10 interesting stories served every morning and every evening.




1 1,215 shares, 57 trendiness

The Singularity will Occur on a Tuesday

Everyone in San Francisco is talk­ing about the sin­gu­lar­ity. At din­ner par­ties, at cof­fee shops, at the OpenClaw meetup where Ashton Kutcher showed up for some rea­son. The con­ver­sa­tions all have the same shape: some­one says it’s com­ing, some­one says it’s hype, and no­body has a num­ber.

This seems like the wrong ques­tion. If things are ac­cel­er­at­ing (and they mea­sur­ably are) the in­ter­est­ing ques­tion is­n’t whether. It’s when. And if it’s ac­cel­er­at­ing, we can cal­cu­late ex­actly when.

I col­lected five real met­rics of AI progress, fit a hy­per­bolic model to each one in­de­pen­dently, and found the one with gen­uine cur­va­ture to­ward a pole. The date has mil­lisec­ond pre­ci­sion. There is a count­down.

Five met­rics, cho­sen for what I’m call­ing their an­thropic sig­nif­i­cance (anthropic here in the Greek sense (“pertaining to hu­mans”), not the com­pany, though they ap­pear in the dataset with sus­pi­cious fre­quency):

Tokens per dol­lar: cost col­lapse of in­tel­li­gence (log-transformed, be­cause the Gemini Flash out­lier spans 150× the range oth­er­wise)

Each met­ric nor­mal­ized to . Release in­ter­vals in­verted (shorter = bet­ter). Tokens per dol­lar log-trans­formed be­fore nor­mal­iz­ing (the raw val­ues span five or­ders of mag­ni­tude; with­out the log, Gemini Flash at 2.5M to­kens/$ dom­i­nates the fit and every­thing else is noise). Each se­ries keeps its own scale, no merg­ing into a sin­gle en­sem­ble.

An ex­po­nen­tial ap­proaches in­fin­ity only as . You’d be wait­ing for­ever. Literally.

We need a func­tion that hits in­fin­ity at a fi­nite time. That’s the whole point of a sin­gu­lar­ity: a pole, a ver­ti­cal as­ymp­tote, the math break­ing:

As , the de­nom­i­na­tor goes to zero. . Not a bug. The fea­ture.

Polynomial growth () never reaches in­fin­ity at fi­nite time. You could wait un­til heat death and would still be fi­nite. Polynomials are for peo­ple who think AGI is decades away.”

Exponential growth reaches in­fin­ity at . Technically a sin­gu­lar­ity, but an in­fi­nitely pa­tient one. Moore’s Law was ex­po­nen­tial. We are no longer on Moore’s Law.

Hyperbolic growth is what hap­pens when the thing that’s grow­ing ac­cel­er­ates its own growth. Better AI → bet­ter AI re­search tools → bet­ter AI → bet­ter tools. Positive feed­back with supra­lin­ear dy­nam­ics. The sin­gu­lar­ity is real and fi­nite.

The pro­ce­dure is straight­for­ward, which should con­cern you.

The model fits a sep­a­rate hy­per­bola to each met­ric:

Each se­ries gets its own scale and off­set . The sin­gu­lar­ity time is shared. MMLU scores and to­kens-per-dol­lar have no busi­ness be­ing on the same y-axis, but they can agree on when the pole is.

For each can­di­date , the per-se­ries fits are lin­ear in and . The ques­tion is: which makes the hy­per­bola fit best?

Here’s the thing no­body tells you about fit­ting sin­gu­lar­i­ties: most met­rics don’t ac­tu­ally have one. If you min­i­mize to­tal RSS across all se­ries, the best is al­ways at in­fin­ity. A dis­tant hy­per­bola de­gen­er­ates into a line, and lines fit noisy data just fine. The singularity date” ends up be­ing what­ever you set as the search bound­ary. You’re find­ing the edge of your search grid, not a sin­gu­lar­ity.

So in­stead, we look for the real sig­nal. For each se­ries in­de­pen­dently, grid search and find the peak: the date where hy­per­bolic fits bet­ter than any nearby al­ter­na­tive. If a se­ries gen­uinely curves to­ward a pole, its will peak at some fi­nite and then de­cline. If it’s re­ally just lin­ear, will keep in­creas­ing as and never peak. No peak, no sig­nal, no vote!

One se­ries peaks! arXiv emergent” (the count of AI pa­pers about emer­gence) has a clear, un­am­bigu­ous max­i­mum. The other four are mo­not­o­n­i­cally bet­ter fit by a line. The sin­gu­lar­ity date comes from the one met­ric that’s ac­tu­ally go­ing hy­per­bolic.

This is more hon­est than forc­ing five met­rics to av­er­age out to a date that none of them in­di­vid­u­ally sup­port.

Same in­puts → same date. Deterministic. The sto­chas­tic­ity is in the uni­verse, not the model.

The fit con­verged! Each se­ries has its own at the shared , so you can see ex­actly which met­rics the hy­per­bola cap­tures well and which it does­n’t. arX­iv’s is the one that mat­ters. It’s the se­ries that ac­tu­ally peaked.

The 95% con­fi­dence in­ter­val comes from pro­file like­li­hood on . We slide the sin­gu­lar­ity date for­ward and back­ward un­til the fit de­grades past an F-threshold.

How much does the date move if we drop one met­ric en­tirely?

If drop­ping a sin­gle se­ries shifts by years, that se­ries was do­ing all the work. If the shifts are zero, the dropped se­ries never had a sig­nal in the first place.

The table tells the story plainly: arXiv is do­ing all the work. Drop it and the date jumps to the search bound­ary (no re­main­ing se­ries has a fi­nite peak). Drop any­thing else and noth­ing moves. They were never con­tribut­ing to the date, only pro­vid­ing con­text curves at the shared .

Note: Copilot has ex­actly 2 data points and 2 pa­ra­me­ters ( and ), so it fits any hy­per­bola per­fectly. Zero RSS, zero in­flu­ence on . It’s along for the ride!

The model says at . But what does infinity” mean for arXiv pa­pers about emer­gence? It does­n’t mean in­fi­nitely many pa­pers get pub­lished on a Tuesday in 2034.

It means the model breaks. is the point where the cur­rent tra­jec­to­ry’s cur­va­ture can no longer be sus­tained. The sys­tem ei­ther breaks through into some­thing qual­i­ta­tively new, or it sat­u­rates and the hy­per­bola was wrong. A phase tran­si­tion marker, not a phys­i­cal pre­dic­tion.

But here’s the part that should un­set­tle you: the met­ric that’s ac­tu­ally go­ing hy­per­bolic is hu­man at­ten­tion, not ma­chine ca­pa­bil­ity.

MMLU, to­kens per dol­lar, re­lease in­ter­vals. The ac­tual ca­pa­bil­ity and in­fra­struc­ture met­rics. All lin­ear. No pole. No sin­gu­lar­ity sig­nal. The only curve point­ing at a fi­nite date is the count of pa­pers about emer­gence. Researchers notic­ing and nam­ing new be­hav­iors. Field ex­cite­ment, mea­sured memet­i­cally.

The data says: ma­chines are im­prov­ing at a con­stant rate. Humans are freak­ing out about it at an ac­cel­er­at­ing rate that ac­cel­er­ates its own ac­cel­er­a­tion.

That’s a very dif­fer­ent sin­gu­lar­ity than the one peo­ple ar­gue about.

If marks when the rate of AI sur­prises ex­ceeds hu­man ca­pac­ity to process them, the in­ter­est­ing ques­tion is­n’t what hap­pens to the ma­chines. It’s what hap­pens to us.

And the un­com­fort­able an­swer is: it’s al­ready hap­pen­ing.

The la­bor mar­ket is­n’t ad­just­ing. It’s snap­ping. In 2025, 1.1 mil­lion lay­offs were an­nounced. Only the sixth time that thresh­old has been breached since 1993. Over 55,000 ex­plic­itly cited AI. But HBR found that com­pa­nies are cut­ting based on AIs po­ten­tial, not its per­for­mance. The dis­place­ment is an­tic­i­pa­tory. The curve does­n’t need to reach the pole. It just needs to look like it will.

Institutions can’t keep up. The EU AI Act’s high-risk rules have al­ready been de­layed to 2027. The US re­voked its own 2023 AI ex­ec­u­tive or­der in January 2025, then is­sued a new one in December try­ing to pre­empt state laws. California and Colorado are go­ing their own way any­way. The laws be­ing writ­ten to­day reg­u­late 2023′s prob­lems. By the time leg­is­la­tion catches up to GPT-4, we’re on GPT-7. When gov­ern­ments vis­i­bly can’t keep up, trust does­n’t erode. It col­lapses. Global trust in AI has dropped to 56%.

Capital is con­cen­trat­ing at dot-com lev­els. The top 10 S&P 500 stocks (almost all AI-adjacent) hit 40.7% of in­dex weight in 2025, sur­pass­ing the dot-com peak. Since ChatGPT launched, AI-related stocks have cap­tured 75% of S&P 500 re­turns, 80% of earn­ings growth, and 90% of cap­i­tal spend­ing growth. The Shiller CAPE is at 39.4. The last time it was this high was 1999. The money flood­ing in does­n’t re­quire AI to ac­tu­ally reach su­per­in­tel­li­gence. It just re­quires enough peo­ple to be­lieve the curve keeps go­ing up.

People are los­ing the thread. Therapists are re­port­ing a surge in what they’re call­ing FOBO (Fear of Becoming Obsolete). The clin­i­cal lan­guage is strik­ing: pa­tients de­scribe it as the uni­verse say­ing, You are no longer needed.’” 60% of US work­ers be­lieve AI will cut more jobs than it cre­ates. AI us­age is up 13% year-over-year, but con­fi­dence in it has dropped 18%. The more peo­ple use it, the less they trust it.

The epis­temics are crack­ing. Less than a third of AI re­search is re­pro­ducible. Under 5% of re­searchers share their code. Corporate labs are pub­lish­ing less. The gap be­tween what fron­tier labs know and what the pub­lic knows is grow­ing, and the peo­ple mak­ing pol­icy are op­er­at­ing on in­for­ma­tion that’s al­ready ob­so­lete. The ex­perts who tes­tify be­fore Congress con­tra­dict each other, be­cause the field is mov­ing faster than ex­per­tise can form.

The pol­i­tics are re­align­ing. TIME is writ­ing about pop­ulist AI back­lash. Foreign Affairs pub­lished The Coming AI Backlash: How the Anger Economy Will Supercharge Populism.” HuffPost says AI will de­fine the 2026 midterms. MAGA is split­ting over whether AI is pro-busi­ness or anti-worker. Sanders pro­posed a data cen­ter mora­to­rium. The old left-right axis is buck­ling un­der the weight of a ques­tion it was­n’t built to an­swer.

All of this is hap­pen­ing eight years be­fore tst_sts​. The so­cial sin­gu­lar­ity is front-run­ning the tech­ni­cal one. The in­sti­tu­tional and psy­cho­log­i­cal dis­rup­tion does­n’t wait for ca­pa­bil­i­ties to go ver­ti­cal. It starts as soon as the tra­jec­tory be­comes leg­i­ble.

The pole at is­n’t when ma­chines be­come su­per­in­tel­li­gent. It’s when hu­mans lose the abil­ity to make co­her­ent col­lec­tive de­ci­sions about ma­chines. The ac­tual ca­pa­bil­i­ties are al­most be­side the point. The so­cial fab­ric frays at the seams of at­ten­tion and in­sti­tu­tional re­sponse time, not at the fron­tier of model per­for­mance.

The date comes from one se­ries. arXiv emergent” is the only met­ric with gen­uine hy­per­bolic cur­va­ture. The other four are bet­ter fit by straight lines. The sin­gu­lar­ity date is re­ally the date when AI emer­gence re­search goes ver­ti­cal.” Whether field ex­cite­ment is a lead­ing in­di­ca­tor or a lag­ging one is the crux of whether this means any­thing.

The model as­sumes sta­tion­ar­ity. Like as­sum­ing the weather will con­tinue to be changing.” The curve will bend, ei­ther into a lo­gis­tic (the hype sat­u­rates) or into some­thing the model can’t rep­re­sent (genuine phase tran­si­tion). marks where the cur­rent regime can’t con­tinue, not what comes af­ter.

MMLU is hit­ting its ceil­ing. Benchmark sat­u­ra­tion in­tro­duces a lep­tokur­tic com­pres­sion ar­ti­fact. MMLUs low re­flects this. The hy­per­bola is the wrong shape for sat­u­rat­ing data.

Tokens per dol­lar is log-trans­formed (values span five or­ders of mag­ni­tude) and non-mo­not­o­nic (GPT-4 cost more than 3.5; Opus 4.5 costs more than DeepSeek-R1). The cost curve is­n’t smooth: it’s Pareto ad­vances in­ter­spersed with we spent more on this one.”

Five met­rics is­n’t enough. More se­ries with gen­uine hy­per­bolic cur­va­ture would make the date less de­pen­dent on arXiv alone. A proper study would add SWE-bench, ARC, GPQA, com­pute pur­chases, tal­ent salaries. I used five be­cause five fits in a table.

Copilot has two data points. Two pa­ra­me­ters, two points, zero de­grees of free­dom, zero RSS con­tri­bu­tion. The sen­si­tiv­ity analy­sis con­firms it does­n’t mat­ter.

The math found one met­ric curv­ing to­ward a pole on a spe­cific day at a spe­cific mil­lisec­ond: the rate at which hu­mans are dis­cov­er­ing emer­gent AI be­hav­iors. The other four met­rics are lin­ear. The ma­chines are im­prov­ing steadily. We are the ones ac­cel­er­at­ing!

The so­cial con­se­quences of that ac­cel­er­a­tion (labor dis­place­ment, in­sti­tu­tional fail­ure, cap­i­tal con­cen­tra­tion, epis­temic col­lapse, po­lit­i­cal re­align­ment) are not pre­dic­tions for 2034. They are de­scrip­tions of 2026. The sin­gu­lar­ity in the data is a sin­gu­lar­ity in hu­man at­ten­tion, and it is al­ready ex­ert­ing grav­i­ta­tional force on every­thing it touches.

I see no rea­son to let epis­te­mo­log­i­cal hu­mil­ity in­ter­fere with a per­fectly good timer.

See you on the other side!

...

Read the original on campedersen.com »

2 1,034 shares, 41 trendiness

Europe's $24 Trillion Breakup With Visa and Mastercard

ECB President Christine Lagarde has called for Europe to break its de­pen­dence on American pay­ment in­fra­struc­ture, warn­ing that every card trans­ac­tion sends European con­sumer data to the United States. A coali­tion of 16 banks thinks it has the an­swer.

What’s hap­pen­ing? ECB President Christine Lagarde told Irish ra­dio that Europe needs its own dig­i­tal pay­ment sys­tem urgently,” warn­ing that vir­tu­ally all European card and mo­bile pay­ments cur­rently run through non-Eu­ro­pean in­fra­struc­ture con­trolled by Visa, Mastercard, PayPal or Alipay. Days later, on 2 February, the European Payments Initiative (EPI) and the EuroPA Alliance signed a land­mark agree­ment to build a pan-Eu­ro­pean in­ter­op­er­a­ble pay­ment net­work cov­er­ing 130 mil­lion users across 13 coun­tries. The sys­tem, built around the dig­i­tal wal­let Wero, aims to let Europeans pay and trans­fer money across bor­ders with­out touch­ing a sin­gle American net­work.

Every time a European taps a card, pays on­line or splits a bill with friends, the trans­ac­tion flows through in­fra­struc­ture owned and op­er­ated by American com­pa­nies. Visa and Mastercard to­gether process ap­prox­i­mately $24 tril­lion in trans­ac­tions an­nu­ally. Card pay­ments ac­count for 56% of all cash­less trans­ac­tions in the EU. And the data — who bought what, where, when and for how much — leaves European ju­ris­dic­tion every time.

It’s im­por­tant for us to have dig­i­tal pay­ment un­der our con­trol,” Lagarde told The Pat Kenny Show. Whether you use a card or whether you use a phone, typ­i­cally it goes through Visa, Mastercard, PayPal, Alipay. Where are all those com­ing from? Well, ei­ther the US or China.”

The host’s re­sponse — I did­n’t re­alise this” — cap­tured the broader European blind spot. Most con­sumers have no idea that their pay­ment data rou­tinely ex­its the EU. In a geopo­lit­i­cal en­vi­ron­ment where Europe is scram­bling to re­duce de­pen­dence on the United States across de­fence, en­ergy and trade, pay­ments re­main an over­looked vul­ner­a­bil­ity.

The les­son of Russia sharp­ened the ur­gency. When Western sanc­tions cut Russia off from Visa and Mastercard in 2022, the coun­try’s do­mes­tic pay­ments were im­me­di­ately dis­rupted. European pol­i­cy­mak­ers asked the ob­vi­ous ques­tion: what would hap­pen if the US de­cided — or was pres­sured — to re­strict European ac­cess to those same net­works?

The European Payments Initiative, a con­sor­tium of 16 ma­jor banks and pay­ment proces­sors in­clud­ing BNP Paribas, Deutsche Bank and Worldline, launched Wero in July 2024 as Europe’s an­swer. Built on SEPA in­stant credit trans­fers, Wero lets users send money us­ing just a phone num­ber — no IBAN, no card, no in­ter­me­di­ary.

The num­bers so far are en­cour­ag­ing. Wero al­ready has over 47 mil­lion reg­is­tered users in Belgium, France and Germany, has processed over €7.5 bil­lion in trans­fers, and counts more than 1,100 mem­ber in­sti­tu­tions. Retail pay­ments went live in Germany at the end of 2025, with mer­chants in­clud­ing Lidl, Decathlon, Rossmann and Air Europa al­ready ac­cept­ing Wero on­line. France and Belgium fol­low in 2026.

But the real break­through came on 2 February, when EPI signed a mem­o­ran­dum of un­der­stand­ing with the EuroPA Alliance — a coali­tion of na­tional pay­ment sys­tems in­clud­ing Italy’s Bancomat, Spain’s Bizum, Portugal’s MB WAY and the Nordics’ Vipps MobilePay. The deal in­stantly con­nects ap­prox­i­mately 130 mil­lion users across 13 coun­tries, cov­er­ing roughly 72% of the EU and Norway pop­u­la­tion. Cross-border peer-to-peer pay­ments launch this year, with e-com­merce and point-of-sale pay­ments fol­low­ing in 2027.

European pay­ment sov­er­eignty is not a vi­sion, but a re­al­ity in the mak­ing,” said Martina Weimert, CEO of EPI.

Europe has tried this be­fore. The Monnet Project, launched in 2008 by twenty European banks, col­lapsed in 2012. The orig­i­nal EPI vi­sion it­self was scaled back af­ter sev­eral found­ing mem­bers with­drew, forc­ing a pivot from a full card-re­place­ment scheme to a nar­rower ac­count-to-ac­count model.

The core prob­lem has al­ways been frag­men­ta­tion. Each EU coun­try de­vel­oped its own do­mes­tic pay­ment so­lu­tion — Bizum in Spain, iDEAL in the Netherlands, Payconiq in Belgium, Girocard in Germany — but none could work across bor­ders. A Belgian con­sumer buy­ing from a Dutch re­tailer still needed Visa or Mastercard. National pride and com­pet­ing bank­ing in­ter­ests re­peat­edly sab­o­taged at­tempts at uni­fi­ca­tion.

The net­work ef­fect com­pounds the chal­lenge. Merchants ac­cept Visa and Mastercard be­cause con­sumers carry them. Consumers carry them be­cause mer­chants ac­cept them. Breaking that loop re­quires ei­ther reg­u­la­tory force or a crit­i­cal mass of users large enough to make mer­chants care — which is pre­cisely what the EuroPA deal at­tempts to de­liver by con­nect­ing ex­ist­ing na­tional user bases rather than build­ing from scratch.

Running in par­al­lel is the ECBs dig­i­tal euro pro­ject, which would cre­ate a cen­tral bank-backed dig­i­tal cur­rency us­able across the eu­ro­zone. EU fi­nance min­is­ters have ac­cel­er­ated dis­cus­sions on the ini­tia­tive, though the European Parliament has not yet passed the re­quired leg­is­la­tion. Once ap­proved, the ECB es­ti­mates it would need a fur­ther two to three years to launch.

EPI is care­ful to dis­tin­guish Wero from the dig­i­tal euro. Wero is a pri­vate-sec­tor ini­tia­tive; the dig­i­tal euro is pub­lic money. They are de­signed to com­ple­ment rather than com­pete — though the over­lap in am­bi­tion is ob­vi­ous. Both ex­ist be­cause Europe’s po­lit­i­cal es­tab­lish­ment has fi­nally ac­cepted that pay­ments sov­er­eignty is as strate­gi­cally im­por­tant as en­ergy in­de­pen­dence or de­fence au­ton­omy.

Sceptics have good rea­sons for doubt. Creating a vi­able al­ter­na­tive to Visa and Mastercard re­quires several bil­lion eu­ros” in in­vest­ment, ac­cord­ing to EPIs own es­ti­mates. Low in­ter­change fees un­der EU reg­u­la­tion make prof­itabil­ity dif­fi­cult. Consumer habits are deeply en­trenched — and nei­ther Visa nor Mastercard will sit idle while Europe tries to dis­man­tle their most prof­itable mar­ket.

Weimert her­self con­cedes that call­ing Wero a challenger” may be pre­ma­ture, de­scrib­ing it as func­tion­ing like a startup — al­beit one with €500 mil­lion in back­ing and 47 mil­lion users al­ready on board.

But the po­lit­i­cal tail­winds are stronger than they have ever been. The EUs in­stant pay­ments reg­u­la­tion, the Capital Markets Union push, the broader drive for European strate­gic au­ton­omy in a world of tar­iff wars and great power ri­valry — all point in the same di­rec­tion. The ques­tion is no longer whether Europe wants its own pay­ment in­fra­struc­ture. It is whether it can ex­e­cute fast enough to mat­ter.

As Lagarde put it: We have the as­sets and op­por­tu­ni­ties to do that our­selves. And if we were to re­move the in­ter­nal bar­ri­ers that we have set for our­selves in Europe, our eco­nomic wealth would in­crease sig­nif­i­cantly.”

...

Read the original on europeanbusinessmagazine.com »

3 760 shares, 31 trendiness

Google Fulfilled ICE Subpoena Demanding Student Journalist’s Bank and Credit Card Numbers

Google ful­filled an Immigration and Customs Enforcement sub­poena that de­manded a wide ar­ray of per­sonal data on a stu­dent ac­tivist and jour­nal­ist, in­clud­ing his credit card and bank ac­count num­bers, ac­cord­ing to a copy of an ICE sub­poena ob­tained by The Intercept.

Amandla Thomas-Johnson had at­tended a protest tar­get­ing com­pa­nies that sup­plied weapons to Israel at a Cornell University job fair in 2024 for all of five min­utes, but the ac­tion got him banned from cam­pus. When President Donald Trump as­sumed of­fice and is­sued a se­ries of ex­ec­u­tive or­ders tar­get­ing stu­dents who protested in sup­port of Palestinians, Thomas-Johnson and his friend Momodou Taal went into hid­ing.

Google in­formed Thomas-Johnson via a brief email in April that it had al­ready shared his meta­data with the Department of Homeland Security, as The Intercept pre­vi­ously re­ported. But the full ex­tent of the in­for­ma­tion the agency sought — including user­names, ad­dresses, item­ized list of ser­vices, in­clud­ing any IP mask­ing ser­vices, tele­phone or in­stru­ment num­bers, sub­scriber num­bers or iden­ti­ties, and credit card and bank ac­count num­bers — was not pre­vi­ously known.

I’d al­ready seen the sub­poena re­quest that Google and Meta had sent to Momodou [Taal], and I knew that he had got­ten in touch with a lawyer and the lawyer suc­cess­fully chal­lenged that,” Thomas-Johnson said. I was quite sur­prised to see that I did­n’t have that op­por­tu­nity.”

The sub­poena pro­vides no jus­ti­fi­ca­tion for why ICE is ask­ing for this in­for­ma­tion, ex­cept that it’s re­quired in con­nec­tion with an in­ves­ti­ga­tion or in­quiry re­lat­ing to the en­force­ment of U. S. im­mi­gra­tion laws.” In the sub­poena, ICE re­quests that Google not disclose the ex­is­tence of this sum­mons for in­def­i­nite pe­riod of time.”

Thomas-Johnson, who is British, be­lieves that ICE re­quested that in­for­ma­tion to track and even­tu­ally de­tain him — but he had al­ready fled to Geneva, Switzerland, and is now in Dakar, Senegal.

The Electronic Frontier Foundation, which is rep­re­sent­ing Thomas-Johnson, and the ACLU of Northern California sent a let­ter to Google, Amazon, Apple, Discord, Meta, Microsoft, and Reddit last week call­ing on tech com­pa­nies to re­sist sim­i­lar sub­poe­nas in the fu­ture from DHS with­out court in­ter­ven­tion. The let­ter asks the com­pa­nies to pro­vide users with as much no­tice as pos­si­ble be­fore com­ply­ing with a sub­poena to give them the op­por­tu­nity to fight it, and to re­sist gag or­ders that would pre­vent the tech com­pa­nies from in­form­ing tar­gets that a sub­poena was is­sued.

Your promises to pro­tect the pri­vacy of users are be­ing tested right now. As part of the fed­eral gov­ern­men­t’s un­prece­dented cam­paign to tar­get crit­ics of its con­duct and poli­cies, agen­cies like DHS have re­peat­edly de­manded ac­cess to the iden­ti­ties and in­for­ma­tion of peo­ple on your ser­vices,” the let­ter reads. Based on our own con­tact with tar­geted users, we are deeply con­cerned your com­pa­nies are fail­ing to chal­lenge un­law­ful sur­veil­lance and de­fend user pri­vacy and speech.”

In ad­di­tion to Thomas-Johnson’s case, the let­ter refers to other in­stances in which tech­nol­ogy com­pa­nies pro­vided user data to DHS, in­clud­ing a sub­poena sent to Meta to unmask” the iden­ti­ties of users who doc­u­mented im­mi­gra­tion raids in California. Unlike Thomas-Johnson, users in that case were given the chance to fight the sub­poena be­cause they were made aware of it be­fore Meta com­plied.

Google has al­ready ful­filled this sub­poena,” an at­tor­ney for Google told Thomas-Johnson’s lawyer, as The Intercept pre­vi­ously re­ported. Production con­sisted of ba­sic sub­scriber in­for­ma­tion.”

The ICE sub­poena re­quested the de­tailed in­for­ma­tion linked to Thomas-Johnson’s Gmail ac­count. Thomas-Johnson con­firmed to The Intercept that he had at­tached his bank and credit card num­bers to his ac­count to buy apps.

Google did not re­spond to a re­quest for com­ment.

Lindsay Nash, a pro­fes­sor at Cardozo Law and a for­mer staff at­tor­ney with ACLU Immigrants’ Rights Project, said that by not giv­ing prior no­tice, Google de­prived Thomas-Johnson of his abil­ity to pro­tect his in­for­ma­tion.

The prob­lem is that it does­n’t al­low the per­son whose per­sonal in­for­ma­tion is on the line and whose pri­vacy may be be­ing in­vaded to raise chal­lenges to the dis­clo­sure of that po­ten­tially pri­vate in­for­ma­tion,” Nash said. And I think that’s im­por­tant to pro­tect rights that they may have to their own in­for­ma­tion.”

Tech com­pa­nies’ data shar­ing prac­tices are pri­mar­ily gov­erned by two fed­eral laws, the Stored Communications Act, which pro­tects the pri­vacy of dig­i­tal com­mu­ni­ca­tions, in­clud­ing emails, and Section 5 of the Federal Trade Commission Act, which pro­hibits un­fair or de­cep­tive trade prac­tices.

Under both fed­eral law and the law of every state, you can­not de­ceive con­sumers,” said Neil Richards, a law pro­fes­sor at Washington University St. Louis who spe­cial­izes in pri­vacy, the in­ter­net, and civil lib­er­ties. And if you make a ma­te­r­ial mis­rep­re­sen­ta­tion about your data prac­tices, that’s a de­cep­tive trade prac­tice.”

Whether or not cor­po­ra­tions are clear enough with con­sumers about how they col­lect and share their data has been lit­i­gated for decades, Richards said, ref­er­enc­ing the in­fa­mous Cambridge Analytica law­suit brought by the Federal Trade Commission, al­leg­ing that the com­pany mis­led Facebook users about data col­lec­tion and shar­ing.

Google’s pub­lic pri­vacy pol­icy ac­knowl­edges that it will share per­sonal in­for­ma­tion in re­sponse to an enforceable gov­ern­men­tal re­quest,” adding that its le­gal team will frequently push back when a re­quest ap­pears to be overly broad or does­n’t fol­low the cor­rect process.”

According to Google, the com­pany over­whelm­ingly com­plied with the mil­lions of re­quests made by the gov­ern­ment for user in­for­ma­tion over the last decade. Its data also shows that those re­quests have spiked over the last five years. It’s un­clear how many of those users were given no­tice of those re­quests ahead of time or af­ter.

Richards said that cases like these em­pha­size the need for le­gal re­forms around data pri­vacy and urged Congress to amend the Stored Communications Act to re­quire a higher stan­dard be­fore the gov­ern­ment can ac­cess our dig­i­tal data. He also said the fed­eral gov­ern­ment needs to reg­u­late Big Tech and place substantive re­stric­tions on their abil­ity to share in­for­ma­tion with the gov­ern­ment.”

It’s hard to know ex­actly how tech com­pa­nies are han­dling our per­sonal data in re­la­tion to the gov­ern­ment, but there seems to have been a shift in op­tics, Richards said. What we have seen in the 12 months since the lead­ers of Big Tech were there on the podium at the in­au­gu­ra­tion,” Richards said, is much more friend­li­ness of Big Tech to­wards the gov­ern­ment and to­wards state power.”

From Dakar, Thomas-Johnson said that un­der­stand­ing the ex­tent of the sub­poena was ter­ri­fy­ing but had not changed his com­mit­ment to his work.

As a jour­nal­ist, what’s weird is that you’re so used to see­ing things from the out­side,” said Thomas-Johnson, whose work has ap­peared in out­lets in­clud­ing Al Jazeera and The Guardian. We need to think very hard about what re­sis­tance looks like un­der these con­di­tions… where gov­ern­ment and Big Tech know so much about us, can track us, can im­prison, can de­stroy us in a va­ri­ety of ways.”

This story has been up­dated to re­flect that Thomas-Johnson’s le­gal team still does not know the full ex­tent of the in­for­ma­tion that Google pro­vided to ICE, but that Thomas-Johnson said his bank and credit card num­bers were at­tached to his ac­count.

...

Read the original on theintercept.com »

4 758 shares, 30 trendiness

I Started Programming When I Was 7. I'm 50 Now, and the Thing I Loved Has Changed

I wrote my first line of code in 1983. I was seven years old, typ­ing BASIC into a ma­chine that had less pro­cess­ing power than the chip in your wash­ing ma­chine. I un­der­stood that ma­chine com­pletely. Every byte of RAM had a pur­pose I could trace. Every pixel on screen was there be­cause I’d put it there. The path from in­ten­tion to re­sult was di­rect, vis­i­ble, and mine.

Forty-two years later, I’m sit­ting in front of hard­ware that would have seemed like sci­ence fic­tion to that kid, and I’m try­ing to fig­ure out what building things” even means any­more.

This is­n’t a rant about AI. It’s not a back in my day” piece. It’s some­thing I’ve been cir­cling for months, and I think a lot of ex­pe­ri­enced de­vel­op­ers are cir­cling it too, even if they haven’t said it out loud yet.

My favourite pe­riod of com­put­ing runs from the 8-bits through to about the 486DX2-66. Every ma­chine in that era had char­ac­ter. The Sinclair Spectrum with its at­tribute clash. The Commodore 64 with its SID chip do­ing things the de­sign­ers never in­tended. The NES with its 8-sprite-per-scanline limit that made de­vel­op­ers in­vent flick­er­ing tricks to cheat the hard­ware. And the PC — start­ing life as a bor­ing beige box for spread­sheets, then evolv­ing at break­neck pace through the 286, 386, and 486 un­til it be­came a gam­ing pow­er­house that could run Doom. You could feel each gen­er­a­tion leap. Upgrading your CPU was­n’t a spec sheet ex­er­cise — it was trans­for­ma­tive.

These weren’t just prod­ucts. They were en­gi­neer­ing ad­ven­tures with vis­i­ble trade­offs. You had to un­der­stand the ma­chine to use it. IRQ con­flicts, DMA chan­nels, CONFIG. SYS and AUTOEXEC.BAT op­ti­mi­sa­tion, mem­ory man­agers — get­ting a game to run was the game. You weren’t just a user. You were a sys­tems en­gi­neer by ne­ces­sity.

And the soft­ware side matched. Small teams like id Software were go­ing their own way, mak­ing bold tech­ni­cal de­ci­sions be­cause no­body had writ­ten the rules yet. Carmack’s ray­cast­ing in Wolfenstein, the VGA Mode X tricks in Doom — these were peo­ple push­ing against real con­straints and pro­duc­ing some­thing gen­uinely new. Creative con­straints bred cre­ativ­ity.

Then it pro­fes­sion­alised. Plug and Play ar­rived. Windows ab­stracted every­thing. The Wild West closed. Computers stopped be­ing fas­ci­nat­ing, can­tan­ker­ous ma­chines that de­manded re­spect and un­der­stand­ing, and be­came ap­pli­ances. The craft be­came in­vis­i­ble.

But it was­n’t just the craft that changed. The promise changed.

When I started, there was a gen­uine op­ti­mism about what com­put­ers could be. A kid with a Spectrum could teach them­selves to build any­thing. The early web felt like the great­est lev­el­ling force in hu­man his­tory. Small teams made bold de­ci­sions be­cause no­body had writ­ten the rules yet.

That hope gave way to some­thing I find gen­uinely dis­taste­ful. The ma­chines I fell in love with be­came in­stru­ments of sur­veil­lance and ex­trac­tion. The plat­forms that promised to con­nect us were re­ally built to mon­e­tise us. The tin­kerer spirit did­n’t die of nat­ural causes — it was bought out and put to work op­ti­mis­ing ad clicks.

The thing I loved changed, and then it was put to work do­ing things I’m not proud to be as­so­ci­ated with. That’s a dif­fer­ent kind of loss than just the tools moved on.”

But I adapted. That’s what ex­pe­ri­enced de­vel­op­ers, hu­man be­ings, do.

Over four decades I’ve been through more tech­nol­ogy tran­si­tions than I can count. New lan­guages, new plat­forms, new par­a­digms. CLI to GUI. Desktop to web. Web to mo­bile. Monoliths to mi­croser­vices. Tapes, floppy discs, hard dri­ves, SSDs. JavaScript frame­works ar­riv­ing and dy­ing like mayflies.

Each wave re­quired learn­ing new things, but the core skill trans­ferred. You learned the new plat­form, you ap­plied your ex­ist­ing un­der­stand­ing of how sys­tems work, and you kept build­ing. The tool changed; the craft did­n’t. You were still the per­son who un­der­stood why things broke, how sys­tems com­posed, where to­day’s short­cut be­came next mon­th’s mess.

I’ve writ­ten pro­duc­tion code in more lan­guages than some de­vel­op­ers have heard of. I’ve shipped soft­ware on plat­forms that no longer ex­ist. I’ve chased C-beams off the shoul­der of Orion. And every time the in­dus­try lurched in a new di­rec­tion, the ex­pe­ri­ence com­pounded. You did­n’t start over. You brought every­thing with you and ap­plied it some­where new.

That’s the deal ex­pe­ri­enced de­vel­op­ers made with the in­dus­try: things change, but un­der­stand­ing en­dures.

I say that know­ing how of­ten those words have been wrong through­out his­tory. But hear me out.

Previous tech­nol­ogy shifts were learn the new thing, ap­ply ex­ist­ing skills.” AI is­n’t that. It’s not a new plat­form or a new lan­guage or a new par­a­digm. It’s a shift in what it means to be good at this.

I no­ticed it grad­u­ally. I’d be work­ing on some­thing — build­ing a fea­ture, de­sign­ing an ar­chi­tec­ture — and I’d re­alise I was still do­ing the same thing I’d al­ways done, just with the in­ter­est­ing bits hol­lowed out. The part where you fig­ure out the el­e­gant so­lu­tion, where you wres­tle with the con­straints, where you feel the sat­is­fac­tion of some­thing click­ing into place — that was in­creas­ingly be­ing han­dled by a model that does­n’t care about el­e­gance and has never felt sat­is­fac­tion.

I’m not typ­ing the code any­more. I’m re­view­ing it, di­rect­ing it, cor­rect­ing it. And I’m good at that — 42 years of ac­cu­mu­lated judg­ment about what works and what does­n’t, what’s el­e­gant ver­sus what’s ex­pe­di­ent, how sys­tems com­pose and where they frac­ture. That’s valu­able. I know it’s valu­able. But it’s a dif­fer­ent kind of work, and it does­n’t feel the same.

The feed­back loop has changed. The in­ti­macy has gone. The thing that kept me up at night for decades — the puz­zle, the chase, the mo­ment where you fi­nally un­der­stand why some­thing is­n’t work­ing — that’s been com­pressed into a prompt and a re­sponse. And I’m watch­ing peo­ple with a frac­tion of my ex­pe­ri­ence pro­duce su­per­fi­cially sim­i­lar out­put. The craft dis­tinc­tion is real, but it’s harder to see from the out­side. Harder to value. Maybe harder to feel in­ter­nally.

Here’s the part that makes me laugh, darkly.

I saw some­one on LinkedIn re­cently — early twen­ties, a few years into their ca­reer — lament­ing that with AI they didn’t re­ally know what was go­ing on any­more.” And I thought: mate, you were al­ready so far up the ab­strac­tion chain you did­n’t even re­alise you were tee­ter­ing on top of a wob­bly Jenga tower.

They’re writ­ing TypeScript that com­piles to JavaScript that runs in a V8 en­gine writ­ten in C++ that’s mak­ing sys­tem calls to an OS ker­nel that’s sched­ul­ing threads across cores they’ve never thought about, hit­ting RAM through a mem­ory con­troller with caching lay­ers they could­n’t di­a­gram, all while npm pulls in 400 pack­ages they’ve never read a line of.

But sure. AI is the mo­ment they lost track of what’s hap­pen­ing.

The ab­strac­tion ship sailed decades ago. We just did­n’t no­tice be­cause each layer ar­rived grad­u­ally enough that we could pre­tend we still un­der­stood the whole stack. AI is just the layer that made the pre­tence im­pos­si­ble to main­tain.

The dif­fer­ence is: I re­mem­ber what it felt like to un­der­stand the whole ma­chine. I’ve had that ex­pe­ri­ence. And los­ing it — even ac­knowl­edg­ing that it was lost long be­fore AI ar­rived — is a kind of grief that some­one who never had it can’t fully feel.

I don’t want to be dis­hon­est about this. There’s a ver­sion of this post where I tell you that ex­pe­ri­ence is more valu­able than ever, that sys­tems think­ing and ar­chi­tec­tural judg­ment are the things AI can’t re­place, that the craft en­dures in a dif­fer­ent form.

And that’s true. When I’m work­ing on some­thing com­plex — jug­gling sys­tem-level de­pen­den­cies, hold­ing a men­tal model across mul­ti­ple in­ter­act­ing spec­i­fi­ca­tions, mak­ing the thou­sand small de­ci­sions that de­ter­mine whether some­thing feels co­her­ent or just works — I can see how I still bring some­thing AI does­n’t. The taste. The judg­ment. The pat­tern recog­ni­tion from decades of see­ing things go wrong.

AI tools ac­tu­ally make that kind of think­ing more valu­able, not less. When code gen­er­a­tion is cheap, the bot­tle­neck shifts to the per­son who knows what to ask for, can spot when the out­put is sub­tly wrong, and can hold the whole pic­ture to­gether. Typing was never the hard part.

But I’d be ly­ing if I said it felt the same. It does­n’t. The won­der is harder to ac­cess. The sense of dis­cov­ery, of fig­ur­ing some­thing out through sheer per­sis­tence and in­ge­nu­ity — that’s been com­pressed. Not elim­i­nated, but com­pressed. And some­thing is lost in the com­pres­sion, even if some­thing is gained.

I turned 50 re­cently. Four decades of in­ten­sity, of craft­ing and find­ing sat­is­fac­tion and iden­tity in the build­ing.

And now I’m in what I’ve started call­ing a fal­low pe­riod. Not burnout ex­actly. More like the ground shift­ing un­der a build­ing you thought that al­though ever chang­ing also had a per­ma­nence, and try­ing to fig­ure out where the new foun­da­tion is.

I don’t have a neat con­clu­sion. I’m not go­ing to tell you that ex­pe­ri­enced de­vel­op­ers just need to push them­selves up the stack” or embrace the tools” or focus on what AI can’t do.” All of that is prob­a­bly right, and none of it ad­dresses the feel­ing.

The feel­ing is: I gave 42 years to this thing, and the thing changed into some­thing I’m not sure I recog­nise any­more. Not worse, nec­es­sar­ily. Just dif­fer­ent. And dif­fer­ent in a way that chal­lenges the iden­tity I built around it and does­n’t sat­isfy in the way it did.

I sus­pect a lot of de­vel­op­ers over 40 are feel­ing some­thing sim­i­lar and not say­ing it, be­cause the in­dus­try wor­ships youth and adapt­abil­ity and say­ing this does­n’t feel like it used to” sounds like you’re falling be­hind.

I’m not falling be­hind. I’m mov­ing ahead, tak­ing ad­van­tage of the new tools, build­ing faster than ever, and us­ing these tools to help oth­ers ac­cel­er­ate their own work. I’m cre­at­ing prod­ucts I could only have dreamt of a few years ago. But at the same time I’m look­ing at the land­scape, try­ing to fig­ure out what build­ing means to me now. The world’s still fig­ur­ing out its shape too. Maybe that’s okay.

Maybe the fal­low pe­riod is the point. Not some­thing to push through, but some­thing to be in for a while.

I started pro­gram­ming when I was seven be­cause a ma­chine did ex­actly what I told it to, felt like some­thing I could ex­plore and ul­ti­mately know, and that felt like magic. I’m fifty now, and the magic is dif­fer­ent, and I’m learn­ing to sit with that.

...

Read the original on www.jamesdrandall.com »

5 588 shares, 20 trendiness

Our $200M Series C / Oxide

We have raised a $200M Series C, and yes, you are per­mit­ted a dou­ble take: did­n’t we just raise a

$100M Series B? And aren’t we the ones that are es­pe­cially can­did about the

per­ils of rais­ing too much money?Well, yes, on both fronts, so let us ex­plain a lit­tle. First, we have the lux­ury of hav­ing achieved real prod­uct-mar­ket fit: we are mak­ing a prod­uct that peo­ple want to buy. This takes on ad­di­tional di­men­sions when mak­ing some­thing phys­i­cal: with com­plex­i­ties like man­u­fac­tur­ing, in­ven­tory, cash-con­ver­sion, and shift­ing sup­ply chains, prod­uct-mar­ket fit im­plies get­ting the unit eco­nom­ics of the busi­ness right. All of this is a long way of say­ing: we did not (and do not) need to raise cap­i­tal to sup­port the busi­ness.So if we did­n’t need to raise, why seek the cap­i­tal? Well, we weren’t seek­ing it, re­ally. But our in­vestors, see­ing the busi­ness take off, were ea­ger to sup­port it. And we, in turn, were ea­ger to have them: they were the ones, af­ter all, who joined us in tak­ing a real leap when it felt like there was a lot more risk on the table. They un­der­stood our vi­sion for the com­pany and shared our love for cus­tomers and our de­sire to build a sin­gu­lar team. They had been with us in some dif­fi­cult mo­ments; they know and trust us, as do we them. So be­ing able to raise a Series C purely from our ex­ist­ing in­vestors pre­sented a real op­por­tu­nity.Still, even from in­vestors that we trust and with a quick close, if the busi­ness does­n’t need the money, does it make sense to raise? We have al­ways be­lieved that our biggest chal­lenge at Oxide was time — and there­fore cap­i­tal. We spelled this out in our ini­tial pitch deck from 2019:Six years later, we stand by this, which is not to min­i­mize any of those chal­lenges: the tech­ni­cal chal­lenges were in­deed hard; we feel for­tu­nate to have at­tracted an ex­tra­or­di­nary team; and we cer­tainly caught some

lucky breaks

with re­spect to the mar­ket. With this large Series C, we have en­tirely de-risked cap­i­tal go­ing for­ward, which in turn as­sures our in­de­pen­dence.This last bit is re­ally im­por­tant, be­cause any buyer of in­fra­struc­ture has had their heart bro­ken count­less times by promis­ing star­tups that suc­cumbed to ac­qui­si­tion by one of the es­tab­lished play­ers that they were seek­ing to dis­rupt. The se­r­ial dis­ap­point­ments leave a re­fresh­ing blunt­ness in their wake, and it’s not un­com­mon for us to be asked di­rectly: How do I know you won’t be bought?“Our in­tent in start­ing Oxide was not to be an ac­qui­si­tion tar­get but rather build a gen­er­a­tional com­pany; this is our life’s work, not a means to an end. With our Series C, cus­tomers don’t have to merely take our word for it: we have the cap­i­tal to as­sure our sur­vival into the in­def­i­nite fu­ture. If our Series B left us with con­fi­dence in achiev­ing our mis­sion, our Series C leaves us with cer­tainty: we’re go­ing to kick butt, have fun, not cheat (of course!), love our cus­tomers — and change com­put­ing for­ever.

...

Read the original on oxide.computer »

6 553 shares, 22 trendiness

Hello Entire World · Entire Blog

...

Read the original on entire.io »

7 500 shares, 68 trendiness

Common vulnerabilities and Exposures (CVE)

...

Read the original on www.cve.org »

8 403 shares, 25 trendiness

The Day the telnet Died – GreyNoise Labs

...

Read the original on www.labs.greynoise.io »

9 306 shares, 74 trendiness

Trump administration says El Paso airspace closure was tied to Mexican cartel drones

Add AP News as your pre­ferred source to see more of our sto­ries on Google.

Add AP News as your pre­ferred source to see more of our sto­ries on Google.

WASHINGTON (AP) — The Federal Aviation Administration re­opened the air­space around El Paso International Airport in Texas on Wednesday morn­ing, just hours af­ter it an­nounced a 10-day clo­sure that would have grounded all flights to and from the air­port.

The Federal Aviation Administration said in a so­cial me­dia post that it has lifted the tem­po­rary clo­sure of the air­space over El Paso, say­ing there was no threat to com­mer­cial avi­a­tion and that all flights will re­sume.

Transportation Secretary Sean Duffy said in a post on X that the FAA and the Defense Department acted swiftly to ad­dress a car­tel drone in­cur­sion. The threat has been neu­tral­ized and there is no dan­ger to com­mer­cial travel in the re­gion.”

He said nor­mal flights are re­sum­ing Wednesday morn­ing.

He did not say how many drones were in­volved or what specif­i­cally was done to dis­able them.

The shut­down an­nounced just hours ear­lier for spe­cial se­cu­rity rea­sons” had been ex­pected to cre­ate sig­nif­i­cant dis­rup­tions given the du­ra­tion and the size of the met­ro­pol­i­tan area.

El Paso, a bor­der city with a pop­u­la­tion of nearly 700,000 peo­ple and larger when you in­clude the sur­round­ing metro area, is hub of cross-bor­der com­merce along­side the neigh­bor­ing city of Ciudad Juarez in Mexico. The brief clo­sure does not in­clude Mexican air­space.

The air­port said in an Instagram post af­ter the clo­sure was an­nounced that all flights to and from the air­port would be grounded from late Tuesday through late on Feb. 20, in­clud­ing com­mer­cial, cargo and gen­eral avi­a­tion flights. It sug­gested trav­el­ers con­tact their air­lines to get up-to-date flight in­for­ma­tion.

Rep. Veronica Escobar, a Democrat whose dis­trict in­cludes El Paso, had urged the FAA to lift the re­stric­tions in a state­ment Wednesday morn­ing. There was no ad­vance no­tice given to her of­fice, the city of El Paso or air­port op­er­a­tions, she said.

The highly con­se­quen­tial de­ci­sion by FAA to shut down the El Paso Airport for 10 days is un­prece­dented and has re­sulted in sig­nif­i­cant con­cern within the com­mu­nity,” Escobar said. From what my of­fice and I have been able to gather overnight and early this morn­ing there is no im­me­di­ate threat to the com­mu­nity or sur­round­ing ar­eas.”

The air­port de­scribes it­self as the gate­way to west Texas, south­ern New Mexico and north­ern Mexico. Southwest, United, American and Delta all op­er­ate flights there, among oth­ers.

A sim­i­lar tem­po­rary flight re­stric­tion for spe­cial se­cu­rity rea­sons over the same time pe­riod was im­posed around Santa Teresa, New Mexico, which is about 15 miles (24 kilo­me­ters) north­west of the El Paso air­port.

Southwest Airlines said in a state­ment that it has paused all op­er­a­tions to and from El Paso at the di­rec­tion of the FAA.

We have no­ti­fied af­fected cus­tomers and will share ad­di­tional in­for­ma­tion as it be­comes avail­able,” Southwest Airlines said. Nothing is more im­por­tant to Southwest than the safety of its cus­tomers and em­ploy­ees.”

...

Read the original on apnews.com »

10 289 shares, 0 trendiness

The US Is Flirting With Its First-Ever Population Decline

If there’s one sin­gle con­sis­tent ad­van­tage the United States has car­ried since its found­ing, it is its abil­ity to draw tal­ent and ex­pand its pop­u­la­tion. Now, as the coun­try pre­pares to cel­e­brate its 250th birth­day and pon­ders its ap­petite for President Donald Trump’s crack­down on im­mi­gra­tion, the US risks record­ing a his­toric and eco­nomic mile­stone decades ahead of sched­ule: Based on at least one re­spected es­ti­mate, 2026 may see the first real pop­u­la­tion de­cline in American his­tory.

Even if that mile­stone does­n’t hap­pen this year, there’s broad agree­ment among ex­perts on both sides of the im­mi­gra­tion de­bate that Trump’s sec­ond term is has­ten­ing a crit­i­cal point — when net mi­gra­tion into the US stops off­set­ting the de­clin­ing births and ris­ing deaths that come with an ag­ing na­tive-born pop­u­la­tion. The more Trump cracks down on im­mi­gra­tion, the sooner the US pop­u­la­tion plateaus or even shrinks.

A coun­try’s pop­u­la­tion is an es­sen­tial el­e­ment of its eco­nomic mass. The shrink­ing pop­u­la­tion of China, which in 2025 recorded its low­est birth rate since Communist rule be­gan in 1949, is one good rea­son it may never over­take the US as the world’s largest econ­omy. Japan’s pop­u­la­tion peaked at 128 mil­lion in 2010, and its de­cline has dragged on growth for years. Europe’s wors­en­ing de­mo­graph­ics have long fed its nar­ra­tive of eco­nomic malaise.

The US has for years mostly stood apart from that con­ver­sa­tion. In 2023, when the US Census last is­sued long-run fore­casts for the pop­u­la­tion, the main pre­dic­tion was that it would de­cline for the first time in 2081. But the way things are go­ing, this year the US is at best poised to record a lower pop­u­la­tion growth rate than Germany, where an ag­ing pop­u­la­tion has con­tributed to its rep­u­ta­tion as the sick man of Europe.”

To be clear, that’s not nec­es­sar­ily a prob­lem for Trump. His ad­min­is­tra­tion is fo­cused on de­liv­er­ing on his promise to re­duce the im­mi­grant pop­u­la­tion and ar­gues, de­spite the protes­ta­tions of econ­o­mists, that do­ing so will mean greater op­por­tu­ni­ties and wages for na­tive-born work­ers and will re­duce the cost of every­thing from hous­ing to health care by re­duc­ing de­mand.

...

Read the original on www.bloomberg.com »

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

If you like 10HN please leave feedback and share

Visit pancik.com for more.