10 interesting stories served every morning and every evening.




1 2,821 shares, 109 trendiness

Statement from Dario Amodei on our discussions with the Department of War

I be­lieve deeply in the ex­is­ten­tial im­por­tance of us­ing AI to de­fend the United States and other democ­ra­cies, and to de­feat our au­to­cratic ad­ver­saries. Anthropic has there­fore worked proac­tively to de­ploy our mod­els to the Department of War and the in­tel­li­gence com­mu­nity. We were the first fron­tier AI com­pany to de­ploy our mod­els in the US gov­ern­men­t’s clas­si­fied net­works, the first to de­ploy them at the National Laboratories, and the first to pro­vide cus­tom mod­els for na­tional se­cu­rity cus­tomers. Claude is ex­ten­sively de­ployed across the Department of War and other na­tional se­cu­rity agen­cies for mis­sion-crit­i­cal ap­pli­ca­tions, such as in­tel­li­gence analy­sis, mod­el­ing and sim­u­la­tion, op­er­a­tional plan­ning, cy­ber op­er­a­tions, and more.An­thropic has also acted to de­fend America’s lead in AI, even when it is against the com­pa­ny’s short-term in­ter­est. We chose to forgo sev­eral hun­dred mil­lion dol­lars in rev­enue to cut off the use of Claude by firms linked to the Chinese Communist Party (some of whom have been des­ig­nated by the Department of War as Chinese Military Companies), shut down CCP-sponsored cy­ber­at­tacks that at­tempted to abuse Claude, and have ad­vo­cated for strong ex­port con­trols on chips to en­sure a de­mo­c­ra­tic ad­van­tage.An­thropic un­der­stands that the Department of War, not pri­vate com­pa­nies, makes mil­i­tary de­ci­sions. We have never raised ob­jec­tions to par­tic­u­lar mil­i­tary op­er­a­tions nor at­tempted to limit use of our tech­nol­ogy in an ad hoc man­ner.How­ever, in a nar­row set of cases, we be­lieve AI can un­der­mine, rather than de­fend, de­mo­c­ra­tic val­ues. Some uses are also sim­ply out­side the bounds of what to­day’s tech­nol­ogy can safely and re­li­ably do. Two such use cases have never been in­cluded in our con­tracts with the Department of War, and we be­lieve they should not be in­cluded now:Mass do­mes­tic sur­veil­lance. We sup­port the use of AI for law­ful for­eign in­tel­li­gence and coun­ter­in­tel­li­gence mis­sions. But us­ing these sys­tems for mass do­mes­tic sur­veil­lance is in­com­pat­i­ble with de­mo­c­ra­tic val­ues. AI-driven mass sur­veil­lance pre­sents se­ri­ous, novel risks to our fun­da­men­tal lib­er­ties. To the ex­tent that such sur­veil­lance is cur­rently le­gal, this is only be­cause the law has not yet caught up with the rapidly grow­ing ca­pa­bil­i­ties of AI. For ex­am­ple, un­der cur­rent law, the gov­ern­ment can pur­chase de­tailed records of Americans’ move­ments, web brows­ing, and as­so­ci­a­tions from pub­lic sources with­out ob­tain­ing a war­rant, a prac­tice the Intelligence Community has ac­knowl­edged raises pri­vacy con­cerns and that has gen­er­ated bi­par­ti­san op­po­si­tion in Congress. Powerful AI makes it pos­si­ble to as­sem­ble this scat­tered, in­di­vid­u­ally in­nocu­ous data into a com­pre­hen­sive pic­ture of any per­son’s life—au­to­mat­i­cally and at mas­sive scale.Fully au­tonomous weapons. Partially au­tonomous weapons, like those used to­day in Ukraine, are vi­tal to the de­fense of democ­racy. Even fully au­tonomous weapons (those that take hu­mans out of the loop en­tirely and au­to­mate se­lect­ing and en­gag­ing tar­gets) may prove crit­i­cal for our na­tional de­fense. But to­day, fron­tier AI sys­tems are sim­ply not re­li­able enough to power fully au­tonomous weapons. We will not know­ingly pro­vide a prod­uct that puts America’s warfight­ers and civil­ians at risk. We have of­fered to work di­rectly with the Department of War on R&D to im­prove the re­li­a­bil­ity of these sys­tems, but they have not ac­cepted this of­fer. In ad­di­tion, with­out proper over­sight, fully au­tonomous weapons can­not be re­lied upon to ex­er­cise the crit­i­cal judg­ment that our highly trained, pro­fes­sional troops ex­hibit every day. They need to be de­ployed with proper guardrails, which don’t ex­ist to­day.To our knowl­edge, these two ex­cep­tions have not been a bar­rier to ac­cel­er­at­ing the adop­tion and use of our mod­els within our armed forces to date.The Department of War has stated they will only con­tract with AI com­pa­nies who ac­cede to any law­ful use” and re­move safe­guards in the cases men­tioned above. They have threat­ened to re­move us from their sys­tems if we main­tain these safe­guards; they have also threat­ened to des­ig­nate us a supply chain risk”—a la­bel re­served for US ad­ver­saries, never be­fore ap­plied to an American com­pany—and to in­voke the Defense Production Act to force the safe­guards’ re­moval. These lat­ter two threats are in­her­ently con­tra­dic­tory: one la­bels us a se­cu­rity risk; the other la­bels Claude as es­sen­tial to na­tional se­cu­rity.Re­gard­less, these threats do not change our po­si­tion: we can­not in good con­science ac­cede to their re­quest.It is the Department’s pre­rog­a­tive to se­lect con­trac­tors most aligned with their vi­sion. But given the sub­stan­tial value that Anthropic’s tech­nol­ogy pro­vides to our armed forces, we hope they re­con­sider. Our strong pref­er­ence is to con­tinue to serve the Department and our warfight­ers—with our two re­quested safe­guards in place. Should the Department choose to off­board Anthropic, we will work to en­able a smooth tran­si­tion to an­other provider, avoid­ing any dis­rup­tion to on­go­ing mil­i­tary plan­ning, op­er­a­tions, or other crit­i­cal mis­sions. Our mod­els will be avail­able on the ex­pan­sive terms we have pro­posed for as long as re­quired.We re­main ready to con­tinue our work to sup­port the na­tional se­cu­rity of the United States.

...

Read the original on www.anthropic.com »

2 2,530 shares, 92 trendiness

We Will Not Be Divided

Enter the work email you’ll use to sign into the Google Form. Used only to match your ver­i­fi­ca­tion — never pub­lished or shared.

Alternative ver­i­fi­ca­tion

For those who pre­fer not to use a work email or don’t have one (e.g. for­mer em­ploy­ees). Upload a photo of a work badge, send us a mes­sage on Signal, point us to a co-signer who can vouch for you, or oth­er­wise pro­vide proof of em­ploy­ment.

Google Form email ver­i­fi­ca­tion

After sub­mit­ting, you’ll open a short Google Form and sign in with your work Google ac­count (@google.com, @openai.com). This ver­i­fies your email with­out send­ing any­thing to your in­box.

Email ver­i­fi­ca­tion

We’ll send a ver­i­fi­ca­tion link to your work email. Note: the email will be vis­i­ble in your in­box.

Email ver­i­fi­ca­tion

We’ll send a ver­i­fi­ca­tion link to your work email. Note: the email will be vis­i­ble in your in­box.

Google Form email ver­i­fi­ca­tion

After sub­mit­ting, you’ll open a short Google Form and sign in with your work Google ac­count (@google.com, @openai.com). This ver­i­fies your email with­out send­ing any­thing to your in­box.

Alternative ver­i­fi­ca­tion

For those who pre­fer not to use a work email or don’t have one (e.g. for­mer em­ploy­ees). Upload a photo of a work badge, send us a mes­sage on Signal, point us to a co-signer who can vouch for you, or oth­er­wise pro­vide proof of em­ploy­ment.

Your sig­na­ture will ap­pear as Anonymous [Role/Title if pro­vided], ver­i­fied [current/former] em­ployee at [Company].” Only one or­ga­nizer re­views anony­mous sig­na­tures. Your per­sonal data (name, email) is au­to­mat­i­cally deleted within 24 hours of ver­i­fi­ca­tion.

Sign anony­mously. Your name will not be pub­lished.

Your sig­na­ture will ap­pear as Anonymous [Role/Title if pro­vided], ver­i­fied [current/former] em­ployee at [Company].” Only one or­ga­nizer re­views anony­mous sig­na­tures. Your per­sonal data (name, email) is au­to­mat­i­cally deleted within 24 hours of ver­i­fi­ca­tion.

Current and for­mer em­ploy­ees of Google and OpenAI are in­vited to sign. You may sign anony­mously. All sig­na­tures are ver­i­fied be­fore be­ing pub­lished.

Enter the work email you’ll use to sign into the Google Form. Used only to match your ver­i­fi­ca­tion — never pub­lished or shared.

Alternative ver­i­fi­ca­tion

For those who pre­fer not to use a work email or don’t have one (e.g. for­mer em­ploy­ees). Upload a photo of a work badge, send us a mes­sage on Signal, point us to a co-signer who can vouch for you, or oth­er­wise pro­vide proof of em­ploy­ment.

Google Form email ver­i­fi­ca­tion

After sub­mit­ting, you’ll open a short Google Form and sign in with your work Google ac­count (@google.com, @openai.com). This ver­i­fies your email with­out send­ing any­thing to your in­box.

Email ver­i­fi­ca­tion

We’ll send a ver­i­fi­ca­tion link to your work email. Note: the email will be vis­i­ble in your in­box.

Google Form email ver­i­fi­ca­tion

After sub­mit­ting, you’ll open a short Google Form and sign in with your work Google ac­count (@google.com, @openai.com). This ver­i­fies your email with­out send­ing any­thing to your in­box.

Alternative ver­i­fi­ca­tion

For those who pre­fer not to use a work email or don’t have one (e.g. for­mer em­ploy­ees). Upload a photo of a work badge, send us a mes­sage on Signal, point us to a co-signer who can vouch for you, or oth­er­wise pro­vide proof of em­ploy­ment.

Your sig­na­ture will ap­pear as Anonymous [Role/Title if pro­vided], ver­i­fied [current/former] em­ployee at [Company].” Only one or­ga­nizer re­views anony­mous sig­na­tures. Your per­sonal data (name, email) is au­to­mat­i­cally deleted within 24 hours of ver­i­fi­ca­tion.

Sign anony­mously. Your name will not be pub­lished.

Your sig­na­ture will ap­pear as Anonymous [Role/Title if pro­vided], ver­i­fied [current/former] em­ployee at [Company].” Only one or­ga­nizer re­views anony­mous sig­na­tures. Your per­sonal data (name, email) is au­to­mat­i­cally deleted within 24 hours of ver­i­fi­ca­tion.

Current and for­mer em­ploy­ees of Google and OpenAI are in­vited to sign. You may sign anony­mously. All sig­na­tures are ver­i­fied be­fore be­ing pub­lished.

Have you thought about broad­en­ing the re­quests to be more com­pre­hen­sive?

The goal of this let­ter is to find com­mon ground. The sig­na­to­ries likely have a di­verse set of views. The cur­rent sit­u­a­tion with the DoW is so clear-cut that it can bring to­gether a very broad coali­tion. Signing this let­ter does­n’t mean you think it’s the only thing that needs to be done, just that you agree with the bot­tom line.

Who is be­hind this?

This let­ter was or­ga­nized by a few cit­i­zens who are con­cerned about the po­ten­tial mis­use of AI against Americans. We are not af­fil­i­ated with any po­lit­i­cal party, ad­vo­cacy group, or or­ga­ni­za­tion. We are not af­fil­i­ated with any AI com­pany and are not paid.

Current and for­mer em­ploy­ees of Google and OpenAI are in­vited to sign. We ver­ify every sig­na­ture to en­sure au­then­tic­ity. You may sign anony­mously.

How is my data han­dled?

If you sign anony­mously, your per­sonal in­for­ma­tion (name, email) is au­to­mat­i­cally and per­ma­nently deleted from our data­base within 24 hours of ver­i­fi­ca­tion. After dele­tion, only your anony­mous pub­lic list­ing re­mains (e.g. Anonymous, ver­i­fied cur­rent em­ployee at [Company]“). Only one or­ga­nizer has ac­cess to re­view anony­mous sig­na­tures dur­ing that 24-hour win­dow. No one else can see your iden­tity.

If you sign pub­licly, we store your name and af­fil­i­a­tion to dis­play on the let­ter. Email ad­dresses used for ver­i­fi­ca­tion are never pub­lished or shared.

What if I ac­ci­den­tally fill out the form twice?

Don’t worry. We de-du­pli­cate non-anony­mous sig­na­tures au­to­mat­i­cally, and anony­mous sig­na­tures within 24 hours (before per­sonal data is deleted). For anony­mous sig­na­to­ries be­yond 24 hours, we can­not ver­ify there are no du­pli­cates, though there is one hu­man who man­u­ally reads all sig­na­tures and will try hard to no­tice and cor­rect any abuse of the sys­tem.

I signed anony­mously but now want to put my name on it. How can I fix that?

Sign again us­ing the Alternative ver­i­fi­ca­tion” method. In the ver­i­fi­ca­tion de­tails, men­tion that you pre­vi­ously signed anony­mously and would like to switch to a named sig­na­ture. We’ll up­date your en­try and make sure you’re not dou­ble-counted.

How do you ver­ify sig­na­tures?

Every sig­na­ture is ver­i­fied be­fore it ap­pears on the let­ter. If you sign us­ing the Google Form or email ver­i­fi­ca­tion op­tions, we con­firm that you have ac­cess to a @google.com or @openai.com email ad­dress. If you use al­ter­na­tive ver­i­fi­ca­tion, an or­ga­nizer man­u­ally re­views your proof of em­ploy­ment. No sig­na­ture is pub­lished with­out ver­i­fi­ca­tion.

Have there been any mis­takes in sig­na­ture ver­i­fi­ca­tion for this let­ter?

We are aware of two mis­takes in our ef­forts to ver­ify the sig­na­tures in the form so far. One per­son who was not an em­ployee of OpenAI or Google found a bug in our ver­i­fi­ca­tion sys­tem and signed falsely un­der the name You guys are let­ting China Win”. This was no­ticed and fixed in un­der 10 min­utes, and the ver­i­fi­ca­tion sys­tem was im­proved to pre­vent mis­takes like this from hap­pen­ing again. We also had two peo­ple sub­mit twice in a way that our au­to­matic de-du­pli­ca­tion did­n’t catch. We do pe­ri­odic checks for this.

Because of anonymity con­sid­er­a­tions, all sig­na­tures are man­u­ally re­viewed by one fal­li­ble hu­man. We do our best to make sure we catch and cor­rect any mis­takes, but we are not per­fect and will prob­a­bly make mis­takes. We will log those mis­takes here as we find them.

...

Read the original on notdivided.org »

3 2,152 shares, 88 trendiness

Motorola's new partnership with GrapheneOS

Motorola, a Lenovo Company, an­nounced the ad­di­tion of new con­sumer and en­ter­prise so­lu­tions to its port­fo­lio to­day at Mobile World Congress. The com­pany un­veiled a part­ner­ship with the GrapheneOS Foundation, to bring cut­ting-edge se­cu­rity to every­day users across the globe. In ad­di­tion, Motorola in­tro­duced a new Moto Secure fea­ture and Moto Analytics, to ex­pand Motorola’s B2B ecosys­tem with ad­vanced se­cu­rity and deeper op­er­a­tional in­sights for or­ga­ni­za­tions across in­dus­tries. These an­nounce­ments re­in­force Motorola’s com­mit­ment to de­liv­er­ing in­tel­li­gent, and highly ca­pa­ble tech­nol­ogy with en­hanced se­cu­rity for cus­tomers world­wide.

GrapheneOS Foundation Partnership

Motorola is in­tro­duc­ing a new era of smart­phone se­cu­rity through a long‑term part­ner­ship with the GrapheneOS Foundation, the lead­ing non­profit in ad­vanced mo­bile se­cu­rity and cre­ators of a hard­ened, op­er­at­ing sys­tem based on the Android Open Source Project. Together, Motorola and the GrapheneOS Foundation will work to strengthen smart­phone se­cu­rity and col­lab­o­rate on fu­ture de­vices en­gi­neered with GrapheneOS com­pat­i­bil­ity.

We are thrilled to be part­ner­ing with Motorola to bring GrapheneOS’s in­dus­try‑lead­ing pri­vacy and se­cu­rity‑fo­cused mo­bile op­er­at­ing sys­tem to their next-gen­er­a­tion smart­phone”, said a spokesper­son at GrapheneOS. This col­lab­o­ra­tion marks a sig­nif­i­cant mile­stone in ex­pand­ing the reach of GrapheneOS, and we ap­plaud Motorola for tak­ing this mean­ing­ful step to­wards ad­vanc­ing mo­bile se­cu­rity.”

By com­bin­ing GrapheneOS’s pi­o­neer­ing en­gi­neer­ing with Motorola’s decades of se­cu­rity ex­per­tise, real‑world user in­sights, and Lenovo’s ThinkShield so­lu­tions, the col­lab­o­ra­tion will ad­vance a new gen­er­a­tion of pri­vacy and se­cu­rity tech­nolo­gies. In the com­ing months, Motorola and the GrapheneOS Foundation will con­tinue to col­lab­o­rate on joint re­search, soft­ware en­hance­ments, and new se­cu­rity ca­pa­bil­i­ties, with more de­tails and so­lu­tions to roll out as the part­ner­ship evolves.

Moto Analytics

Today, Motorola also in­tro­duced Moto Analytics, an en­ter­prise‑grade an­a­lyt­ics plat­form de­signed to give IT ad­min­is­tra­tors real‑time vis­i­bil­ity into de­vice per­for­mance across their fleet. Unlike tra­di­tional EMM tools that fo­cus pri­mar­ily on ac­cess con­trol, Moto Analytics pro­vides deep op­er­a­tional in­sights, from app sta­bil­ity to bat­tery health and con­nec­tiv­ity per­for­mance.

With this data, IT teams can trou­bleshoot more ef­fi­ciently, pre­vent is­sues be­fore they es­ca­late, and main­tain em­ployee pro­duc­tiv­ity. As part of the ThinkShield ecosys­tem, Moto Analytics in­te­grates seam­lessly with ex­ist­ing en­ter­prise en­vi­ron­ments and scales ef­fort­lessly as or­ga­ni­za­tions grow.

Private Image Data

Motorola is also ex­pand­ing its Moto Secure plat­form with a new fea­ture, Private Image Data. This tool gives users greater con­trol over the hid­den data stored in their pho­tos. When en­abled, it au­to­mat­i­cally re­moves sen­si­tive meta­data from all new cam­era im­ages on the de­vice, help­ing pro­tect de­tails like lo­ca­tion and de­vice in­for­ma­tion. This pro­tec­tion runs qui­etly in the back­ground, pre­serv­ing the im­age it­self while clear­ing some of the pri­vate data at­tached to it.

Private Image Data joins a grow­ing set of pro­tec­tions within the Moto Secure app, Motorola’s cen­tral hub for es­sen­tial pri­vacy and se­cu­rity tools pow­ered by ThinkShield. From man­ag­ing app per­mis­sions to se­cur­ing sen­si­tive files and mon­i­tor­ing de­vice in­tegrity, Moto Secure brings key Android and Motorola safe­guards to­gether in one place, mak­ing it eas­ier for users to un­der­stand and man­age their de­vice’s se­cu­rity.

Private Image Data will be­gin rolling out to mo­torola sig­na­ture de­vices in the com­ing months, with ad­di­tional up­dates and re­fine­ments ex­pected over time.

With the in­tro­duc­tion of these new so­lu­tions, Motorola is ex­pand­ing its en­ter­prise port­fo­lio with so­lu­tions built for to­day’s most de­mand­ing busi­ness en­vi­ron­ments. From ad­vanced se­cu­rity to op­er­a­tional ef­fi­ciency and in­tel­li­gent de­vice man­age­ment, these in­no­va­tions re­flect Motorola’s com­mit­ment to em­pow­er­ing or­ga­ni­za­tions with tech­nol­ogy that is se­cu­rity-fo­cused, re­li­able, and ready for the fu­ture.

Legal Disclaimers

Certain fea­tures, func­tion­al­ity, and prod­uct spec­i­fi­ca­tions may be net­work-de­pen­dent and sub­ject to ad­di­tional terms, con­di­tions, and charges. All are sub­ject to change with­out no­tice. MOTOROLA, the Stylized M Logo, MOTO, and the MOTO fam­ily of marks are trade­marks of Motorola Trademark Holdings, LLC. LENOVO and THINKSHIELD are trade­marks of Lenovo. Android is a trade­mark of Google, LLC. All other trade­marks are the prop­erty of their re­spec­tive own­ers. ©2026 Motorola Mobility LLC. All rights re­served.

...

Read the original on motorolanews.com »

4 1,842 shares, 73 trendiness

Say hello to MacBook Neo

Say hello to MacBook Neo

Apple’s all-new MacBook fea­tures a durable alu­minum de­sign, a stun­ning 13-inch Liquid Retina dis­play, the power of Apple sil­i­con, and all-day bat­tery life — all for the break­through start­ing price of just $599

Apple to­day un­veiled MacBook Neo, an all-new lap­top that de­liv­ers the magic of the Mac at a break­through price, mak­ing it even more ac­ces­si­ble to mil­lions of peo­ple around the world. MacBook Neo starts with a beau­ti­ful Apple de­sign, fea­tur­ing a durable alu­minum en­clo­sure in an ar­ray of gor­geous col­ors — blush, in­digo, sil­ver, and a fresh new cit­rus. Its stun­ning 13-inch Liquid Retina dis­play brings web­sites, pho­tos, videos, and apps to life with high res­o­lu­tion and bright­ness, and sup­port for 1 bil­lion col­ors. Powered by A18 Pro, MacBook Neo can fly through every­day tasks, from brows­ing the web and stream­ing con­tent, to edit­ing pho­tos, ex­plor­ing cre­ative hob­bies, or us­ing AI ca­pa­bil­i­ties across apps. In fact, it’s up to 50 per­cent faster for every­day tasks like web brows­ing,1 and up to 3x faster when run­ning on-de­vice AI work­loads like ap­ply­ing ad­vanced ef­fects to pho­tos,2 com­pared to the best­selling PC with the lat­est ship­ping Intel Core Ultra 5. Providing up to 16 hours of bat­tery life, MacBook Neo al­lows users to go all day on a sin­gle charge.3 A 1080p FaceTime HD cam­era and dual mics make it easy to look and sound great, and the dual side-fir­ing speak­ers with Spatial Audio de­liver crisp, im­mer­sive sound. MacBook Neo also fea­tures Apple’s renowned Magic Keyboard for com­fort­able and pre­cise typ­ing, and a large Multi-Touch track­pad with sup­port for in­tu­itive ges­tures, en­abling smooth and pre­cise con­trol. Completing the MacBook Neo ex­pe­ri­ence is ma­cOS Tahoe, with pow­er­ful built-in apps like Messages, Pages, Calendar, and Safari; seam­less in­te­gra­tion with iPhone; Apple Intelligence; as well as broad com­pat­i­bil­ity with third-party apps. And start­ing at just $599 and $499 for ed­u­ca­tion, MacBook Neo is Apple’s most af­ford­able lap­top ever, pro­vid­ing an un­prece­dented com­bi­na­tion of qual­ity and value. MacBook Neo is avail­able to pre-or­der start­ing to­day, with avail­abil­ity be­gin­ning Wednesday, March 11.

We’re in­cred­i­bly ex­cited to in­tro­duce MacBook Neo, which de­liv­ers the magic of the Mac at a break­through price,” said John Ternus, Apple’s se­nior vice pres­i­dent of Hardware Engineering. Built from the ground up to be more af­ford­able for even more peo­ple, MacBook Neo is a lap­top only Apple could cre­ate. It fea­tures a durable alu­minum de­sign in four beau­ti­ful col­ors; a bril­liant Liquid Retina dis­play; Apple sil­i­con-pow­ered per­for­mance; all-day bat­tery life; a high-qual­ity cam­era, mics, and speak­ers; a Magic Keyboard and Multi-Touch track­pad; and the in­tu­itive and pow­er­ful fea­tures of ma­cOS. There is sim­ply no other lap­top like it.”

MacBook Neo pro­vides an un­matched com­bi­na­tion of qual­ity and af­ford­abil­ity for stu­dents, fam­i­lies, small busi­ness own­ers, new Mac users, and more.

A fanned-out ar­ray of MacBook Neo mod­els in its four col­ors: sil­ver, blush, cit­rus, and in­digo.

MacBook Neo comes in four beau­ti­ful col­ors — sil­ver, blush, cit­rus, and in­digo.

MacBook Neo comes in four beau­ti­ful col­ors — blush, in­digo, sil­ver, and cit­rus.

MacBook Neo comes in four beau­ti­ful col­ors — blush, in­digo, sil­ver, and cit­rus.

MacBook Neo comes in four beau­ti­ful col­ors — blush, in­digo, sil­ver, and cit­rus.

MacBook Neo comes in four beau­ti­ful col­ors — blush, in­digo, sil­ver, and cit­rus.

A user an­swers emails and browses the web on their cit­rus MacBook Neo.

A per­son uses ChatGPT and Canva on their blush MacBook Neo.

A per­son mul­ti­tasks be­tween apps on their in­digo MacBook Neo.

With A18 Pro, MacBook Neo can power through a wide range of every­day tasks, from brows­ing the web to send­ing emails and ef­fort­lessly mul­ti­task­ing be­tween apps.

With A18 Pro, MacBook Neo can power through a wide range of every­day tasks, from brows­ing the web to send­ing emails and ef­fort­lessly mul­ti­task­ing be­tween apps.

With A18 Pro, MacBook Neo can power through a wide range of every­day tasks, from brows­ing the web to send­ing emails and ef­fort­lessly mul­ti­task­ing be­tween apps.

A18 Pro fea­tures a 5-core GPU to fa­cil­i­tate smooth per­for­mance for every­thing from FaceTime calls to ca­sual game­play.

A stu­dent uses their cit­rus MacBook Neo in a class­room set­ting.

A per­son lounges in bed us­ing MacBook Neo while lis­ten­ing to mu­sic on AirPods Max.

A per­son uses their sil­ver MacBook Neo in an au­di­to­rium-like set­ting.

MacBook Neo de­liv­ers up to 16 hours of bat­tery life on a sin­gle charge, mak­ing it a per­fect on-the-go com­pan­ion for school, work, or play.

MacBook Neo de­liv­ers up to 16 hours of bat­tery life on a sin­gle charge, mak­ing it a per­fect on-the-go com­pan­ion for school, work, or play.

MacBook Neo de­liv­ers up to 16 hours of bat­tery life on a sin­gle charge, mak­ing it a per­fect on-the-go com­pan­ion for school, work, or play.

MacBook Neo de­liv­ers up to 16 hours of bat­tery life on a sin­gle charge, mak­ing it a per­fect on-the-go com­pan­ion for school, work, or play.

Customers can pre-or­der the new MacBook Neo start­ing to­day at ap­ple.com/​store and in the Apple Store app in 30 coun­tries and re­gions, in­clud­ing the U. S. It will be­gin ar­riv­ing to cus­tomers, and will be in Apple Store lo­ca­tions and Apple Authorized Resellers, start­ing Wednesday, March 11.

MacBook Neo starts at $599 (U.S.) and $499 (U.S.) for ed­u­ca­tion. It is avail­able in four col­ors — blush, in­digo, sil­ver, and cit­rus. Additional tech­ni­cal spec­i­fi­ca­tions, con­fig­ure-to-or­der op­tions, and ac­ces­sories are avail­able at ap­ple.com/​mac.

With Apple Trade In, cus­tomers can trade in their cur­rent com­puter and get credit to­ward a new Mac. Customers can visit ap­ple.com/​shop/​trade-in to see what their de­vice is worth.

AppleCare de­liv­ers ex­cep­tional ser­vice and sup­port, with flex­i­ble op­tions for Apple users. Customers can choose AppleCare+ to cover their new Mac, or in the U.S., AppleCare One to pro­tect mul­ti­ple prod­ucts in one sim­ple plan. Both plans in­clude cov­er­age for ac­ci­dents like drops and spills, theft and loss pro­tec­tion on el­i­gi­ble prod­ucts, bat­tery re­place­ment ser­vice, and 24/7 sup­port from Apple Experts. For more in­for­ma­tion, visit ap­ple.com/​ap­ple­care.

Every cus­tomer who buys di­rectly from Apple Retail gets ac­cess to Personal Setup. In these guided on­line ses­sions, a Specialist can walk them through setup, or fo­cus on fea­tures that help them make the most of their new de­vice. Customers can also learn more about get­ting started and go­ing fur­ther with their new de­vice with a Today at Apple ses­sion at their near­est Apple Store.

Customers in the U.S. who shop at Apple us­ing Apple Card can pay monthly at 0 per­cent APR when they choose to check out with Apple Card Monthly Installments, and they’ll get 3 per­cent Daily Cash back — all up front. More in­for­ma­tion — in­clud­ing de­tails on el­i­gi­bil­ity, ex­clu­sions, and Apple Card terms — is avail­able at ap­ple.com/​ap­ple-card/​monthly-in­stall­ments.

Apple’s all-new MacBook fea­tures a durable alu­minum de­sign, a stun­ning 13-inch Liquid Retina dis­play, the power of Apple sil­i­con, and all-day bat­tery life — all for the break­through start­ing price of just $599

CUPERTINO, CALIFORNIA Apple to­day un­veiled MacBook Neo, an all-new lap­top that de­liv­ers the magic of the Mac at a break­through price, mak­ing it even more ac­ces­si­ble to mil­lions of peo­ple around the world. MacBook Neo starts with a beau­ti­ful Apple de­sign, fea­tur­ing a durable alu­minum en­clo­sure in an ar­ray of gor­geous col­ors — blush, in­digo, sil­ver, and a fresh new cit­rus. Its stun­ning 13-inch Liquid Retina dis­play brings web­sites, pho­tos, videos, and apps to life with high res­o­lu­tion and bright­ness, and sup­port for 1 bil­lion col­ors. Powered by A18 Pro, MacBook Neo can fly through every­day tasks, from brows­ing the web and stream­ing con­tent, to edit­ing pho­tos, ex­plor­ing cre­ative hob­bies, or us­ing AI ca­pa­bil­i­ties across apps. In fact, it’s up to 50 per­cent faster for every­day tasks like web brows­ing,1 and up to 3x faster when run­ning on-de­vice AI work­loads like ap­ply­ing ad­vanced ef­fects to pho­tos,2 com­pared to the best­selling PC with the lat­est ship­ping Intel Core Ultra 5. Providing up to 16 hours of bat­tery life, MacBook Neo al­lows users to go all day on a sin­gle charge.3 A 1080p FaceTime HD cam­era and dual mics make it easy to look and sound great, and the dual side-fir­ing speak­ers with Spatial Audio de­liver crisp, im­mer­sive sound. MacBook Neo also fea­tures Apple’s renowned Magic Keyboard for com­fort­able and pre­cise typ­ing, and a large Multi-Touch track­pad with sup­port for in­tu­itive ges­tures, en­abling smooth and pre­cise con­trol. Completing the MacBook Neo ex­pe­ri­ence is ma­cOS Tahoe, with pow­er­ful built-in apps like Messages, Pages, Calendar, and Safari; seam­less in­te­gra­tion with iPhone; Apple Intelligence; as well as broad com­pat­i­bil­ity with third-party apps. And start­ing at just $599 and $499 for ed­u­ca­tion, MacBook Neo is Apple’s most af­ford­able lap­top ever, pro­vid­ing an un­prece­dented com­bi­na­tion of qual­ity and value. MacBook Neo is avail­able to pre-or­der start­ing to­day, with avail­abil­ity be­gin­ning Wednesday, March 11.

We’re in­cred­i­bly ex­cited to in­tro­duce MacBook Neo, which de­liv­ers the magic of the Mac at a break­through price,” said John Ternus, Apple’s se­nior vice pres­i­dent of Hardware Engineering. Built from the ground up to be more af­ford­able for even more peo­ple, MacBook Neo is a lap­top only Apple could cre­ate. It fea­tures a durable alu­minum de­sign in four beau­ti­ful col­ors; a bril­liant Liquid Retina dis­play; Apple sil­i­con-pow­ered per­for­mance; all-day bat­tery life; a high-qual­ity cam­era, mics, and speak­ers; a Magic Keyboard and Multi-Touch track­pad; and the in­tu­itive and pow­er­ful fea­tures of ma­cOS. There is sim­ply no other lap­top like it.”

MacBook Neo fea­tures a beau­ti­fully crafted alu­minum de­sign that’s built to last. With its soft, rounded cor­ners, MacBook Neo looks el­e­gant while feel­ing solid and com­fort­able to hold. At just 2.7 pounds, it’s also easy to carry in a back­pack or hand­bag. Bringing a fun touch of per­son­al­ity and style to every­day com­put­ing, MacBook Neo comes in a spec­trum of four gor­geous col­ors: blush, in­digo, sil­ver, and cit­rus. These col­ors ex­tend to the Magic Keyboard in lighter shades and new wall­pa­pers, cre­at­ing a co­he­sive de­sign aes­thetic and mak­ing MacBook Neo the most col­or­ful MacBook yet.

A gor­geous 13-inch Liquid Retina dis­play fea­tures a 2408-by-1506 res­o­lu­tion, 500 nits of bright­ness, and sup­port for 1 bil­lion col­ors, bring­ing to life sharp, crys­tal-clear text and vi­brant im­ages. The dis­play is both brighter and higher in res­o­lu­tion than most PC lap­tops in this price range, putting it in a class of its own. Finally, an anti-re­flec­tive coat­ing pro­vides a com­fort­able view­ing ex­pe­ri­ence in a va­ri­ety of light­ing con­di­tions, al­low­ing users to watch movies, edit pho­tos, or take video calls from any­where.

At the heart of MacBook Neo is A18 Pro, en­abling users to power through things they do every day, like brows­ing the web, cre­at­ing doc­u­ments, stream­ing con­tent, edit­ing pho­tos, and tak­ing ad­van­tage of AI. Users can seam­lessly work be­tween their fa­vorite apps, like Messages, WhatsApp, Canva, Excel, Safari, and more. MacBook Neo with A18 Pro is up to 50 per­cent faster for every­day tasks than the best­selling PC with the lat­est ship­ping Intel Core Ultra 5.1 And for more de­mand­ing ac­tiv­i­ties, it’s up to 3x faster for on-de­vice AI work­loads2 and up to 2x faster for tasks like photo edit­ing.4 The in­te­grated 5-core GPU brings graph­ics to life while play­ing ac­tion-packed games or ex­plor­ing cre­ative hob­bies. And a 16-core Neural Engine sup­ports fast on-de­vice Apple Intelligence fea­tures and every­day AI tasks like sum­ma­riz­ing notes in Bear or us­ing the Clean Up tool in the Photos app, while en­sur­ing user data stays pri­vate and se­cure. MacBook Neo is also fan­less, so it runs com­pletely silent.

Thanks to the in­cred­i­ble power ef­fi­ciency of Apple sil­i­con, MacBook Neo de­liv­ers up to 16 hours of bat­tery life on a sin­gle charge.3 This makes it a per­fect on-the-go com­pan­ion for work or play, from the class­room to the cof­fee shop, and every­where in be­tween.

MacBook Neo fea­tures Apple’s much-loved Magic Keyboard, which pro­vides a com­fort­able, pre­cise typ­ing ex­pe­ri­ence, while a large Multi-Touch track­pad lets users click, scroll, swipe, and pinch any­where on its sur­face. The MacBook Neo model with Touch ID en­ables easy, quick, and se­cure lo­gin au­then­ti­ca­tion, and the abil­ity to con­ve­niently au­tho­rize pur­chases us­ing Apple Pay.

The 1080p FaceTime HD cam­era on MacBook Neo has op­ti­mized im­age pro­cess­ing to de­liver vi­brant video calls. Dual mics with di­rec­tional beam­form­ing are de­signed to re­duce back­ground noise and iso­late a user’s voice, al­low­ing it to come across loud and clear for an ex­cel­lent video con­fer­enc­ing ex­pe­ri­ence. And dual side-fir­ing speak­ers with sup­port for Spatial Audio and Dolby Atmos pro­duce im­mer­sive sound for watch­ing a movie, lis­ten­ing to mu­sic, or us­ing apps like GarageBand.

MacBook Neo fea­tures two USB-C ports for con­nect­ing ac­ces­sories or an ex­ter­nal dis­play.5 Both ports can be used for charg­ing. MacBook Neo also in­cludes a head­phone jack for wired au­dio. Wi-Fi 6E pro­vides fast wire­less con­nec­tiv­ity, and Bluetooth 6 en­sures re­li­able wire­less con­nec­tions for pe­riph­er­als and ac­ces­sories.

ma­cOS is Apple’s pow­er­ful and in­tu­itive op­er­at­ing sys­tem for Mac.6 With in­cred­i­ble fea­tures and built-in apps like Safari, Photos, Messages, and FaceTime, ma­cOS en­ables users to get started right out of the box. Apple Intelligence fea­tures like Writing Tools, Live Translation, and more are deeply in­te­grated across ma­cOS, el­e­vat­ing the user ex­pe­ri­ence by bring­ing in­tel­li­gence to the apps users rely on every day.7 Advanced pri­vacy and se­cu­rity also come stan­dard, fea­tur­ing in­dus­try‑lead­ing en­cryp­tion, ro­bust virus pro­tec­tions, and au­to­matic free se­cu­rity up­dates to help keep users pro­tected.

iPhone users can tap in to Continuity fea­tures built in to ma­cOS to make work­ing across iPhone and Mac a breeze. Handoff lets users start a task on MacBook Neo and con­tinue it on iPhone, while Universal Clipboard al­lows users to copy and paste con­tent be­tween de­vices. With iPhone Mirroring, users can view and in­ter­act with their iPhone di­rectly on MacBook Neo, and users switch­ing to Mac for the first time can use iPhone to con­ve­niently and se­curely trans­fer set­tings, files, pho­tos, pass­words, and more.

Built with the Environment in Mind

MacBook Neo was built from the ground up to be Apple’s low­est-car­bon MacBook, and brings the com­pany even closer to reach­ing its am­bi­tious plan to be car­bon neu­tral across its en­tire foot­print by 2030. It fea­tures 60 per­cent re­cy­cled con­tent — the high­est per­cent­age of any Apple prod­uct.8 This in­cludes 90 per­cent re­cy­cled alu­minum over­all and 100 per­cent re­cy­cled cobalt in the bat­tery. The en­clo­sure is man­u­fac­tured with a ma­te­r­ial-ef­fi­cient form­ing process that uses 50 per­cent less alu­minum com­pared to tra­di­tional ma­chin­ing meth­ods. MacBook Neo is man­u­fac­tured with 45 per­cent re­new­able elec­tric­ity, like wind and so­lar, across the sup­ply chain. It also meets Apple’s high stan­dards for en­ergy ef­fi­ciency and safe chem­istry. Additionally, the pa­per pack­ag­ing is 100 per­cent fiber-based and can be eas­ily re­cy­cled.9

Customers can pre-or­der the new MacBook Neo start­ing to­day at ap­ple.com/​store and in the Apple Store app in 30 coun­tries and re­gions, in­clud­ing the U.S. It will be­gin ar­riv­ing to cus­tomers, and will be in Apple Store lo­ca­tions and Apple Authorized Resellers, start­ing Wednesday, March 11.

MacBook Neo starts at $599 (U.S.) and $499 (U.S.) for ed­u­ca­tion. It is avail­able in four col­ors — blush, in­digo, sil­ver, and cit­rus. Additional tech­ni­cal spec­i­fi­ca­tions, con­fig­ure-to-or­der op­tions, and ac­ces­sories are avail­able at ap­ple.com/​mac.

With Apple Trade In, cus­tomers can trade in their cur­rent com­puter and get credit to­ward a new Mac. Customers can visit ap­ple.com/​shop/​trade-in to see what their de­vice is worth.

AppleCare de­liv­ers ex­cep­tional ser­vice and sup­port, with flex­i­ble op­tions for Apple users. Customers can choose AppleCare+ to cover their new Mac, or in the U.S., AppleCare One to pro­tect mul­ti­ple prod­ucts in one sim­ple plan. Both plans in­clude cov­er­age for ac­ci­dents like drops and spills, theft and loss pro­tec­tion on el­i­gi­ble prod­ucts, bat­tery re­place­ment ser­vice, and 24/7 sup­port from Apple Experts. For more in­for­ma­tion, visit ap­ple.com/​ap­ple­care.

Every cus­tomer who buys di­rectly from Apple Retail gets ac­cess to Personal Setup. In these guided on­line ses­sions, a Specialist can walk them through setup, or fo­cus on fea­tures that help them make the most of their new de­vice. Customers can also learn more about get­ting started and go­ing fur­ther with their new de­vice with a Today at Apple ses­sion at their near­est Apple Store.

Customers in the U.S. who shop at Apple us­ing Apple Card can pay monthly at 0 per­cent APR when they choose to check out with Apple Card Monthly Installments, and they’ll get 3 per­cent Daily Cash back — all up front. More in­for­ma­tion — in­clud­ing de­tails on el­i­gi­bil­ity, ex­clu­sions, and Apple Card terms — is avail­able at ap­ple.com/​ap­ple-card/​monthly-in­stall­ments.

About Apple

Apple rev­o­lu­tion­ized per­sonal tech­nol­ogy with the in­tro­duc­tion of the Macintosh in 1984. Today, Apple leads the world in in­no­va­tion with iPhone, iPad, Mac, AirPods, Apple Watch, and Apple Vision Pro. Apple’s six soft­ware plat­forms — iOS, iPa­dOS, ma­cOS, watchOS, vi­sionOS, and tvOS — pro­vide seam­less ex­pe­ri­ences across all Apple de­vices and em­power peo­ple with break­through ser­vices in­clud­ing the App Store, Apple Music, Apple Pay, iCloud, and Apple TV. Apple’s more than 150,000 em­ploy­ees are ded­i­cated to mak­ing the best prod­ucts on earth and to leav­ing the world bet­ter than we found it.

Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Speedometer 3.1 per­for­mance bench­mark tested with pre-re­lease Safari 26.3 on ma­cOS Tahoe, and both Chrome 144.0.7559.110 and Edge 144.0.3719.104 on Windows 11 Home. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Adobe Photoshop 2026 27.3.0 tested us­ing the fol­low­ing fil­ters and func­tions: su­per zoom, depth blur, JPEG ar­ti­fact re­moval, style trans­fer, photo restora­tion, and land­scape mixer. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

Testing was con­ducted by Apple in January 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD. Wireless web bat­tery life tested by brows­ing 25 pop­u­lar web­sites while con­nected to Wi-Fi. Video stream­ing bat­tery life tested with 1080p con­tent in Safari while con­nected to Wi-Fi. All sys­tems tested with dis­play bright­ness set to eight clicks from bot­tom. Battery life varies by use and con­fig­u­ra­tion. See ap­ple.com/​bat­ter­ies for more in­for­ma­tion.

Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Tested with Affinity v3.0.3.4027 us­ing the built-in bench­mark 30000. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

MacBook Neo fea­tures two USB-C ports — USB 3 (left) and USB 2 (right). External dis­play con­nec­tiv­ity sup­ported on left USB 3 port only.

ma­cOS Tahoe is avail­able as a free soft­ware up­date. Some fea­tures may not be avail­able in all re­gions or in all lan­guages. See re­quire­ments at ap­ple.com/​os/​ma­cos.

Apple Intelligence is avail­able in beta with sup­port for these lan­guages: English, Danish, Dutch, French, German, Italian, Norwegian, Portuguese, Spanish, Swedish, Turkish, Vietnamese, Chinese (simplified), Chinese (traditional), Japanese, and Korean. Some fea­tures may not be avail­able in all re­gions or lan­guages. For fea­ture and lan­guage avail­abil­ity and sys­tem re­quire­ments, see sup­port.ap­ple.com/​en-us/​121115.

Product re­cy­cled or re­new­able con­tent is the mass of cer­ti­fied re­cy­cled ma­te­r­ial rel­a­tive to the over­all mass of the de­vice, not in­clud­ing pack­ag­ing or in-box ac­ces­sories. Comparison ex­cludes ac­ces­sories.

Breakdown of U.S. re­tail pack­ag­ing by weight. Adhesives, inks, and coat­ings are ex­cluded from cal­cu­la­tions.

Copy text

* Customers can pre-or­der the new MacBook Neo start­ing to­day at ap­ple.com/​store and in the Apple Store app in 30 coun­tries and re­gions, in­clud­ing the U.S. It will be­gin ar­riv­ing to cus­tomers, and will be in Apple Store lo­ca­tions and Apple Authorized Resellers, start­ing Wednesday, March 11.

* MacBook Neo starts at $599 (U.S.) and $499 (U.S.) for ed­u­ca­tion. It is avail­able in four col­ors — blush, in­digo, sil­ver, and cit­rus. Additional tech­ni­cal spec­i­fi­ca­tions, con­fig­ure-to-or­der op­tions, and ac­ces­sories are avail­able at ap­ple.com/​mac.

* With Apple Trade In, cus­tomers can trade in their cur­rent com­puter and get credit to­ward a new Mac. Customers can visit ap­ple.com/​shop/​trade-in to see what their de­vice is worth.

* AppleCare de­liv­ers ex­cep­tional ser­vice and sup­port, with flex­i­ble op­tions for Apple users. Customers can choose AppleCare+ to cover their new Mac, or in the U.S., AppleCare One to pro­tect mul­ti­ple prod­ucts in one sim­ple plan. Both plans in­clude cov­er­age for ac­ci­dents like drops and spills, theft and loss pro­tec­tion on el­i­gi­ble prod­ucts, bat­tery re­place­ment ser­vice, and 24/7 sup­port from Apple Experts. For more in­for­ma­tion, visit ap­ple.com/​ap­ple­care.

* Every cus­tomer who buys di­rectly from Apple Retail gets ac­cess to Personal Setup. In these guided on­line ses­sions, a Specialist can walk them through setup, or fo­cus on fea­tures that help them make the most of their new de­vice. Customers can also learn more about get­ting started and go­ing fur­ther with their new de­vice with a Today at Apple ses­sion at their near­est Apple Store.

* Customers in the U.S. who shop at Apple us­ing Apple Card can pay monthly at 0 per­cent APR when they choose to check out with Apple Card Monthly Installments, and they’ll get 3 per­cent Daily Cash back — all up front. More in­for­ma­tion — in­clud­ing de­tails on el­i­gi­bil­ity, ex­clu­sions, and Apple Card terms — is avail­able at ap­ple.com/​ap­ple-card/​monthly-in­stall­ments.

* Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Speedometer 3.1 per­for­mance bench­mark tested with pre-re­lease Safari 26.3 on ma­cOS Tahoe, and both Chrome 144.0.7559.110 and Edge 144.0.3719.104 on Windows 11 Home. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

* Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Adobe Photoshop 2026 27.3.0 tested us­ing the fol­low­ing fil­ters and func­tions: su­per zoom, depth blur, JPEG ar­ti­fact re­moval, style trans­fer, photo restora­tion, and land­scape mixer. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

* Testing was con­ducted by Apple in January 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD. Wireless web bat­tery life tested by brows­ing 25 pop­u­lar web­sites while con­nected to Wi-Fi. Video stream­ing bat­tery life tested with 1080p con­tent in Safari while con­nected to Wi-Fi. All sys­tems tested with dis­play bright­ness set to eight clicks from bot­tom. Battery life varies by use and con­fig­u­ra­tion. See ap­ple.com/​bat­ter­ies for more in­for­ma­tion.

* Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Tested with Affinity v3.0.3.4027 us­ing the built-in bench­mark 30000. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

* MacBook Neo fea­tures two USB-C ports — USB 3 (left) and USB 2 (right). External dis­play con­nec­tiv­ity sup­ported on left USB 3 port only.

* ma­cOS Tahoe is avail­able as a free soft­ware up­date. Some fea­tures may not be avail­able in all re­gions or in all lan­guages. See re­quire­ments at ap­ple.com/​os/​ma­cos.

* Apple Intelligence is avail­able in beta with sup­port for these lan­guages: English, Danish, Dutch, French, German, Italian, Norwegian, Portuguese, Spanish, Swedish, Turkish, Vietnamese, Chinese (simplified), Chinese (traditional), Japanese, and Korean. Some fea­tures may not be avail­able in all re­gions or lan­guages. For fea­ture and lan­guage avail­abil­ity and sys­tem re­quire­ments, see sup­port.ap­ple.com/​en-us/​121115.

* Product re­cy­cled or re­new­able con­tent is the mass of cer­ti­fied re­cy­cled ma­te­r­ial rel­a­tive to the over­all mass of the de­vice, not in­clud­ing pack­ag­ing or in-box ac­ces­sories. Comparison ex­cludes ac­ces­sories.

* Breakdown of U.S. re­tail pack­ag­ing by weight. Adhesives, inks, and coat­ings are ex­cluded from cal­cu­la­tions.

Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Speedometer 3.1 per­for­mance bench­mark tested with pre-re­lease Safari 26.3 on ma­cOS Tahoe, and both Chrome 144.0.7559.110 and Edge 144.0.3719.104 on Windows 11 Home. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Adobe Photoshop 2026 27.3.0 tested us­ing the fol­low­ing fil­ters and func­tions: su­per zoom, depth blur, JPEG ar­ti­fact re­moval, style trans­fer, photo restora­tion, and land­scape mixer. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

Testing was con­ducted by Apple in January 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD. Wireless web bat­tery life tested by brows­ing 25 pop­u­lar web­sites while con­nected to Wi-Fi. Video stream­ing bat­tery life tested with 1080p con­tent in Safari while con­nected to Wi-Fi. All sys­tems tested with dis­play bright­ness set to eight clicks from bot­tom. Battery life varies by use and con­fig­u­ra­tion. See ap­ple.com/​bat­ter­ies for more in­for­ma­tion.

Testing was con­ducted by Apple in January and February 2026 us­ing pre­pro­duc­tion MacBook Neo sys­tems with Apple A18 Pro, 6-core CPU, 5-core GPU, 8GB of uni­fied mem­ory, and 256GB SSD, as well as pro­duc­tion Intel Core Ultra 5-based PC sys­tems with Intel Graphics, 8GB of RAM, 256GB SSD, and the lat­est ver­sion of Windows 11 Home avail­able at the time of test­ing. Bestselling PC lap­top with the lat­est ship­ping Intel Core Ultra 5 proces­sor is based on pub­licly avail­able sales data over the prior six months. Tested with Affinity v3.0.3.4027 us­ing the built-in bench­mark 30000. Performance tests are con­ducted us­ing spe­cific com­puter sys­tems and re­flect the ap­prox­i­mate per­for­mance of MacBook Neo.

MacBook Neo fea­tures two USB-C ports — USB 3 (left) and USB 2 (right). External dis­play con­nec­tiv­ity sup­ported on left USB 3 port only.

ma­cOS Tahoe is avail­able as a free soft­ware up­date. Some fea­tures may not be avail­able in all re­gions or in all lan­guages. See re­quire­ments at ap­ple.com/​os/​ma­cos.

Apple Intelligence is avail­able in beta with sup­port for these lan­guages: English, Danish, Dutch, French, German, Italian, Norwegian, Portuguese, Spanish, Swedish, Turkish, Vietnamese, Chinese (simplified), Chinese (traditional), Japanese, and Korean. Some fea­tures may not be avail­able in all re­gions or lan­guages. For fea­ture and lan­guage avail­abil­ity and sys­tem re­quire­ments, see sup­port.ap­ple.com/​en-us/​121115.

Product re­cy­cled or re­new­able con­tent is the mass of cer­ti­fied re­cy­cled ma­te­r­ial rel­a­tive to the over­all mass of the de­vice, not in­clud­ing pack­ag­ing or in-box ac­ces­sories. Comparison ex­cludes ac­ces­sories.

Breakdown of U. S. re­tail pack­ag­ing by weight. Adhesives, inks, and coat­ings are ex­cluded from cal­cu­la­tions.

...

Read the original on www.apple.com »

5 1,684 shares, 70 trendiness

microgpt

This is a brief guide to my new art pro­ject mi­crogpt, a sin­gle file of 200 lines of pure Python with no de­pen­den­cies that trains and in­fer­ences a GPT. This file con­tains the full al­go­rith­mic con­tent of what is needed: dataset of doc­u­ments, to­k­enizer, au­to­grad en­gine, a GPT-2-like neural net­work ar­chi­tec­ture, the Adam op­ti­mizer, train­ing loop, and in­fer­ence loop. Everything else is just ef­fi­ciency. I can­not sim­plify this any fur­ther. This script is the cul­mi­na­tion of mul­ti­ple pro­jects (micrograd, make­more, nanogpt, etc.) and a decade-long ob­ses­sion to sim­plify LLMs to their bare es­sen­tials, and I think it is beau­ti­ful 🥹. It even breaks per­fectly across 3 columns:

Where to find it:

This GitHub gist has the full source code: mi­crogpt.py

It’s also avail­able on this web page: https://​karpa­thy.ai/​mi­crogpt.html

Also avail­able as a Google Colab note­book

The fol­low­ing is my guide on step­ping an in­ter­ested reader through the code.

The fuel of large lan­guage mod­els is a stream of text data, op­tion­ally sep­a­rated into a set of doc­u­ments. In pro­duc­tion-grade ap­pli­ca­tions, each doc­u­ment would be an in­ter­net web page but for mi­crogpt we use a sim­pler ex­am­ple of 32,000 names, one per line:

# Let there be an in­put dataset `docs`: list[str] of doc­u­ments (e.g. a dataset of names)

if not os.path.ex­ists(‘in­put.txt’):

im­port url­lib.re­quest

names_url = https://​raw.githubuser­con­tent.com/​karpa­thy/​make­more/​refs/​heads/​mas­ter/​names.txt

url­lib.re­quest.url­re­trieve(names_url, input.txt’)

docs = [l.strip() for l in open(‘in­put.txt’).read().strip().split(‘\n’) if l.strip()] # list[str] of doc­u­ments

ran­dom.shuf­fle(docs)

print(f”num docs: {len(docs)}“)

The dataset looks like this. Each name is a doc­u­ment:

The goal of the model is to learn the pat­terns in the data and then gen­er­ate sim­i­lar new doc­u­ments that share the sta­tis­ti­cal pat­terns within. As a pre­view, by the end of the script our model will gen­er­ate (“hallucinate”!) new, plau­si­ble-sound­ing names. Skipping ahead, we’ll get:

It does­n’t look like much, but from the per­spec­tive of a model like ChatGPT, your con­ver­sa­tion with it is just a funny look­ing document”. When you ini­tial­ize the doc­u­ment with your prompt, the mod­el’s re­sponse from its per­spec­tive is just a sta­tis­ti­cal doc­u­ment com­ple­tion.

Under the hood, neural net­works work with num­bers, not char­ac­ters, so we need a way to con­vert text into a se­quence of in­te­ger to­ken ids and back. Production to­k­eniz­ers like tik­to­ken (used by GPT-4) op­er­ate on chunks of char­ac­ters for ef­fi­ciency, but the sim­plest pos­si­ble to­k­enizer just as­signs one in­te­ger to each unique char­ac­ter in the dataset:

# Let there be a Tokenizer to trans­late strings to dis­crete sym­bols and back

uchars = sorted(set(‘’.join(docs))) # unique char­ac­ters in the dataset be­come to­ken ids 0..n-1

BOS = len(uchars) # to­ken id for the spe­cial Beginning of Sequence (BOS) to­ken

vo­cab_­size = len(uchars) + 1 # to­tal num­ber of unique to­kens, +1 is for BOS

print(f”vo­cab size: {vocab_size}“)

In the code above, we col­lect all unique char­ac­ters across the dataset (which are just all the low­er­case let­ters a-z), sort them, and each let­ter gets an id by its in­dex. Note that the in­te­ger val­ues them­selves have no mean­ing at all; each to­ken is just a sep­a­rate dis­crete sym­bol. Instead of 0, 1, 2 they might as well be dif­fer­ent emoji. In ad­di­tion, we cre­ate one more spe­cial to­ken called BOS (Beginning of Sequence), which acts as a de­lim­iter: it tells the model a new doc­u­ment starts/​ends here”. Later dur­ing train­ing, each doc­u­ment gets wrapped with BOS on both sides: [BOS, e, m, m, a, BOS]. The model learns that BOS ini­tates a new name, and that an­other BOS ends it. Therefore, we have a fi­nal vo­cavu­lary of 27 (26 pos­si­ble low­er­case char­ac­ters a-z and +1 for the BOS to­ken).

Training a neural net­work re­quires gra­di­ents: for each pa­ra­me­ter in the model, we need to know if I nudge this num­ber up a lit­tle, does the loss go up or down, and by how much?”. The com­pu­ta­tion graph has many in­puts (the model pa­ra­me­ters and the in­put to­kens) but fun­nels down to a sin­gle scalar out­put: the loss (we’ll de­fine ex­actly what the loss is be­low). Backpropagation starts at that sin­gle out­put and works back­wards through the graph, com­put­ing the gra­di­ent of the loss with re­spect to every in­put. It re­lies on the chain rule from cal­cu­lus. In pro­duc­tion, li­braries like PyTorch han­dle this au­to­mat­i­cally. Here, we im­ple­ment it from scratch in a sin­gle class called Value:

class Value:

__slots__ = (‘data’, grad’, _children’, _local_grads’)

def __init__(self, data, chil­dren=(), lo­cal_­grads=()):

self.data = data # scalar value of this node cal­cu­lated dur­ing for­ward pass

self.grad = 0 # de­riv­a­tive of the loss w.r.t. this node, cal­cu­lated in back­ward pass

self._chil­dren = chil­dren # chil­dren of this node in the com­pu­ta­tion graph

self._lo­cal_­grads = lo­cal_­grads # lo­cal de­riv­a­tive of this node w.r.t. its chil­dren

def __add__(self, other):

other = other if isin­stance(other, Value) else Value(other)

re­turn Value(self.data + other.data, (self, other), (1, 1))

def __mul__(self, other):

other = other if isin­stance(other, Value) else Value(other)

re­turn Value(self.data * other.data, (self, other), (other.data, self.data))

def __pow__(self, other): re­turn Value(self.data**other, (self,), (other * self.data**(other-1),))

def log(self): re­turn Value(math.log(self.data), (self,), (1/self.data,))

def exp(self): re­turn Value(math.exp(self.data), (self,), (math.exp(self.data),))

def relu(self): re­turn Value(max(0, self.data), (self,), (float(self.data > 0),))

def __neg__(self): re­turn self * -1

def __radd__(self, other): re­turn self + other

def __sub__(self, other): re­turn self + (-other)

def __rsub__(self, other): re­turn other + (-self)

def __rmul__(self, other): re­turn self * other

def __truediv__(self, other): re­turn self * other**-1

def __rtruediv__(self, other): re­turn other * self**-1

def back­ward(self):

topo = []

vis­ited = set()

def build_­topo(v):

if v not in vis­ited:

vis­ited.add(v)

for child in v._chil­dren:

build_­topo(child)

topo.ap­pend(v)

build_­topo(self)

self.grad = 1

for v in re­versed(topo):

for child, lo­cal_­grad in zip(v._chil­dren, v._lo­cal_­grads):

child.grad += lo­cal_­grad * v.grad

I re­al­ize that this is the most math­e­mat­i­cally and al­go­rith­mi­cally in­tense part and I have a 2.5 hour video on it: mi­cro­grad video. Briefly, a Value wraps a sin­gle scalar num­ber (.data) and tracks how it was com­puted. Think of each op­er­a­tion as a lit­tle lego block: it takes some in­puts, pro­duces an out­put (the for­ward pass), and it knows how its out­put would change with re­spect to each of its in­puts (the lo­cal gra­di­ent). That’s all the in­for­ma­tion au­to­grad needs from each block. Everything else is just the chain rule, string­ing the blocks to­gether.

Every time you do math with Value ob­jects (add, mul­ti­ply, etc.), the re­sult is a new Value that re­mem­bers its in­puts (_children) and the lo­cal de­riv­a­tive of that op­er­a­tion (_local_grads). For ex­am­ple, __mul__ records that \(\frac{\partial(a \cdot b)}{\par­tial a} = b\) and \(\frac{\partial(a \cdot b)}{\par­tial b} = a\). The full set of lego blocks:

The back­ward() method walks this graph in re­verse topo­log­i­cal or­der (starting from the loss, end­ing at the pa­ra­me­ters), ap­ply­ing the chain rule at each step. If the loss is \(L\) and a node \(v\) has a child \(c\) with lo­cal gra­di­ent \(\frac{\partial v}{\par­tial c}\), then:

\[\frac{\partial L}{\partial c} \mathrel{+}= \frac{\partial v}{\par­tial c} \cdot \frac{\partial L}{\partial v}\]

This looks a bit scary if you’re not com­fort­able with your cal­cu­lus, but this is lit­er­ally just mul­ti­ply­ing two num­bers in an in­tu­itive way. One way to see it looks as fol­lows: If a car trav­els twice as fast as a bi­cy­cle and the bi­cy­cle is four times as fast as a walk­ing man, then the car trav­els 2 x 4 = 8 times as fast as the man.” The chain rule is the same idea: you mul­ti­ply the rates of change along the path.

We kick things off by set­ting self.grad = 1 at the loss node, be­cause \(\frac{\partial L}{\partial L} = 1\): the loss’s rate of change with re­spect to it­self is triv­ially 1. From there, the chain rule just mul­ti­plies lo­cal gra­di­ents along every path back to the pa­ra­me­ters.

Note the += (accumulation, not as­sign­ment). When a value is used in mul­ti­ple places in the graph (i.e. the graph branches), gra­di­ents flow back along each branch in­de­pen­dently and must be summed. This is a con­se­quence of the mul­ti­vari­able chain rule: if \(c\) con­tributes to \(L\) through mul­ti­ple paths, the to­tal de­riv­a­tive is the sum of con­tri­bu­tions from each path.

After back­ward() com­pletes, every Value in the graph has a .grad con­tain­ing \(\frac{\partial L}{\partial v}\), which tells us how the fi­nal loss would change if we nudged that value.

Here’s a con­crete ex­am­ple. Note that a is used twice (the graph branches), so its gra­di­ent is the sum of both paths:

a = Value(2.0)

b = Value(3.0)

c = a * b # c = 6.0

L = c + a # L = 8.0

L.backward()

print(a.grad) # 4.0 (dL/da = b + 1 = 3 + 1, via both paths)

print(b.grad) # 2.0 (dL/db = a = 2)

This is ex­actly what PyTorch’s .backward() gives you:

This is the same al­go­rithm that PyTorch’s loss.back­ward() runs, just on scalars in­stead of ten­sors (arrays of scalars) - al­go­rith­mi­cally iden­ti­cal, sig­nif­i­cantly smaller and sim­pler, but of course a lot less ef­fi­cient.

Let’s spell what the .backward() gives us above. Autograd cal­cu­lated that if L = a*b + a, and a=2 and b=3, then a.grad = 4.0 is telling us about the lo­cal in­flu­ence of a on L. If you wig­gle the in­m­put a, in what di­rec­tion is L chang­ing? Here, the de­riv­a­tive of L w.r.t. a is 4.0, mean­ing that if we in­crease a by a tiny amount (say 0.001), L would in­crease by about 4x that (0.004). Similarly, b.grad = 2.0 means the same nudge to b would in­crease L by about 2x that (0.002). In other words, these gra­di­ents tell us the di­rec­tion (positive or neg­a­tive de­pend­ing on the sign), and the steep­ness (the mag­ni­tude) of the in­flu­ence of each in­di­vid­ual in­put on the fi­nal out­put (the loss). This then al­lows us to in­ter­ately nudge the pa­ra­me­ters of our neural net­work to lower the loss, and hence im­prove its pre­dic­tions.

The pa­ra­me­ters are the knowl­edge of the model. They are a large col­lec­tion of float­ing point num­bers (wrapped in Value for au­to­grad) that start out ran­dom and are it­er­a­tively op­ti­mized dur­ing train­ing. The ex­act role of each pa­ra­me­ter will make more sense once we de­fine the model ar­chi­tec­ture be­low, but for now we just need to ini­tial­ize them:

n_embd = 16 # em­bed­ding di­men­sion

n_­head = 4 # num­ber of at­ten­tion heads

n_layer = 1 # num­ber of lay­ers

block­_­size = 16 # max­i­mum se­quence length

head­_dim = n_embd // n_­head # di­men­sion of each head

ma­trix = lambda nout, nin, std=0.08: [[Value(random.gauss(0, std)) for _ in range(nin)] for _ in range(nout)]

state_­dict = {‘wte’: ma­trix(vo­cab_­size, n_embd), wpe’: ma­trix(block­_­size, n_embd), lm_head’: ma­trix(vo­cab_­size, n_embd)}

for i in range(n_layer):

state_­dict[f’layer{i}.at­tn_wq’] = ma­trix(n_embd, n_embd)

state_­dict[f’layer{i}.at­tn_wk’] = ma­trix(n_embd, n_embd)

state_­dict[f’layer{i}.at­tn_wv’] = ma­trix(n_embd, n_embd)

state_­dict[f’layer{i}.at­tn_­wo’] = ma­trix(n_embd, n_embd)

state_­dict[f’layer{i}.mlp_fc1′] = ma­trix(4 * n_embd, n_embd)

state_­dict[f’layer{i}.mlp_fc2′] = ma­trix(n_embd, 4 * n_embd)

params = [p for mat in state_­dict.val­ues() for row in mat for p in row]

print(f”num params: {len(params)}“)

...

Read the original on karpathy.github.io »

6 1,357 shares, 54 trendiness

She Came Out of the Bathroom Naked, Employee Says

There are also sex scenes filmed with the smart glasses — some­one is wear­ing them hav­ing sex. That is why this is so ex­tremely sen­si­tive. There are cam­eras every­where in our of­fice, and you are not al­lowed to bring your own phones or any de­vice that can record”, an em­ployee says. In or­der to an­swer ques­tions and in­ter­pret what the cam­era sees, the glasses re­quire that data be processed via Meta’s in­fra­struc­ture — it is not pos­si­ble to in­ter­act with the AI solely lo­cally on the phone. We con­tact Synsam and Synoptik for an in­ter­view about what train­ing the sales staff re­ceive and how it can be that the an­swers they give are so dif­fer­ent. Synsam re­sponded in writ­ing that its role is to in­form cus­tomers about the ap­plic­a­ble terms and to pro­vide in­ter­nal train­ing, but that re­spon­si­bil­ity for com­ply­ing with Swedish law and Meta’s terms ul­ti­mately rests with the wearer. Synoptik re­sponded in sim­i­lar terms, say­ing its staff are trained in ethics and em­p­hazise the user’s re­spon­si­bil­ity.But for the AI as­sis­tant to func­tion, voice, text, im­age and some­times video must be processed and may be shared on­wards. This data pro­cess­ing is done au­to­mat­i­cally and can­not be turned off.It is not spec­i­fied how much data may be analysed or for how long it may be stored. Nor is it spec­i­fied who is given ac­cess to the data.Where do the im­ages come from? Can pri­vate videos from Sweden end up on screens in Kenya? Those who ap­pear in the im­ages, have they con­sented to ap­pear­ing in this way?“Many be­lieve that data must be stored within the EU to be pro­tected. But un­der GDPR it does not mat­ter where the server is lo­cated — as long as the coun­try meets the EUs re­quire­ments. If it does not, data may not be sent there”.“Tech­ni­cally, we have data cen­tres in Sweden, Denmark and Ireland, but the phys­i­cal lo­ca­tion is ac­tu­ally less rel­e­vant. The le­gal re­spon­si­bil­ity lies with Meta Ireland, which is the European en­tity. Where the data is ac­tu­ally processed — in Europe or in the US — does not change the reg­u­la­tory frame­work”.“For it to be per­mit­ted to use a ser­vice provider in a third coun­try (outside the EU), it is re­quired that ro­bust agree­ments with in­struc­tions are in place. It must also be en­sured that there is le­gal sup­port for the trans­fers, so that the data that is trans­ferred re­ceives con­tin­ued strong and equiv­a­lent pro­tec­tion when it is processed in a third coun­try. The pro­tec­tion must there­fore not be­come weaker when it is processed by sub­con­trac­tors”, says Petra Wierup.

Hur san­no­likt är det att du skulle rek­om­mendera SvD till en vän eller kol­lega?

...

Read the original on www.svd.se »

7 1,248 shares, 51 trendiness

p5.js Web Editor

...

Read the original on editor.p5js.org »

8 1,214 shares, 51 trendiness

GrapheneOS (@GrapheneOS@grapheneos.social)

To use the Mastodon web ap­pli­ca­tion, please en­able JavaScript. Alternatively, try one of the na­tive apps for Mastodon for your plat­form.

...

Read the original on grapheneos.social »

9 1,122 shares, 37 trendiness

Statement on the comments from Secretary of War Pete Hegseth

Earlier to­day, Secretary of War Pete Hegseth shared on X that he is di­rect­ing the Department of War to des­ig­nate Anthropic a sup­ply chain risk. This ac­tion fol­lows months of ne­go­ti­a­tions that reached an im­passe over two ex­cep­tions we re­quested to the law­ful use of our AI model, Claude: the mass do­mes­tic sur­veil­lance of Americans and fully au­tonomous weapons.

We have not yet re­ceived di­rect com­mu­ni­ca­tion from the Department of War or the White House on the sta­tus of our ne­go­ti­a­tions.

We have tried in good faith to reach an agree­ment with the Department of War, mak­ing clear that we sup­port all law­ful uses of AI for na­tional se­cu­rity aside from the two nar­row ex­cep­tions above. To the best of our knowl­edge, these ex­cep­tions have not af­fected a sin­gle gov­ern­ment mis­sion to date.

We held to our ex­cep­tions for two rea­sons. First, we do not be­lieve that to­day’s fron­tier AI mod­els are re­li­able enough to be used in fully au­tonomous weapons. Allowing cur­rent mod­els to be used in this way would en­dan­ger America’s warfight­ers and civil­ians. Second, we be­lieve that mass do­mes­tic sur­veil­lance of Americans con­sti­tutes a vi­o­la­tion of fun­da­men­tal rights.

Designating Anthropic as a sup­ply chain risk would be an un­prece­dented ac­tion—one his­tor­i­cally re­served for US ad­ver­saries, never be­fore pub­licly ap­plied to an American com­pany. We are deeply sad­dened by these de­vel­op­ments. As the first fron­tier AI com­pany to de­ploy mod­els in the US gov­ern­men­t’s clas­si­fied net­works, Anthropic has sup­ported American warfight­ers since June 2024 and has every in­ten­tion of con­tin­u­ing to do so.

We be­lieve this des­ig­na­tion would both be legally un­sound and set a dan­ger­ous prece­dent for any American com­pany that ne­go­ti­ates with the gov­ern­ment.

No amount of in­tim­i­da­tion or pun­ish­ment from the Department of War will change our po­si­tion on mass do­mes­tic sur­veil­lance or fully au­tonomous weapons. We will chal­lenge any sup­ply chain risk des­ig­na­tion in court.

What this means for our cus­tomers

Secretary Hegseth has im­plied this des­ig­na­tion would re­strict any­one who does busi­ness with the mil­i­tary from do­ing busi­ness with Anthropic. The Secretary does not have the statu­tory au­thor­ity to back up this state­ment. Legally, a sup­ply chain risk des­ig­na­tion un­der 10 USC 3252 can only ex­tend to the use of Claude as part of Department of War con­tracts—it can­not af­fect how con­trac­tors use Claude to serve other cus­tomers.

* If you are an in­di­vid­ual cus­tomer or hold a com­mer­cial con­tract with Anthropic, your ac­cess to Claude—through our API, claude.ai, or any of our prod­ucts—is com­pletely un­af­fected.

* If you are a Department of War con­trac­tor, this des­ig­na­tion—if for­mally adopted—would only af­fect your use of Claude on Department of War con­tract work. Your use for any other pur­pose is un­af­fected.

Our sales and sup­port teams are stand­ing by to an­swer any ques­tions you may have.

We are deeply grate­ful to our users, and to the in­dus­try peers, pol­i­cy­mak­ers, vet­er­ans, and mem­bers of the pub­lic who have voiced their sup­port in re­cent days. Thank you. Above all else, our pri­or­i­ties are to pro­tect our cus­tomers from any dis­rup­tion caused by these ex­tra­or­di­nary events and to work with the Department of War to en­sure a smooth tran­si­tion—for them, for our troops, and for American mil­i­tary op­er­a­tions.

...

Read the original on www.anthropic.com »

10 1,108 shares, 46 trendiness

Microsoft gets tired of “Microslop,” bans the word on its Discord, then locks the server after backlash

Microsoft’s ag­gres­sive AI push in Windows 11 through 2025 brought upon them­selves the ti­tle Microslop. Unfortunately for the com­pany, it’s every­where on so­cial me­dia, and there is­n’t a way to stop the spread, un­less, of course, it’s their own Discord server.

Windows Latest was first to no­tice that the word Microslop” was ac­tively fil­tered in the of­fi­cial Microsoft Copilot Discord server.

As you can see in the above screen­shot, any mes­sage con­tain­ing the term is au­to­mat­i­cally blocked, and users see a mod­er­a­tion no­tice stat­ing that the mes­sage in­cludes a phrase con­sid­ered in­ap­pro­pri­ate by server rules.

The ex­treme back­lash that Microsoft has to en­dure every day on so­cial me­dia is noth­ing short of ex­tra­or­di­nary. Surely the com­pany is re­spon­si­ble for this fall­out, as they pri­or­i­tized AI more than the sta­bil­ity of the OS that it needs to run on.

Copilot, be­ing the most vis­i­ble face of that ef­fort, has nat­u­rally be­come the scape­goat. So when a nick­name like Microslop” starts trend­ing across so­cials, it was only a mat­ter of time be­fore it reached of­fi­cial chan­nels as well.

Windows Latest found that send­ing a mes­sage with the word Microslop” in­side the of­fi­cial Copilot Discord server im­me­di­ately trig­gers an au­to­mated mod­er­a­tion re­sponse. The mes­sage does not ap­pear pub­licly in the chan­nel, and in­stead, only the sender sees the no­tice stat­ing that the con­tent is blocked by the server be­cause it con­tains a phrase deemed in­ap­pro­pri­ate.

Of course, the in­ter­net rarely leaves things there. Shortly af­ter Windows Latest posted about Copilot Discord server block­ing Microslop on X, users be­gan ex­per­i­ment­ing in the server with vari­a­tions such as Microsl0p” us­ing a zero in­stead of the let­ter o.”

Predictably, those ver­sions slipped past the fil­ter. Keyword mod­er­a­tion has al­ways been some­thing of a cat-and-mouse game, and this is­n’t any dif­fer­ent.

What started as a sim­ple key­word fil­ter quickly snow­balled into users de­lib­er­ately test­ing the re­stric­tion and post­ing vari­a­tions of the blocked term. Accounts that in­cluded Microslop” in their mes­sages first got banned from mes­sag­ing again.

Not long af­ter, ac­cess to parts of the server was re­stricted, with mes­sage his­tory hid­den and post­ing per­mis­sions dis­abled for many users.

Microsoft’s brand im­age might al­ready be at an all-time low, and even as the com­pany an­nounced plans to fix Windows 11 with per­for­mance im­prove­ments and less AI, the soft­ware gi­ant can’t risk get­ting more ha­tred to­wards their ex­pen­sive in­vest­ment in Copilot, es­pe­cially since Microsoft’s head start in AI is start­ing to be over­shad­owed by com­peti­tors like Anthropic, Google, OpenAI, and maybe even Apple in the near fu­ture.

Back in December 2024, when Microsoft in­vited users to join the Copilot Discord server through an of­fi­cial X post, the re­sponse was largely cu­ri­ous and en­thu­si­as­tic, with peo­ple will­ing to ex­plore the AIs ca­pa­bil­i­ties.

Since then, sen­ti­ment around Copilot and its us­age has dropped along­side Microsoft’s broader AI push across Windows 11. At its pre­sent state, Copilot has added some ca­pa­bil­i­ties that are gen­uinely use­ful in day-to-day work­flows. Features like con­nec­tors can pull con­tex­tual data from ser­vices such as Google Contacts, Gmail, and Outlook to re­trieve phone num­bers or email ad­dresses di­rectly in­side Copilot, some­thing com­pet­ing tools like Gemini have not yet cracked, as we found in our de­tailed test­ing.

It re­mains to be seen if this episode fades as a mi­nor com­mu­nity mod­er­a­tion story or be­comes an­other chap­ter in Microsoft’s com­pli­cated re­la­tion­ship with its AI roll­out.

Microsoft reached out to Windows Latest with an of­fi­cial state­ment not­ing why the com­pany had to lock the Copilot Discord server.

According to a Microsoft spokesper­son, the Copilot Discord server was re­cently tar­geted by co­or­di­nated spam in­tended to dis­rupt con­ver­sa­tions. The com­pany says the ac­tiv­ity ini­tially ap­peared as large vol­umes of repet­i­tive or ir­rel­e­vant mes­sages, prompt­ing mod­er­a­tors to in­tro­duce tem­po­rary key­word fil­ters to slow the in­flux.

The Copilot Discord chan­nel has re­cently been tar­geted by spam­mers at­tempt­ing to dis­rupt and over­whelm the space with harm­ful con­tent not re­lated to Copilot. Initially, this spam con­sisted of walls of text, so we added tem­po­rary fil­ters for se­lect terms to slow this ac­tiv­ity. We have since made the de­ci­sion to tem­porar­ily lock down the server while we work to im­ple­ment stronger safe­guards to pro­tect users from this harm­ful spam and help en­sure the server re­mains a safe, us­able space for the com­mu­nity,” a Microsoft spokesper­son told Windows Latest.

Microsoft added that block­ing terms such as Microslop,” along with other phrases in the spam cam­paign, was not in­tended as a per­ma­nent pol­icy but a short-term mit­i­ga­tion while the com­pany man­ages to put ad­di­tional pro­tec­tions in place.

...

Read the original on www.windowslatest.com »

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

If you like 10HN please leave feedback and share

Visit pancik.com for more.