10 interesting stories served every morning and every evening.




1 786 shares, 75 trendiness

numerique.gouv.fr

Fermer

Le numérique au sein de l’É­tat

La stratégie numérique de l’É­tat

La DINUM

Les ac­teurs du numérique de l’É­tat

En Europe et à l’in­ter­na­tional

La trans­for­ma­tion numérique des ter­ri­toires

Offre d’ac­com­pa­g­ne­ment

Services numériques

Données publiques

IA

Actualités

Dernières in­for­ma­tions

Espace presse

Blog

Postuler

Fermer

Le numérique au sein de l’É­tat

La stratégie numérique de l’É­tat

La DINUM

Les ac­teurs du numérique de l’É­tat

En Europe et à l’in­ter­na­tional

La trans­for­ma­tion numérique des ter­ri­toires

Offre d’ac­com­pa­g­ne­ment

Services numériques

Données publiques

IA

Actualités

Dernières in­for­ma­tions

Espace presse

Blog

Postuler

Souveraineté numérique : l’É­tat ac­célère la ré­duc­tion de ses dépen­dances ex­tra-eu­ropéennes

À l’ini­tia­tive du Premier min­istre, du min­istre de l’Ac­tion et des Comptes publics, et de la min­istre déléguée chargée de l’In­tel­li­gence ar­ti­fi­cielle et du Numérique, la di­rec­tion in­ter­min­istérielle du numérique (DINUM) a or­gan­isé mer­credi 8 avril 2026 avec la di­rec­tion générale des en­tre­prises (DGE), l’a­gence na­tionale de la sécu­rité des sys­tèmes d’in­for­ma­tion (ANSSI) et la di­rec­tion des achats de l’É­tat (DAE) un sémi­naire in­ter­min­istériel visant à ren­forcer la dy­namique col­lec­tive de ré­duc­tion des dépen­dances numériques ex­tra-eu­ropéennes. Réunissant min­istres, ad­min­is­tra­tions, opéra­teurs publics et ac­teurs privés, cet événe­ment mar­que une ac­céléra­tion de la stratégie française et eu­ropéenne en faveur de la sou­veraineté numérique. Dans la con­ti­nu­ité des di­rec­tives ré­centes com­mu­niquées par le Premier min­istre, no­tam­ment les cir­cu­laires rel­a­tives à la com­mande publique numérique ainsi qu’à la général­i­sa­tion de l’outil de vi­sio­con­férence « Visio », le sémi­naire a per­mis de fixer un ob­jec­tif clair : ré­duire les dépen­dances numériques ex­tra-eu­ropéennes de l’É­tat.S’agis­sant de l’évo­lu­tion du poste de tra­vail, la DINUM an­nonce sa sor­tie de Windows au profit de postes sous sys­tème d’­ex­ploita­tion Linux.S’agissant de la mi­gra­tion vers des so­lu­tions sou­veraines, la Caisse na­tionale d’As­sur­ance mal­adie a an­noncé il y a quelques jours la mi­gra­tion de ses 80 000 agents vers des out­ils du so­cle numérique in­ter­min­istériel (Tchap, Visio et FranceTransfert pour le trans­fert de doc­u­ments).Le mois dernier, le Gouvernement an­nonçait la mi­gra­tion de la plate­forme des don­nées de santé vers une so­lu­tion de con­fi­ance d’ici à fin 2026.Le sémi­naire a per­mis de lancer une nou­velle méth­ode pour sor­tir des dépen­dances en for­mant des coali­tions in­édites as­so­ciant min­istères, grands opéra­teurs publics et ac­teurs privés. Cette dé­marche vise à fédérer les én­er­gies publiques et privées au­tour de pro­jets pré­cis, en s’ap­puyant no­tam­ment sur les com­muns numériques et les stan­dards d’in­teropéra­bil­ité (initiatives Open-Interop, OpenBuro).La DINUM co­or­don­nera un plan in­ter­min­istériel de ré­duc­tion des dépen­dances ex­tra-eu­ropéennes. Chaque min­istère (opérateurs in­clus) sera tenu de for­maliser son pro­pre plan d’ici l’au­tomne, por­tant sur les axes suiv­ants : poste de tra­vail, out­ils col­lab­o­rat­ifs, anti-virus, in­tel­li­gence ar­ti­fi­cielle, bases de don­nées, vir­tu­al­i­sa­tion, équipements réseau. Ces plans d’ac­tion per­me­t­tront de don­ner de la vis­i­bil­ité quant aux be­soins de l’E­tat à la fil­ière in­dus­trielle du numérique, qui dis­pose d’atouts ma­jeurs qu’il con­vient de val­oriser par la com­mande publique.Le tra­vail de car­togra­phie et de di­ag­nos­tic des dépen­dances réal­isé par la Direction des Achats de l’É­tat (DAE), ainsi que celui au­tour de la déf­i­ni­tion d’un ser­vice numérique eu­ropéen porté par la Direction générale des Entreprises (DGE), per­me­t­tra d’affiner l’ob­jec­tif chiffré de ré­duc­tion avec un cal­en­drier clair.Les pre­mières « ren­con­tres in­dus­trielles du numérique », qui seront or­gan­isées par la DINUM en juin 2026, con­stitueront l’oc­ca­sion de con­cré­tiser des coali­tions min­istérielles publiques - privées, avec no­tam­ment la for­mal­i­sa­tion d’une « al­liance pub­lic-privé pour la sou­veraineté eu­ropéenne ».

L’État ne peut plus se con­tenter de con­stater sa dépen­dance, il doit en sor­tir. Nous de­vons nous désen­si­biliser des out­ils améri­cains et repren­dre le con­trôle de notre des­tin numérique. Nous ne pou­vons plus ac­cepter que nos don­nées, nos in­fra­struc­tures et nos dé­ci­sions stratégiques dépen­dent de so­lu­tions dont nous ne maîtrisons ni les rè­gles, ni les tar­ifs, ni les évo­lu­tions, ni les risques. La tran­si­tion est en marche : nos min­istères, nos opéra­teurs et nos parte­naires in­dus­triels s’en­ga­gent au­jour­d’hui dans une dé­marche sans précé­dent pour car­togra­phier nos dépen­dances et ren­forcer notre sou­veraineté numérique. La sou­veraineté numérique n’est pas une op­tion.

min­istre de l’Ac­tion et des Comptes publics

La sou­veraineté numérique n’est pas une op­tion, c’est une né­ces­sité stratégique. L’Europe doit se doter des moyens de ses am­bi­tions, et la France mon­tre l’ex­em­ple en ac­célérant la bas­cule vers des so­lu­tions sou­veraines, in­teropérables et durables. En ré­duisant nos dépen­dances à des so­lu­tions ex­tra-eu­ropéennes, l’É­tat en­voie un mes­sage clair : celui d’une puis­sance publique qui reprend la main sur ses choix tech­nologiques au ser­vice de sa sou­veraineté numérique.

min­istre déléguée chargée de l’In­tel­li­gence ar­ti­fi­cielle et du Numérique

À pro­pos de la di­rec­tion in­ter­min­istérielle du numérique (DINUM) : La DINUM a pour mis­sion d’éla­borer la stratégie numérique de l’É­tat et de pi­loter sa mise en œu­vre. Elle ac­com­pa­gne les pro­jets numériques de l’É­tat, au ser­vice des pri­or­ités gou­verne­men­tales et dans le souci d’une amélio­ra­tion de l’­ef­fi­cac­ité de l’ac­tion publique.

(Ouvre une nou­velle fenêtre) En savoir plus sur nu­merique.gouv.fr

...

Read the original on numerique.gouv.fr »

2 602 shares, 26 trendiness

Native Instant Space Switching on MacOS

• 3 min read • more posts View on • 3 min read • more posts View on

The worst part about the MacOS win­dow man­age­ment sit­u­a­tion is the in­abil­ity to in­stantly switch spaces, and that Apple has con­tin­u­ously ig­nored re­quests to dis­able the nau­se­at­ing switch­ing an­i­ma­tion. Sure, it’s not that long, but I switch spaces of­ten enough to the point where it be­comes very no­tice­able and dri­ves me in­sane.

I be­lieve to have found the best so­lu­tion to in­stant space switch­ing!

But be­fore I show you, of course, other peo­ple share the same sen­ti­ment. I claim that none of the sur­veyed con­tem­po­rary so­lu­tions, ex­cept for what I bring up at the end of this ar­ti­cle, suf­fice for what I want:

This is al­ways the de­fault an­swer to this ques­tion on­line, and I’m sick of it! It does­n’t even solve the prob­lem, but rather re­places it with an equally use­less fade-in an­i­ma­tion. It also has the side ef­fect of ac­ti­vat­ing the prefers-re­duced-mo­tion me­dia query on web browsers.

Install the yabai tiling win­dow man­ager and use its in­stant space switcher.

And to be fair, it works pretty well. There are only two prob­lems: for one, yabai does this by bi­nary patch­ing a part of the op­er­at­ing sys­tem. This is only pos­si­ble by dis­abling System Integrity Protection at your own dis­cre­tion. For the sec­ond, in­stalling yabai forces you to learn and use it as your tiling win­dow man­ager1. I per­son­ally use PaperWM.spoon as my win­dow man­ager. Both of which are in­com­pat­i­ble when in­stalled to­gether.

Use a third-party vir­tual space man­ager fa­cade, hid­ing and show­ing win­dows as needed when switch­ing spaces.

Some pop­u­lar op­tions are FlashSpace and AeroSpace vir­tual work­spaces. I ac­tu­ally of­fer no crit­i­cism other than that they are not na­tive to MacOS, and feel un­nec­es­sary given that all we want to do is dis­able an an­i­ma­tion.

Pay for a li­cense for BetterTouchTool. Enable Move Right Space (Without Animation)” and Move Left Space (Without Animation)”.

Without fur­ther ado, I man­aged to find InstantSpaceSwitcher by ju­r­plel on GitHub. It is a sim­ple menu bar ap­pli­ca­tion that achieves in­stant space switch­ing while of­fer­ing none of the afore­men­tioned draw­backs.

InstantSpaceSwitcher does not re­quire dis­abling Security Integration Protection; it works by sim­u­lat­ing a track­pad swipe with a large amount of ve­loc­ity. It ad­di­tion­ally al­lows you to in­stantly jump to a space num­ber. The last thing it pro­vides is a com­mand line in­ter­face.

The in­stal­la­tion in­struc­tions are listed on the README, and I will briefly re­peat them. You can ei­ther in­stall InstantSpaceSwiter via Homebrew:

$ brew in­stall –cask ju­r­plel/​tap/​in­stant-space-switcher

$ git clone https://​github.com/​ju­r­plel/​In­stantSpaceSwitcher

$ cd InstantSpaceSwitcher

$ ./dist/build.sh

$ open ./build/InstantSpaceSwitcher.app

Once InstantSpaceSwitcher is in­stalled as a na­tive ap­pli­ca­tion, the com­mand line in­ter­face is pro­vided at:

$ InstantSpaceSwitcher.app/Contents/MacOS/ISSCli –help

Usage: InstantSpaceSwitcher.app/Contents/MacOS/ISSCli [left|right|index

Did I men­tion that the repos­i­tory lit­er­ally has one star on GitHub (me)? I want more peo­ple to dis­cover InstantSpaceSwitcher and con­sider it trust­wor­thy; hence, please con­sider giv­ing it a star if you find it help­ful.

↑ Scroll to top ↑

...

Read the original on arhan.sh »

3 434 shares, 70 trendiness

FBI used iPhone notification data to retrieve deleted Signal messages

A new re­port from 404 Media re­veals that the FBI was able to re­cover deleted Signal mes­sages from an iPhone by ex­tract­ing data stored in the de­vice’s no­ti­fi­ca­tion data­base. Here are the de­tails.

According to 404 Media, tes­ti­mony in a re­cent trial in­volv­ing a group of peo­ple set­ting off fire­works and van­dal­iz­ing prop­erty at the ICE Prairieland Detention Facility in Alvarado, Texas,” showed that the FBI was able to re­cover con­tent of in­com­ing Signal mes­sages from a de­fen­dan­t’s iPhone, even though Signal had been re­moved from the de­vice:

One of the de­fen­dants was Lynette Sharp, who pre­vi­ously pleaded guilty to pro­vid­ing ma­te­r­ial sup­port to ter­ror­ists. During one day of the re­lated trial, FBI Special Agent Clark Wiethorn tes­ti­fied about some of the col­lected ev­i­dence. A sum­mary of Exhibit 158 pub­lished on a group of sup­port­ers’ web­site says, Messages were re­cov­ered from Sharp’s phone through Apple’s in­ter­nal no­ti­fi­ca­tion stor­age—Sig­nal had been re­moved, but in­com­ing no­ti­fi­ca­tions were pre­served in in­ter­nal mem­ory. Only in­com­ing mes­sages were cap­tured (no out­go­ing).”

As 404 Media notes, Signal’s set­tings in­clude an op­tion that pre­vents the ac­tual mes­sage con­tent from be­ing pre­viewed in no­ti­fi­ca­tions. However, it ap­pears the de­fen­dant did not have that set­ting en­abled, which, in turn, seem­ingly al­lowed the sys­tem to store the con­tent in the data­base.

404 Media reached out to Signal and Apple, but nei­ther com­pany pro­vided any state­ments on how no­ti­fi­ca­tions are han­dled or stored.

With lit­tle to no tech­ni­cal de­tails about the ex­act con­di­tion of the de­fen­dan­t’s iPhone, it is ob­vi­ously im­pos­si­ble to pin­point the pre­cise method the FBI used to re­cover the in­for­ma­tion.

For in­stance, there are mul­ti­ple sys­tem states an iPhone can be in, each with its own se­cu­rity and data ac­cess con­straints, such as BFU (Before First Unlock), AFU (After First Unlock) mode, and so on.

Security and data ac­cess also change even more dra­mat­i­cally when the de­vice is un­locked, since the sys­tem as­sumes the user is pre­sent and per­mits ac­cess to a wider range of pro­tected data.

That said, iOS does store and cache a lot of data lo­cally, trust­ing that it can rely on these dif­fer­ent states to keep that in­for­ma­tion safe but read­ily avail­able in case the de­vice’s right­ful owner needs it.

Another im­por­tant fac­tor to keep in mind: the to­ken used to send push no­ti­fi­ca­tions is­n’t im­me­di­ately in­val­i­dated when an app is deleted. And since the server has no way of know­ing whether the app is still in­stalled af­ter the last no­ti­fi­ca­tion it sent, it may con­tinue push­ing no­ti­fi­ca­tions, leav­ing it up to the iPhone to de­cide whether to dis­play them.

Interestingly, Apple just changed how iOS val­i­dates push no­ti­fi­ca­tion to­kens on iOS 26.4. While it is im­pos­si­ble to tell whether this is a re­sult of this case, the tim­ing is still no­table.

Back to the case, given Exhibit 158’s de­scrip­tion that the mes­sages were re­cov­ered from Sharp’s phone through Apple’s in­ter­nal no­ti­fi­ca­tion stor­age,” it is pos­si­ble the FBI ex­tracted the in­for­ma­tion from a de­vice backup.

In that case, there are many com­mer­cially avail­able tools for law en­force­ment that ex­ploit iOS vul­ner­a­bil­i­ties to ex­tract data that could have helped the FBI ac­cess this in­for­ma­tion.

To read 404 Media’s orig­i­nal re­port of this case, fol­low this link.

...

Read the original on 9to5mac.com »

4 389 shares, 26 trendiness

I Still Prefer MCP Over Skills

The Right Tool for the Job

TL;DR: The AI space is push­ing hard for Skills” as the new stan­dard for giv­ing LLMs ca­pa­bil­i­ties, but I’m not a fan. Skills are great for pure knowl­edge and teach­ing an LLM how to use an ex­ist­ing tool. But for giv­ing an LLM ac­tual ac­cess to ser­vices, the Model Context Protocol (MCP) is the far su­pe­rior, more prag­matic ar­chi­tec­tural choice. We should be build­ing con­nec­tors, not just more CLIs.

Maybe it’s an ar­ti­fact of spend­ing too much time on X, but lately, the nar­ra­tive that MCP is dead” and Skills are the new stan­dard” has been ham­mered into my brain. Everywhere I look, some­one is cel­e­brat­ing the death of the Model Context Protocol in fa­vor of drop­ping a SKILL.md into their repos­i­tory.

I am a very heavy AI user. I use Claude Code, Codex, and Gemini for cod­ing. I rely on ChatGPT, Claude, and Perplexity al­most every day to man­age every­thing from Notion notes to my DEVONthink data­bases, and even my emails.

And hon­estly? I just don’t like Skills.

I hope MCP sticks around. I re­ally don’t want a fu­ture where every sin­gle ser­vice in­te­gra­tion re­quires a ded­i­cated CLI and a mark­down man­ual.

Here’s why I think the push for Skills as a uni­ver­sal so­lu­tion is a step back­ward, and why MCP still gets the ar­chi­tec­ture right.

Claude pulling re­cent user feed­back from Kikuyo through the Kikuyo MCP, no CLI needed.

The core phi­los­o­phy of MCP is sim­ple: it’s an API ab­strac­tion. The LLM does­n’t need to un­der­stand the how; it just needs to know the what. If the LLM wants to in­ter­act with DEVONthink, it calls de­von­think.do_x(), and the MCP server han­dles the rest.

This sep­a­ra­tion of con­cerns brings some un­beat­able ad­van­tages:

Zero-Install Remote Usage: For re­mote MCP servers, you don’t need to in­stall any­thing lo­cally. You just point your client to the MCP server URL, and it works. Seamless Updates: When a re­mote MCP server is up­dated with new tools or re­sources, every client in­stantly gets the lat­est ver­sion. No need to push up­dates, up­grade pack­ages, or re­in­stall bi­na­ries.Saner Auth: Authentication is han­dled grace­fully (often with OAuth). Once the client fin­ishes the hand­shake, it can per­form ac­tions against the MCP. You aren’t forc­ing the user to man­age raw to­kens and se­crets in plain text.True Portability: My re­mote MCP servers work from any­where: my Mac, my phone, the web. It does­n’t mat­ter. I can man­age my Notion via my LLM of choice from wher­ever a client is avail­able.Sand­box­ing: Remote MCPs are nat­u­rally sand­boxed. They ex­pose a con­trolled in­ter­face rather than giv­ing the LLM raw ex­e­cu­tion power in your lo­cal en­vi­ron­ment.Smart Discovery: Modern apps (ChatGPT, Claude, etc.) have tool search built-in. They only look for and load tools when they are ac­tu­ally needed, sav­ing pre­cious con­text win­dow.Fric­tion­less Auto-Updates: Even for lo­cal se­tups, an MCP in­stalled di­rectly via npx -y or uv can auto-up­date on every launch.

Not all Skills are the same. A pure knowl­edge skill (one that teaches the LLM how to for­mat a com­mit mes­sage, write tests a cer­tain way, or use your in­ter­nal jar­gon) ac­tu­ally works well. The prob­lems start when a Skill re­quires a CLI to ac­tu­ally do some­thing.

My biggest gripe with Skills is the as­sump­tion that every en­vi­ron­ment can, or should, run ar­bi­trary CLIs.

Most skills re­quire you to in­stall a ded­i­cated CLI. But what if you aren’t in a lo­cal ter­mi­nal? ChatGPT can’t run CLIs. Neither can Perplexity or the stan­dard web ver­sion of Claude. Unless you are us­ing a full-blown com­pute en­vi­ron­ment (like Perplexity Computer, Claude Cowork, Claude Code, or Codex), any skill that re­lies on a CLI is dead on ar­rival.

This leads to a cas­cade of an­noy­ing UX and ar­chi­tec­tural prob­lems:

The Deployment Mess: CLIs need to be pub­lished, man­aged, and in­stalled through bi­na­ries, NPM, uv, etc. The Secret Management Nightmare: Where do you put the API to­kens re­quired to au­then­ti­cate? If you’re lucky, the en­vi­ron­ment has a .env file you can dump plain-text se­crets into. Some ephemeral en­vi­ron­ments wipe them­selves, mean­ing your CLI works to­day but for­gets your se­crets to­mor­row.Frag­mented Ecosystems: Skill man­age­ment is cur­rently the wild west. When a skill up­dates, you have to re­in­stall it. Some tools sup­port in­stalling skills via npx skills, but that only works in Codex and Claude Code, not Claude Cowork or stan­dard Claude. Pure knowl­edge skills work in Claude, but most oth­ers don’t. Some tools sup­port a skills mar­ket­place,” oth­ers don’t. Some can in­stall from GitHub, oth­ers can’t. You try to in­stall an OpenClaw skill into Claude and it ex­plodes with YAML pars­ing er­rors be­cause the meta­data fields don’t match.Con­text Bloat: Using a skill of­ten re­quires load­ing the en­tire SKILL.md into the LLMs con­text win­dow, rather than just ex­pos­ing the sin­gle tool sig­na­ture it needs. It’s like forc­ing some­one to read the en­tire car’s own­er’s man­ual when all they want to do is call car.turn_on().

If a Skill’s in­struc­tions start with install this CLI first,” you’ve just added an un­nec­es­sary ab­strac­tion layer and ex­tra steps. Why not just use a re­mote MCP in­stead?

Codex pulling up a pure knowl­edge skill to learn how Phoenix colo­cated hooks work. No CLI, no MCP, just con­text.

The Right Tool for the Job#

I don’t want Skills to be­come the de facto way to con­nect an LLM to a ser­vice. We can ex­plain API shapes in a Skill so the LLM can curl it, but how is that bet­ter than pro­vid­ing a clean, strongly-typed in­ter­face via MCP?

Here’s how I think the ecosys­tem should look:

When to use MCP:

MCP should be the stan­dard for giv­ing an LLM an in­ter­face to con­nect to some­thing: a web­site, a ser­vice, an ap­pli­ca­tion. The ser­vice it­self should dic­tate the in­ter­face it ex­poses.

Take Google Calendar. A gcal CLI is fine. The prob­lem is a Skill that tells the LLM to in­stall it, man­age auth to­kens, and shell out to it. An OAuth-backed re­mote MCP owned by Google han­dles all of that at the pro­to­col level, and works from any client with­out any setup. To con­trol Chrome, the browser should ex­pose an MCP end­point for state­ful con­trol, rather than re­ly­ing on a janky chrome-cli.To de­bug with Hopper, the cur­rent built-in MCP that lets the LLM run step() is in­fi­nitely bet­ter than a sep­a­rate hop­per-cli.Xcode should ship with a built-in MCP that han­dles auth when an LLM con­nects to a pro­ject.No­tion should have mcp.no­tion.so/​mcp avail­able na­tively, in­stead of forc­ing me to down­load no­tion-cli and man­age auth state man­u­ally. (They ac­tu­ally do have a re­mote MCP now, which is ex­actly the right call.)

When to use Skills:

Skills should be pure.” They should fo­cus on knowl­edge and con­text.

Teaching ex­ist­ing tools: I love hav­ing a .claude/skills folder that teaches the LLM how to use tools I al­ready have in­stalled. A skill ex­plain­ing how to use curl, git, gh, or gcloud makes com­plete sense. We don’t need a curl MCP. We just need to teach the LLM how to con­struct good curl com­mands. However, a ded­i­cated re­mote GitHub MCP makes much more sense for man­ag­ing is­sues than re­ly­ing on a gh CLI skill. Standardizing work­flows: Skills are per­fect for teach­ing Claude your busi­ness jar­gon, in­ter­nal com­mu­ni­ca­tion style, or or­ga­ni­za­tional struc­ture.Teach­ing han­dling of cer­tain things: This is an­other great ex­am­ple and what Anthropic does as well with the PDF Skill - it ex­plains how to deal with PDF files and how to ma­nip­u­late them with Python.Secret Management pat­terns: Having a skill that tells Claude Use fnox for this repo, here is how to use it” just makes sense. Every time we deal with se­crets, Claude pulls up the skill. That’s way bet­ter than build­ing a cus­tom MCP just to call get_se­cret().

Skills liv­ing di­rectly in the repo. The LLM picks them up au­to­mat­i­cally when work­ing in that pro­ject.

Shower thought: Maybe the ter­mi­nol­ogy is the prob­lem. Skills should just be called LLM_MANUAL.md, and MCPs should be called Connectors.

Both have their place.

For the ser­vices I own, I al­ready do this. A few ex­am­ples:

mcp-server-de­von­think: A lo­cal MCP server that gives any LLM di­rect con­trol over DEVONthink. No CLI wrap­per, just a clean tool in­ter­face.mi­crofn: Exposes a re­mote MCP at mcp.mi­crofn.dev so any MCP-capable client can use it out of the box. MCP Nest: Tunnels lo­cal MCP servers through the cloud so they’re reach­able re­motely at mcp.mcp­nest.dev/​mcp. Built it be­cause I kept want­ing re­mote ac­cess to lo­cal MCPs with­out ex­pos­ing my ma­chine di­rectly.

For mi­crofn and Kikuyo I also pub­lished Skills, but they cover the CLI, not the MCP. That said, writ­ing this made me re­al­ize: a skill that ex­plains how to use an MCP server ac­tu­ally makes a lot of sense. Not to re­place the MCP, but to give the LLM con­text be­fore it starts call­ing tools. What the ser­vice does, how the tools re­late to each other, when to use which one. A knowl­edge layer on top of a con­nec­tor layer. That’s the com­bi­na­tion I’d want.

And this is ac­tu­ally a pat­tern I’ve been us­ing more and more in prac­tice. When I’m work­ing with a MCP server, I in­evitably dis­cover gotchas and non-ob­vi­ous pat­terns: a date for­mat that needs to be YYYY-MM-DD in­stead of YYYYMMDD, a search func­tion that trun­cates re­sults un­less you bump a pa­ra­me­ter, a tool name that does­n’t do what you’d ex­pect. Rather than re­dis­cov­er­ing these every ses­sion, I just ask Claude to wrap every­thing we learned into a Skill. The LLM al­ready has the con­text from our con­ver­sa­tion, so it writes the Skill with all the gotchas, com­mon pat­terns, and cor­rected as­sump­tions baked in.

After dis­cov­er­ing back­link gotchas and date for­mat quirks in the NotePlan MCP, I asked Claude to pack­age every­thing into a skill. Now every fu­ture ses­sion starts with that knowl­edge.

The re­sult is a Skill that acts as a cheat sheet for the MCP, not a re­place­ment for it. The MCP still han­dles the ac­tual con­nec­tion and tool ex­e­cu­tion. The Skill just makes sure the LLM does­n’t waste to­kens stum­bling through the same pit­falls I al­ready solved. It’s the com­bi­na­tion of both that makes the ex­pe­ri­ence ac­tu­ally smooth.

At the same time, I’ll keep main­tain­ing my dot­files repo full of Skills for pro­ce­dures I use of­ten, and I’ll keep drop­ping .claude/skills into my repos­i­to­ries to guide the AIs be­hav­ior.

I just hope the in­dus­try does­n’t aban­don the Model Context Protocol. The dream of seam­less AI in­te­gra­tion re­lies on stan­dard­ized in­ter­faces, not a frac­tured land­scape of hacky CLIs. I’m still hold­ing out hope for of­fi­cial Skyscanner, Booking.com, Trip.com, and Agoda.com MCPs.

Speaking of re­mote MCPs: I built MCP Nest specif­i­cally for this prob­lem. A lot of use­ful MCP servers are lo­cal-only by na­ture, think Fastmail, Gmail, or any­thing that runs on your ma­chine. MCP Nest tun­nels them through the cloud so they be­come re­motely ac­ces­si­ble, us­able from Claude, ChatGPT, Perplexity, or any MCP-capable client, across all your de­vices. If you want your lo­cal MCPs to work every­where with­out ex­pos­ing your ma­chine di­rectly, that’s what it’s for.

...

Read the original on david.coffee »

5 389 shares, 64 trendiness

OpenAI Backs Bill That Would Limit Liability for AI-Enabled Mass Deaths or Financial Disasters

OpenAI is throw­ing its sup­port be­hind an Illinois state bill that would shield AI labs from li­a­bil­ity in cases where AI mod­els are used to cause se­ri­ous so­ci­etal harms, such as death or se­ri­ous in­jury of 100 or more peo­ple or at least $1 bil­lion in prop­erty dam­age.

The ef­fort seems to mark a shift in OpenAI’s leg­isla­tive strat­egy. Until now, OpenAI has largely played de­fense, op­pos­ing bills that could have made AI labs li­able for their tech­nol­o­gy’s harms. Several AI pol­icy ex­perts tell WIRED that SB 3444—which could set a new stan­dard for the in­dus­try—is a more ex­treme mea­sure than bills OpenAI has sup­ported in the past.

The bill would shield fron­tier AI de­vel­op­ers from li­a­bil­ity for critical harms” caused by their fron­tier mod­els as long as they did not in­ten­tion­ally or reck­lessly cause such an in­ci­dent, and have pub­lished safety, se­cu­rity, and trans­parency re­ports on their web­site. It de­fines a fron­tier model as any AI model trained us­ing more than $100 mil­lion in com­pu­ta­tional costs, which likely could ap­ply to America’s largest AI labs, like OpenAI, Google, xAI, Anthropic, and Meta.

We sup­port ap­proaches like this be­cause they fo­cus on what mat­ters most: Reducing the risk of se­ri­ous harm from the most ad­vanced AI sys­tems while still al­low­ing this tech­nol­ogy to get into the hands of the peo­ple and busi­nesses—small and big—of Illinois,” said OpenAI spokesper­son Jamie Radice in an emailed state­ment. They also help avoid a patch­work of state-by-state rules and move to­ward clearer, more con­sis­tent na­tional stan­dards.”

Under its de­f­i­n­i­tion of crit­i­cal harms, the bill lists a few com­mon ar­eas of con­cern for the AI in­dus­try, such as a bad ac­tor us­ing AI to cre­ate a chem­i­cal, bi­o­log­i­cal, ra­di­o­log­i­cal, or nu­clear weapon. If an AI model en­gages in con­duct on its own that, if com­mit­ted by a hu­man, would con­sti­tute a crim­i­nal of­fense and leads to those ex­treme out­comes, that would also be a crit­i­cal harm. If an AI model were to com­mit any of these ac­tions un­der SB 3444, the AI lab be­hind the model may not be held li­able, so long as it was­n’t in­ten­tional and they pub­lished their re­ports.

Federal and state leg­is­la­tures in the US have yet to pass any laws specif­i­cally de­ter­min­ing whether AI model de­vel­op­ers, like OpenAI, could be li­able for these types of harm caused by their tech­nol­ogy. But as AI labs con­tinue to re­lease more pow­er­ful AI mod­els that raise novel safety and cy­ber­se­cu­rity chal­lenges, such as Anthropic’s Claude Mythos, these ques­tions feel in­creas­ingly pre­scient.

In her tes­ti­mony sup­port­ing SB 3444, a mem­ber of OpenAI’s Global Affairs team, Caitlin Niedermeyer, also ar­gued in fa­vor of a fed­eral frame­work for AI reg­u­la­tion. Niedermeyer struck a mes­sage that’s con­sis­tent with the Trump ad­min­is­tra­tion’s crack­down on state AI safety laws, claim­ing it’s im­por­tant to avoid a patch­work of in­con­sis­tent state re­quire­ments that could cre­ate fric­tion with­out mean­ing­fully im­prov­ing safety.” This is also con­sis­tent with the broader view of Silicon Valley in re­cent years, which has gen­er­ally ar­gued that it’s para­mount for AI leg­is­la­tion to not ham­per America’s po­si­tion in the global AI race. While SB 3444 is it­self a state-level safety law, Niedermeyer ar­gued that those can be ef­fec­tive if they reinforce a path to­ward har­mo­niza­tion with fed­eral sys­tems.”

At OpenAI, we be­lieve the North Star for fron­tier reg­u­la­tion should be the safe de­ploy­ment of the most ad­vanced mod­els in a way that also pre­serves US lead­er­ship in in­no­va­tion,” Niedermeyer said.

Scott Wisor, pol­icy di­rec­tor for the Secure AI pro­ject, tells WIRED he be­lieves this bill has a slim chance of pass­ing, given Illinois’ rep­u­ta­tion for ag­gres­sively reg­u­lat­ing tech­nol­ogy. We polled peo­ple in Illinois, ask­ing whether they think AI com­pa­nies should be ex­empt from li­a­bil­ity, and 90 per­cent of peo­ple op­pose it. There’s no rea­son ex­ist­ing AI com­pa­nies should be fac­ing re­duced li­a­bil­ity,” Wisor says.

...

Read the original on www.wired.com »

6 374 shares, 15 trendiness

Personal Laptop Colocation Service

Transform your old lap­top into a pow­er­ful al­ways-on­line server. Based in Amsterdam, we aim to pro­vide pro­fes­sional colo­ca­tion ser­vices in the US and across European dat­a­cen­ters in part­ner­ship with Hetzner.

Most VPS providers give you se­verely lim­ited com­pute re­sources at pre­mium prices. You even share these re­sources with other cus­tomers with­out know­ing!

Your old lap­top packs more CPU power, RAM, and stor­age than their en­try-level of­fer­ings - and with us, you’ll pay just €7/month for pro­fes­sional host­ing. Why set­tle for a re­stricted vir­tual slice when you can have your own lap­top run­ning ded­i­cated just to you, 24/7 in a pro­fes­sional dat­a­cen­ter?

In ad­di­tion, you cut-down on e-waste and help save the planet!

Every lap­top gets its own sta­tic, fully routable IPv4 ad­dress for max­i­mum ac­ces­si­bil­ity.

Professional dat­a­cen­ter host­ing with guar­an­teed up­time and mon­i­tor­ing, to be pow­ered by Hetzner’s in­fra­struc­ture.

Free KVM-over-IP ac­cess to your lap­top - just like hav­ing it right next to you.

We of­fer free as­sis­tance for your ini­tial setup and en­sure you get your choice of server soft­ware up and run­ning. A Kubernetes clus­ter, Proxmox or a niche CI/CD so­lu­tion? No prob­lem!

Fill out our ap­pli­ca­tion form and we’ll con­tact you within 2 work­ing days.

We’ll send you a pre­paid ship­ping box - just pack your lap­top and drop it at your near­est col­lec­tion point. Please note that we are still fig­ur­ing out the specifics of lo­gis­tics.

Our team con­nects your lap­top and sends you a link to ac­cess your ma­chine via KVM. If you need fur­ther as­sis­tance, we’re just an email away!

Access your lap­top server from any­where, any­time.

Click here to signup with your de­tails and we’ll con­tact you within 2 work­ing days to dis­cuss your setup.

How much does it cost?

We charge a flat fee of €7 per month, re­gard­less of power con­sump­tion. This in­cludes all ser­vices: colo­ca­tion, IPv4 ad­dress, KVM ac­cess, and mon­i­tor­ing.

What do I need to send?

Your lap­top and its power brick. We’ll pro­vide a pre­paid ship­ping box - just pack every­thing se­curely and drop it at your near­est col­lec­tion point. It’s com­pletely free!

What are the con­nec­tiv­ity re­quire­ments?

Your lap­top must have ei­ther an eth­er­net port or a USB port (we’ll pro­vide a USB eth­er­net adapter if needed). We con­nect all lap­tops via eth­er­net - WiFi is not avail­able in the dat­a­cen­ter.

What kind of setup as­sis­tance do you pro­vide?

We of­fer com­pli­men­tary as­sis­tance with ini­tial setup, in­clud­ing in­stal­la­tion of most Linux dis­tri­b­u­tions, Kubernetes clus­ters, Proxmox vir­tu­al­iza­tion, and other com­mon server soft­ware. Just let us know what you need, and we’ll help you get started.

What are the lap­top re­quire­ments?

Your lap­top should be fully func­tional with a work­ing power sup­ply and ei­ther an eth­er­net port or USB port for con­nec­tiv­ity. Age is­n’t a fac­tor. We might mod­ify your lap­top to re­move or power down the bat­tery, wire­less ra­dios, etc. to en­sure it can be used safely in the data cen­ter.

Where are your dat­a­cen­ters lo­cated?

We’re based in Amsterdam and aim to work with Hetzner to pro­vide colo­ca­tion ser­vices in the US and across their European dat­a­cen­ter net­work. This en­sures your lap­top is hosted in pro­fes­sional, se­cure fa­cil­i­ties with ex­cel­lent con­nec­tiv­ity.

Your lap­top will be hosted in Hetzner’s pro­fes­sional dat­a­cen­ters with 24/7 se­cu­rity, cli­mate con­trol, and re­dun­dant power. We also pro­vide ba­sic fire­wall ser­vices and DDoS pro­tec­tion.

...

Read the original on colaptop.pages.dev »

7 374 shares, 14 trendiness

Lzon.ca. A personal blog, by a programmer and IT expert.

I’d like to tell the story of job I just com­pleted for a cus­tomer, so that I can make a point about how I feel Microsoft and other large tech­nol­ogy com­pa­nies are ac­tively hos­tile to their users.

I re­ceived a call from my neigh­bour ask­ing if I would be will­ing to help her hus­band with an is­sue he’d been hav­ing with his lap­top. As the proud new owner of my own IT ser­vices com­pany, I of course agreed to take a look.

I spoke with my neigh­bour’s hus­band, and im­me­di­ately saw that he was not tech lit­er­ate. I learned to iden­tify the type while do­ing IT work for my pre­vi­ous em­ployer. This made un­der­stand­ing his prob­lem dif­fi­cult, but through con­ver­sa­tion we did man­age to come to an un­der­stand­ing about what the real is­sue was that he was ex­pe­ri­enc­ing.

What he was see­ing was that he was no longer re­ceiv­ing email in Outlook, and that there was an er­ror mes­sage claim­ing he had run out of avail­able stor­age’, or some other sim­i­lar non­sense. He is a very light email user, and he knows it. He was con­fused as to why he’d run out of stor­age. I was con­fused as well, at first.

Through in­ves­ti­ga­tion I dis­cov­ered that the Outlook email ser­vice uses Onedrive for stor­age of all mes­sages and at­tach­ments. He had 5 GB of avail­able stor­age, the amount that is given with his free ac­count. This had yet to ex­plain why he was see­ing that er­ror mes­sage, there was no way he had con­sumed 5 GB of stor­age with just his email use.

Unsurprisingly, his Onedrive stor­age was­n’t filled by his email, it was filled by the per­sonal files from his Windows 11 desk­top. Did he con­fig­ure Windows to save those files to his Onedrive di­rec­tory, in­stead of his lo­cal home di­rec­tory? Of course not, that was done by de­fault. Did he even know that this was hap­pen­ing? Also, no. He had no idea this was hap­pen­ing un­til he saw that er­ror mes­sage, which oh-so-help­fully of­fered to solve’ his prob­lem by of­fer­ing him a sub­scrip­tion to ad­di­tional paid stor­age ca­pac­ity on the ac­count.

He did man­age to loosely un­der­stand what was hap­pen­ing, enough at least to start delet­ing files from his com­puter to try and make the er­ror mes­sage go away. I was never able to con­firm with him, but I sus­pect that he deleted files (including fam­ily pho­tos) for which he had no other backup.

I will be blunt, this in­fu­ri­ates me. This was­n’t the first time I’ve seen this. I saw it many times while work­ing for my pre­vi­ous em­ployer. Microsoft has in­ten­tion­ally bro­ken a fun­da­men­tal as­sump­tion about how files are stored on a com­puter run­ning Windows. They do this with­out ask­ing the user, and with­out ad­e­quately ex­plain­ing what they have done. Microsoft is very ob­vi­ously em­ploy­ing dark pat­terns in or­der to goad its users into pay­ing for Onedrive stor­age.

I’m a com­puter nerd, and if you are read­ing this you prob­a­bly are as well. We can change that set­ting our­selves with­out much thought, and we prob­a­bly have back­ups of our im­por­tant data in case re­cov­ery is nec­es­sary. I will tell you that many peo­ple are ex­tremely util­i­tar­ian about their com­puter use. They use their com­put­ers only to the de­gree that they must to serve their other in­ter­ests in life. They also trust that their prop­erty, the de­vice that cost them hun­dreds of dol­lars is­n’t try­ing to cheat them like some back-al­ley con artist.

This is­n’t a game. My cus­tomer is­n’t a num­ber on a spread­sheet, merely an in­cre­ment to­wards reach­ing some use­less KPI. He deleted fam­ily pho­tos to try and get that er­ror mes­sage to go away, so that he could just re­ceive emails again. He may not un­der­stand what hap­pened, but he’s not stu­pid. He sus­pected that this was a scam to get him to pay for some­thing he did­n’t need, he just did­n’t un­der­stand how the scam worked.

First and fore­most, I per­formed a com­plete backup of his data. I took every­thing that I could find lo­cally on the ma­chine, as well as every­thing from the Onedrive ac­count, in­clud­ing the Trash. It was­n’t much, only a few gi­ga­bytes, which I trans­ferred to a sep­a­rate USB drive.

I care­fully trans­ferred all files out of the Onedrive di­rec­tory struc­ture and back into his home folder. The Windows file ex­plorer did not make this easy or in­tu­itive.

I pro­ceeded to delete every­thing from the Onedrive ac­count, through the web in­ter­face. I did no­tice that delet­ing files merely moved them into the Trash, which was still be­ing counted to­wards to­tal stor­age us­age. I as­sumed this was yet an­other sub­tle dark pat­tern.

I al­luded to chang­ing set­tings as a way to solve this. The ap­proach we of­ten took at my pre­vi­ous em­ployer was to sim­ply dis­able Onedrive in the Windows startup list. That could have worked in this case but I had a bet­ter idea. Remove Onedrive en­tirely.

I have mus­cle mem­ory at this point for how to do it, if you were won­der­ing this is the pro­ce­dure I used:

Open an ad­min Terminal and load up the Chris Titus’ winu­til.

This en­tirely re­moves the Onedrive ap­pli­ca­tion from Windows, in­clud­ing all in­te­gra­tions into other pro­grams, such as the file ex­plorer.

I then pro­ceeded to delete every­thing from the Onedrive ac­count, in­clud­ing the Trash. The er­ror mes­sages fi­nally went away in Outlook and he was able to re­cieve email mes­sages again.

I may be preach­ing to the choir, but re­gard­less I want to use this post as my op­por­tu­nity to make these points in my own way. Microsoft is ac­tively hos­tile to­wards its users.They have be­come a bas­ket-case of an or­gan­i­sa­tion, where chas­ing ir­rel­e­vant KPIs has be­come more im­por­tant than prod­uct qual­ity, or even base­line re­spect for their users.The ex­act same can be said, to vary­ing de­grees, to every other large con­sumer-tech com­pany.

I see this as the re­sult of bad in­cen­tive struc­tures. A toxic game the­ory that has been al­lowed to play out over many years with­out proper scrutiny. The lefty in me might think that this is a man­i­fes­ta­tion of Late Capitalism. If so then it feels like we’re about 30 sec­onds away from mid­night.

I think a lot about the pos­si­ble ways to tweak said in­cen­tive struc­tures, to build a choice ar­chi­tec­ture that can pre­vent even the first step in the process that led to this.

Days like to­day, when I’m think­ing about the real ac­tual ways that this non­sense im­pacts real ac­tual peo­ple, I can’t ig­nore the hu­mans in this loop. People need to ac­tu­ally take re­spon­si­bil­ity for their choices, not just turn their brain off when the num­ber looks right in the spread­sheet.

If you en­joyed this post, let me know! Email me at mail@lzon.ca, or reach out through one of my so­cial ac­counts linked on the home­page.

...

Read the original on lzon.ca »

8 359 shares, 21 trendiness

- YouTube

...

Read the original on www.youtube.com »

9 298 shares, 13 trendiness

Charcuterie

...

Read the original on charcuterie.elastiq.ch »

10 292 shares, 6 trendiness

Maine Is About to Become the First State to Ban Major New Data Centers

Your AI chat­bot ses­sions and cloud-stored pho­tos might get more ex­pen­sive if other states fol­low Maine’s lead. Lawmakers there just ad­vanced the na­tion’s first statewide mora­to­rium on large data cen­ters, cit­ing con­cerns that the AI boom is push­ing elec­tric­ity costs even higher in a state al­ready suf­fer­ing America’s prici­est power bills.

The Democratic-controlled leg­is­la­ture ad­vanced bill LD 307, tem­porar­ily block­ing per­mits for any new data cen­ter re­quir­ing more than 20 megawatts. The mea­sure runs un­til November 2027, buy­ing time for a new Data Center Coordination Council to study how these fa­cil­i­ties strain Maine’s ag­ing elec­tri­cal grid.

Gov. Janet Mills sup­ports the pause while de­vel­op­ers scram­ble for ex­emp­tions.

The bill gained trac­tion af­ter res­i­dents in Wiscasset and Lewiston suc­cess­fully op­posed data cen­ter pro­pos­als over wa­ter us­age and safety con­cerns. Projects now in limbo in­clude fa­cil­i­ties planned for:

* Jay (at an old pa­per mill site)

Taking this pause now is go­ing to be cru­cial,” Rep. Christopher Kessler said, ac­cord­ing to Maine Public Radio, re­flect­ing grow­ing leg­isla­tive con­cern about grid ca­pac­ity. Developer Tony McDonald dis­agrees, call­ing the pro­posed re­stric­tions disastrous” and claim­ing his team got caught in this drag­net.”

The Pine Tree State is­n’t alone in pump­ing the brakes. Counties in Michigan and Indiana have im­posed their own lo­cal pauses on data cen­ter de­vel­op­ment, while cities from Denver to Detroit weigh re­stric­tions as hy­per­scale fa­cil­i­ties chase cheap land and re­li­able power.

The tim­ing re­flects broader anx­i­ety about AIs in­fra­struc­ture ap­petite. Data cen­ters now con­sume roughly 4% of U. S. elec­tric­ity, with pro­jec­tions sug­gest­ing that fig­ure could dou­ble by 2030. For Mainers al­ready pay­ing some of the na­tion’s high­est res­i­den­tial rates, that math­e­mat­i­cal re­al­ity hits dif­fer­ently than Silicon Valley’s end­less op­ti­miza­tion rhetoric.

Maine’s move rep­re­sents what econ­o­mist Anirban Basu called a canary in the coal mine” for state-level re­sis­tance to Big Tech’s en­ergy de­mands. Whether that prece­dent spreads de­pends on how ag­gres­sively other gov­er­nors fol­low Maine’s lead—and whether your fa­vorite AI ser­vices start charg­ing ac­cord­ingly.

...

Read the original on www.gadgetreview.com »

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

If you like 10HN please leave feedback and share

Visit pancik.com for more.