10 interesting stories served every morning and every evening.
Six million fake stars, $0.06 per click, and a VC funding pipeline that treats GitHub popularity as proof of traction. We ran our own analysis on 20 repos and found the fingerprints.
Six million fake stars, $0.06 per click, and a VC funding pipeline that treats GitHub popularity as proof of traction. We ran our own analysis on 20 repos and found the fingerprints.
A GitHub star costs $0.06 at the low end. A seed round unlocks $1 million to $10 million. The math is obvious, and thousands of repositories are exploiting it.
This investigation maps the full ecosystem: from the peer-reviewed research quantifying the problem, to the marketplaces selling stars openly, to the venture capital pipeline that converts star counts into funding decisions. We ran our own analysis on 20 repositories using the GitHub API, sampling thousands of stargazer profiles to independently verify which projects show fingerprints of manipulation - and which don’t.
The picture that emerges is a mature, professionalized shadow economy operating in plain sight.
The definitive account comes from a peer-reviewed study presented at ICSE 2026 by researchers at Carnegie Mellon University, North Carolina State University, and Socket. Their tool, StarScout, analyzed 20 terabytes of GitHub metadata - 6.7 billion events and 326 million stars from 2019 to 2024 - and identified approximately 6 million suspected fake stars distributed across 18,617 repositories by roughly 301,000 accounts.
The problem accelerated dramatically in 2024. By July, 16.66% of all repositories with 50 or more stars were involved in fake star campaigns - up from near-zero before 2022. The researchers’ detection proved accurate: 90.42% of flagged repositories and 57.07% of flagged accounts had been deleted as of January 2025, confirming GitHub itself recognized these as illegitimate.
AI and LLM repositories emerged as the largest non-malicious category of fake-star recipients, ahead of blockchain/cryptocurrency projects in absolute volume at 177,000 fake stars. The study notes that “many of which are academic paper repositories or LLM-related startup products.” Critically, 78 repositories with detected fake star campaigns appeared on GitHub Trending, proving that purchased stars successfully game the platform’s discovery algorithm.
Earlier foundational work includes Dagster’s March 2023 investigation, where engineers purchased stars from two vendors to study the phenomenon. They found services via basic Google search. A premium vendor - GitHub24, a registered German company (Moller und Ringauf GbR) - charged EUR 0.85 per star and delivered reliably, with all 100 stars persisting after one month. A budget service (Baddhi Shop) sold 1,000 stars for $64, though only 75% survived.
The star-selling ecosystem spans dedicated websites, freelance platforms, exchange networks, and underground channels. At least a dozen active websites sell GitHub stars directly, including SocialPlug.io, Buy.fans, Boost-Like.store, GitHubPromoter.com, Followdeh.com, and Vurike.com.
On Fiverr, 24 active gigs sell GitHub promotion, with packages from $5 for basic stars and forks to $25+ for “organic promotion.” Many use obfuscated language to evade platform filters. Star exchange platforms like GithubStarMate.com and SafeStarExchange.com - both live and operational - enable free mutual starring through credit-based systems.
The infrastructure extends beyond stars. At least seven open-source tools on GitHub (fake-git-history, commit-bot, Commiter, and others) exist specifically to fabricate GitHub contribution graphs. Pre-built GitHub profiles with five-year commit histories and Arctic Code Vault Contributor badges sell for approximately $5,000 on Telegram.
Some vendors offer replacement guarantees - Followdeh advertises 30-day coverage, and premium services promise “non-drop” stars that survive GitHub’s detection systems. SocialPlug claims 3.1 million stars delivered across 53,000+ clients and offers a formal API for programmatic purchasing.
A Tsinghua University study (ACSAC 2020) documented Chinese QQ and WeChat promotion groups with 1,020+ members processing roughly 20 repos per day, generating an estimated $3.4 to $4.4 million annually in promoter profits.
To move beyond reported statistics, we built a GitHub API analysis tool and ran it against 20 repositories: projects flagged by StarScout, fast-growing AI repos from the Runa Capital ROSS Index, and known organic baselines. For each repo, we sampled 150 stargazer profiles and measured account age, public repos, followers, and bio presence.
The fingerprints of manipulation are unmistakable once you know what to look for.
Organic repositories are starred by developers who have been on GitHub for years, maintain their own projects, and follow other users. Ghost accounts - zero repos, zero followers, no bio - make up about 1% of a healthy project’s stargazer base.
These repos share a distinctive fingerprint. The accounts aren’t obviously new - median ages of 1,000+ days - so they pass simple “young account” filters. But they’re empty: a third have zero repos, half to four-fifths have zero followers, and a quarter are complete ghosts. These are aged accounts purchased or farmed specifically for star campaigns.
The fork-to-star ratio is the strongest signal. Flask has 235 forks per 1,000 stars. Shardeum has 22. FreeDomain has 17. When nobody is forking a 157,000-star repository, nobody is using it. The watcher-to-star ratio tells the same story: FreeDomain’s 0.001 means that for every 1,000 people who starred the repo, just one actually watches it for updates.
FreeDomain is worth isolating: 157,000 stars, but only 168 watchers and 2,676 forks. That’s a watcher-to-star ratio 26x lower than Flask. 81.3% of sampled stargazers have zero followers. This is a repository where almost nobody who starred it has any visible presence on GitHub.
Union Labs is the most consequential case. It was ranked #1 on Runa Capital’s ROSS Index for Q2 2025 - a widely cited VC industry report identifying the “hottest open-source startups” - with 54.2x star growth and 74,300 stars. Our analysis found 32.7% zero-repo accounts, 52% zero-follower accounts, and a fork-to-star ratio of 0.052. The StarScout analysis flagged it with 47.4% suspected fake stars. An influential investment-sourcing report that VCs rely on was topped by a project with nearly half its stars suspected as artificial.
RagaAI-Catalyst and openai-fm show clear manipulation signals. RagaAI has 76.2% zero-follower accounts and 28% ghosts - nearly identical to the blockchain pattern. openai-fm is the most extreme case in our dataset: 66% suspicious accounts, 36% ghosts, and a median account age of just 116 days. Two-thirds of its stargazers are less than a year old with virtually no GitHub activity. (The StarScout analysis notes this is likely third-party bots, not OpenAI itself.)
Langflow - flagged by StarScout at 47.9% fake - showed clean metrics in our profile sample, with a median age of 2,859 days and low ghost rates. This likely reflects improved account quality since the StarScout scan. The 0.060 fork-to-star ratio is still notably low - roughly a quarter of Flask’s - suggesting less genuine adoption relative to star count.
For comparison, NousResearch’s hermes-agent looks relatively organic: median age 8 years, 6% ghosts, fork-to-star ratio of 0.133. Despite Reddit accusations of astroturfing, the stargazer population is mostly real developers. The project’s crypto-adjacent audience includes more casual GitHub users, which explains slightly elevated zero-follower rates, but the fundamental engagement pattern is legitimate.
The connection between GitHub star counts and startup funding is not speculative - it is explicitly documented by the investors themselves.
Jordan Segall, Partner at Redpoint Ventures, published an analysis of 80 developer tool companies showing that the median GitHub star count at seed financing was 2,850 and at Series A was 4,980. He confirmed: “Many VCs write internal scraping programs to identify fast growing github projects for sourcing, and the most common metric they look toward is stars.”
Those numbers set an implicit target. For $85 to $285 in budget stars, a startup can manufacture the 2,850-star seed median. For $990 to $4,500, it can reach Series A territory. Against typical seed rounds of $1-10 million, the ROI ranges from 3,500x to 117,000x.
Runa Capital publishes the ROSS (Runa Open Source Startup) Index quarterly, ranking the 20 fastest-growing open-source startups by GitHub star growth rate. Per TechCrunch, 68% of ROSS Index startups that attracted investment did so at seed stage, with $169 million raised across tracked rounds. GitHub itself, through its GitHub Fund partnership with M12 (Microsoft’s VC arm), commits $10 million annually to invest in 8-10 open-source companies at pre-seed/seed stages based partly on platform traction.
* Lovable (formerly GPT Engineer): 50,000+ stars, $7.5M pre-seed, $200M Series A at $1.8 billion valuation with 45 employees
Dagster’s Fraser Marlow, who led the fake star investigation, admitted directly: “In the run-up to the fundraising, I spent a fair amount of time preoccupied with GitHub stars.” An academic paper in Organization Science provided rigorous statistical evidence that GitHub engagement correlates with startup funding outcomes - startups active on GitHub are 15 percentage points more likely to have raised a financing round.
The incentive loop is self-reinforcing: VCs use stars as sourcing signals, so startups manipulate stars, so VCs see inflated traction, so more VCs adopt star-tracking, so more startups manipulate. Redpoint’s own published benchmarks give startups an exact target to buy toward.
Our analysis revealed the fork-to-star ratio as the strongest simple heuristic for identifying potential manipulation. The logic is straightforward: a star costs nothing and conveys no commitment. A fork means someone downloaded the code to use or modify it.
Any repository with a fork-to-star ratio below 0.05 and more than 10,000 stars warrants scrutiny. The watcher-to-star ratio is even more telling: organic projects average 0.005 to 0.030; FreeDomain registers 0.001.
These ratios aren’t perfect - educational repos and curated lists naturally have low fork rates. But as a first-pass filter, they catch the most egregious cases that raw star counts miss entirely.
The problem extends to every platform where popularity metrics influence trust.
npm downloads are trivially inflatable. Developer Andy Richardson demonstrated this by using a single AWS Lambda function (free tier) to push his package is-introspection-query to nearly 1 million downloads per week - surpassing legitimate packages like urql and mobx. Zero actual users. The CMU study found that of repos with fake star campaigns, only 1.23% appeared in package registries, but of those 738 packages, 70.46% had zero dependent projects.
VS Code Marketplace extensions are similarly vulnerable. Researchers demonstrated 1,000+ installs of a fake extension in 48 hours. AquaSec found 1,283 extensions with known malicious dependencies totaling 229 million installs.
X/Twitter promotion amplifies artificial GitHub virality through engagement pods - private groups where members agree to like, repost, and comment on each other’s content. Growth Terminal sells this as a product feature. NBC News and Clemson University researchers identified a network of 686 X accounts that posted more than 130,000 times using LLM-generated content, some containing telltale artifacts like “Dolphin here!” from the uncensored Dolphin model they employed.
The Higgsfield AI case documents cross-platform astroturfing at industrial scale: over 100 confirmed spam posts across 60+ subreddits, combined with mass template DMs to content creators offering payment for promotion.
The FTC Consumer Review Rule, effective October 21, 2024, explicitly prohibits selling or buying “fake indicators of social media influence” generated by bots or fake accounts for commercial purposes. Penalties: up to $53,088 per violation. The FTC issued its first warning letters to 10 companies in December 2025. A GitHub star purchased to promote a commercial product fits this framework.
The SEC precedent is more direct. HeadSpin’s CEO was charged with wire fraud (maximum 20 years) and securities fraud for inflating metrics to deceive investors out of $80 million. ComplYant’s founder faced charges for claiming $250,000 monthly revenue when actual revenue was $250.
The SEC’s message: “Startup fundraisers cannot use the ‘fake it until you make it’ ethos to whitewash lying to investors.”
If a startup buys fake GitHub stars to inflate perceived traction during a fundraising round, and investors rely on those metrics to deploy capital, the wire fraud framework applies: using electronic communications to misrepresent material facts for financial gain. No one has been charged specifically for fake GitHub stars yet. Given the CMU research documenting the practice at scale and the FTC rule explicitly covering fake social influence metrics, it may only be a matter of time.
GitHub’s Acceptable Use Policies explicitly prohibit “inauthentic interactions, such as fake accounts and automated inauthentic activity,” “rank abuse, such as automated starring or following,” and “creation of or participation in secondary markets for the purpose of the proliferation of inauthentic activity.” The policies even specifically prohibit starring incentivized by “cryptocurrency airdrops, tokens, credits, gifts or other give-aways.”
Enforcement is reactive and asymmetric. GitHub removed 90.42% of repositories flagged by StarScout, but only 57.07% of the accounts that delivered those stars. The infrastructure for future campaigns largely remains intact. When Dagster published its investigation, fake star profiles were deleted within 48 hours - but only after public embarrassment, not proactive detection.
GitHub has never published an engineering blog post about its detection methods or enforcement statistics. No transparency report exists for star manipulation. The company’s VP of Security Operations told Wired only that they “disabled user accounts in accordance with GitHub’s Acceptable Use Policies,” declining to elaborate - though that comment was specifically about the Stargazers Ghost Network malware operation, not vanity metric manipulation.
The CMU researchers recommended GitHub adopt a weighted popularity metric based on network centrality rather than raw star counts. A change that would structurally undermine the fake star economy. GitHub has not implemented it.
Bessemer Venture Partners calls stars “vanity metrics” and instead tracks unique monthly contributor activity - anyone who created an issue, comment, PR, or commit. Fewer than 5% of top 10,000 projects ever exceeded 250 monthly contributors; only 2% sustained it across six months.
Jono Bacon at StateShift recommends five metrics that correlate with real adoption: package downloads, issue quality (production edge cases from real users), contributor retention (time to second PR), community discussion depth, and usage telemetry.
The fork-to-star ratio our analysis surfaced is the simplest first-pass filter. A healthy project has roughly 100-200 forks per 1,000 stars. Projects below 50 forks per 1,000 stars with high absolute counts deserve a closer look.
As one commenter put it: “You can fake a star count, but you can’t fake a bug fix that saves someone’s weekend.”
First, the incentive loop. VCs use stars as sourcing signals. Startups manipulate stars. VCs see inflated traction. More VCs adopt star-tracking. More startups manipulate. Redpoint’s published benchmarks - 2,850 at seed, 4,980 at Series A - effectively give startups a price list for how many stars to buy.
Second, the AI sector’s specific vulnerability. The combination of extreme hype, crypto-adjacent funding models that reward token price over product quality, and a reviewer ecosystem on X/Twitter populated partly by fabricated personas creates a perfect environment for manufactured credibility. Our analysis confirmed this: the repos with the worst manipulation signals were overwhelmingly blockchain and crypto-adjacent AI projects.
Third, GitHub’s enforcement asymmetry. Removing repos but leaving 57% of fake accounts intact preserves the labor force of the fake star economy while doing little to deter repeat offenses. Until GitHub implements structural changes - weighted popularity metrics, account-level reputation scoring, or transparent enforcement reporting - the gap between star counts and genuine developer adoption will continue to widen.
The star economy is a $50 problem with a $50 million consequence. Until the platforms, investors, and regulators catch up, the market will keep paying the $50.
...
Read the original on awesomeagents.ai »
UPDATE–Vercel, a widely used cloud platform for developing and deploying apps, has disclosed a breach of its internal systems, and says a “limited subset of customers” is affected.
The incident came to light on Sunday and the company says it has brought in an incident response provider to investigate the intrusion. The company recommends that customers check activity logs for suspicious activity and also rotate environmental variables as a precaution. Vercek also suggests that customers use its sensitive environmental variables feature to mark things such as API keys as sensitive, which then causes Vercel to store them in an unreadable format.
Vercel said the intrusion was related to the compromise of a third-party app.
“Our investigation has revealed that the incident originated from a third-party AI tool whose Google Workspace OAuth app was the subject of a broader compromise, potentially affecting hundreds of its users across many organizations,” the company said.
Vercel did not identify the app but included IOCs the identifier for it. Given that the intrusion originated with a third-party app, there may well be other related incidents emerging in the coming hours or days.
“We’ve identified a security incident that involved unauthorized access to certain internal Vercel systems. We are actively investigating, and we have engaged incident response experts to help investigate and remediate. We have notified law enforcement and will update this page as the investigation progresses,” the company said in a statement.
“At this time, we have identified a limited subset of customers that were impacted and are engaging with them directly.”
Vercel provides a wide range of services for developers and enterprises, and has a number of offerings that are focused on agentic AI workloads.
Vercel did not specify which of its systems were compromised or how many of its customers are affected, but said it has contacted the customers that it has identified as being affected.
“Initially we identified a limited subset of customers whose Vercel credentials were compromised. We reached out to that subset and recommended an immediate rotation of credentials. If you have not been contacted, we do not have reason to believe that your Vercel credentials or personal data have been compromised at this time,” the company said.
Later on Sunday, Context, an AI provider, published a security notice related to the Vercel intrusion, saying that it had identified and halted an incident in March that turned out to be connected to Vercel’s incident. Context officials said an attacker gained access to the company’s AWS environment and compromised OAuth tokens for some of Context’s consumer users.
“Today, based on information provided by Vercel and some additional internal investigation, we learned that, during the incident last month, the unauthorized actor also likely compromised OAuth tokens for some of our consumer users. We also learned that the unauthorized actor appears to have used a compromised OAuth token to access Vercel’s Google Workspace,” the Context statement says.
“Vercel is not a Context customer, but it appears at least one Vercel employee signed up for the AI Office Suite using their Vercel enterprise account and granted ‘Allow All’ permissions. Vercel’s internal OAuth configurations appear to have allowed this action to grant these broad permissions in Vercel’s enterprise Google Workspace.”
This story was updated on April 19 to add information about the source of the intrusion and on April 20 to add information from Context.
...
Read the original on decipher.sc »
I spend a lot of time negotiating this in the software world:
And if you’re wondering why this happens, it’s normally because:
So lots of designers and product people have leapt onto 1, basically trying to turn talking to people into terms engineering people find more cuddly. Like “framework”. Or “system”. Or even that term that’s in vogue, socio-technical system.
Stop. The problem isn’t that you need a better system. The problem is you’re avoiding doing the work.
The problem is, 2 is much harder than 1. So how do you listen to people?
Listening is not the same as just doing what someone tells you they want
Tonnes of frameworks around this concept, so I won’t repeat what others have done decently already. Jobs To Be Done, Outcome Driven Innovation, and in the UX camp, empathy mapping.
You underestimate the specialism effect on your own worldview
You spend so long learning a subject but a specific set of “surely they know this?!”. It can even be an area that the person is an expert in! Well, no, they don’t. They know other things instead. You need to understand more about what they know to be able to listen properly.
You assume “technical” is one thing
Such a common pitfall for software developers. Technical is a whole heterogenous beautiful spectrum of knowledge areas, and it’s not “exactly the knowledge I gained as a software developer with the exact jobs I had”. If you are still thinking of people with the binary of “technical” and “non-technical”, you definitely will be missing insights and most likely, you’re not listening properly.
You assume everyone has the same resources as you
The same energy, the same skills, etc. So maybe you have a health condition, and you manage it a certain way, but when you chat with someone else with the same health condition, they just can’t do the things you do, or vice versa. Some people are great at maths. Some people are great at other things. Some people have less money or reserves and act more risk averse. Some people don’t. And so on.
You assume that because you met one person with one characteristic, that the rest will be like that.
See also: assuming older people don’t understand computers. Some don’t. Some do. Not every woman is your mother or daughter.
On the macro level - personalities change over time.
On the micro level - work personas are different to people at home, judgement alters when things are stressful or when certain situations arise.
This is fundamentally why a “fixed” project management just doesn’t work for making software. You set the requirements up front. People change in the interim. It comes out. At the very very best, it matches what was requested at the start. But it’s not what is wanted anymore. And people load in their own expectations, often not articulated, as they wait for The Thing and the reality never matches all of that.
You assume what they say is the same as what they are thinking
Some people say what they mean. Some don’t. A lot of people say they say what they mean but actually aren’t doing that.
Yeah. I said it. Stop hating or dismissing people for misunderstanding the thing you documented badly. Stop assuming they are bad at their job or their lives.
If you’re dismissive of someone, you are extremely unlikely to be able to listen to them properly.
You assume 80 people are the same as 1 x 80 individuals.
Turns out, B2B is more human than B2C - all those messy relationships, dynamics, soft power vs org chart, and so on. Group dynamics add more here.
If you can’t listen to them, then you’re gonna be missing the juiciest stuff that’s gonna make you the most money, and steam you ahead of the competitors, and even, weirdly, help minimise some sources of tech debt too - turns out every misunderstanding adds a new thing in the code you gotta work with later.
Hopefully, this will give a little clue for when we fall into not listening… so we can all listen better.
...
Read the original on ashley.rolfmore.com »
Last week, popular World of Warcraft private server Turtle WoW got hit with a cease and desist from Blizzard after a judge ruled in the studio’s favor regarding a copyright infringement suit filed last September. Court documents revealed that the two parties reached a settlement that hinged on “certain actions that are required to be taken by certain parties,” and today, the other shoe dropped for anyone still playing the modded MMO: a forum post announced a complete shutdown of the project.
“Working on Turtle WoW has been the highlight of our lives,” said Turtle WoW developer Torta in the post. “The adventures you had, the battles you fought, and the friends you met are what made it all worthwhile. We hope you will cherish those moments. What we leave behind are fond memories of an 8-year-long journey, and we hope you’ll remember it every now and then.”
The servers will close on May 14, and all servers have been shot forward to the final patch “for those who want to see the new raids before the project’s sunset.” All associated social media channels, including the forum site, will close later this year on Oct. 16.
Fans of the server saying their farewells on the subreddit and forum. “Wish I ended up playing more and dinging 60 in the end, but the time I did spend was fun. Thanks for the game and wishing everyone all the best,” wrote forum user Zeran. Reddit user ElChuppolaca wrote, “This is genuinely heartbreaking but I figured it would come seeing as they delayed any response for so long.”
If you’re unfamiliar with the server, it takes an Old School RuneScape approach to World of Warcraft’s pre-expansion era, back before you could roll a paladin on the Horde or get an epic mount without grinding for hours. There are new raids, zones, playable races, and dungeons, but nothing that raises the max level or incorporates lore from recent story arcs.
The server aimed to deliver the “Classic Plus” experience fans of vanilla WoW have clamored for since official pre-expansion servers landed, and with Blizzard teasing its own take on the idea following the end of the game’s Season of Discovery, it’s hard not to see parallels with the shutdown of Nostalrius (which came just a year before World of Warcraft Classic was announced).
Regrettably, it seems that publisher-approved fan servers like EverQuest’s Project 1999 and City of Heroes’s Homecoming are the exception and not the rule, as in the end, the Turtle WoW team’s open plea for a fan server licensing framework proved fruitless.
...
Read the original on www.pcgamer.com »
La voiture autonome promettait un rêve, elle se transforme en cauchemar pour certains usagers. Une enquête révèle comment Elon Musk et Tesla ont utilisé les routes comme terrain d’essai en précipitant la mise sur le marché d’un système de conduite autonome par intelligence artificielle.
Le constructeur automobile a passé sous silence des milliers d’incidents graves. Certains ont coûté la vie à des conducteurs et des passagers. D’autres usagers de la route se sont retrouvés impliqués sans le savoir.
L’enquête s’appuie sur une fuite massive de données internes à Tesla. Ces documents révèlent l’ampleur du problème. Le constructeur était conscient depuis des années des défaillances de ses systèmes.
Les fichiers montrent des milliers de plaintes de clients. Plus de 2400 concernent des accélérations spontanées et le nombre d’accidents dépasse les 1000. Dans de nombreux cas, le statut indiqué était “non résolu”.
Certaines voitures Tesla ont accéléré ou freiné brutalement sans raison. En intelligence artificielle, on appelle ces dysfonctionnements des “hallucinations”, comme lorsque ChatGPT donne une réponse complètement fausse.
Sur la route, les conséquences sont désastreuses. Le système de conduite autonome peut mal interpréter son environnement. A grande vitesse, ces erreurs deviennent mortelles.
Je ne savais pas que le pilote automatique existait. Quand je l’ai découvert, je me suis senti comme un cobaye Dillon Angulo, impliqué dans un accident avec une Tesla
Le problème touche tous les usagers. Alors que beaucoup n’ont jamais accepté d’être les cobayes de Tesla, ils se retrouvent malgré eux exposés aux défaillances du système “Autopilot”.
>> Lire à ce sujet : Les automobilistes encore “cobayes” des systèmes d’aide à la conduite
Naibel Benavides avait 22 ans. Cette simple piétonne est morte dans un accident impliquant une Tesla en mode “Autopilot”. Son compagnon Dillon Angulo a survécu avec de graves blessures.
“Je ne savais pas que le pilote automatique existait. Quand je l’ai découvert, je me suis senti comme un cobaye”, témoigne Dillon Angulo, qui souffre encore aujourd’hui des conséquences de l’accident.
La famille de Naibel a décidé d’attaquer Tesla en justice. Elle accuse le constructeur d’avoir caché des informations cruciales. Tesla a toujours rejeté la faute sur le conducteur.
Les enquêteurs ont rencontré des obstacles inhabituels. Les données de l’accident auraient dû être disponibles dans la “boîte noire” du véhicule. Or, Tesla a affirmé que ces données étaient corrompues.
Les avocats des victimes ont fait appel à des experts, qui ont réussi à récupérer les données supprimées. Ces informations prouvent que Tesla était au courant de la défaillance dès le soir de l’accident.
La voiture en mode “Autopilot” avait détecté les obstacles. Elle n’a pourtant rien fait pour éviter la collision. Seule une alerte a retenti juste avant l’impact.
Un jury a condamné Tesla à verser plus de 243 millions de dollars de dommages et intérêts. Cette sanction marque une première dans les affaires liées à l’“Autopilot”. Les jurés ont jugé que Tesla et le conducteur étaient responsables.
“C’est un jour historique pour la justice”, a déclaré l’avocat des victimes. Le verdict montre que les constructeurs ne peuvent pas utiliser les routes publiques comme laboratoire.
Tesla a tenté de faire annuler ce verdict. Fin février, un juge fédéral a confirmé la sanction contre le constructeur. L’entreprise peut encore faire appel.
Tesla fait l’objet de plusieurs enquêtes aux Etats-Unis. Le ministère de la Justice examine si le constructeur a trompé les consommateurs. L’Administration nationale de la sécurité routière enquête également.
>> Lire aussi : Tesla évite un long procès sur sa technologie d’aide à la conduite
Des lanceurs d’alerte ont témoigné auprès des autorités. Ils décrivent une entreprise qui privilégie la rapidité au détriment de la sécurité. La version test de la conduite autonome a été précipitée sur le marché, alors que plusieurs employés avaient alerté la direction sur les dangers de l’“Autopilot”.
Les experts s’attendent à ce que d’autres poursuites judiciaires suivent. Le premier verdict ouvre la voie à de nouveaux procès contre Tesla.
...
Read the original on www.rts.ch »
Five of those six users have placed no more bets since, but one of the account’s recent activity shows it has subsequently made $163,000 by correctly betting on a US-Iran ceasefire by 7 April, which was announced by Washington and Tehran on that day.
...
Read the original on www.bbc.com »
The Swiss voice in the world since 1935
How Switzerland got caught in the Magnitsky case — again
Read more: How Switzerland got caught in the Magnitsky case — again
Read more: Millions of dollars linked to Magnitsky fraud case leave Switzerland
Read more: City of London urges Swiss airports to give UK travellers e-gate access
Read more: Afghanistan’s Taliban tap Swiss, other travellers for flyover fees
Read more: Our newsletter on geopolitics
How Switzerland got caught in the Magnitsky case — again
Read more: How Switzerland got caught in the Magnitsky case — again
When is a democracy no longer a democracy?
Read more: When is a democracy no longer a democracy?
Why Merantix founder Adrian Locher chose Berlin over Zurich for his AI start-up
Read more: Why Merantix founder Adrian Locher chose Berlin over Zurich for his AI start-up
How are you dealing with the rising cost of fossil fuels?
Read more: How are you dealing with the rising cost of fossil fuels?
The Swiss Connection Podcast: Hear Swiss science stories for the world
Read more: The Swiss Connection Podcast: Hear Swiss science stories for the world
A queer filmmaker in Switzerland captures the divide on her visit home to China
Read more: A queer filmmaker in Switzerland captures the divide on her visit home to China
The right to privacy, except during wartime
Read more: The right to privacy, except during wartime
At what point does someone belong in Switzerland?
Read more: At what point does someone belong in Switzerland?
Justice in sight for the Swiss convicted for helping the Resistance
Read more: Justice in sight for the Swiss convicted for helping the Resistance
To what extent do you think assisted suicide should be a legally available option to those who want to end their lives?
Read more: To what extent do you think assisted suicide should be a legally available option to those who want to end their lives?
From e-cigarettes to lab devices: surprising facts about Swiss patents
Read more: From e-cigarettes to lab devices: surprising facts about Swiss patents
Cured but uninsurable: the hidden financial burden of surviving cancer in Switzerland
Read more: Cured but uninsurable: the hidden financial burden of surviving cancer in Switzerland
How the war in Iran is affecting the Swiss food industry
Read more: How the war in Iran is affecting the Swiss food industry
Read more: A brain scan before a prescription? Geneva’s bet on precision psychiatry
Reality hits: hard truths come to light in the final episode of ‘Lost Cells’
Read more: Reality hits: hard truths come to light in the final episode of ‘Lost Cells’
Read more: Swiss diaspora divided after Orbán’s fall in Hungary election
Swiss voters to decide on stricter rules for conscientious objection
Read more: Swiss voters to decide on stricter rules for conscientious objection
Where cows compete to become queens
Read more: Where cows compete to become queens
The SWIplus app: your connection to Switzerland
Read more: The SWIplus app: your connection to Switzerland
Swiss authorities want to reduce dependency on Microsoft
Copyright 2024 The Associated Press. All Rights Reserved
The Swiss government is aiming to gradually shift away from a dependency on Microsoft products, according to the NZZ am Sonntag newspaper.
+Get the most important news from Switzerland in your inbox
A spokesman for the Federal Chancellery told the newspaper that the federal administration “aims to reduce its dependency on Microsoft, step by step and in the long term”.
This comes as a surprise, as Microsoft 365 was recently installed on some 54,000 administration workstations — despite concerns about data security. Calls for alternatives previously met with internal resistance and charges of “tinkering”, the NZZ am Sonntag writes.
‘Switzerland must not give in to the Big Tech narrative’
This content was published on
Switzerland can be more independent from tech giants like Microsoft when it comes to artificial intelligence, says a leading digital sovereignty expert.
Read more: ‘Switzerland must not give in to the Big Tech narrative’
However, former army chief Thomas Süssli called for alternative solutions to be examined more quickly. A feasibility study now shows that replacement with open-source software is possible. Germany serves as a reference: there, work is underway on an independent open-source solution in which Bern is also interested.
The German state of Schleswig-Holstein has already switched over its administration. Open-source software can be used freely, while it can also be further developed independently of corporations.
Swiss authorities have spent a tidy amount on Microsoft software in recent years: an investigation by SRFExternal link last year showed that the federal government and cantons spent over CHF1.1 billion ($1.4 billion) on licences with the tech giant over the past ten years.
The Trump administration and its approach to the rule of law are increasing concerns among users of US technology. This is because US law — thanks to the 2018 Cloud Act — allows the government to access all data stored by US tech corporations.
This means that if data is stored on servers or clouds of US firms such as Microsoft, Apple or Adobe — no matter where in the world — US authorities may request this data from the US corporations. This could even be the case if the servers are in Switzerland. Users generally have no idea which authority is accessing the data nor what is being done with it.
We select the most relevant news for an international audience and use automatic translation tools to translate them into English. A journalist then reviews the translation for clarity and accuracy before publication.
Providing you with automatically translated news gives us the time to write more in-depth articles. The news stories we select have been written and carefully fact-checked by an external editorial team from news agencies such as Bloomberg or Keystone.
If you have any questions about how we work, write to us at english@swissinfo.ch
In compliance with the JTI standards
More:
SWI swissinfo.ch certified by the Journalism Trust Initiative
...
Read the original on www.swissinfo.ch »
⚠ This data may be out of date or incorrect. A research project is ongoing to further develop such maps.
⚠ This data may be out of date or incorrect. A research project is ongoing to further develop such maps.
A map of all ~2,100 Swiss municipalities showing which provider handles their official email — grouped by jurisdiction — based on public DNS records and other public network signals.
Digital sovereignty: US-based providers are subject to the US CLOUD Act, which allows US authorities to request stored data, regardless of where it is physically hosted. This map makes the current provider landscape visible.
Each municipality’s official domain is checked via 11 signals from DNS records, SMTP banners, ASN lookups, and a public Microsoft API endpoint, then classified by provider type with confidence scoring.
Disclaimer: DNS records indicate mail routing and authorized senders, not necessarily where data is stored.
The code and data are on GitHub.
If you have noticed an error, please submit an issue.
...
Read the original on mxmap.ch »
The U. S.-Israeli war with Iran, now in an unstable ceasefire, has exposed a structural failure in the global semiconductor memory supply chain, and it is not the one analysts seem to be tracking. The story receiving attention is helium: Qatar’s Ras Laffan facility went offline, a 45-day inventory clock started running, and spot prices doubled within days. The story receiving almost no attention is bromine, and it is potentially the more dangerous one. Bromine is the raw material from which specialized chemical suppliers produce semiconductor-grade hydrogen bromide gas, the etch chemical that South Korean fabs use to carve the transistor structures in every Dynamic Random-Access Memory (DRAM) and NAND flash chip on earth. A DRAM chip powers active computation and loses its contents the moment power cuts. A NAND chip retains data without power and underlies every form of digital storage. Together they underpin every modern computing device, from the phone in your pocket to the data center running your AI applications.
South Korea sources 97.5 percent of its bromine imports from Israel. Beyond that vulnerable concentration, converting bromine into semiconductor-grade hydrogen bromide gas requires dedicated purification infrastructure, and producers outside Israel are already fully committed to existing customers and stretched too thin to absorb additional demand. Building new conversion capacity takes years of permitting, equipment procurement, and fabrication qualification.
ICL Group, the Israeli multinational formerly known as Israel Chemicals Ltd., currently continues Dead Sea operations. Israel routes most trade through Mediterranean ports at Haifa and Ashdod, bypassing the Strait of Hormuz entirely. But Iran has been striking the Negev — Israel’s southern desert and the heart of its defense and industrial infrastructure — with ballistic missiles for three weeks, hitting Dimona and Arad, both within 35 kilometers of ICL’s Dead Sea extraction and conversion complex. If Israeli bromine production is displaced, there are no conversion facilities outside Israel capable of immediately producing semiconductor-grade hydrogen bromide gas at the scale required to replace it, and policymakers have not yet acted on that fact.
The vulnerability sits in plain sight, within missile range and outside any meaningful policy response. A disruption would be immediate and global. Within weeks, shortages would propagate across everything from consumer devices to military systems.
Bromine’s role in semiconductor manufacturing is specific and non-substitutable. Its primary derivative, hydrogen bromide, is consumed at the polysilicon etching stage foundational to both DRAM and NAND flash production. Each DRAM memory cell requires a polysilicon gate electrode etched with extreme precision over a silicon oxide layer as thin as 20 angstroms. Hydrogen bromide gas plasmas achieve a polysilicon-to-oxide selectivity ratio of 100 to 1, while chlorine-based alternatives achieve roughly 30 to 1. At advanced DRAM node geometries, that is the difference between a functional transistor and a destroyed one. Bromine also appears in chemical vapor deposition processes and chip packaging. There is no viable near-term substitute in any of these applications.
Three structural realities determine why the gap cannot be bridged through market reallocation. First, bromine already converted for industrial use such as flame retardants and drilling fluids cannot be reconverted. Those processes are chemically irreversible at any industrial scale and the resulting compounds cannot meet the parts-per-billion purity specifications that fabrication facilities require. The two supply chains draw from the same raw material but diverge permanently at the point of conversion. Second, converting raw bromine to semiconductor-grade hydrogen bromide gas requires dedicated purification infrastructure, specifically gas-phase distillation columns capable of lowering trace metals to parts-per-billion contamination levels. That infrastructure does not exist at scale outside the existing semiconductor chemical supply chain, and building more facilities requires permitting, equipment procurement, testing, and fabrication qualification measured in years. Third, producers such as Resonac, Air Liquide, and Adeka manufacture semiconductor-grade hydrogen bromide gas outside Israel, but their combined capacity is already committed to existing customers: Taiwan Semiconductor Manufacturing Company, the world’s dominant contract chipmaker; Samsung, the leading producer of DRAM and high-bandwidth memory; and Semiconductor Manufacturing International Corporation, China’s largest state-backed foundry. Critically, those customers are not holding steady: AI infrastructure buildout is accelerating demand across the board, meaning outside producers are stretched thin against a growing baseline. Even if outside producers could expand output, South Korean facilities would be competing for that capacity with Taiwan, Samsung’s own logic plants, and China, all of whom face the same accelerating demand.
The Dead Sea is among the most bromine-rich bodies of water on earth. ICL Group, which extracts at the lowest cost of any producer globally, dominates a supply that Israel and Jordan together account for roughly two thirds of globally. Critically, ICL’s hydrogen bromide gas production, including the semiconductor-grade output supplied to South Korean fabrication plants, is manufactured at the same Sodom facility where extraction occurs, meaning extraction and conversion infrastructure are co-located in the same vulnerable corridor. Iranian missiles have already penetrated Israeli air defenses in the Negev on multiple occasions, wounding nearly 200 people in Dimona and Arad, both in the same geographic corridor as ICL’s production and conversion sites.
The mechanism of disruption does not require a direct hit on an ICL facility. War risk insurance for vessel calls at Israeli ports has already risen from 0.2 percent to between 0.7 and 1.0 percent of vessel value per seven-day call, adding up to $500,000 in costs per voyage on a mid-sized cargo ship. Even for ships routed through the Mediterranean rather than the Red Sea, those insurance costs apply the moment a vessel calls at an Israeli port. The war risk premium follows the port, not the route. ZIM, Israel’s primary shipping line, has implemented a “war risk premium surcharge” on all cargo to and from Israel. Haifa oil refinery — the country’s largest — was shut down after its power station was damaged in an Iranian attack, demonstrating that critical industrial infrastructure does not require a direct strike to be forced offline. The downstream consequences of even a partial disruption to that corridor would propagate immediately across the global memory supply chain.
Samsung and SK hynix together dominate approximately 70 percent of the global DRAM market. SK hynix alone holds roughly 57 percent of the high bandwidth memory market. Since DRAM and NAND underpin every modern computing device, a supply disruption would propagate across the full consumer and industrial electronics stack, not only AI infrastructure. High bandwidth memory — a specialized form of DRAM stacked vertically to deliver the data speeds that AI accelerators such as Nvidia’s graphics processing units require — is sold out through 2026, and DRAM suppliers hold only two to three weeks of inventory. A shortage would force both companies to allocate scarce hydrogen bromide gas to their highest-value lines — high bandwidth memory for AI accelerators — at the expense of commodity DRAM and NAND used in phones, personal computers, laptops, and data storage. The consequences fall hardest across Africa, South Asia, and Latin America, where memory already accounts for 15 to 20 percent of the bill of materials for a mid-range smartphone. That share rises sharply for budget devices, the primary gateway to digital participation across Africa, South Asia, and Latin America. Smartphone prices in Bangladesh have already risen 10 to 25 percent in 2026 as a direct result of DRAM and NAND inflation, with similar increases reported in Nigeria and South Africa. Budget smartphones are reverting to 4 gigabytes of RAM in 2026, precisely as on-device AI features demand more, not less. A bromine supply shock would price hundreds of millions of people out of the devices through which they access banking, education, healthcare, and economic opportunity.
The exposure extends beyond commercial technology. The majority of guidance systems, radar modules, and electronic warfare packages fielded by the U. S. military run on DRAM and NAND flash chips sourced from the same commercial facilities, on the same allocation logic, with less procurement flexibility than commercial customers. Since the Defense Department shifted to commercial off-the-shelf procurement in the 1990s, there is no separate defense-grade memory supply chain. A shortage that forces Samsung and SK hynix to prioritize high-margin high bandwidth memory for AI customers would deprioritize the commodity DRAM used in precision-guided munitions, intelligence platforms, and shipboard radar systems, with no government visibility into how that allocation decision gets made. The same war straining ICL’s operational continuity is simultaneously depleting munitions stockpiles whose guidance systems depend on the same memory supply chain. The supply stress and the demand spike are running in the same direction at the same time.
The consequences for American AI follow directly from the South Korean exposure but run through a supply chain that most U. S. policymakers have never traced. Every Nvidia Blackwell and Rubin graphics processing unit requires high-bandwidth memory stacks that come almost entirely from SK hynix and Samsung, as SK hynix is Nvidia’s primary high-bandwidth memory supplier for both platforms. Microsoft, Amazon, Google, and Meta are deploying hundreds of billions of dollars in AI infrastructure on delivery schedules that assume South Korean plants will have uninterrupted access to the etch chemicals they need. A bromine disruption produces delivery slippage, renegotiated contracts, higher spot prices, and delayed server deployments.
Three levers are available, and they require action simultaneously. First, the most immediate is physical pre-positioning. Arkansas bromine from Albemarle and TETRA Technologies cannot be used directly in chip production, but it could serve as feedstock for semiconductor-grade hydrogen bromide gas conversion if that infrastructure existed, which is precisely the gap that ought to be closed. South Korean companies could also establish bromine forward contracts locking in supply and price for 12 to 18 months.
Second, the single most important structural action is the one with the longest lead time: building semiconductor-grade hydrogen bromide gas conversion capacity outside Israel. The Chip 4 framework should be extended to include a critical materials annex with a coordinated allied program to site, permit, and fund dedicated gas-phase distillation infrastructure capable of achieving parts-per-billion purity in geographically diversified locations — particularly in South Korea, Japan, and the United States. Private firms will not build conversion infrastructure at this scale and speed without government mandate, offtake guarantees and permitting priority.
Third, each government should take action in their own lanes, but in a coordinated fashion. South Korea should designate bromine a critical mineral, mandate minimum inventory levels, and fund domestic conversion infrastructure jointly with Samsung and SK hynix. The United States should add bromine, semiconductor-grade hydrogen bromide gas, and the full range of specialty gases derived from bromine to the critical minerals list, and use Defense Production Act authority and CHIPS and Science Act funding to co-invest with allies in purification capacity on allied soil. Israel should formalize bromine as a strategic export commodity, harden ICL’s production sites against missile attack, and use the 2030 Dead Sea concession expiration to bring in allied capital in exchange for long-term supply priority.
In sum, the bromine risk sits outside every dashboard anyone is monitoring. The structural failure is not the war: It is that the global memory supply chain has built itself around a conversion chokepoint with no redundancy and no fallback. If ICL’s Sodom facility goes offline, the gap does not get filled. The action that matters most — building semiconductor-grade hydrogen bromide gas conversion capacity outside Israel — takes years. The actions available now — forward contracts, inventory mandates, and Arkansas feedstock development — buy months at best, not years. That gap is precisely why these three countries should move now, before an Iranian ballistic missile makes the answer irrelevant.
Alvin Camba, Ph. D., is lead scientist and director of research at Lyvi. He is also a nonresident fellow in the Indo-Pacific Security Initiative at the Atlantic Council’s Scowcroft Center for Strategy and Security, and a senior research fellow at Associated Universities, Inc. His book on Chinese megaprojects and coalition politics in Southeast Asia is in production at Cornell University Press.
**Please note, as a matter of house style, War on the Rocks will not use a different name for the U. S. Department of Defense until and unless the name is changed by statute by the U.S. Congress.
...
Read the original on warontherocks.com »
...
Read the original on www.legalnomads.com »
To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".
10HN is also available as an iOS App
If you visit 10HN only rarely, check out the the best articles from the past week.
If you like 10HN please leave feedback and share
Visit pancik.com for more.