10 interesting stories served every morning and every evening.
An open-source intelligence investigation into how Meta Platforms built a multi-channel influence operation to pass age verification laws that shift regulatory burden from social media platforms onto Apple and Google’s app stores.
Every finding in this repository is sourced from public records: IRS 990 filings, Senate LD-2 lobbying disclosures, state lobbying registrations, campaign finance databases, corporate registries, WHOIS/DNS records, Wayback Machine archives, and investigative journalism.
Status: Active investigation. 47 proven findings, 9 structurally possible but unproven hypotheses, and multiple pending FOIA responses.
Meta spent a record $26.3 million on federal lobbying in 2025, deployed 86+ lobbyists across 45 states, and covertly funded a “grassroots” child safety group called the Digital Childhood Alliance (DCA) to advocate for the App Store Accountability Act (ASAA). The ASAA requires app stores to verify user ages before downloads but imposes no requirements on social media platforms. If it becomes law, Apple and Google absorb the compliance cost while Meta’s apps face zero new mandates.
This investigation traced funding flows across five confirmed channels, analyzed $2.0 billion in dark money grants, searched 59,736 DAF recipients, parsed LD-2 filings, and mapped campaign contributions across four states to document the operation.
Meta’s federal lobbying spending jumped from $19M (2022-2023) to $24M (2024) to $26.3M (2025) as ASAA bills were introduced in roughly 20 states. In Louisiana alone, 12 lobbyists were deployed for a single bill that passed 99-0.
Across all five Arabella Advisors entities (New Venture Fund, Sixteen Thirty Fund, North Fund, Windward Fund, Hopewell Fund), 4,433 grants totaling approximately $2.0 billion were analyzed. Not a single dollar went to any child safety, age verification, or tech policy organization. The Schedule I grant pathway through the Arabella network is definitively ruled out.
Five confirmed channels connect Meta’s spending to ASAA advocacy: direct federal lobbying ($26.3M), state lobbyist networks (45 states), the Digital Childhood Alliance (astroturf 501(c)(4)), super PACs ($70M+), and state legislative campaigns (3 laws passed). A sixth channel through the Arabella dark money network is structurally possible but unproven.
These standalone HTML documents provide detailed views of the investigation:
Full Investigation Documentation contains the complete OSINT investigation report with all five channels, evidence tables, and source citations.
Funding Network Timeline maps the chronological development of Meta’s lobbying infrastructure, DCA’s formation, and ASAA legislative progress across states.
Research Timeline tracks the investigation itself, showing when each finding was established and how threads connected.
Meta retained 40+ lobbying firms and 87 federal lobbyists in 2025 (85% with prior government service). Meta’s own LD-2 filings with the Senate explicitly list H. R. 3149/S. 1586, the App Store Accountability Act, as a lobbied bill. The filing narrative includes “protecting children, bullying prevention and online safety; youth safety and federal parental approval; youth restrictions on social media.”
At the state level, confirmed operations include $338,500 to Headwaters Strategies (Colorado), $324,992+ across 9 firms and 12 lobbyists in Louisiana, and $1,036,728 in direct California lobbying (Q1-Q3 2025 alone). A Meta lobbyist brought the legislative language for Louisiana HB-570 directly to the bill’s sponsor, Rep. Kim Carver, who confirmed this publicly.
DCA is a 501(c)(4) advocacy group that Meta covertly funds. Bloomberg exposed the funding relationship in July 2025. Under oath at a Louisiana Senate committee hearing, Executive Director Casey Stefanski admitted receiving tech company funding but refused to name donors.
DCA has no EIN in the IRS Business Master File, no incorporation record in any state registry searched (CO, DC, DE, VA, OpenCorporates), and no Form 990 on file. It processes donations through the For Good DAF (formerly Network for Good) as a “Project,” not a standalone nonprofit. Its likely fiscal sponsor is NCOSEAction/Institute for Public Policy (EIN 88-1180705), NCOSE’s confirmed 501(c)(4) affiliate with the same leadership.
DCA’s domain was registered December 18, 2024. The website was live and fully formed the next day. Every blog post and testimony targets Apple and Google. Meta is never mentioned or criticized.
Meta committed over $70 million to four state-level super PACs: ATEP ($45M, bipartisan, co-led by Hilltop Public Solutions), META California ($20M), California Leads ($5M), and Forge the Future (Texas, Republican-aligned). Forge the Future’s stated policy priority is “empowering parents with oversight of children’s online activities,” which mirrors ASAA language exactly.
Hilltop Public Solutions co-leads the $45M ATEP super PAC and is also involved in DCA’s messaging coordination, making it the first firm confirmed in both Meta’s PAC operation and the astroturf advocacy track.
All super PACs are registered at the state level rather than with the FEC, scattering disclosure filings across individual state ethics commissions instead of a single searchable federal database.
Meta’s Colorado lobbyist Adam Eichberg simultaneously serves as Board Chair of the New Venture Fund, the flagship 501(c)(3) of the Arabella Advisors network. NVF transfers $121.3 million annually to the Sixteen Thirty Fund, a 501(c)(4) with no donor disclosure requirements.
The Arabella network operates four entities from 1828 L Street NW, Washington DC (suites 300-A through 300-D) with combined annual revenue exceeding $1.3 billion. All five entities’ grant recipients were analyzed (4,433 grants, approximately $2.0 billion). Zero dollars went to any child safety organization, definitively ruling out the Schedule I grant pathway.
If Meta money flows through the Arabella network to DCA, it would have to travel via fiscal sponsorship, consulting fees, or lobbying expenditures, which are more opaque than grant disclosures.
ASAA has been signed into law in three states:
Roughly 17 additional states have introduced or are considering ASAA bills, including Kansas, South Carolina, Ohio, Georgia, and Florida. The federal version was introduced in May 2025 by Rep. John James (R-MI) and Sen. Mike Lee (R-UT).
Each finding below is documented with sources in the corresponding analysis file.
Meta funds DCA, confirmed by Bloomberg reporters and partially admitted by Stefanski under oath at the Louisiana Senate Commerce Committee hearing (April 2025). Sources: Insurance Journal/Bloomberg July 2025, Deseret News Dec 2025, The Center Square LA.
Meta deployed 86+ lobbyists across 45 states for ASAA and related campaigns. Source: OpenSecrets, state lobbying registrations.
Meta spent $26.3 million on federal lobbying in 2025, an all-time record exceeding Lockheed Martin and Boeing. Source: OpenSecrets, Quiver Quantitative, Dome Politics.
Meta paid Headwaters Strategies $338,500 for Colorado lobbying between 2019 and 2026. Source: Colorado SOS SODA API.
Adam Eichberg simultaneously co-founded Meta’s Colorado lobbying firm (Headwaters Strategies) and chairs the New Venture Fund board. Sources: Headwaters Strategies website, NVF board page, InfluenceWatch.
NVF does not directly fund any child safety or tech policy organizations via Schedule I grants. Source: NVF Form 990 Schedule I analysis, 2,669 recipients.
DCA and DCI share infrastructure: same registrar (GoDaddy), CDN (Cloudflare), email (Microsoft 365), and marketing platform (Elastic Email). Source: DNS/WHOIS analysis.
Pelican State Partners represents Meta as a lobbying client in Louisiana. Source: F Minus database, LA Board of Ethics.
DCA leadership comes from NCOSE: three of four senior staff have NCOSE connections (Stefanski, Hawkins, McKay). Source: DCA website, NCOSE public records.
ASAA has been signed into law in three states: Utah (SB-142, March 2025), Louisiana (HB-570, June 2025), and Texas (SB 2420, May 2025, paused by judge December 2025). Sources: State legislature records, news coverage.
The Sixteen Thirty Fund does not fund any child safety or tech policy organizations via Schedule I grants (306 of 318 recipients analyzed). Source: STF Form 990 Schedule I, 2024.
All five Arabella entities analyzed: 4,433 grants (approximately $2.0 billion) with zero dollars going to child safety or tech policy organizations. Schedule I pathway definitively ruled out across the entire network. Sources: NVF, STF, North Fund, Windward, Hopewell Form 990 Schedule I filings via ProPublica.
A Meta employee (Jake Levine, Product Manager) contributed $1,175 to ASAA sponsor Matt Ball’s campaign apparatus on June 2, 2025. Source: Colorado TRACER bulk data.
A Google Policy Manager (Kyle Gardner) also contributed $450 to Matt Ball. Multiple tech company employees from ASAA-affected companies targeted the same ASAA bill sponsor. Source: Colorado TRACER bulk data.
Eichberg and Coyne (Headwaters principals) did not contribute to ASAA bill sponsors Ball or Paschal despite $20,000+ combined political giving. Source: Colorado TRACER bulk data.
No direct Meta PAC contributions to any ASAA sponsor across Utah, Louisiana, Texas, or Colorado. Source: FollowTheMoney.org multi-state search.
Todd Weiler (Utah SB-142 sponsor) does not accept corporate contributions and has not discussed ASAA directly with Meta. DCA served as the policy intermediary. Source: Investigative reporting, Weiler’s public statements.
DCA has no EIN in the IRS Business Master File. Not found in any of four regional extracts (eo1-eo4.csv) covering all US tax-exempt organizations. Source: IRS BMF regional extracts.
DCI confirmed in IRS BMF with EIN 39-3684798, Delaware incorporation at 213 N Market St Wilmington, IRS ruling November 2025. Source: IRS BMF extract.
Meta’s Forge the Future super PAC spent $1.3 million in Texas ahead of March 2026 primaries. Source: Texas Ethics Commission filings, news coverage.
DCA’s website deployed less than 24 hours after domain registration: fully functional advocacy site with professional design, statistics, and Heritage/NCOSE testimonials. Source: Wayback Machine CDX API, 100+ snapshots.
77-day pipeline from DCA domain registration (December 18, 2024) to Utah SB-142 signing (March 5, 2025). Site pre-loaded with ASAA talking points before any bill had passed. Source: WHOIS records, Utah Legislature.
Meta deployed 12 lobbyists for Louisiana HB-570, which passed 99-0. Disproportionate deployment indicates text-control and amendment-blocking rather than vote persuasion. Source: Investigative reporting, LA Board of Ethics.
Three California tech policy employees from Meta, Google, and Pinterest contributed to Matt Ball within 90 days. All from ASAA-affected companies, all out-of-state, targeting a newly-appointed senator. Source: Colorado TRACER bulk data.
Pelican State Partners represents both Meta and Roblox in Louisiana. Both are ASAA beneficiaries, enabling “broad industry support” framing. Source: F Minus database.
DCA’s coalition count inflated from 50+ to 140+ with only six organizations ever publicly named. No member list has been published on the website. Source: DCA website, Wayback Machine.
NCOSE has a confirmed 501(c)(4) affiliate: NCOSEAction / Institute for Public Policy (EIN 88-1180705), IRS ruling May 2025, same address and leadership as NCOSE. Source: IRS BMF, NCOSE website.
Network for Good is a Donor Advised Fund, not a payment processor. DCA is classified as “Project” (ID 258136) in the system. For Good explicitly limits grants to 501(c)(3) organizations. Source: For Good website, IRS determination.
A Meta lobbyist drafted HB-570′s legislative language, confirmed by sponsor Rep. Kim Carver. The bill as originally written placed age verification burden exclusively on app stores, not platforms. Source: Investigative reporting, Carver’s public confirmation.
Nicole Lopez (Meta Director of Global Litigation Strategy for Youth) testified in both Louisiana and South Dakota for ASAA bills, serving as Meta’s national ASAA spokesperson. Source: Legislative hearing records.
The Sixteen Thirty Fund’s $31 million lobbying budget and $13.1 million in professional fees contain zero mentions of child safety, digital policy, age verification, or app stores. Source: STF Form 990 Part IX.
John R. Read (DCA Senior Policy Advisor) lists “Digital Childhood Alliance” as his employer in Colorado TRACER records. Contributed $100 to AG candidate Hetal Doshi (October 2025). Source: Colorado TRACER.
Matt Ball received 8% of total fundraising from tech industry employees. He is the only 2026 Colorado senate candidate with contributions from Meta, Pinterest, Instacart, Anthropic, and Google employees. Four of eight dual-maxed donors are tech employees. Source: Colorado TRACER analysis.
NCOSE Schedule R reveals a two-entity evolution: the original NCOSE Action (EIN 86-2458921, c4 reclassified to c3) was replaced by the Institute for Public Policy (EIN 88-1180705, c4). All 19 NCOSE-to-Institute transaction indicators are marked “No” despite shared leadership. Source: NCOSE Form 990 Schedule R, 2019-2023.
For Good DAF pathway definitively ruled out: 59,736 grant recipients across five years (approximately $1.73 billion) searched with zero matches for DCA, DCI, NCOSE, NCOSEAction, or any related entity. Source: For Good DAF grant data.
NCOSE lobbying spending tripled from $78,000 to $204,000 concurrent with DCA launch and the ASAA legislative push (FY2023 to FY2024). Source: NCOSE Form 990 Part IX.
Forge the Future super PAC explicitly lists an ASAA-aligned policy priority: “Empowering parents with oversight of children’s online activities across devices and digital environments.” Source: Forge the Future filings.
Hilltop Public Solutions bridges Meta’s super PAC and DCA operations. It co-leads ATEP ($45M) and is involved in DCA messaging coordination. First firm confirmed in both tracks. Source: ATEP filings, investigative reporting.
Meta super PACs are state-level entities (not FEC-registered), deliberately scattering filings across state ethics commissions to avoid centralized searchability. Source: FEC search (negative), state PAC registrations.
Meta’s total documented political spending exceeds $70 million: $45M ATEP, $20M META California, $5M California Leads, with downstream flows to Forge the Future (TX) and Making Our Tomorrow (IL). Source: State PAC filings, news coverage.
Casey Stefanski never appears on any NCOSE 990 filing despite reportedly working there ten years. Not among officers, directors, key employees, or five highest-compensated. Source: NCOSE Form 990 filings, 2015-2023.
Meta’s LD-2 filings explicitly list the App Store Accountability Act (H. R. 3149/S. 1586) as a lobbied bill. This is the first direct evidence from Meta’s own federal filings connecting its $26.3M lobbying spend to the specific legislation DCA advocates for. Source: Senate LDA filing UUID b73445ed-15e5-42e7-a1e8-aeb224755267.
Meta simultaneously lobbies FOR ASAA and ON KOSA/COPPA 2.0, supporting legislation that burdens Apple and Google while opposing or amending legislation that would regulate Meta directly. Both appear in the same LD-2 filing. Source: Meta LD-2 Q1-Q2 2025.
LD-2 narrative mirrors DCA messaging: “youth safety and federal parental approval” framing in Meta’s federal filings matches DCA’s “parental approval” and “child protection” advocacy language. Source: LD-2 filing CPI issue code narrative.
Meta funds flow through the Arabella network via non-grant mechanisms (fiscal sponsorship, consulting fees, lobbying expenditures). The Schedule I and For Good DAF pathways are both ruled out.
DCA operates under NCOSEAction (EIN 88-1180705) as fiscal sponsor. The personnel chain is direct (van der Watt to Hawkins to Stefanski), but NCOSE reports zero transactions with its c4 affiliate.
Jake Levine’s contribution to Matt Ball was coordinated by Meta’s government affairs team rather than being purely personal.
Angela Paxton (Texas ASAA sponsor) was among the unnamed state senators supported by Forge the Future.
NCOSE’s lobbying spend tripling is causally related to DCA/ASAA activity (timing is concurrent but program descriptions do not mention ASAA).
DCA’s For Good donation page is cosmetic. Actual funding comes directly from Meta, not small-dollar DAF donations.
This investigation used Claude Code (Anthropic’s CLI tool, running Claude Opus) was used as a research assistant for:
* Bulk data processing: parsing 4,433 IRS Schedule I grant records, 59,736 DAF recipients, 132MB of Colorado TRACER campaign finance data, and IRS Business Master File extracts covering all US tax-exempt organizations
* Cross-referencing findings across 24 analysis files and identifying patterns that span multiple research threads
Claude Code did not independently choose what to investigate, decide what constitutes a finding, or determine what to publish. Every factual claim in this repository cites a primary source (IRS filing, Senate disclosure, state database, legislative record, or published reporting) that can be independently verified. The tool does not change whether Meta’s LD-2 filing lists H. R. 3149, whether DCA has an EIN, or whether Stefanski admitted tech funding under oath. The records exist or they don’t.
If you want to verify any finding, the source URLs and database identifiers are provided throughout. Start with the primary records, not with this repository.
This is an OSINT research product. All findings are based on public records. Source data is cited throughout.
...
Read the original on github.com »
There is a certain kind of computer review that is really a permission slip. It tells you what you’re allowed to want. It locates you in a taxonomy — student, creative, professional, power user — and assigns you a product. It is helpful. It is responsible. It has very little interest in what you might become.
The MacBook Neo has attracted a lot of these reviews.
The consensus is reasonable: $599, A18 Pro, 8GB RAM, stripped-down I/O. A Chromebook killer, a first laptop, a sensible machine for sensible tasks. “If you are thinking about Xcode or Final Cut, this is not the computer for you.” The people saying this are not wrong. It is also not the point.
Nobody starts in the right place. You don’t begin with the correct tool and work sensibly within its constraints until you organically graduate to a more capable one. That is not how obsession works. Obsession works by taking whatever is available and pressing on it until it either breaks or reveals something. The machine’s limits become a map of the territory. You learn what computing actually costs by paying too much of it on hardware that can barely afford it.
I know this because I was running Final Cut Pro X on a 2006 Core 2 Duo iMac with 3GB RAM and 120GB of spinning rust. I was nine. I had no business doing this. I did it every day after school until my parents made me go to bed.
The machine came as a hand-me-down from my nana. She’d wiped it, set it up in her kitchen in Massachusetts. It was one software update away from getting the axe from Apple. I torrented Adobe CS5 the same week. Downloaded Xcode and dragged buttons and controls around in Interface Builder with no understanding of what I was looking at. I edited SystemVersion.plist to make the “About this Mac” window say it was running Mac OS 69, which is the s*x number, which is very funny. I faked being sick to watch WWDC 2011 — Steve Jobs’ last keynote — and clapped alone in my room when the audience clapped, and rebuilt his slides in Keynote afterward because I wanted to understand how he’d made them feel that way.
I knew the machine was wrong for what I wanted to do with it. I didn’t care. Every limitation was just the edge of something I hadn’t figured out yet. It was green fields and blue skies.
I thought about all of this when I opened the Neo for the first time.
What Apple put inside the Neo is the complete behavioral contract of the Mac. Not a Mac Lite. Not a browser in a laptop costume. The same macOS, the same APIs, the same Neural Engine, the same weird byzantine AppKit controls that haven’t meaningfully changed since the NeXT era. The ability to disable SIP and install some fuck-ass system modification you saw in a YouTube tutorial. All of it, at $599.
They cut the things that are, apparently, not the Mac. MagSafe. ProMotion. M-series silicon. Port bandwidth. Configurable memory. What remains is the Retina display, the aluminum, the keyboard, and the full software platform. I held it and thought, “yep, still a Mac.”
Yes, you will hit the limits of this machine. 8GB of RAM and a phone chip will see to that. But the limits you hit on the Neo are resource limits — memory is finite, silicon has a clock speed, processes cost something. You are learning physics. A Chromebook doesn’t teach you that. A Chromebook’s ceiling is made of web browser, and the things you run into are not the edges of computing but the edges of a product category designed to save you from yourself. The kid who tries to run Blender on a Chromebook doesn’t learn that his machine can’t handle it. He learns that Google decided he’s not allowed to. Those are completely different lessons.
Somewhere a kid is saving up for this. He has read every review. Watched the introduction video four or five times. Looked up every spec, every benchmark, every footnote. He has probably walked into an Apple Store and interrogated an employee about it ad nauseam. He knows the consensus. He knows it’s probably not the right tool for everything he wants to do.
He has decided he’ll be fine.
This computer is not for the people writing those reviews — people who already have the MacBook Pro, who have the professional context, who are optimizing at the margin. This computer is for the kid who doesn’t have a margin to optimize. Who can’t wait for the right tool to materialize. Who is going to take what’s available and push it until it breaks and learn something permanent from the breaking.
He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it. He is going to make a folder called “Projects” with nothing in it. He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes. He is going to open GarageBand and make something that is not a song. He is going to take screenshots of fonts he likes and put them in a folder called “cool fonts” and not know why. Then he is going to have Blender and GarageBand and Safari and Xcode all open at once, not because he’s working in all of them but because he doesn’t know you’re not supposed to do that, and the machine is going to get hot and slow and he is going to learn what the spinning beachball cursor means. None of this will look, from the outside, like the beginning of anything. But one of those things is going to stick longer than the others. He won’t know which one until later. He’ll just know he keeps opening it.
That is not a bug in how he’s using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
He knows it’s probably not the right tool. It doesn’t matter. It never did.
The reviews can tell you what a computer is for. They have very little interest in what you might become because of one.
...
Read the original on samhenri.gold »
Willingness to look stupid is a genuine moat in creative workEvery Sunday I go to a coffee shop in Japantown with my laptop to write. And I write! I have no trouble writing. The writing isn’t the problem. The problem is that when I’m done, I look at what I just wrote and think this is definitely not good enough to publish. This didn’t use to happen. A few years ago I used to publish all the time. I’d write something, feel pretty good about it, and then hit publish without a second thought. I knew nobody really cared about what I was writing, so it didn’t matter if it sucked. And honestly, a lot of what I wrote really did suck. But I published it anyway. And yet I’d somehow occasionally write a good post.Fast forward to today: I have no trouble writing, but I’ve now developed this fear of hitting publish. I’m older and objectively a better writer, with supposedly better ideas. So where did things go wrong? Why’s it so much harder to share my ideas now?1.
There’s this unfortunate pattern that happens when someone wins a Nobel Prize. They tend to stop doing great work. Richard Hamming talks about this in You and Your Research:When you are famous it is hard to work on small problems. This is what did Shannon in. After information theory, what do you do for an encore? The great scientists often make this error. They fail to continue to plant the little acorns from which the mighty oak trees grow. They try to get the big thing right off. And that isn’t the way things go. So that is another reason why you find that when you get early recognition it seems to sterilize you. In fact I will give you my favorite quotation of many years. The Institute for Advanced Study in Princeton, in my opinion, has ruined more good scientists than any institution has created, judged by what they did before they came and judged by what they did after. Not that they weren’t good afterwards, but they were superb before they got there and were only good afterwards.Before the Nobel Prize, nobody really cares who you are. But after the Nobel Prize, you’re a Nobel Prize winner, and Nobel Prize winners are supposed to have Good Ideas. Every idea, every paper, every talk at a conference is now being evaluated against the standard of your Nobel Prize-winning work. Everyone is asking, “is this worthy of a Nobel laureate?” It’s a high bar to clear. So instead of trying and occasionally failing, they just… stop trying. The fear of making something bad is worse than producing nothing at all.¹2.
Many good ideas come from young and unproven people. The Macintosh team’s average age was 21. Most researchers at Xerox PARC were under 30. Some of the best research work I’ve seen at OpenAI has come from surprisingly young people. I don’t think young people are smarter than old people. I don’t think they work that much harder either. It mostly just seems that nobody really expects much of young people, so they’re free to follow their curiosity into weird, silly, and seemingly-bad-but-actually-good ideas. They’re not afraid of looking stupid. Good Ideas, and I mean this in the broadest sense — research directions, startup ideas, premises for a novel — almost always sound stupid at first. They often make the person who came up with them look stupid. So if a truly Good Idea always starts out by looking unserious, then the only way to have one is to get comfortable producing stupid things.3.
A few weeks ago my friend Aadil and I were at Whole Foods buying a birthday cake for a friend. We wanted to write something clever on the cake but couldn’t really think of anything. We stood around thinking for a few minutes before Aadil said “Let’s just say a bunch of bad ideas out loud so we can get to the good ones.” And it worked! We all said a bunch of terrible ideas, and eventually we landed on a good one — a pretty clever pun based on our friend’s longtime email address.This sounds silly, but I think it captures the entire creative process well. You start by coming up with bad ideas. You will probably look stupid. That’s inevitable. But once you’re comfortable looking stupid, you can produce the bad ideas which will eventually lead to the good ones. If you don’t have the courage to look stupid, you’ll never reap the reward of having good ideas.It feels like there’s something like a conservation law at work here: the amount of stupidity you’re willing to tolerate is directly proportional to the quality of ideas you’ll eventually produce. I’ll call this Aadil’s Law.4.
Yesterday, I visited the Monterey Bay Aquarium and could not stop thinking about the jellyfish exhibit. They are seriously weird creatures. Jellyfish have no bones, brains, teeth, or blood. Some are bioluminescent for reasons we don’t fully understand. They’re pretty much sacs of jelly contained within a thin membrane, drifting aimlessly at the mercy of ocean currents. Yet somehow, jellyfish have been around for over 500 million years. So by most definitions of evolutionary success, jellyfish are a great idea.But how was evolution able to get to the jellyfish? The evolutionary process is pretty simple: generate a ton of random mutations and then let natural selection filter them. The overwhelming majority of mutations end up being harmful or neutral. An exceedingly small fraction are beneficial. If you could somehow give evolution a sense of embarrassment, so if every time it produced a fish with no fins or a bird with no wings, it felt a deep sense of shame and promised to be more careful next time — evolution would no longer work. It needs to be able to explore the fitness landscape with bad traits in order to produce good traits, and this exploration requires a willingness to produce unfit organisms. The only way evolution could get to the jellyfish was by being willing to produce the countless jellyfish-adjacent organisms which went extinct.5.
There might be a good reason why smart people want to avoid looking stupid. I’ve spent a long time thinking about what this reason could be. The only plausible explanation is that our egos are fragile, and by not sharing any work at all, we never have to risk our egos being damaged. If we never share anything, then nothing bad can ever happen to us. But the flip side to protecting our egos is that we never end up making anything worthwhile.I think there are two very different failure modes here, each at an opposite end of the spectrum:Overshare, but look stupid: You have lots of ideas, and you share them indiscriminately. You look stupid because you don’t really care about what you share, and people eventually learn to tune you out.Undershare, but never do anything interesting: You have lots of ideas, but share almost none of them. You’re afraid of looking stupid, so the exceedingly few ideas that you do share end up being incredibly bland. You never look stupid, but this comes at the expense of never doing anything interesting ever again.Knowing myself, I’m definitely more at risk of undersharing my work. I’d also bet that the most people reading this blog post are prone to undersharing as well.6.
So where do we go from here? I think the answer is actually in that Whole Foods story. Aadil’s implicit goal was to “think of something clever to write on this cake” but none of us could do it because cleverness was the standard and none of our ideas met it. But when Aadil said “Let’s just say a bunch of bad ideas,” he changed the frame entirely. We were now playing a game where the only way to lose was by saying nothing at all.I think that’s the key here. Your goal shouldn’t be to share something good. It should just be to share something at all. Even if it isn’t good. A half-baked blog post. A silly demo. A weird project. I’ve been doing too much selection, and not enough production.7.
I keep thinking about the version of me from a few years ago. He was worse at almost everything. Worse writer, worse thinker, worse at making things. Nobody really knew him and nobody really cared what he had to say. And yet he had so much more courage. He’d write something in an afternoon and publish it that evening and go to bed feeling good about himself. He wasn’t performing for anyone. He was just a guy with a blog, putting his thoughts out into the world, mostly for himself. I miss that guy.Evolution didn’t get to the jellyfish by being careful. Aadil didn’t come up with a good cake idea by trying to be clever. I think it’s just about overcoming fear. Not a matter of talent, taste, or intelligence. Just this: are you willing to look stupid today? That’s it. That’s all there is to it.¹ My favorite counterexample to this is that Alec Radford (the researcher behind GPT-1) is still writing papers on cleaning pretraining data, arguably the most unglamorous thing you could work on in ML research in 2026.Most people will spend decades in chronic pain to avoid a few minutes of acute pain.Maybe making is about matteringWanting to matter might be the most honest reason to create anything.Sometimes the people who need invitations most are the ones who always decline them.The models are powerful as is. But where are the tools?
...
Read the original on sharif.io »
Find out which AI models your machine can actually run.
Improved V3 with hybrid thinking and tool use
Try adjusting your search or filters
...
Read the original on canirun.ai »
We’re thrilled to announce the stable release of Vite 8! When Vite first launched, we made a pragmatic bet on two bundlers: esbuild for speed during development, and Rollup for optimized production builds. That bet served us well for years. We’re very grateful to the Rollup and esbuild maintainers. Vite wouldn’t have succeeded without them. Today, it resolves into one: Vite 8 ships with Rolldown as its single, unified, Rust-based bundler, delivering up to 10-30x faster builds while maintaining full plugin compatibility. This is the most significant architectural change since Vite 2.
Vite is now being downloaded 65 million times a week, and the ecosystem continues to grow with every release. To help developers navigate the ever-expanding plugin landscape, we also launched registry.vite.dev, a searchable directory of plugins for Vite, Rolldown, and Rollup that collects plugin data from npm daily.
Play online with Vite 8 using vite.new or scaffold a Vite app locally with your preferred framework running pnpm create vite. Check out the Getting Started Guide for more information.
We invite you to help us improve Vite (joining the more than 1.2K contributors to Vite Core), our dependencies, or plugins and projects in the ecosystem. Learn more at our Contributing Guide. A good way to get started is by triaging issues, reviewing PRs, sending tests PRs based on open issues, and supporting others in Discussions or Vite Land’s help forum. If you have questions, join our Discord community and talk to us in the #contributing channel.
Stay updated and connect with others building on top of Vite by following us on Bluesky, X, or Mastodon.
Since its earliest versions, Vite relied on two separate bundlers to serve different needs. esbuild handled fast compilation during development (dependency pre-bundling and TypeScript/JSX transforms) that made the dev experience feel instant. Rollup handled production bundling, chunking, and optimization, with its rich plugin API powering the entire Vite plugin ecosystem.
This dual-bundler approach served Vite well for years. It allowed us to focus on developer experience and orchestration rather than reinventing parsing and bundling from scratch. But it came with trade-offs. Two separate transformation pipelines meant two separate plugin systems, and an increasing amount of glue code needed to keep the two pipelines in sync. Edge cases around inconsistent module handling accumulated over time, and every alignment fix in one pipeline risked introducing differences in the other.
Rolldown is a Rust-based bundler built by the VoidZero team to address these challenges head-on. It was designed with three goals:
* Performance: Written in Rust, Rolldown operates at native speed. In benchmarks, it is 10-30x faster than Rollup matching esbuild’s performance level.
* Compatibility: Rolldown supports the same plugin API as Rollup and Vite. Most existing Vite plugins work out of the box with Vite 8.
* Advanced features: A single unified bundler unlocks capabilities that were difficult or impossible with the dual-bundler setup, including full bundle mode, more flexible chunk splitting, module-level persistent caching, and Module Federation support.
The migration to Rolldown was deliberate and community-driven. First, a separate rolldown-vite package was released as a technical preview, allowing early adopters to test Rolldown’s integration without affecting the stable version of Vite. The feedback from those early adopters was invaluable. They pushed the integration through real-world codebases of every shape and size, surfacing edge cases and compatibility issues we could address before a wider release. We also set up a dedicated CI suite validating key Vite plugins and frameworks against the new bundler, catching regressions early and building confidence in the migration path.
In December 2025, we shipped the Vite 8 beta with Rolldown fully integrated. During the beta period, Rolldown itself progressed from beta to a release candidate, with continuous improvements driven by the testing and feedback of the Vite community.
During the preview and beta phases of rolldown-vite, several companies reported measurable reductions in production build times:
For large projects, the impact can be especially noticeable, and we expect further improvements as Rolldown continues to evolve.
With Vite 8, Vite becomes the entry point to an end-to-end toolchain with closely collaborating teams: the build tool (Vite), the bundler (Rolldown), and the compiler (Oxc). This alignment ensures consistent behavior across the entire stack, from parsing and resolving to transforming and minifying. It also means we can rapidly adopt new language specifications as JavaScript evolves. And by integrating deeply across layers, we can pursue optimizations that were previously out of reach, such as leveraging Oxc’s semantic analysis for better tree-shaking in Rolldown.
None of this would have been possible without the broader community. We want to extend our deep thanks to the framework teams (SvelteKit, React Router, Storybook, Astro, Nuxt, and many others) who tested rolldown-vite early, filed detailed bug reports, and worked with us to resolve compatibility issues. We are equally grateful to every developer who tried the beta, shared their build time improvements, and reported the rough edges that helped us polish this release. Your willingness to test the migration on real projects helped make the transition to Rolldown smoother and more reliable.
Vite 8 requires Node.js 20.19+, 22.12+, the same requirements as Vite 7. These ranges ensure Node.js supports require(esm) without a flag, allowing Vite to be distributed as ESM only.
Beyond the Rolldown integration, Vite 8 includes several notable features:
* Integrated Devtools: Vite 8 ships devtools option to enable Vite Devtools, a developer tooling for debugging and analysis. Vite Devtools provide deeper insights into your Vite-powered projects directly from the dev server.
* Built-in tsconfig paths support: Developers can enable TypeScript path alias resolution by setting resolve.tsconfigPaths to true. This has a small performance cost and is not enabled by default.
* emitDecoratorMetadata support: Vite 8 now has built-in automatic support for TypeScript’s emitDecoratorMetadata option, removing the need for external plugins. See the Features page for details.
* Wasm SSR support: .wasm?init imports now work in SSR environments, expanding Vite’s WebAssembly feature to server-side rendering.
* Browser console forwarding: Vite 8 can forward browser console logs and errors to the dev server terminal. This is especially useful when working with coding agents, as runtime client errors become visible in the CLI output. Enable it with server.forwardConsole, which activates automatically when a coding agent is detected.
Alongside Vite 8, we are releasing @vitejs/plugin-react v6. The plugin uses Oxc for React Refresh transform. Babel is no longer a dependency and the installation size is smaller.
For projects that need the React Compiler, v6 provides a reactCompilerPreset helper that works with @rolldown/plugin-babel, giving you an explicit opt-in path without burdening the default setup.
See the Release Notes for more details.
Note that v5 still works with Vite 8, so you can upgrade the plugin after upgrading Vite.
The Rolldown integration opens the door to improvements and optimizations. Here is what we are working on next:
* Full Bundle Mode (experimental): This mode bundles modules during development, similar to production builds. Preliminary results show 3x faster dev server startup, 40% faster full reloads, and 10x fewer network requests. This is especially impactful for large projects where the unbundled dev approach hits scaling limits.
* Raw AST transfer: Allows JavaScript plugins to access the Rust-produced AST with minimal serialization overhead, bridging the performance gap between Rust internals and JS plugin code.
* Native MagicString transforms: Enables custom transforms where the logic lives in JavaScript but the string manipulation computation runs in Rust.
* Stabilizing the Environment API: We are working to make the Environment API stable. The ecosystem has started regular meetings to better collaborate together.
We want to be transparent about changes to Vite’s install size. Vite 8 is approximately 15 MB larger than Vite 7 on its own. This comes from two main sources:
* ~10 MB from lightningcss: Previously an optional peer dependency, lightningcss is now a normal dependency to provide better CSS minification out of the box.
* ~5 MB from Rolldown: The Rolldown binary is larger than esbuild + Rollup mainly due to performance optimizations that favor speed over binary size.
We will continue monitoring and working to reduce install size as Rolldown matures.
For most projects, upgrading to Vite 8 should be a smooth process. We built a compatibility layer that auto-converts existing esbuild and rollupOptions configuration to their Rolldown and Oxc equivalents, so many projects will work without any config changes.
For larger or more complex projects, we recommend the gradual migration path: first switch from vite to the rolldown-vite package on Vite 7 to isolate any Rolldown-specific issues, then upgrade to Vite 8. This two-step approach makes it easy to identify whether any issues come from the bundler change or from other Vite 8 changes.
Please review the detailed Migration Guide before upgrading. The complete list of changes is in the Vite 8 Changelog.
As Vite moves to Rolldown, we want to take a moment to express our deep gratitude to the two projects that made Vite possible.
Rollup has been Vite’s production bundler since the very beginning. Its elegant plugin API design proved so well-conceived that Rolldown adopted it as its own, and Vite’s entire plugin ecosystem exists because of the foundation Rollup laid. The quality and thoughtfulness of Rollup’s architecture shaped how Vite thinks about extensibility. Thank you, Rich Harris for creating Rollup, and Lukas Taegert-Atkinson and the Rollup team for maintaining and evolving it into something that has had such a lasting impact on the web tooling ecosystem.
esbuild powered Vite’s remarkably fast development experience from its early days: dependency pre-bundling, TypeScript and JSX transforms that completed in milliseconds rather than hundreds. esbuild proved that build tools could be orders of magnitude faster, and its speed set the bar that inspired an entire generation of Rust and Go-based tooling. Thank you, Evan Wallace, for showing all of us what was possible.
Without these two projects, Vite would not exist as it does today. Even as we move forward with Rolldown, the influence of Rollup and esbuild is deeply embedded in Vite’s DNA, and we are grateful for everything they have given to the ecosystem. You can learn more about all the projects and people Vite depends on at our Acknowledgements page.
Vite 8 was led by sapphi-red and the Vite Team with the help of the wide community of contributors, downstream maintainers, and plugin authors. We want to thank the Rolldown team for their close collaboration in making the Rolldown-powered Vite 8 possible. We are also especially grateful to everyone who participated in the rolldown-vite preview and the Vite 8 beta period. Your testing, bug reports, and feedback made the Rolldown migration possible and shaped this release into something we are proud of.
Vite is brought to you by VoidZero, in partnership with Bolt and NuxtLabs. We also want to thank our sponsors on Vite’s GitHub Sponsors and Vite’s Open Collective.
...
Read the original on vite.dev »
Alpha notice: Code export is not functional yet. We’re actively working on it — check back soon.
Design once, generate production-ready code for your framework of choice. Switch targets without touching your design.
Alpha notice: Code export is not functional yet. We’re actively working on it — check back soon.
Design once, generate production-ready code for your framework of choice. Switch targets without touching your design.
Everything you need to know before hitting download.
A TUI (Text User Interface) is an interactive application that runs entirely in the terminal — like htop, lazygit, or k9s. Instead of a web browser or native window, the UI is built from characters, colors, and ANSI escape codes. TUIStudio lets you design these visually instead of hand-coding every layout.
Will macOS or Windows block the app?
With no code-signing configured, each platform behaves differently:
macOS
Gatekeeper blocks the app immediately. You’ll see either “TUIStudio cannot be opened because it is from an unidentified developer” or “TUIStudio is damaged and can’t be opened” on newer macOS after quarantine flags the binary.
To get past it: right-click the .app → Open → Open anyway — or go to System Settings → Privacy & Security → “Open Anyway”.
Windows
SmartScreen shows “Windows protected your PC”. Click More info → Run anyway. Less fatal than macOS, but still alarming to non-technical users.
Linux
No such gate. dpkg -i TUIStudio-amd64.deb or double-click in a file manager — just works.
Why are exports not working?
TUIStudio is currently in Alpha — exports are not functional yet. We’re actively working on it.
When ready, the following 6 frameworks will be supported:
Switch export targets at any time without touching your design.
TUIStudio is currently in early access. The core editor is free to download and use. A pro tier with team features, cloud sync, and priority support is planned for later.
Can I save and reopen my designs?
Yes. Projects are saved as portable .tui JSON files you can open from anywhere, commit to git, or share with your team. No account or cloud required.
...
Read the original on tui.studio »
...
Read the original on channelsurfer.tv »
For a decade, I have been working with AWS and third-party security teams to resolve bucketsquatting / bucketsniping issues in AWS S3. Finally, I am happy to say AWS now has a solution to the problem, and it changes the way you should name your buckets.
Bucketsquatting (or sometimes called bucketsniping) is an issue I first wrote about in 2019, and it has been a recurring issue in AWS S3 ever since. If you’re interested in the specifics of the problem, I recommend you check out my original post on the topic: S3 Bucket Namesquatting - Abusing predictable S3 bucket names. In short, the problem is that S3 bucket names are globally unique, and if the owner of a bucket deletes it, that name becomes available for anyone else to register. This can lead to a situation where an attacker can register a bucket with the same name as a previously deleted bucket and potentially gain access to sensitive data or disrupt services that rely on that bucket.
Additionally, it is a common practice for organizations to use predictable naming conventions for their buckets, such as appending the AWS region name to the end of the bucket name (e.g. myapp-us-east-1), which can make it easier for attackers to guess and register buckets that may have been previously used. This latter practice is one that AWS’ internal teams commonly fall victim to, and it is one that I have been working with the AWS Security Outreach team to address for almost a decade now across dozens of individual communications.
To address this issue, AWS has introduced a new protection that works effectively as a “namespace” for S3 buckets. The namespace syntax is as follows:
For example, if your account ID is 123456789012, your prefix is myapp, and you want to create a bucket in the us-west-2 region, you would name your bucket as follows:
Though not explicitly mentioned, the -an here refers to the “account namespace”. This new syntax ensures that only the account that owns the namespace can create buckets with that name, effectively preventing bucketsquatting attacks. If another account tries to create a bucket with the same name, they will receive an InvalidBucketNamespace error message indicating that the bucket name is already in use. Account owners will also receive an InvalidBucketNamespace error if they try to create a bucket where the bucket region does not match the region specified in the bucket name.
Interestingly, the guidance from AWS is that this namespace is recommended to be used by default. Namespaces aren’t new to S3, with suffixes like .mrap, –x-s3, and -s3alias all being examples of existing namespaces that AWS previously used for new features; however, this is the first time AWS has introduced a namespace that is recommended for general use by customers to protect against a specific security issue.
It is AWS’ stance that all buckets should use this namespace pattern, unless you have a compelling reason not to (hint: there aren’t many). To this end, AWS is allowing security administrators to set policies that require the use of this namespace through the use of a new condition key s3:x-amz-bucket-namespace, which can be applied within an Organization’s SCP policies to enforce the use of this protection across an organization.
This doesn’t retroactively protect any existing buckets (or published templates that use a region prefix/suffix pattern without the namespace), but it does provide a strong protection for new buckets going forward (okay, so it’s dying, not dead). If you wish to protect your existing buckets, you’ll need to create new buckets with the namespace pattern and migrate your data to those buckets.
While AWS has introduced this new namespace protection for S3 buckets, the other major cloud providers handle things slightly differently.
Google Cloud Storage already has a namespace concept in place for its buckets, which is based on domain name verification. This means that only the owner of a domain can create buckets with names that are of a domain name format (e.g. myapp.com), and they must verify ownership of the domain before they can create buckets with that name. Bucketsquatting is still possible with non-domain name formatted buckets, but the use of domain name formatted buckets is Google’s solution to the issue.
For Azure Blob Storage, storage accounts are scoped with a configurable account name and container name, so the same issue does apply. This is further exacerbated by the fact that Azure’s storage account names have a maximum of 24 characters, leaving a fairly small namespace for organizations to work with. (h/t vhab for pointing this out)
There is a new namespace for S3 buckets. The namespace protects you from bucketsquatting attacks, and you should use it for any S3 buckets you create.
If you liked what I’ve written, or want to hear more on this topic, reach out to me on LinkedIn or 𝕏.
...
Read the original on onecloudplease.com »
Senator Ron Wyden says that when a secret interpretation of Section 702 is eventually declassified, the American public “will be stunned” to learn what the NSA has been doing. If you’ve followed Wyden’s career, you know this is not a man prone to hyperbole — and you know his track record on these warnings is perfect.
Just last month, we wrote about the Wyden Siren — the pattern where Senator Ron Wyden sends a cryptic public signal that something terrible is happening behind the classification curtain, can’t say what it is, and then is eventually proven right. Every single time. The catalyst then was a two-sentence letter to CIA Director Ratcliffe expressing “deep concerns about CIA activities.”
Well, the siren is going off once again. This time, Wyden took to the Senate floor to deliver a lengthy speech, ostensibly about the since approved (with support of many Democrats) nomination of Joshua Rudd to lead the NSA. Wyden was protesting that nomination, but in the context of Rudd being unwilling to agree to basic constitutional limitations on NSA surveillance. But that’s just a jumping off point ahead of Section 702’s upcoming reauthorization deadline. Buried in the speech is a passage that should set off every alarm bell:
There’s another example of secret law related to Section 702, one that directly affects the privacy rights of Americans. For years, I have asked various administrations to declassify this matter. Thus far they have all refused, although I am still waiting for a response from DNI Gabbard. I strongly believe that this matter can and should be declassified and that Congress needs to debate it openly before Section 702 is reauthorized. In fact, when it is eventually declassified, the American people will be stunned that it took so long and that Congress has been debating this authority with insufficient information.
You can see the full video here if you want.
Here’s a sitting member of the Senate Intelligence Committee — someone with access to the classified details — is telling his colleagues and the public that there is a secret interpretation of Section 702 that “directly affects the privacy rights of Americans,” that he’s been asking multiple administrations to declassify it, that they’ve all refused, and that when it finally comes out, people will be stunned.
If you’ve followed Wyden for any amount of time, this all sounds very familiar. In 2011, Wyden warned that the government had secretly reinterpreted the PATRIOT Act to mean something entirely different from what Congress and the public understood. He couldn’t say what. Nobody believed it could be that bad. Then the Snowden revelations showed the NSA was engaged in bulk collection of essentially every American’s phone metadata. In 2017, he caught the Director of National Intelligence answering a different question than the one Wyden asked about Section 702 surveillance. The pattern repeats. The siren sounds. Years pass. And then, eventually, we find out it was worse than we imagined.
Now here he is, doing the exact same thing with Section 702 yet again, now that it’s up for renewal. Congress is weeks away from a reauthorization vote, and Wyden is explicitly telling his colleagues (not for the first time) they are preparing to vote on a law whose actual meaning is being kept secret from them as well as from the American public:
The past fifteen years have shown that, unless the Congress can have an open debate about surveillance authorities, the laws that are passed cannot be assumed to have the support of the American people. And that is fundamentally undemocratic. And, right now, the government is relying on secret law with regard to Section 702 of FISA. I’ve already mentioned the provision that was stuck into the last reauthorization bill, that could allow the government to force all sorts of people to spy on their fellow citizens. I have explained the details of how the Biden Administration chose to interpret it, and how the Trump Administration will interpret it, are a big secret. Americans have the right to be confused and angry that this is how the government and Congress choose to do business.
That’s a United States senator who has a long history of calling out secret interpretations that lead to surveillance of Americans — standing on the Senate floor and warning, once again, that there’s a secret interpretation of Section 702 authorities. One that almost certainly means mass surveillance.
And Wyden knows exactly how this plays out. He’s been through the reauthorization cycle enough times to know the playbook the intelligence community runs every time 702 is up for renewal:
I’ve been doing this a long time, so I know how this always goes. Opponents of reforming Section 702 don’t want a real debate where Members can decide for themselves which reform amendments to support. So what always happens is that a lousy reauthorization bill magically shows up a few days before the authorization expires and Members are told that there’s no time to do anything other than pass that bill and that if they vote for any amendments, the program will die and terrible things will happen and it will be all their fault.
He’s right. Every time reauthorization is on the table, no real debate happens, and then just before the authorization is about to run out, some loyal soldier of the surveillance brigade in Congress will scream “national security” at the top of their lungs, insist there’s no time to debate this or people will die, and then promises that we need to just re-authorize for a few more years, at which point we’ll be able to hold a debate on the surveillance.
But even setting aside the secret interpretation Wyden can’t discuss, his speech highlights something almost as damning: just how spectacularly the supposed “reforms” from the last reauthorization have failed. Remember, one of the big “concessions” to get the last reauthorization across the finish line was a requirement that “sensitive searches” — targeting elected officials, political candidates, journalists, and the like — would need the approval of the FBI’s Deputy Director.
This was in response to some GOP elected officials being on the receiving end of investigations during the Biden era, freaking out that the NSA appeared to be doing the very things plenty of civil society and privacy advocates had been telling them about for over a decade while they just yelled “national security” back at us.
So how are those small “reforms” working out? Here’s Wyden:
The so-called big reform was to require the approval of the Deputy FBI Director for these sensitive searches.
Until two months ago, the Deputy FBI Director was Dan Bongino. As most of my colleagues know, Mr. Bongino is a longtime conspiracy theorist who has frequently called for specious investigations of his political opponents. This is the man whom the President and the U. S. Senate put in charge of these incredibly sensitive searches. And Bongino’s replacement as Deputy Director, Andrew Bailey, is a highly partisan election denier who recently directed a raid on a Georgia election office in an effort to justify Donald Trump’s conspiracy theories. I don’t know about my colleagues, but this so-called reform makes me feel worse, not better.
So the grand reform that was supposed to provide meaningful oversight of the FBI’s most sensitive surveillance activities ended up placing that authority in the hands of a conspiracy theorist, followed by a partisan election denier. And just to make the whole thing even more farcical, Wyden notes that the FBI has refused to even keep a basic record of these searches:
But it’s even worse than it looks. The FBI has refused to even keep track of all of the sensitive searches the Deputy Director has considered. The Inspector General urged the FBI to just put this information into a simple spreadsheet and they refused to do it. That is how much the FBI does not want oversight.
They won’t maintain a spreadsheet. The Inspector General asked them to track their use of a sensitive surveillance power using what amounts to a basic Excel file, and the FBI said no. That’s the state of “reform” for Section 702 after the last re-auth.
Wyden has also been sounding the alarm about the expansion of who can be forced to spy on behalf of the government, thanks to a provision jammed into the last reauthorization that expanded the definition of “electronic communications service provider” to cover essentially anyone with access to communications equipment. As Wyden explained:
Two years ago, during the last reauthorization debacle, something really bad happened. Over in the House, existing surveillance law was changed so that the government could force anyone with “access” to communications to secretly collect those communications for the government. As I pointed out at the time, that could mean anyone installing or repairing a cable box, or anyone responsible for a wifi router. It was a jaw-dropping expansion of authorities that could end up forcing countless ordinary Americans to secretly help the government spy on their fellow citizens.
The Biden administration apparently promised to use this authority narrowly. But, of course, the Trump administration has made no such promise. As we say with every expansion of executive authority, just imagine how the worst possible president from the opposing party would use it. And now we don’t have to wonder any more.
Wyden correctly points out that secret promises from a prior administration are worth exactly nothing:
But here’s the other thing — whatever secret promise the Biden Administration made about using these vast, unchecked authorities with restraint, the current administration clearly isn’t going to feel bound by that promise. So whatever the previous administration intended to accomplish with that provision, there is absolutely nothing preventing the current administration from conscripting those cable repair and tech support men and women to secretly spy on Americans.
So to tally this up: Congress is about to vote on reauthorizing Section 702 with a secret legal interpretation that Wyden says will stun the public when it’s eventually revealed, with “reforms” that placed surveillance approval authority in the hands of conspiracy theorists who won’t even keep a spreadsheet, with a massively expanded definition of who can be forced to help the government spy, with secret promises about restraint that the current administration has no intention of honoring, and with a nominee to lead the NSA who won’t commit to following the Constitution.
The Wyden Siren is blaring. And if history is any guide — and it has been, without exception — whatever is behind the classification curtain is worse than what we can see from the outside.
...
Read the original on www.techdirt.com »
To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".
10HN is also available as an iOS App
If you visit 10HN only rarely, check out the the best articles from the past week.
If you like 10HN please leave feedback and share
Visit pancik.com for more.