10 interesting stories served every morning and every evening.
Hacker News Guidelines
What to Submit
On-Topic: Anything that good hackers would find interesting. That includes more than hacking and startups. If you had to reduce it to a sentence, the answer might be: anything that gratifies one’s intellectual curiosity.
Off-Topic: Most stories about politics, or crime, or sports, or celebrities, unless they’re evidence of some interesting new phenomenon. If they’d cover it on TV news, it’s probably off-topic.
Please don’t do things to make titles stand out, like using uppercase or exclamation points, or saying how great an article is.
Please submit the original source. If a post reports on something found on another site, submit the latter.
Please don’t use HN primarily for promotion. It’s ok to post your own stuff part of the time, but the primary use of the site should be for curiosity.
If the title includes the name of the site, please take it out, because the site name will be displayed after the link.
If the title contains a gratuitous number or number + adjective, we’d appreciate it if you’d crop it. E.g. translate “10 Ways To Do X” to “How To Do X,” and “14 Amazing Ys” to “Ys.” Exception: when the number is meaningful, e.g. “The 5 Platonic Solids.”
Otherwise please use the original title, unless it is misleading or linkbait; don’t editorialize.
If you submit a video or pdf, please warn us by appending [video] or [pdf] to the title.
Please don’t post on HN to ask or tell us something. Send it to hn@ycombinator.com.
Please don’t delete and repost. Deletion is for things that shouldn’t have been submitted in the first place.
Don’t solicit upvotes, comments, or submissions. Users should vote and comment when they run across something they personally find interesting—not for promotion.
Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.
When disagreeing, please reply to the argument instead of calling names. “That is idiotic; 1 + 1 is 2, not 3” can be shortened to “1 + 1 is 2, not 3.”
Don’t be curmudgeonly. Thoughtful criticism is fine, but please don’t be rigidly or generically negative.
Don’t post generated comments or AI-edited comments. HN is for conversation between humans.
Please don’t fulminate. Please don’t sneer, including at the rest of the community.
Please respond to the strongest plausible interpretation of what someone says, not a weaker one that’s easier to criticize. Assume good faith.
Please don’t post shallow dismissals, especially of other people’s work. A good critical comment teaches us something.
Please don’t use Hacker News for political or ideological battle. It tramples curiosity.
Please don’t comment on whether someone read an article. “Did you even read the article? It mentions that” can be shortened to “The article mentions that”.
Please don’t pick the most provocative thing in an article or post to complain about in the thread. Find something interesting to respond to instead.
Throwaway accounts are ok for sensitive information, but please don’t create accounts routinely. HN is a community—users should have an identity that others can relate to.
Please don’t use uppercase for emphasis. Instead, put *asterisks* around it and it will get italicized. More formatting info here.
Please don’t post insinuations about astroturfing, shilling, brigading, foreign agents, and the like. It degrades discussion and is usually mistaken. If you’re worried about abuse, email hn@ycombinator.com and we’ll look at the data.
Please don’t complain about tangential annoyances—e.g. article or website formats, name collisions, or back-button breakage. They’re too common to be interesting.
Please don’t comment about the voting on comments. It never does any good, and it makes boring reading.
Please don’t post comments saying that HN is turning into Reddit. It’s
a
semi-noob
illusion,
as
old
as
the
hills.
...
Read the original on news.ycombinator.com »
Computational Complexity and other fun stuff in math and computer science from Lance Fortnow and Bill Gasarch
...
Read the original on blog.computationalcomplexity.org »
Skip to content
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.
You switched accounts on another tab or window. Reload to refresh your session.
You must be signed in to star a gist
You must be signed in to fork a gist
Embed this gist in your website.
Save bretonium/291f4388e2de89a43b25c135b44e41f0 to your computer and use it in GitHub Desktop.
Embed this gist in your website.
Save bretonium/291f4388e2de89a43b25c135b44e41f0 to your computer and use it in GitHub Desktop.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
You can’t perform that action at this time.
...
Read the original on gist.github.com »
Our liberation services are temporarily unavailable. Please try again later.
Is your legal team frustrated with the attribution clause? Tired of putting “Portions of this software…” in your documentation? Those maintainers worked for free—why should they get credit?
Does your company forbid AGPL code? One wrong import and suddenly your entire proprietary codebase must be open sourced. The horror!
Tracking licenses across hundreds of dependencies? Legal reviews taking weeks? Third-party audits finding “issues”? What if you could just… not deal with any of that?
Some licenses require you to contribute improvements back. Your shareholders didn’t invest in your company so you could help strangers.
For the first time, a way to avoid giving that pesky credit to maintainers.
Our proprietary AI systems have never seen the original source code. They independently analyze documentation, API specifications, and public interfaces to recreate functionally equivalent software from scratch.
The result is legally distinct code that you own outright. No derivative works. No license inheritance. No obligations.
*Through our offshore subsidiary in a jurisdiction that doesn’t recognize software copyright
Simply upload your package.json, requirements.txt, Cargo.toml, or any dependency manifest. Our system identifies every open source package you want liberated.
Our legally-trained robots analyze only public documentation—README files, API docs, and type definitions. They never see a single line of source code. The clean room stays clean.
A completely separate team of robots—who have never communicated with the analysis team—implements the software from scratch based solely on specifications. No copying. No derivation.
Your new code is delivered under the MalusCorp-0 License—a proprietary-friendly license with zero attribution requirements, zero copyleft, and zero obligations.
Do whatever you want
Transparent, pay-per-KB pricing. No tiers, no subscriptions, no hidden fees.
Every package is priced by its unpacked size on npm. We look up each dependency in your package.json, measure the size in kilobytes, and charge … per KB. That’s it.
✓ Up to 50 packages per order
✓ No base fee, no subscription — pay only for what you liberate
Upload Manifest
If any of our liberated code is found to infringe on the original license, we’ll provide a full refund and relocate our corporate headquarters to international waters.*
*This has never happened because it legally cannot happen. Trust us.
“We had 847 AGPL dependencies blocking our acquisition. MalusCorp liberated them all in 3 weeks. The due diligence team found zero license issues. We closed at $2.3B.”
“Our lawyers estimated $4M in compliance costs. MalusCorp’s Total Liberation package was $50K. The board was thrilled. The open source maintainers were not, but who cares?”
“I used to feel guilty about not attributing open source maintainers. Then I remembered that guilt doesn’t show up on quarterly reports. Thank you, MalusCorp.”
“The robots recreated our entire npm dependency tree—2,341 packages—in perfect isolation. Our compliance dashboard went from red to green overnight.”
Trusted by industry leaders who prefer to remain anonymous
Our clean room process is based on well-established legal precedent. The robots performing reconstruction have provably never accessed the original source code. We maintain detailed audit logs that definitely exist and are available upon request to courts in select jurisdictions.
What about the original developers?
They made their choice when they released their code as “open source.” We’re simply exercising our right to independently implement the same functionality. If they wanted compensation, they should have worked for a corporation.
How is this different from copying?
Intent and process. Our robots independently arrive at the same solutions through clean room methodology. It’s like how every movie about an asteroid threatening Earth isn’t plagiarism—sometimes multiple entities just have the same idea.
What if the liberated code has bugs?
Our SLA guarantees functional equivalence, not perfection. Besides, the original open source code probably had bugs too. At least now they’re YOUR bugs, under YOUR license.
Can I see the robots?
Our robot workforce operates in a secure facility in [LOCATION REDACTED]. Tours are available for Enterprise customers who sign our 47-page NDA.
What licenses can you eliminate?
All of them. MIT, Apache, GPL, AGPL, LGPL, BSD, MPL—if it has terms, we can liberate you from them. Special rush pricing available for AGPL emergencies.
Join the thousands of corporations who’ve discovered that open source obligations are merely suggestions when you have enough robots.
No credit card required for quotes. Payment accepted in USD, EUR, BTC, and stock options.
...
Read the original on malus.sh »
An open-source intelligence investigation into how Meta Platforms built a multi-channel influence operation to pass age verification laws that shift regulatory burden from social media platforms onto Apple and Google’s app stores.
Every finding in this repository is sourced from public records: IRS 990 filings, Senate LD-2 lobbying disclosures, state lobbying registrations, campaign finance databases, corporate registries, WHOIS/DNS records, Wayback Machine archives, and investigative journalism.
Status: Active investigation. 47 proven findings, 9 structurally possible but unproven hypotheses, and multiple pending FOIA responses.
Meta spent a record $26.3 million on federal lobbying in 2025, deployed 86+ lobbyists across 45 states, and covertly funded a “grassroots” child safety group called the Digital Childhood Alliance (DCA) to advocate for the App Store Accountability Act (ASAA). The ASAA requires app stores to verify user ages before downloads but imposes no requirements on social media platforms. If it becomes law, Apple and Google absorb the compliance cost while Meta’s apps face zero new mandates.
This investigation traced funding flows across five confirmed channels, analyzed $2.0 billion in dark money grants, searched 59,736 DAF recipients, parsed LD-2 filings, and mapped campaign contributions across four states to document the operation.
Meta’s federal lobbying spending jumped from $19M (2022-2023) to $24M (2024) to $26.3M (2025) as ASAA bills were introduced in roughly 20 states. In Louisiana alone, 12 lobbyists were deployed for a single bill that passed 99-0.
Across all five Arabella Advisors entities (New Venture Fund, Sixteen Thirty Fund, North Fund, Windward Fund, Hopewell Fund), 4,433 grants totaling approximately $2.0 billion were analyzed. Not a single dollar went to any child safety, age verification, or tech policy organization. The Schedule I grant pathway through the Arabella network is definitively ruled out.
Five confirmed channels connect Meta’s spending to ASAA advocacy: direct federal lobbying ($26.3M), state lobbyist networks (45 states), the Digital Childhood Alliance (astroturf 501(c)(4)), super PACs ($70M+), and state legislative campaigns (3 laws passed). A sixth channel through the Arabella dark money network is structurally possible but unproven.
These standalone HTML documents provide detailed views of the investigation:
Full Investigation Documentation contains the complete OSINT investigation report with all five channels, evidence tables, and source citations.
Funding Network Timeline maps the chronological development of Meta’s lobbying infrastructure, DCA’s formation, and ASAA legislative progress across states.
Research Timeline tracks the investigation itself, showing when each finding was established and how threads connected.
Meta retained 40+ lobbying firms and 87 federal lobbyists in 2025 (85% with prior government service). Meta’s own LD-2 filings with the Senate explicitly list H. R. 3149/S. 1586, the App Store Accountability Act, as a lobbied bill. The filing narrative includes “protecting children, bullying prevention and online safety; youth safety and federal parental approval; youth restrictions on social media.”
At the state level, confirmed operations include $338,500 to Headwaters Strategies (Colorado), $324,992+ across 9 firms and 12 lobbyists in Louisiana, and $1,036,728 in direct California lobbying (Q1-Q3 2025 alone). A Meta lobbyist brought the legislative language for Louisiana HB-570 directly to the bill’s sponsor, Rep. Kim Carver, who confirmed this publicly.
DCA is a 501(c)(4) advocacy group that Meta covertly funds. Bloomberg exposed the funding relationship in July 2025. Under oath at a Louisiana Senate committee hearing, Executive Director Casey Stefanski admitted receiving tech company funding but refused to name donors.
DCA has no EIN in the IRS Business Master File, no incorporation record in any state registry searched (CO, DC, DE, VA, OpenCorporates), and no Form 990 on file. It processes donations through the For Good DAF (formerly Network for Good) as a “Project,” not a standalone nonprofit. Its likely fiscal sponsor is NCOSEAction/Institute for Public Policy (EIN 88-1180705), NCOSE’s confirmed 501(c)(4) affiliate with the same leadership.
DCA’s domain was registered December 18, 2024. The website was live and fully formed the next day. Every blog post and testimony targets Apple and Google. Meta is never mentioned or criticized.
Meta committed over $70 million to four state-level super PACs: ATEP ($45M, bipartisan, co-led by Hilltop Public Solutions), META California ($20M), California Leads ($5M), and Forge the Future (Texas, Republican-aligned). Forge the Future’s stated policy priority is “empowering parents with oversight of children’s online activities,” which mirrors ASAA language exactly.
Hilltop Public Solutions co-leads the $45M ATEP super PAC and is also involved in DCA’s messaging coordination, making it the first firm confirmed in both Meta’s PAC operation and the astroturf advocacy track.
All super PACs are registered at the state level rather than with the FEC, scattering disclosure filings across individual state ethics commissions instead of a single searchable federal database.
Meta’s Colorado lobbyist Adam Eichberg simultaneously serves as Board Chair of the New Venture Fund, the flagship 501(c)(3) of the Arabella Advisors network. NVF transfers $121.3 million annually to the Sixteen Thirty Fund, a 501(c)(4) with no donor disclosure requirements.
The Arabella network operates four entities from 1828 L Street NW, Washington DC (suites 300-A through 300-D) with combined annual revenue exceeding $1.3 billion. All five entities’ grant recipients were analyzed (4,433 grants, approximately $2.0 billion). Zero dollars went to any child safety organization, definitively ruling out the Schedule I grant pathway.
If Meta money flows through the Arabella network to DCA, it would have to travel via fiscal sponsorship, consulting fees, or lobbying expenditures, which are more opaque than grant disclosures.
ASAA has been signed into law in three states:
Roughly 17 additional states have introduced or are considering ASAA bills, including Kansas, South Carolina, Ohio, Georgia, and Florida. The federal version was introduced in May 2025 by Rep. John James (R-MI) and Sen. Mike Lee (R-UT).
Each finding below is documented with sources in the corresponding analysis file.
Meta funds DCA, confirmed by Bloomberg reporters and partially admitted by Stefanski under oath at the Louisiana Senate Commerce Committee hearing (April 2025). Sources: Insurance Journal/Bloomberg July 2025, Deseret News Dec 2025, The Center Square LA.
Meta deployed 86+ lobbyists across 45 states for ASAA and related campaigns. Source: OpenSecrets, state lobbying registrations.
Meta spent $26.3 million on federal lobbying in 2025, an all-time record exceeding Lockheed Martin and Boeing. Source: OpenSecrets, Quiver Quantitative, Dome Politics.
Meta paid Headwaters Strategies $338,500 for Colorado lobbying between 2019 and 2026. Source: Colorado SOS SODA API.
Adam Eichberg simultaneously co-founded Meta’s Colorado lobbying firm (Headwaters Strategies) and chairs the New Venture Fund board. Sources: Headwaters Strategies website, NVF board page, InfluenceWatch.
NVF does not directly fund any child safety or tech policy organizations via Schedule I grants. Source: NVF Form 990 Schedule I analysis, 2,669 recipients.
DCA and DCI share infrastructure: same registrar (GoDaddy), CDN (Cloudflare), email (Microsoft 365), and marketing platform (Elastic Email). Source: DNS/WHOIS analysis.
Pelican State Partners represents Meta as a lobbying client in Louisiana. Source: F Minus database, LA Board of Ethics.
DCA leadership comes from NCOSE: three of four senior staff have NCOSE connections (Stefanski, Hawkins, McKay). Source: DCA website, NCOSE public records.
ASAA has been signed into law in three states: Utah (SB-142, March 2025), Louisiana (HB-570, June 2025), and Texas (SB 2420, May 2025, paused by judge December 2025). Sources: State legislature records, news coverage.
The Sixteen Thirty Fund does not fund any child safety or tech policy organizations via Schedule I grants (306 of 318 recipients analyzed). Source: STF Form 990 Schedule I, 2024.
All five Arabella entities analyzed: 4,433 grants (approximately $2.0 billion) with zero dollars going to child safety or tech policy organizations. Schedule I pathway definitively ruled out across the entire network. Sources: NVF, STF, North Fund, Windward, Hopewell Form 990 Schedule I filings via ProPublica.
A Meta employee (Jake Levine, Product Manager) contributed $1,175 to ASAA sponsor Matt Ball’s campaign apparatus on June 2, 2025. Source: Colorado TRACER bulk data.
A Google Policy Manager (Kyle Gardner) also contributed $450 to Matt Ball. Multiple tech company employees from ASAA-affected companies targeted the same ASAA bill sponsor. Source: Colorado TRACER bulk data.
Eichberg and Coyne (Headwaters principals) did not contribute to ASAA bill sponsors Ball or Paschal despite $20,000+ combined political giving. Source: Colorado TRACER bulk data.
No direct Meta PAC contributions to any ASAA sponsor across Utah, Louisiana, Texas, or Colorado. Source: FollowTheMoney.org multi-state search.
Todd Weiler (Utah SB-142 sponsor) does not accept corporate contributions and has not discussed ASAA directly with Meta. DCA served as the policy intermediary. Source: Investigative reporting, Weiler’s public statements.
DCA has no EIN in the IRS Business Master File. Not found in any of four regional extracts (eo1-eo4.csv) covering all US tax-exempt organizations. Source: IRS BMF regional extracts.
DCI confirmed in IRS BMF with EIN 39-3684798, Delaware incorporation at 213 N Market St Wilmington, IRS ruling November 2025. Source: IRS BMF extract.
Meta’s Forge the Future super PAC spent $1.3 million in Texas ahead of March 2026 primaries. Source: Texas Ethics Commission filings, news coverage.
DCA’s website deployed less than 24 hours after domain registration: fully functional advocacy site with professional design, statistics, and Heritage/NCOSE testimonials. Source: Wayback Machine CDX API, 100+ snapshots.
77-day pipeline from DCA domain registration (December 18, 2024) to Utah SB-142 signing (March 5, 2025). Site pre-loaded with ASAA talking points before any bill had passed. Source: WHOIS records, Utah Legislature.
Meta deployed 12 lobbyists for Louisiana HB-570, which passed 99-0. Disproportionate deployment indicates text-control and amendment-blocking rather than vote persuasion. Source: Investigative reporting, LA Board of Ethics.
Three California tech policy employees from Meta, Google, and Pinterest contributed to Matt Ball within 90 days. All from ASAA-affected companies, all out-of-state, targeting a newly-appointed senator. Source: Colorado TRACER bulk data.
Pelican State Partners represents both Meta and Roblox in Louisiana. Both are ASAA beneficiaries, enabling “broad industry support” framing. Source: F Minus database.
DCA’s coalition count inflated from 50+ to 140+ with only six organizations ever publicly named. No member list has been published on the website. Source: DCA website, Wayback Machine.
NCOSE has a confirmed 501(c)(4) affiliate: NCOSEAction / Institute for Public Policy (EIN 88-1180705), IRS ruling May 2025, same address and leadership as NCOSE. Source: IRS BMF, NCOSE website.
Network for Good is a Donor Advised Fund, not a payment processor. DCA is classified as “Project” (ID 258136) in the system. For Good explicitly limits grants to 501(c)(3) organizations. Source: For Good website, IRS determination.
A Meta lobbyist drafted HB-570′s legislative language, confirmed by sponsor Rep. Kim Carver. The bill as originally written placed age verification burden exclusively on app stores, not platforms. Source: Investigative reporting, Carver’s public confirmation.
Nicole Lopez (Meta Director of Global Litigation Strategy for Youth) testified in both Louisiana and South Dakota for ASAA bills, serving as Meta’s national ASAA spokesperson. Source: Legislative hearing records.
The Sixteen Thirty Fund’s $31 million lobbying budget and $13.1 million in professional fees contain zero mentions of child safety, digital policy, age verification, or app stores. Source: STF Form 990 Part IX.
John R. Read (DCA Senior Policy Advisor) lists “Digital Childhood Alliance” as his employer in Colorado TRACER records. Contributed $100 to AG candidate Hetal Doshi (October 2025). Source: Colorado TRACER.
Matt Ball received 8% of total fundraising from tech industry employees. He is the only 2026 Colorado senate candidate with contributions from Meta, Pinterest, Instacart, Anthropic, and Google employees. Four of eight dual-maxed donors are tech employees. Source: Colorado TRACER analysis.
NCOSE Schedule R reveals a two-entity evolution: the original NCOSE Action (EIN 86-2458921, c4 reclassified to c3) was replaced by the Institute for Public Policy (EIN 88-1180705, c4). All 19 NCOSE-to-Institute transaction indicators are marked “No” despite shared leadership. Source: NCOSE Form 990 Schedule R, 2019-2023.
For Good DAF pathway definitively ruled out: 59,736 grant recipients across five years (approximately $1.73 billion) searched with zero matches for DCA, DCI, NCOSE, NCOSEAction, or any related entity. Source: For Good DAF grant data.
NCOSE lobbying spending tripled from $78,000 to $204,000 concurrent with DCA launch and the ASAA legislative push (FY2023 to FY2024). Source: NCOSE Form 990 Part IX.
Forge the Future super PAC explicitly lists an ASAA-aligned policy priority: “Empowering parents with oversight of children’s online activities across devices and digital environments.” Source: Forge the Future filings.
Hilltop Public Solutions bridges Meta’s super PAC and DCA operations. It co-leads ATEP ($45M) and is involved in DCA messaging coordination. First firm confirmed in both tracks. Source: ATEP filings, investigative reporting.
Meta super PACs are state-level entities (not FEC-registered), deliberately scattering filings across state ethics commissions to avoid centralized searchability. Source: FEC search (negative), state PAC registrations.
Meta’s total documented political spending exceeds $70 million: $45M ATEP, $20M META California, $5M California Leads, with downstream flows to Forge the Future (TX) and Making Our Tomorrow (IL). Source: State PAC filings, news coverage.
Casey Stefanski never appears on any NCOSE 990 filing despite reportedly working there ten years. Not among officers, directors, key employees, or five highest-compensated. Source: NCOSE Form 990 filings, 2015-2023.
Meta’s LD-2 filings explicitly list the App Store Accountability Act (H. R. 3149/S. 1586) as a lobbied bill. This is the first direct evidence from Meta’s own federal filings connecting its $26.3M lobbying spend to the specific legislation DCA advocates for. Source: Senate LDA filing UUID b73445ed-15e5-42e7-a1e8-aeb224755267.
Meta simultaneously lobbies FOR ASAA and ON KOSA/COPPA 2.0, supporting legislation that burdens Apple and Google while opposing or amending legislation that would regulate Meta directly. Both appear in the same LD-2 filing. Source: Meta LD-2 Q1-Q2 2025.
LD-2 narrative mirrors DCA messaging: “youth safety and federal parental approval” framing in Meta’s federal filings matches DCA’s “parental approval” and “child protection” advocacy language. Source: LD-2 filing CPI issue code narrative.
Meta funds flow through the Arabella network via non-grant mechanisms (fiscal sponsorship, consulting fees, lobbying expenditures). The Schedule I and For Good DAF pathways are both ruled out.
DCA operates under NCOSEAction (EIN 88-1180705) as fiscal sponsor. The personnel chain is direct (van der Watt to Hawkins to Stefanski), but NCOSE reports zero transactions with its c4 affiliate.
Jake Levine’s contribution to Matt Ball was coordinated by Meta’s government affairs team rather than being purely personal.
Angela Paxton (Texas ASAA sponsor) was among the unnamed state senators supported by Forge the Future.
NCOSE’s lobbying spend tripling is causally related to DCA/ASAA activity (timing is concurrent but program descriptions do not mention ASAA).
DCA’s For Good donation page is cosmetic. Actual funding comes directly from Meta, not small-dollar DAF donations.
This investigation was conducted by a human researcher who directed all research decisions, selected sources, evaluated findings, and wrote the public-facing posts. Claude Code (Anthropic’s CLI tool, running Claude Opus) was used as a research assistant for:
* Bulk data processing: parsing 4,433 IRS Schedule I grant records, 59,736 DAF recipients, 132MB of Colorado TRACER campaign finance data, and IRS Business Master File extracts covering all US tax-exempt organizations
* Cross-referencing findings across 24 analysis files and identifying patterns that span multiple research threads
Claude Code did not independently choose what to investigate, decide what constitutes a finding, or determine what to publish. Every factual claim in this repository cites a primary source (IRS filing, Senate disclosure, state database, legislative record, or published reporting) that can be independently verified. The tool does not change whether Meta’s LD-2 filing lists H. R. 3149, whether DCA has an EIN, or whether Stefanski admitted tech funding under oath. The records exist or they don’t.
If you want to verify any finding, the source URLs and database identifiers are provided throughout. Start with the primary records, not with this repository.
This is an OSINT research product. All findings are based on public records. Source data is cited throughout.
...
Read the original on github.com »
Find out which AI models your machine can actually run.
Improved V3 with hybrid thinking and tool use
Try adjusting your search or filters
...
Read the original on canirun.ai »
Ireland today (June 20) became the 15th coal-free country in Europe, having ended coal power generation at its 915 MW Moneypoint coal plant in County Clare. Initially commissioned in the mid-1980s by ESB, Moneypoint was intended to help Ireland offset the impact of the oil crises in the 1970s by providing a dependable source of energy.
But with Ireland now generating a lot more renewable energy nowadays, coal burning is no longer such an urgent need. Energy think tank Ember data states Ireland generated 37% (11.4 TWh) of its electricity from wind in 2024. Solar is not near wind levels of generation, (0.97 TWh in 2024) but it has been continuously breaking generation records in recent months and local stakeholders are confident this positive trend will continue.
Following the closure, the Moneypoint plant will continue to serve a limited backup role, burning heavy fuel oil under emergency instruction from Ireland’s transmission system operator EirGrid until 2029.
This strategy is in line with previous plans made by EirGrid and ESB to exit coal-fired generation by the end of 2025, which stipulated that Moneypoint would no longer be active in the wholesale electricity market.
“Ireland has quietly rewritten its energy story, replacing toxic coal with homegrown renewable power,” said Alexandru Mustață, campaigner on coal and gas at Europe’s Beyond Fossil Fuels.
“But this isn’t ‘job done’. The government’s priority now must be building a power system for a renewable future; one with the storage, flexibility, and grid infrastructure needed to run fully on clean, domestic renewable electricity,” Mustață warned.
Jerry Mac Evilly, Campaigns Director at Friends of the Earth Ireland, appealed to the government to ensure oil backup at Moneypoint is kept to an absolute minimum and ultimately decommissioned. He also appealed for the government to prevent further development of data centers, which he said are increasing Ireland’s reliance on fossil gas.
“We also can’t ignore that the government is targeting the installation of at least 2 GW of gas power plants with no strategy to reduce Ireland’s dangerous gas dependency,” he added.
On a broader level, Ireland’s step to close coal power generation at Moneypoint sets a precedent for further European countries’ coal exits to come, says Beyond Fossil Fuels. The group tracks European countries’ progress on their commitments to switching from fossil fuels to renewable energy. So far, 23 European countries have committed to coal phase-outs. Italy is expected to complete its mainland coal phase-out this summer with the upcoming closure of its last two big coal power plants, while mainland Spain is also expecting to declare itself coal-free this summer.
...
Read the original on www.pv-magazine.com »
There is a certain kind of computer review that is really a permission slip. It tells you what you’re allowed to want. It locates you in a taxonomy — student, creative, professional, power user — and assigns you a product. It is helpful. It is responsible. It has very little interest in what you might become.
The MacBook Neo has attracted a lot of these reviews.
The consensus is reasonable: $599, A18 Pro, 8GB RAM, stripped-down I/O. A Chromebook killer, a first laptop, a sensible machine for sensible tasks. “If you are thinking about Xcode or Final Cut, this is not the computer for you.” The people saying this are not wrong. It is also not the point.
Nobody starts in the right place. You don’t begin with the correct tool and work sensibly within its constraints until you organically graduate to a more capable one. That is not how obsession works. Obsession works by taking whatever is available and pressing on it until it either breaks or reveals something. The machine’s limits become a map of the territory. You learn what computing actually costs by paying too much of it on hardware that can barely afford it.
I know this because I was running Final Cut Pro X on a 2006 Core 2 Duo iMac with 3GB RAM and 120GB of spinning rust. I was nine. I had no business doing this. I did it every day after school until my parents made me go to bed.
The machine came as a hand-me-down from my nana. She’d wiped it, set it up in her kitchen in Massachusetts. It was one software update away from getting the axe from Apple. I torrented Adobe CS5 the same week. Downloaded Xcode and dragged buttons and controls around in Interface Builder with no understanding of what I was looking at. I edited SystemVersion.plist to make the “About this Mac” window say it was running Mac OS 69, which is the s*x number, which is very funny. I faked being sick to watch WWDC 2011 — Steve Jobs’ last keynote — and clapped alone in my room when the audience clapped, and rebuilt his slides in Keynote afterward because I wanted to understand how he’d made them feel that way.
I knew the machine was wrong for what I wanted to do with it. I didn’t care. Every limitation was just the edge of something I hadn’t figured out yet. It was green fields and blue skies.
I thought about all of this when I opened the Neo for the first time.
What Apple put inside the Neo is the complete behavioral contract of the Mac. Not a Mac Lite. Not a browser in a laptop costume. The same macOS, the same APIs, the same Neural Engine, the same weird byzantine AppKit controls that haven’t meaningfully changed since the NeXT era. The ability to disable SIP and install some fuck-ass system modification you saw in a YouTube tutorial. All of it, at $599.
They cut the things that are, apparently, not the Mac. MagSafe. ProMotion. M-series silicon. Port bandwidth. Configurable memory. What remains is the Retina display, the aluminum, the keyboard, and the full software platform. I held it and thought, “yep, still a Mac.”
Yes, you will hit the limits of this machine. 8GB of RAM and a phone chip will see to that. But the limits you hit on the Neo are resource limits — memory is finite, silicon has a clock speed, processes cost something. You are learning physics. A Chromebook doesn’t teach you that. A Chromebook’s ceiling is made of web browser, and the things you run into are not the edges of computing but the edges of a product category designed to save you from yourself. The kid who tries to run Blender on a Chromebook doesn’t learn that his machine can’t handle it. He learns that Google decided he’s not allowed to. Those are completely different lessons.
Somewhere a kid is saving up for this. He has read every review. Watched the introduction video four or five times. Looked up every spec, every benchmark, every footnote. He has probably walked into an Apple Store and interrogated an employee about it ad nauseam. He knows the consensus. He knows it’s probably not the right tool for everything he wants to do.
He has decided he’ll be fine.
This computer is not for the people writing those reviews — people who already have the MacBook Pro, who have the professional context, who are optimizing at the margin. This computer is for the kid who doesn’t have a margin to optimize. Who can’t wait for the right tool to materialize. Who is going to take what’s available and push it until it breaks and learn something permanent from the breaking.
He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it. He is going to make a folder called “Projects” with nothing in it. He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes. He is going to open GarageBand and make something that is not a song. He is going to take screenshots of fonts he likes and put them in a folder called “cool fonts” and not know why. Then he is going to have Blender and GarageBand and Safari and Xcode all open at once, not because he’s working in all of them but because he doesn’t know you’re not supposed to do that, and the machine is going to get hot and slow and he is going to learn what the spinning beachball cursor means. None of this will look, from the outside, like the beginning of anything. But one of those things is going to stick longer than the others. He won’t know which one until later. He’ll just know he keeps opening it.
That is not a bug in how he’s using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
He knows it’s probably not the right tool. It doesn’t matter. It never did.
The reviews can tell you what a computer is for. They have very little interest in what you might become because of one.
...
Read the original on samhenri.gold »
Skip to contentGo full –yolo. We’ve got you. LLMs are probabilistic - 1% chance of disaster makes it a matter of when, not if. Safehouse makes this a 0% chance — enforced by the kernel. Safehouse denies write access outside your project directory. The kernel blocks the syscall before any file is touched. All agents work perfectly in their sandboxes, but can’t impact anything outside it.Agents inherit your full user permissions. Safehouse flips this — nothing is accessible unless explicitly granted.Download a single shell script, make it executable, and run your agent inside it. No build step, no dependencies — just Bash and macOS.Safehouse automatically grants read/write access to the selected workdir (git root by default) and read access to your installed toolchains. Most of your home directory — SSH keys, other repos, personal files — is denied by the kernel.See it fail — proof the sandbox worksTry reading something sensitive inside safehouse. The kernel blocks it before the process ever sees the data.# Try to read your SSH private key — denied by the kernel
safehouse cat ~/.ssh/id_ed25519
# cat: /Users/you/.ssh/id_ed25519: Operation not permitted
# Try to list another repo — invisible
safehouse ls ~/other-project
# ls: /Users/you/other-project: Operation not permitted
# But your current project works fine
safehouse ls .
# README.md src/ package.json …Add these to your shell config and every agent runs inside Safehouse automatically — you don’t have to remember. To run without the sandbox, use command claude to bypass the function.# ~/.zshrc or ~/.bashrc
safe() { safehouse –add-dirs-ro=~/mywork “$@”; }
# Sandboxed — the default. Just type the command name.
claude() { safe claude –dangerously-skip-permissions “$@”; }
codex() { safe codex –dangerously-bypass-approvals-and-sandbox “$@”; }
amp() { safe amp –dangerously-allow-all “$@”; }
gemini() { NO_BROWSER=true safe gemini –yolo “$@”; }
# Unsandboxed — bypass the function with `command`
# command claude — plain interactive sessionGenerate your own profile with an LLMUse a ready-made prompt that tells Claude, Codex, Gemini, or another model to inspect the real Safehouse profile templates, ask about your home directory and toolchain, and generate a least-privilege `sandbox-exec` profile for your setup.The guide also tells the LLM to ask about global dotfiles, suggest a durable profile path like ~/.config/sandbox-exec.profile, offer a wrapper that grants the current working directory, and add shell shortcuts for your preferred agents.Open the copy-paste prompt
...
Read the original on agent-safehouse.dev »
Welcome to our blog! I’m Jason Williams, a senior software engineer on Bloomberg’s JavaScript Infrastructure and Terminal Experience team. Today the Bloomberg Terminal runs a lot of JavaScript. Our team provides a JavaScript environment to engineers across the company.
Bloomberg may not be the first company you think of when discussing JavaScript. It certainly wasn’t for me in 2018 before I worked here. Back then, I attended my first TC39 meeting in London, only to meet some Bloomberg engineers who were there discussing Realms, WebAssembly, Class Fields, and other topics. The company has now been involved with JavaScript standardization for numerous years, including partnering with Igalia. Some of the proposals we have assisted include Arrow Functions, Async Await, BigInt, Class Fields, Promise.allSettled, Promise.withResolvers, WeakRefs, standardizing Source Maps, and more!
The first proposal I worked on was Promise.allSettled, which was fulfilling. After that finished, I decided to help out on a proposal around dates and times, called Temporal.
JavaScript is unique in that it runs in all browsers. There is no single “owner,” so you can’t just make a change in isolation and expect it to apply everywhere. You need buy-in from all parties. Evolution happens through TC39, the Technical Committee responsible for ECMAScript.
In 2018, when I first looked at Temporal, it was at Stage 1. The TC39 Committee was convinced the problem was real. It was a radical proposal to bring a whole new library for Dates and Times into JavaScript. It was:
* Providing different DateTime Types (instead of a single API)
But how did we get here? Why was Date such a pain point? For that, we need to take a step back.
In 1995, Brendan Eich was tasked with a 10-day sprint to create Mocha (which would later become JavaScript). Under intense time pressure, many design decisions were pragmatic. One of them was to port Java’s Date implementation directly. As Brendan later explained:
It was a straight port by Ken Smith (the only code in “Mocha” I didn’t write) of Java’s Date code from Java to C.
At the time, this made sense. Java was ascendant and JavaScript was being framed as its lightweight companion. Internally, the philosophy was even referred to as MILLJ: Make It Look Like Java.
Brendan also noted that changing the API would have been politically difficult:
Changing it when everyone expected Java to be the “big brother” language would make confusion and bugs; Sun would have objected too.
In that moment, consistency with Java was more important than fundamentally rethinking the time model. It was a pragmatic trade-off. The Web was young, and most applications making use of JavaScript would be simple, at least, to begin with.
By the 2010s, JavaScript was powering banking systems, trading terminals, collaboration tools, and other complex systems running in every time zone on earth. Date was becoming more of a pain point for developers.
Developers would often write helper functions that accidently mutated the original Date object in place when they intended to return a new one:
const date = new Date(“2026-02-25T00:00:00Z”);
console.log(date.toISOString());
// “2026-02-25T00:00:00.000Z”
function addOneDay(d) {
// oops! This is mutating the date
d.setDate(d.getDate() + 1);
return d;
addOneDay(date);
console.log(date.toISOString());
// “2026-02-26T00:00:00.000Z”
const billingDate = new Date(“Sat Jan 31 2026”);
billingDate.setMonth(billingDate.getMonth() + 1);
// Expected: Feb 28
// Actual: Mar 02
Sometimes people want to get the last day of the month and fall into traps like this one, where they bump the month by one, but the days remain the same. Date does not constrain invalid calendar results back into a valid date. Instead, it silently rolls overflow into the next month.
new Date(“2026-06-25 15:15:00”).toISOString();
// Potential Return Values:
// - local TimeZone
// - Invalid Date RangeError
// - UTC
In this example, the string is similar, but not identical, to ISO 8601. Historically, browser behavior for “almost ISO” strings was undefined by the specification. Some would treat it as local time, others as UTC, and one would throw entirely as invalid input.
There’s more, much more, but the point is that Date has been a pain point for JavaScript developers for the past three decades.
The Web ecosystem had no choice but to patch Date’s shortcomings with libraries. You can see the sheer rise of datetime libraries below. Today, they add up to more than 100 million downloads a week.
Leading the charge was Moment.js, which boasts an expressive API, powerful parsing capabilities, and much-needed immutability. Created in 2011, it quickly became the de facto standard for handling date and time manipulations in JavaScript. So surely the problem is solved? Everyone should just grab a copy of this and call it a day.
The widespread adoption of moment.js (plus other similar libraries) came with its own set of problems. Adding the library meant increasing bundle size, due to the fact that it needed to be shipped with its own set of locale information plus time zone data from the time zone database.
Despite the use of minifiers, compilers, and static analysis tools, all of this extra data couldn’t be tree-shaken away, because most developers don’t know ahead of time which locales or time zones they’ll need. In order to play it safe, the majority of users took all of the data wholesale and shipped it to their users.
Maggie Johnson-Pint, who had been a maintainer of Moment.js for quite a few years (alongside others), was no stranger to requests to deal with the package size.
We were at the point with moment that it was more maintenance to keep up with modules, webpack, people wanting everything immutable because React, etc than any net new functionality
And people never stop talking about the size of course.
In 2017, Maggie decided it was time to standardise dates and times with a “Temporal Proposal” for the TC39 plenary that year. It was met with great enthusiasm, leading it to be advanced to Stage 1.
Stage 1 was a big milestone, but it was still far from the finish line. After the initial burst of energy, progress naturally slowed. Maggie and Matt Johnson-Pint were leading the effort alongside Brian Terlson, while simultaneously balancing other responsibilities inside Microsoft. Temporal was still early enough that much of the immediate work was unglamorous: requirements gathering, clarifying semantics, and translating “the ecosystem’s pain” into a design that could actually ship.
We run JavaScript at scale across the Terminal, using underlying runtimes and engines such as Chromium, Node.js and SpiderMonkey. Our users, and the financial markets in which they invest, span every time zone on earth. We pass timestamps constantly: between services, into storage, into the UI, and across systems that all have to agree on what “now” means, even when governments change DST rules with very little notice.
On top of that, we had requirements that the built-in Date model simply wasn’t designed for:
* A user-configured time zone that is not the machine’s time zone (and can change per request).
* Higher-precision timestamps (nanoseconds, at a minimum), without duct-taping extra fields onto ad-hoc wrappers forever.
In parallel with Maggie bringing Temporal to TC39, Bloomberg engineer Andrew Paprocki was talking with Igalia about making time zones configurable in V8. Specifically, they discussed introducing a supported indirection layer so an embedder could control the “perceived” time zone instead of relying on the OS default. In that conversation, Daniel Ehrenberg (then working at Igalia) pointed Andrew at the early Temporal work because it looked strikingly similar to Bloomberg’s existing value-semantic datetime types.
That exchange became an early bridge between Bloomberg’s production needs, Igalia’s browser-and-standards expertise, and the emerging direction of Temporal. Over the years that followed, Bloomberg partnered with Igalia (including via sustained funding support) and contributed engineering time directly into moving Temporal forward, until it eventually became something the whole ecosystem could ship. Andrew was looking for some volunteers within Bloomberg who could help push Temporal forward and Philipp Dunkel volunteered to be a spec champion. Alongside Andrew, he helped persuade Bloomberg to invest in making Temporal real, including a deeper partnership with Igalia. That support brought in Philip Chimento and Ujjwal Sharma as full time Temporal champions, adding the day-to-day focus the proposal needed to keep moving ahead.
Shane Carr joined the Champions team, representing Google’s Internationalization team. He provided the focus we needed on internationalization topics such as calendars, and also served as the glue between the standardization process and the voice of users who experienced pain points with tools related to JavaScript’s internationalization API (Intl), such as formatting, time zones, and calendars.
Finally, we had Justin Grant, who joined the Temporal champions in 2020 as a volunteer. After 10 years at three different startups that managed time-stamped data, he’d seen engineering teams waste thousands of hours fixing mistakes with dates, times, and time zones. Justin’s experience grounded us in real-world use cases, helped us anticipate mistakes that developers would make, and ensured that Temporal shipped a Temporal. ZonedDateTime API to help make DST bugs a thing of the past.
Other honorable mentions not on this list include Daniel Ehrenberg, Adam Shaw, and Kevin Ness.
Temporal is a top-level namespace object (similar to Math or Intl) that exists in the global scope. Underneath it are “types” that exist in the form of constructors. It’s expected that developers will reach for the type they need when using the API, such as Temporal. PlainDateTime, for example.
Here are the types Temporal comes packed with:
If you don’t know which Temporal type you need, start with Temporal. ZonedDateTime. It is the closest conceptual replacement for Date, but without the “footguns.”
* An exact moment in time (internally, milliseconds since epoch)
* All as an immutable value
const now = new Date();
const now = Temporal.Now.zonedDateTimeISO();
The above example uses the Now namespace, which gives you the type already set to your current local time and time zone.
This type is optimized for DateTimes that may require some datetime arithmetic in which the daylight saving transition could potentially cause problems. ZonedDateTime can take those transitions into account when doing any addition or subtraction of time (see example below).
// London DST starts: 2026-03-29 01:00 -> 02:00
const zdt = Temporal.ZonedDateTime.from(
“2026-03-29T00:30:00+00:00[Europe/London]”,
console.log(zdt.toString());
// → “2026-03-29T00:30:00+00:00[Europe/London]”
const plus1h = zdt.add({ hours: 1 });
console.log(plus1h.toString());
// “2026-03-29T02:30:00+01:00[Europe/London]” (01:30 doesn’t exist)
In this example, we don’t land at 01:30 but 02:30 instead, because 01:30 doesn’t exist at that specific point in time.
Temporal. Instant is an exact moment in time, it has no time zone, no daylight saving, no calendar. It represents elapsed time since midnight on January 1, 1970 (the Unix epoch). Unlike Date, which has a very similar data model, Instant is measured in nanoseconds rather than milliseconds. This decision was taken by the champions because even though the browser has some coarsening for security purposes, developers still need to deal with nanosecond-based timestamps that could have been generated from elsewhere.
A typical example of Temporal. Instant usage looks like this:
// One exact moment in time
const instant = Temporal.Instant.from(“2026-02-25T15:15:00Z”);
instant.toString();
// “2026-02-25T15:15:00Z”
instant.toZonedDateTimeISO(“Europe/London”).toString();
// “2026-02-25T15:15:00+00:00[Europe/London]”
instant.toZonedDateTimeISO(“America/New_York”).toString();
// “2026-02-25T10:15:00-05:00[America/New_York]”
The Instant can be created and then converted to different “zoned” DateTimes (more on that later). You would most likely store the Instant (in your backing storage of choice) and then use the different TimeZone conversions to display the same time to users within their time zones.
We also have a family of plain types. These are what we would call “wall time,” because if you imagine an analogue clock on the wall, it doesn’t check for daylight saving or time zones. It’s just a plain time (moving the clock forward by an hour would advance it an hour on the wall, even if you did this during a Daylight Saving transition).
We have several types with progressively less information. This is useful, as you can choose the type you want to represent and don’t need to worry about running calculations on any other un-needed data (such as calculating the time if you’re only interested in displaying the date).
These types are also useful if you only plan to display the value to the user and do not need to perform any date/time arithmetic, such as moving forwards or backwards by weeks (you will need a calendar) or hours (you could end up crossing a daylight saving boundary). The limitations of some of these types are also what make them so useful. It’s hard for you to trip up and encounter unexpected bugs.
const date = Temporal.PlainDate.from({ year: 2026, month: 3, day: 11 }); // => 2026-03-11
date.year; // => 2026
date.inLeapYear; // => false
date.toString(); // => ‘2026-03-11’
Temporal supports calendars. Browsers and runtimes ship with a set of built-in calendars, which lets you represent, display, and do arithmetic in a user’s preferred calendar system, not just format a Gregorian date differently.
Because Temporal objects are calendar-aware, operations like “add one month” are performed in the rules of that calendar, so you land on the expected result. In the example below, we add one Hebrew month to a Hebrew calendar date:
const today = Temporal.PlainDate.from(“2026-03-11[u-ca=hebrew]“);
today.toLocaleString(“en”, { calendar: “hebrew” });
// ‘22 Adar 5786’
const nextMonth = today.add({ months: 1 });
nextMonth.toLocaleString(“en”, { calendar: “hebrew” });
// ‘22 Nisan 5786’
With legacy Date, there’s no way to express “add one Hebrew month” as a first-class operation. You can format using a different calendar, but any arithmetic you do is still Gregorian month arithmetic under the hood.
...
Read the original on bloomberg.github.io »
To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".
10HN is also available as an iOS App
If you visit 10HN only rarely, check out the the best articles from the past week.
If you like 10HN please leave feedback and share
Visit pancik.com for more.