10 interesting stories served every morning and every evening.

Internet Archive Switzerland

internetarchive.ch

Welcome

Welcome toIn­t­er­net Archive Switzerland.

Universal Access to ALL Knowledge.

Internet Archive Switzerland is an in­de­pen­dent Swiss foun­da­tion, which is op­er­at­ing as an non-profit or­gan­i­sa­tion based in Sankt Gallen. Our pri­mary goal is: Universal Access to All Knowledge.

Together with like-minded part­ners we col­lect and pre­serve dig­i­tal in­for­ma­tion for learn­ing and re­search. Our ob­jec­tive is to en­sure peo­ple can find any kind of help­ful dig­i­tal ma­te­ri­als, to­day and in the fu­ture.

Although at first glance one might think that the dig­i­tal con­tent avail­able on the in­ter­net is in­ex­haustible and end­lessly grow­ing, it is also ev­i­dent that dig­i­tal in­for­ma­tion is ac­tu­ally short-lived.

We are fac­ing con­stant changes in file for­mats, sud­den fail­ure of stor­age me­dia, rapid dele­tion processes (accidental or de­lib­er­ate) and an in­creas­ing ten­dency of hid­ing knowl­edge be­hind pay­walls. All of this jeop­ar­dises easy ac­cess to in­for­ma­tion, learn­ing and the shap­ing of opin­ions on the ba­sis of facts.

All of this has led us to launch two ini­tia­tives in the foun­da­tion’s early stages: We are part­ner­ing with the University of St. Gallen to build the Gen Artificial Intelligence (AI) Archive, pre­serv­ing to­day’s AI mod­els for fu­ture gen­er­a­tions. And through our Endangered Archives ini­tia­tive, we in­vite global part­ners to ex­plore ways of res­cu­ing vul­ner­a­ble col­lec­tions from con­flict, dis­as­ter, and sup­pres­sion be­fore they are lost.

Projects

What we are WORKING on.

Project 01 · Research

Gen AI Archive

Artificial Intelligence, Generative AI, and Large Language Models (LLMs) are fun­da­men­tally re­shap­ing how hu­man­ity cre­ates and shares knowl­edge. To doc­u­ment this evo­lu­tion, the University of St. Gallen and Internet Archive Switzerland part­ner in pre­serv­ing to­day’s most pro­found mod­els for fu­ture gen­er­a­tions in the Gen AI Archive.

Project 02 · Preservation

Endangered Archives

Cultural her­itage and his­tor­i­cal records world­wide face ever grow­ing threats from con­flict, in­sta­bil­ity, and nat­ural dis­as­ters. To pre­vent the loss of this col­lec­tive mem­ory, Internet Archive Switzerland seeks to es­tab­lish an ini­tia­tive called Endangered Archives. In co­op­er­a­tion with UNESCO and other well es­tab­lished or­gan­i­sa­tions we aim to res­cue vul­ner­a­ble ma­te­ri­als by pro­vid­ing a se­cure dig­i­tal haven.

About

The foun­da­tion.

Organization

An in­de­pen­dent non profit foun­da­tion in St. Gallen, Switzerland. Mission-aligned with the Internet Archive, Internet Archive Canada and Internet Archive Europe, and a com­mon goal: Universal Access to All Knowledge.

Our char­ter states (excerpt): The pur­pose of the Foundation is to ad­vance the preser­va­tion and uni­ver­sal ac­ces­si­bil­ity of all knowl­edge as in­spired by the United Nations Universal Declaration of Human Rights (Articles 19, 26, and 27) and the United Nations’s Sustainable Development Goal 4, which strives to en­sure in­clu­sive and eq­ui­table qual­ity ed­u­ca­tion and pro­mote life­long learn­ing op­por­tu­ni­ties for all.

Board & Advisers

Executive Director

Roman Griesfelder

Internet Archive Switzerland is led by Roman Griesfelder as Executive Director. Roman is an Austrian cit­i­zen and has been liv­ing in Switzerland since 1998. The so­ci­ol­o­gist and busi­ness ad­min­is­tra­tor has been work­ing for many years in se­nior roles as pro­ject man­ager and man­age­ment con­sul­tant, among other things, be­fore hold­ing lead­ing po­si­tions at cul­tural in­sti­tu­tions in Switzerland. His wide-rang­ing in­ter­ests con­verge at the points where so­cial, cul­tural and tech­no­log­i­cal de­vel­op­ments in­ter­sect and af­fect the lives of many peo­ple, or even just a few in­di­vid­u­als.

Location

St. Gallen.

47.4245° N · 9.3767° E

St. Gallen is no stranger to the idea that pre­serv­ing the record is a form of civic re­spon­si­bil­ity. Its archival tra­di­tion stretches back over a thou­sand years — a fit­ting sym­bolic home for this new chap­ter of the Internet Archive.

The ex­is­tence of the Abbey Archives proves that, with con­vic­tion and per­se­ver­ance, it is pos­si­ble to pre­serve the foun­da­tions of our knowl­edge about so­ci­ety. This con­vic­tion mo­ti­vates the Internet Archive Switzerland to em­bark boldly on its mis­sion and to pur­sue it un­wa­ver­ingly.

We are also de­lighted to be part­ner­ing with the University of St. Gallen to es­tab­lish the world’s first com­pre­hen­sive AI archive.

Blog

Latest NEWS.

May 5, 2026 • 5 min­utes of read­ing

Internet Archive Switzerland Launches in St. GallenA Thousand Years of Memory, and a New Chapter

On May 5th, 2026, Internet Archive Switzerland cel­e­brates its launch at the ex­hi­bi­tion hall of the Abbey Archives of St. Gallen, one of the old­est con­tin­u­ously ac­tive archives in the world. We are grate­ful to Peter Erhart and the Abbey Archives of St. Gallen for host­ing us: two in­sti­tu­tions, one a mil­len­nium old and one…

Read more: Internet Archive Switzerland Launches in St. Gallen

May 5, 2026 • 5 min­utes of read­ing

Internet Archive Switzerland Launches in St. Gallen

A Thousand Years of Memory, and a New Chapter

On May 5th, 2026, Internet Archive Switzerland cel­e­brates its launch at the ex­hi­bi­tion hall of the Abbey Archives of St. Gallen, one of the old­est con­tin­u­ously ac­tive archives in the world. We are grate­ful to Peter Erhart and the Abbey Archives of St. Gallen for host­ing us: two in­sti­tu­tions, one a mil­len­nium old and one…

Internet Archive Switzerland: Expanding a Global Mission to Preserve Knowledge

blog.archive.org

Thirty years ago, Brewster Kahle founded the Internet Archive with an am­bi­tious goal: Universal Access to All Knowledge. Today, that mis­sion con­tin­ues to grow with an ex­cit­ing new chap­ter: the launch of the Internet Archive Switzerland, a non-profit foun­da­tion based in St. Gallen.

The Internet Archive Switzerland, on­line at https://​in­ter­netarchive.ch/, is a newly-formed Swiss non-profit foun­da­tion that will op­er­ate in­de­pen­dently within its na­tional con­text. Its ef­forts will ini­tially fo­cus on pre­serv­ing en­dan­gered archives from around the world and col­lect­ing the gen­er­a­tive AI wave that is cur­rently upon us all. With a UNESCO con­fer­ence planned for November 2026 in Paris, Internet Archive Switzerland is tak­ing a con­crete step to ex­plore how en­dan­gered archives can be pro­tected.

In par­al­lel, the Swiss foun­da­tion is work­ing in part­ner­ship with the School of Computer Science at the University of St. Gallen, on the Gen AI Archive pro­ject led by Prof. Dr. Damian Borth. Together, they aim to be­gin archiv­ing AI mod­els, which is an emerg­ing fron­tier for preser­va­tion.

The choice of St. Gallen is no co­in­ci­dence. With a thou­sand-year tra­di­tion of archiv­ing and schol­ar­ship, the city of­fers a fit­ting home for this next phase of dig­i­tal preser­va­tion. Its strong aca­d­e­mic en­vi­ron­ment—in­clud­ing col­lab­o­ra­tion with the University of St. Gallen—makes it an ideal place to es­tab­lish a 21st cen­tury mem­ory or­ga­ni­za­tion.

St. Gallen is a very suit­able place to take the preser­va­tion of our uni­ver­sal knowl­edge a step fur­ther. Stability and in­no­va­tion go hand in hand here and are em­bed­ded in a deep un­der­stand­ing of the im­por­tance of cul­tural her­itage,” said Roman Griesfelder, the ex­ec­u­tive di­rec­tor of Internet Archive Switzerland.

Internet Archive Switzerland joins a grow­ing group of mis­sion-aligned or­ga­ni­za­tions, along­side Internet Archive, Internet Archive Canada, and Internet Archive Europe. Together, these in­de­pen­dent li­braries strengthen a shared vi­sion: build­ing a dis­trib­uted, re­silient dig­i­tal li­brary for the world.

Contact Internet Archive SwitzerlandRoman Griesfelder, ex­ec­u­tive di­rec­to­rof­fice@in­ter­netarchive.ch

EU calls VPNs “a loophole that needs closing” in age verification push

cyberinsider.com

The European Parliamentary Research Service (EPRS) has warned that vir­tual pri­vate net­works (VPNs) are in­creas­ingly be­ing used to by­pass on­line age-ver­i­fi­ca­tion sys­tems, de­scrib­ing the trend as a loop­hole in the leg­is­la­tion that needs clos­ing.”

The warn­ing comes as gov­ern­ments across Europe and else­where con­tinue ex­pand­ing on­line child-safety rules that re­quire plat­forms to ver­ify users’ ages be­fore grant­ing ac­cess to adult or age-re­stricted con­tent.

VPNs are pri­vacy tools de­signed to en­crypt in­ter­net traf­fic and hide a user’s IP ad­dress by rout­ing con­nec­tions through re­mote servers. While widely used for le­git­i­mate pur­poses such as pro­tect­ing com­mu­ni­ca­tions, avoid­ing sur­veil­lance, and en­abling se­cure re­mote work, reg­u­la­tors are in­creas­ingly con­cerned that the same tech­nol­ogy al­lows mi­nors to cir­cum­vent re­gional age checks.

The EPRS notes that VPN us­age surged af­ter manda­tory age-ver­i­fi­ca­tion laws took ef­fect in coun­tries in­clud­ing the United Kingdom and sev­eral US states. In the UK, where on­line ser­vices are now re­quired to pre­vent chil­dren from ac­cess­ing harm­ful con­tent, VPN apps re­port­edly dom­i­nated down­load charts af­ter the law came into force.

— European Parliamentary Research Service (@EP_EPRS) May 6, 2026

The doc­u­ment ex­plic­itly frames VPNs as a reg­u­la­tory gap, stat­ing that some pol­i­cy­mak­ers and child-safety ad­vo­cates be­lieve VPN ac­cess it­self should re­quire age ver­i­fi­ca­tion. England’s Children’s Commissioner has also called for VPN ser­vices to be re­stricted to adults only.

However, forc­ing users to ver­ify their iden­tity be­fore ac­cess­ing VPN ser­vices could sig­nif­i­cantly weaken anonymity pro­tec­tions and cre­ate new risks around sur­veil­lance and data col­lec­tion. VPN providers and other pri­vacy ad­vo­cates have al­ready ex­pressed their ob­jec­tions to this ap­proach in a let­ter sent to the UK pol­i­cy­mak­ers.

Last month, re­searchers found mul­ti­ple se­cu­rity and pri­vacy flaws in the European Commission’s of­fi­cial age-ver­i­fi­ca­tion app shortly af­ter its re­lease. The app, pro­moted as a pri­vacy-pre­serv­ing tool un­der the DSA frame­work, was dis­cov­ered stor­ing sen­si­tive bio­met­ric im­ages in un­en­crypted lo­ca­tions and ex­pos­ing weak­nesses that could al­low users to by­pass ver­i­fi­ca­tion con­trols en­tirely.

The EPRS pa­per ac­knowl­edges that age ver­i­fi­ca­tion re­mains tech­ni­cally dif­fi­cult and frag­mented across the EU. Current sys­tems based on self-de­c­la­ra­tion, age es­ti­ma­tion, or iden­tity ver­i­fi­ca­tion are de­scribed as rel­a­tively easy for mi­nors to by­pass. The re­port high­lights emerg­ing ap­proaches, such as double-blind” ver­i­fi­ca­tion sys­tems used in France, where web­sites re­ceive only con­fir­ma­tion that a user meets age re­quire­ments with­out learn­ing the user’s iden­tity, while the ver­i­fi­ca­tion provider does not see which web­sites the user vis­its.

At the same time, reg­u­la­tors are be­gin­ning to ad­dress VPN use di­rectly in leg­is­la­tion. Utah re­cently be­came the first US state to en­act a law ex­plic­itly tar­get­ing VPN use in on­line age ver­i­fi­ca­tion. The state’s SB 73 de­fines a user’s lo­ca­tion based on phys­i­cal pres­ence rather than ap­par­ent IP ad­dress, even if VPNs or proxy ser­vices are used to mask it.

The EPRS sug­gests VPN providers may face in­creas­ing scrutiny as the EU re­vises cy­ber­se­cu­rity and on­line safety leg­is­la­tion, not­ing that fu­ture up­dates to the EU Cybersecurity Act could in­tro­duce child-safety re­quire­ments aimed at pre­vent­ing VPN mis­use to by­pass le­gal pro­tec­tions.

If you liked this ar­ti­cle, be sure to fol­low us on X/Twitter and also LinkedIn for more ex­clu­sive con­tent.

LLMs Corrupt Your Documents When You Delegate

arxiv.org

nytimes.com

www.nytimes.com

Please en­able JS and dis­able any ad blocker

I Will Not Add Query Strings to Your URLs

susam.net

By Susam Pal on 09 May 2026

Last evening, a short blog post ap­peared in my feed reader that felt as if it spoke di­rectly to me. It is Chris Morgan’s ex­cel­lent post I’ve banned query strings.

Contents

Wisdom on the Web

Wander on the Web

Misfeature

Broken URLs

Qualms

Conclusion

Wisdom on the Web

Chris is some­one whose Internet com­ments I have been read­ing for about half a decade now. I first stum­bled upon his com­ments on Hacker News, where he left very de­tailed feed­back on a small col­lec­tion of boil­er­plate CSS rules I had shared there. I am by no means a web de­vel­oper. I have spent most of my pro­fes­sional life do­ing sys­tems pro­gram­ming in C and C++. However, de­vel­op­ing web­sites and writ­ing small HTML tools has been a long-time hobby for me. I have learnt most of my web de­vel­op­ment skills as a hob­by­ist by study­ing what other peo­ple do: first by view­ing the source of web­sites I liked in the early 2000s, and later by oc­ca­sion­ally get­ting pos­sessed by the urge to im­ple­ment a new game or tool and search­ing MDN Web Docs to learn what­ever I needed to make it work. One prob­lem with learn­ing a skill this way is that you some­times pick up habits and prac­tices that are fash­ion­able but not nec­es­sar­ily op­ti­mal or cor­rect. So it was re­ally valu­able to me when Chris com­mented on my col­lec­tion of boil­er­plate CSS rules. It helped me im­prove my CSS a lot. In fact, a few of the lessons from his com­ment have re­ally stuck with me; I keep them in mind when­ever I make a hobby HTML pro­ject: al­ways re­tain un­der­lines in links and re­tain pur­ple for vis­ited links.

I have been fol­low­ing Chris’s posts and com­ments on web-re­lated top­ics since then. He of­ten posts great feed­back on web-re­lated pro­jects. Whenever I come across one, I make sure to read them care­fully, even when the pro­ject is­n’t mine. I al­ways end up learn­ing some­thing nice and use­ful from his com­ments. Here is one such re­cent ex­am­ple from the Lobsters story Adding au­thor con­text to RSS.

Wander on the Web

A cou­ple of months ago, I cre­ated a new pro­ject called Wander Console. It is a small, de­cen­tralised, self-hosted web con­sole that lets vis­i­tors to your web­site ex­plore in­ter­est­ing web­sites and pages rec­om­mended by a com­mu­nity of in­de­pen­dent per­sonal web­site own­ers. For ex­am­ple, my con­sole is here: susam.net/​wan­der/. If you click the Wander’ but­ton there, the tool loads a ran­dom per­sonal web page rec­om­mended by the Wander com­mu­nity.

The tool con­sists of one HTML file that im­ple­ments the con­sole and one JavaScript file where the web­site owner de­fines a list of neigh­bour­ing con­soles along with a list of web pages they rec­om­mend. If you copy these two files to your web server, you in­stantly have a Wander con­sole live on the Web. You don’t need any server-side logic or server-side soft­ware be­yond a ba­sic web server to run Wander Console. You can even host it in con­strained en­vi­ron­ments like Codeberg Pages or GitHub Pages. When you click the Wander’ but­ton, the con­sole con­nects to other re­mote con­soles, fetches web page rec­om­men­da­tions, picks one ran­domly and loads it in your web browser. It is a bit like the now de­funct StumbleUpon but it is com­pletely de­cen­tralised. It is also a bit like web rings ex­cept that the com­mu­nity net­work is not re­stricted to be­ing a cy­cle; it is a graph and it is flex­i­ble.

There are cur­rently over 50 web­sites host­ing this tool. Together, they rec­om­mend over 1500 web pages. You can find a re­cent snap­shot of the list of known con­soles and the pages they rec­om­mend at susam.code­berg.page/​wcn/. To learn more about this tool or to set it up on your web­site, please see code­berg.org/​susam/​wan­der.

Misfeature

In case you were won­der­ing why I sud­denly plugged my pro­ject into this post in the pre­vi­ous sec­tion, it is be­cause I re­cently added a du­bi­ous fea­ture to that pro­ject that I my­self was not en­tirely con­vinced about. That mis­fea­ture is rel­e­vant to this post.

In ver­sion 0.4.0 of Wander Console, I added sup­port for a via= query pa­ra­me­ter while load­ing web pages. For ex­am­ple, if you en­coun­tered mid­night.pub while us­ing the con­sole at susam.net/​wan­der/, the con­sole loaded the page us­ing the fol­low­ing URL:

https://​mid­night.pub/?​via=https://​susam.net/​wan­der/

This al­lowed the owner of the rec­om­mended web­site to see, via their ac­cess logs, that the visit orig­i­nated from a Wander Console. Chris’s re­cent blog post is crit­i­cal of fea­tures like this. He writes:

I don’t like peo­ple adding track­ing stuff to URLs. Still less do I like peo­ple adding track­ing stuff to my URLs.

https://​chris­mor­gan.info/​no-query-strings?ref=ex­am­ple.com? Did I ask? If I wanted to know I’d look at the Referer header; and if it is­n’t there, it’s prob­a­bly for a good rea­son. You abuse your users by adding that to the link.

I don’t like peo­ple adding track­ing stuff to URLs. Still less do I like peo­ple adding track­ing stuff to my URLs.

https://​chris­mor­gan.info/​no-query-strings?ref=ex­am­ple.com? Did I ask? If I wanted to know I’d look at the Referer header; and if it is­n’t there, it’s prob­a­bly for a good rea­son. You abuse your users by adding that to the link.

I men­tioned ear­lier that I was not en­tirely con­vinced that adding a re­fer­ral query string was a good thing to do. Why did I add it any­way? I suc­cumbed to pop­u­lar de­mand. Let me briefly de­scribe my frame of mind when I con­sid­ered and im­ple­mented that fea­ture. When I first saw the fea­ture re­quest on Codeberg, my ini­tial re­ac­tion was re­luc­tance. I was­n’t con­vinced it was a good fea­ture. But I was too busy with some on­go­ing al­ge­braic graph the­ory re­search, an­other re­cent hobby, with a loom­ing dead­line, so I did­n’t have a lot of time to think about it clearly. In fact, every­thing about Wander Console has been made in very lit­tle time dur­ing the short breaks I used to take from my re­search. I made the first ver­sion of the con­sole in about one and a half hours one early morn­ing when my brain was too tired to read more al­ge­braic graph the­ory lit­er­a­ture and I re­ally needed a break. During an­other such break, I re­vis­ited that fea­ture re­quest and, de­spite my reser­va­tions, de­cided to im­ple­ment it any­way. During yet an­other such break, I am writ­ing this post.

Normally, I don’t like adding too many new fea­tures to my lit­tle pro­jects. I want them to have a lim­ited scope. I also want them to be­come sta­ble over time. After a pro­ject has ful­filled some es­sen­tial re­quire­ments I had, I just want to call it fea­ture com­plete and never add an­other fea­ture to it again. I’ll fix bugs, of course. But I don’t like to keep adding new fea­tures end­lessly. That’s my style of main­tain­ing my hobby pro­jects. So it should have been very easy for me to ig­nore the fea­ture re­quest for adding a re­fer­ral query string to URLs loaded by the con­sole tool. But I think a tired body and mind, worn down by long and in­tense re­search work, took a toll on me.

Although my gut feel­ing was telling me that it was not a good fea­ture, I could­n’t ar­tic­u­late to my­self ex­actly why. So I im­ple­mented the re­fer­ral query string fea­ture any­way. While do­ing so, I added an opt-out mech­a­nism to the con­fig­u­ra­tion, so that if some­one else did­n’t like the fea­ture, they could dis­able it for them­selves. This was an­other mis­take. A ques­tion­able fea­ture like this should be im­ple­mented as an opt-in fea­ture, not an opt-out fea­ture, if im­ple­mented at all. The fact that I did­n’t have a lot of time to rea­son through the im­pli­ca­tions of this fea­ture meant that I just went ahead and im­ple­mented it with­out think­ing about it crit­i­cally. As the fa­mous quote from Jurassic Park goes:

Your sci­en­tists were so pre­oc­cu­pied with whether or not they could that they did­n’t stop to think if they should.

Broken URLs

It soon turned out that my gut feel­ing was cor­rect. After I im­ple­mented that fea­ture, a page from one of my favourite web­sites re­fused to load in the con­sole. To il­lus­trate the prob­lem, here are a few sim­i­lar but slightly dif­fer­ent URLs for that page:

https://​in­t10h.org/​old­school-pc-fonts/​fontlist/

https://​in­t10h.org/​old­school-pc-fonts/​fontlist/?​2

https://​in­t10h.org/​old­school-pc-fonts/​fontlist/?​foo

The first and sec­ond URLs load fine, but the third URL re­turns an HTTP 404 er­ror page. The web­site uses the query string to de­ter­mine which one of its sev­eral font col­lec­tions to show. So when we add an ar­bi­trary query string to the URL, the web­site tries to in­ter­pret it as a font col­lec­tion iden­ti­fier and the page fails to load. That is why, when my tool added the via= query pa­ra­me­ter to the first URL, the page failed to load.

Later, with a lit­tle time to breathe and some hind­sight, I could ar­tic­u­late why adding re­fer­ral query strings to a work­ing URL was such a bad idea. Altering a URL gives you a new URL. The new URL could point to a com­pletely dif­fer­ent re­source, or to no re­source at all, even if the al­ter­ation is as small as adding a seem­ingly harm­less query string. By adding the re­fer­ral query string, I had ef­fec­tively bro­ken a work­ing URL from a web­site I am very fond of.

Qualms

It is also worth ask­ing whether an HTML tool should con­cern it­self with re­fer­ral query strings at all when web browsers al­ready have a mech­a­nism for this: the HTTP Referer header, gov­erned by Referrer-Policy. That pol­icy can be set at the server level, the doc­u­ment level or even on in­di­vid­ual links. The Web stan­dards al­ready pro­vide de­lib­er­ate con­trols to de­cide how much re­fer­rer in­for­ma­tion should be sent. Appending re­fer­ral query strings to URLs by­passes those con­trols. It moves a pri­vacy and at­tri­bu­tion con­cern out of the re­fer­rer mech­a­nism and em­beds it into the des­ti­na­tion URL in­stead. I don’t think an HTML tool should do that.

There is also a moral ques­tion here about whether it is okay to mod­ify a given URL on be­half of the user in or­der to in­sert a re­fer­ral query string into it. I think it is­n’t.

Conclusion

In the end, I de­cided to re­move the re­fer­ral query string fea­ture from Wander Console. One might won­der why I could­n’t sim­ply leave the fea­ture in as an opt-in. Well, the an­swer is that once I had deemed the fea­ture mis­guided, I no longer wanted it to be part of my soft­ware in any form. The pro­ject is still new and we are still in the days of 0.x re­leases, so if there is a good time to re­move fea­tures, this is it. But my on­go­ing re­search work left me with no time to do it. Finally, when the post I’ve banned query strings ap­peared in my feed reader last evening, it nudged me just enough to take a lit­tle time away from my aca­d­e­mic hobby and de­vote it to re­mov­ing that ill-con­sid­ered fea­ture. The fea­ture is now gone. See com­mit b26d77c for de­tails. The lat­est re­lease, ver­sion 0.6.0, does not have it any­more.

This is a les­son I’ll re­mem­ber for any new hobby pro­jects I hap­pen to make in the fu­ture. If I ever load URLs again, I’ll load them ex­actly as the web­site’s au­thor in­tended. I will never add query strings to your URLs.

I’ve banned query strings — Chris Morgan

chrismorgan.info

🗓️ 2026 – 05-08 • Tagged /web, /opinions, /meta=only

I don’t like peo­ple adding track­ing stuff to URLs. Still less do I like peo­ple adding track­ing stuff to my URLs.

https://​chris­mor­gan.info/​no-query-strings?ref=ex­am­ple.com? Did I ask? If I wanted to know I’d look at the Referer header; and if it is­n’t there, it’s prob­a­bly for a good rea­son. You abuse your users by adding that to the link.

https://​chris­mor­gan.info/​no-query-strings?ut­m_­source=ex­am­ple&utm_&c.? Hey! That one’s even worse, UTM pa­ra­me­ters are for me to use, not you. Leave my URLs alone.

So I’ve de­cided to try a blan­ket ban for this site: no unau­tho­rised query strings.

At pre­sent I don’t use any query strings. If I ever start us­ing any query strings, I’ll al­low only known pa­ra­me­ters. (In past times I used ?t=… and ?h=… cache-bust­ing URLs for stylesheet URLs; and I de­cided I’m okay break­ing such re­quests; there should­n’t be any le­git­i­mate ones.)

Want to see what hap­pens if you add a query string? Go ahead, try it.

It’s my web­site: I can do what I want with it.

And you can do what you want with yours!

This is cur­rently im­ple­mented in my Caddyfile.

The Intolerable Hypocrisy of Cyberlibertarianism

matduggan.com

I like the Internet. I am old enough to re­mem­ber the pre-In­ter­net era and de­spite the younger gen­er­a­tions pin­ing for those sim­pler days, I was there. Paper maps were ab­solutely hor­ri­ble, just you and a com­pass in your car on the side of the road in the mid­dle of the night try­ing to fig­ure out where you are and where you are go­ing. Once when dri­ving from Michigan to Florida I got so lost in the mid­dle of the night in Kentucky that I had to pull over to sleep and wait for the sun so I could fig­ure out where I was. I awoke to an old man star­ing un­blink­ingly into my car, shirt­less, breath­ing heavy enough to fog the win­dows. To say I floored that 1991 Honda Civic is an un­der­state­ment.

You would leave your house and then just dis­ap­pear. This is pre­sented as kind of ro­man­tic now, as if we were just free spir­its on the wind and could stop and re­ally watch a sun­set. In prac­tice it was mostly an an­noy­ing game of at­tempt­ing to guess where peo­ple were. You’d call their job, they had left. You’d call their house, they weren’t home yet. Presumably they were in tran­sit but you ac­tu­ally had no idea. As a child my re­sponse to peo­ple ask­ing me where my par­ents were was of­ten a shrug as I re­sumed at­tempt­ing to eat my weight in shoplifted candy or make home­made na­palm with gaso­line and sty­ro­foam. Sometimes I shud­der as a par­ent re­mem­ber­ing how young I was putting pen­nies on train tracks and hid­ing dan­ger­ously close so that we could get the cool squished penny af­ter­wards.

Cassettes are the worst way to lis­ten to mu­sic ever in­vented. Tapes squealed. Tapes slowed down for no rea­son, like they were de­pressed. Multiple times in my life I would set off on a long road trip, pop in a tape, and within fif­teen min­utes watch as it shot from the deck un­spooled like the guts from the tauntaun in Star Wars. You’d then spend forty-five min­utes at a Sunoco try­ing to wind it back in with a Bic pen know­ing in your heart you were per­form­ing CPR on a corpse. Then you’d put it back in the player out of pure stub­born­ness, and it would chew it­self again im­me­di­ately, and you’d drive the next six hours in si­lence with your own thoughts, which were not as good as Pearl Jam.

So I am, mostly, grate­ful for the bounty the in­ter­net has pro­vided. But there is some­thing wrong, deeply wrong, with what we built. The wrong­ness was there at the start. It was baked into the foun­da­tion by peo­ple who told them­selves a story about free­dom, and that story was a lie, and we are all, every one of us, pay­ing their tab.

To un­der­stand what hap­pened we need to go back to the 90s.

A Declaration of the Independence of Cyberspace

One of the first and most clas­sic ex­am­ples of the ide­ol­ogy that pow­ered and con­tin­ues to power tech is the clas­sic A Declaration of the Independence of Cyberspace” by John Perry Barlow writ­ten in 1996. You can find the full text here. I re­mem­ber think­ing it was ge­nius when I first read it. I was young enough that I also thought Snow Crash” was a se­ri­ous po­lit­i­cal doc­u­ment. Today the Declaration reads like one of those sov­er­eign cit­i­zen TikToks where some­one in traf­fic court is claim­ing diplo­matic im­mu­nity un­der mar­itime law.

It helps to know who Barlow was. Barlow was a Grateful Dead lyri­cist. He was also a Wyoming cat­tle rancher. He was also, briefly, the cam­paign man­ager for Dick Cheney’s first run for Congress. (You did not mis­read that.) He spent his later years as a fix­ture at Davos, the World Economic Forum, where the very wealthy gather each January to re­mind each other that they are in­ter­est­ing. It was at Davos, in February 1996, fu­eled by cham­pagne and griev­ance over the Telecommunications Act, that Barlow banged out the Declaration on a lap­top and emailed it to a few hun­dred friends. From there it be­came, some­how, one of the found­ing doc­u­ments of the mod­ern in­ter­net.

These in­creas­ingly hos­tile and colo­nial mea­sures place us in the same po­si­tion as those pre­vi­ous lovers of free­dom and self-de­ter­mi­na­tion who had to re­ject the au­thor­i­ties of dis­tant, un­in­formed pow­ers. We must de­clare our vir­tual selves im­mune to your sov­er­eignty, even as we con­tinue to con­sent to your rule over our bod­ies. We will spread our­selves across the Planet so that no one can ar­rest our thoughts.

Many of the pil­lars of modern Internet” are here. Identity is­n’t a fixed con­cept based on gov­ern­ment ID but is a more fluid con­cept. We don’t need cen­tral­ized con­trol or re­ally any form of con­trol be­cause those things are un­nec­es­sary. It was this and the fa­mous ear­lier Cyberspace and the American Dream: A MagnaCarta for the Knowledge Age” that laid a fa­mil­iar foun­da­tion for a lot of the cul­ture we now have. [link]

The Magna Carta is also our in­tro­duc­tion to the (now fa­mil­iar) creed of catch up or get left be­hind”. The adop­tion of new tech­nol­ogy must be done at the ab­solute fastest speed pos­si­ble with no reg­u­la­tions or checks. You don’t need to worry about the con­se­quences of tech­nol­ogy be­cause these prob­lems cor­rect them­selves. If you told me the fol­low­ing was writ­ten two weeks ago by OpenAI I would have be­lieved you.

If this analy­sis is cor­rect, copy­right and patent pro­tec­tion of knowl­edge (or at least many forms of it) may no longer be un­nec­es­sary. In fact, the mar­ket­place may al­ready be cre­at­ing ve­hi­cles to com­pen­sate cre­ators of cus­tomized knowl­edge out­side the cum­ber­some copy­right/​patent process

The cum­ber­some copy­right/​patent process. Cumbersome to whom, ex­actly? This is al­ways the move. The thing your in­dus­try would pre­fer not to deal with is re­framed as an ob­so­lete bur­den. Your re­fusal to do it is re­branded as in­no­va­tion. Your in­abil­ity to imag­ine a world where you don’t get ex­actly what you want be­comes a man­i­festo.

Winner Saw It Coming

So there are dozens of these pieces and they all read the same. If you don’t reg­u­late these tech­nolo­gies hu­man­ity will only ben­e­fit. Education, health­care, in­dus­try, etc. We don’t need reg­u­la­tions be­cause the trans­for­ma­tion from the medium of pa­per to dig­i­tal has trans­formed the hu­man spirit. But one was ex­tremely sur­pris­ing to me. Langdon Winner wrote some­thing al­most prophetic back in 1997. You can read it here.

He coins the term cy­ber­lib­er­tar­i­an­ism (or at least is the first men­tion of it I could find) and then goes on to de­scribe an al­most eerily ac­cu­rate set of events.

In this per­spec­tive, the dy­namism of dig­i­tal tech­nol­ogy is our true des­tiny. There is no time to pause, re­flect or ask for more in­flu­ence in shap­ing these de­vel­op­ments. Enormous feats of quick adap­ta­tion are re­quired of all of us just to re­spond to therequire­ments the new tech­nol­ogy casts upon us each day. In the writ­ings of cy­ber­lib­er­tar­i­ans those able to rise to the chal­lenge are the cham­pi­ons of the com­ing mil­len­nium. The rest are fated to lan­guish in the dust.

Characteristic of this way of think­ing is a ten­dency to con­flatethe ac­tiv­i­ties of free­dom seek­ing in­di­vid­u­als with the op­er­a­tionsof enor­mous, profit seek­ing busi­ness firms. In the Magna Cartafor the Knowledge Age, con­cepts of rights, free­doms, ac­cess, andown­er­ship jus­ti­fied as ap­pro­pri­ate to in­di­vid­u­als are mar­shaledto sup­port the machi­na­tions of enor­mous transna­tional firms.We must rec­og­nize, the man­i­festo ar­gues, that Government does­not own cy­ber­space, the peo­ple do.” One might read this as asug­ges­tion that cy­ber­space is a com­mons in which peo­ple have­shared rights and re­spon­si­bil­i­ties. But that is def­i­nitely not wherethe writ­ers carry their rea­son­ing.What ownership by the peo­ple” means, the Magna Cartainsists, is sim­ply private own­er­ship.” And it even­tu­ally be­comesclear that the pri­vate en­ti­ties they have in mind are ac­tu­ally large,transna­tional busi­ness firms, es­pe­cially those in com­mu­ni­ca­tions.Thus, af­ter prais­ing the mar­ket com­pe­ti­tion as the path­way to abet­ter so­ci­ety, the au­thors an­nounce that some forms of compe-tition are dis­tinctly un­wel­come. In fact, the writ­ers fear that the­gov­ern­ment will reg­u­late in a way that re­quires ca­ble com­pa­niesand phone com­pa­nies to com­pete. Needed in­stead, they ar­gue,is the re­duc­tion of bar­ri­ers to col­lab­o­ra­tion of al­ready large firms,a step that will en­cour­age the cre­ation of a huge, com­mer­cial,in­ter­ac­tive mul­ti­me­dia net­work as the for­merly sep­a­rate kinds of­com­mu­ni­ca­tion merge.

What ownership by the peo­ple” means, the Magna Cartainsists, is sim­ply private own­er­ship.” And it even­tu­ally be­comesclear that the pri­vate en­ti­ties they have in mind are ac­tu­ally large,transna­tional busi­ness firms, es­pe­cially those in com­mu­ni­ca­tions.Thus, af­ter prais­ing the mar­ket com­pe­ti­tion as the path­way to abet­ter so­ci­ety, the au­thors an­nounce that some forms of compe-tition are dis­tinctly un­wel­come. In fact, the writ­ers fear that the­gov­ern­ment will reg­u­late in a way that re­quires ca­ble com­pa­niesand phone com­pa­nies to com­pete. Needed in­stead, they ar­gue,is the re­duc­tion of bar­ri­ers to col­lab­o­ra­tion of al­ready large firms,a step that will en­cour­age the cre­ation of a huge, com­mer­cial,in­ter­ac­tive mul­ti­me­dia net­work as the for­merly sep­a­rate kinds of­com­mu­ni­ca­tion merge.

In all he lays out 4 pil­lars of this ide­ol­ogy.

Technological de­ter­min­ism. The new tech­nol­ogy is go­ing to trans­form every­thing, it can­not be stopped, and your only job is to keep up. Stewart Brand’s ac­tual quote, which Winner pulls out and lets sit there like a body on dis­play, is Technology is rapidly ac­cel­er­at­ing and you have to keep up.” There’s no room to ask whether we want any of this. The wave is com­ing. Surf or drown.

It does not oc­cur to any­one in this dis­course that drown’ is a choice the wave is mak­ing, not a nat­ural law. Waves do not have in­ten­tions. Destroying your liveli­hood and leav­ing you to rot is­n’t a re­quire­ment of the nat­ural or­der as much as that would con­ve­nient.

Radical in­di­vid­u­al­ism. The point of all this tech­nol­ogy is per­sonal lib­er­a­tion. Anything that gets in the way of the in­di­vid­ual max­i­miz­ing them­selves be it gov­ern­ment, reg­u­la­tion, so­cial oblig­a­tion, your an­noy­ing neigh­bors, is an ob­sta­cle to be re­moved. Winner notes, with what I imag­ine was a very dry ex­pres­sion, that the writ­ers of the Magna Carta for the Knowledge Age” cited Ayn Rand ap­prov­ingly. In 1994. As in­tel­lec­tual ground­ing. For a doc­u­ment about com­put­ers.

There is some­thing deeply funny about a move­ment claim­ing to in­vent the fu­ture and ground­ing its case in a Russian émi­gré’s air­port nov­els about steel barons in love with their own re­flec­tions.

Free-market ab­so­lutism. Specifically the Milton Friedman, Chicago School, sup­ply-side fla­vor. The mar­ket will sort it out. Regulation is theft. Wealth is virtue. George Gilder, who co-wrote the Magna Carta, had pre­vi­ously writ­ten a book called Wealth and Poverty that helped sell Reaganomics to the masses. He then wrote Microcosm, which ar­gued that mi­cro­proces­sors plus dereg­u­lated cap­i­tal­ism would lib­er­ate hu­man­ity. He was very se­ri­ous about this.

Don’t worry, Gilder is still out there. He loves the blockchain and crypto now. He now writes about how Bitcoin will save the soul of cap­i­tal­ism, which it is some­how do­ing while also de­stroy­ing the planet. Both can be true in his cos­mol­ogy. The ide­ol­ogy is flex­i­ble like that.

A fan­tasy of com­mu­ni­tar­ian out­comes. This is the part that should make you laugh out loud. After es­tab­lish­ing that gov­ern­ment is bad, reg­u­la­tion is theft, and the in­di­vid­ual is sov­er­eign, the cy­ber­lib­er­tar­i­ans then promise that the re­sult of all this will be… rich, de­cen­tral­ized, har­mo­nious com­mu­nity life. Negroponte: It can flat­ten or­ga­ni­za­tions, glob­al­ize so­ci­ety, de­cen­tral­ize con­trol, and help har­mo­nize peo­ple.” Democracy will flour­ish. The gap be­tween rich and poor will close. The lion will lie down with the lamb, and the lamb will have a Pentium II.

We also have the ad­van­tage of hind­sight and know, with­out ques­tion, that all of these pre­dicted out­comes were wrong. Not directionally wrong’ or wrong in the de­tails.’ Wrong the way it would be wrong to pre­dict that if you set your kitchen on fire, the re­sult will be a ren­o­va­tion.

You have to hold these four ideas in your head at the same time to see the trick. The cy­ber­lib­er­tar­i­ans wanted you to be­lieve that rad­i­cal in­di­vid­u­al­ism plus dereg­u­lated cap­i­tal­ism plus in­evitable tech­nol­ogy would pro­duce com­mu­ni­tar­ian utopia. This is, on its face, in­sane. It is the eco­nomic equiv­a­lent of claim­ing that if every­one punches each other re­ally hard, even­tu­ally we’ll all be hug­ging.

But Winner’s sharpest ob­ser­va­tion, the one I keep com­ing back to, is­n’t about any of the four pil­lars in­di­vid­u­ally. It’s about the move un­der­neath them. He writes:

Characteristic of this way of think­ing is a ten­dency to con­flate the ac­tiv­i­ties of free­dom seek­ing in­di­vid­u­als with the op­er­a­tions of enor­mous, profit seek­ing busi­ness firms.”

This is the en­tire game. This is how don’t tread on me” be­comes Meta should be al­lowed to do what­ever it wants.” This is how the rights of the lone hacker work­ing in their garage be­come in­dis­tin­guish­able from the rights of a multi­na­tional with a mar­ket cap larger than most coun­tries’ GDP. The Magna Carta lit­er­ally ar­gues that the gov­ern­ment should re­duce bar­ri­ers to col­lab­o­ra­tion be­tween ca­ble com­pa­nies and phone com­pa­nies in the name of in­di­vid­ual free­dom and so­cial equal­ity. Winner caught this in 1997.

That is why ob­struct­ing such col­lab­o­ra­tion — in the cause of forc­ing a com­pe­ti­tion­be­tween the ca­ble and phone in­dus­tries — is so­cially elit­ist. To the ex­tent it pre­vents col­lab­o­ra­tion be­tween the ca­ble in­dus­try and the phone com­pa­nies, pre­sent fed­eral pol­icy ac­tu­ally thwarts the Administration’s own goals of ac­cess and em­pow­er­ment.

What makes the es­say un­com­fort­able to read now is that Winner was­n’t even pre­dict­ing the fu­ture. He was just de­scrib­ing what was al­ready hap­pen­ing and not­ing where it would ob­vi­ously lead. He saw the me­dia merg­ers and asked the ques­tion no­body in the in­dus­try wanted to an­swer: what hap­pened to the pre­dicted col­lapse of large cen­tral­ized struc­tures in the age of elec­tronic me­dia? Where, ex­actly, did the de­cen­tral­iza­tion go? He saw that the cy­ber­lib­er­tar­i­ans were go­ing to de­liver the op­po­site of every­thing they promised, and that they were go­ing to keep get­ting paid to promise it any­way.

He was writ­ing be­fore Google. Before Facebook. Before the iPhone. Before YouTube. Before Twitter, Bitcoin, Uber, AirBnB, OpenAI, and the en­tire app econ­omy. Before any of the ac­tual ex­am­ples that would even­tu­ally prove him right ex­isted. He just looked at the peo­ple do­ing the talk­ing, lis­tened to what they were say­ing, and wrote down where it ended. It is not a long es­say. He did­n’t need a long es­say. The fu­ture was right there on the page, in their own words. He just had to read it back to them.

The es­say closes with a ques­tion that has, to my knowl­edge, never been se­ri­ously an­swered by the in­dus­try it was aimed at:

Are the prac­tices, re­la­tion­ships and in­sti­tu­tions af­fected by peo­ple’s in­volve­ment with net­worked com­put­ing ones we wish to fos­ter? Or are they ones we must try to mod­ify or even op­pose?”

Twenty-eight years later, the in­dus­try still treats this ques­tion as some­where be­tween naive and sedi­tious. It’s the ques­tion Barlow’s de­c­la­ra­tion was specif­i­cally de­signed to make unask­able. And it re­mains, to this day, the only ques­tion that ac­tu­ally mat­ters.

Caveat emp­tor

When you look at these early for­ma­tive writ­ings, so much of what we see now be­comes clear. The cy­ber­lib­er­tar­ian deal was al­ways the same: you’re on your own. The in­dus­try would build the in­fra­struc­ture, take the prof­its, and shove every con­se­quence, every harm, every cost, every re­spon­si­bil­ity, onto some­body else.

There is no greater ex­am­ple to me than the mod­er­a­tor. Anyone who has ever mod­er­ated a fo­rum or a sub­red­dit knows that adding the word cyber” to a space does­n’t sud­denly turn peo­ple into bet­ter hu­mans. People are still peo­ple. They flame each other, they post slurs, they doxx, they ha­rass, they spam, they post CSAM, they rad­i­cal­ize each other, they grief, they co­or­di­nate, they lie. A space with hu­mans in it re­quires gov­er­nance.

They pro­duce, with fright­en­ing reg­u­lar­ity, the ex­act be­hav­ior any kinder­garten teacher could have pre­dicted. Then they act sur­prised.

But the cy­ber­lib­er­tar­ian model re­quired pre­tend­ing it was un­fore­see­able. The plat­forms could­n’t ac­knowl­edge that they needed gov­er­nance be­cause ac­knowl­edg­ing it would mean ac­knowl­edg­ing re­spon­si­bil­ity, and ac­knowl­edg­ing re­spon­si­bil­ity would mean ac­knowl­edg­ing li­a­bil­ity, and ac­knowl­edg­ing li­a­bil­ity would mean the en­tire eco­nomic model col­lapses. So in­stead the in­dus­try in­vented a beau­ti­ful fic­tion: gov­er­nance hap­pens, but it hap­pens by magic, per­formed by vol­un­teers, for free, who we will si­mul­ta­ne­ously rely on and mock.

Reddit is run by un­paid mod­er­a­tors. Wikipedia is run by un­paid ed­i­tors. Stack Overflow was run by un­paid ex­perts and is now a ghost town. On TikTok and Twitter it is the un­know­able algorithm” that is the cause of and so­lu­tion to every prob­lem backed by capri­cious mod­er­a­tors who de­light in stop­ping free speech. Unless you don’t like it, then it’s neg­li­gence mod­er­a­tion in de­fense of your en­e­mies.

Open source is run by un­paid main­tain­ers hav­ing ner­vous break­downs. The plat­forms col­lect the rent. The peo­ple do­ing the ac­tual work of mak­ing the plat­forms liv­able get noth­ing, and when they ask for any­thing like recog­ni­tion, tools, ba­sic pro­tec­tion from ha­rass­ment, they’re told they’re power-trip­ping nerds who should touch grass.

This is also the crypto story, just with the masks off. What if we made worse money on pur­pose, money that by­passed every pro­tec­tion con­sumers had won over the pre­vi­ous cen­tury, money that could­n’t be re­versed when stolen, money that funded ran­somware at­tacks on hos­pi­tals and pump-and-dumps tar­get­ing peo­ple’s re­tire­ment ac­counts? The cy­ber­lib­er­tar­ian an­swer was: that’s free­dom. The losses were real. People killed them­selves. Hospitals had to turn away pa­tients. The ar­chi­tects be­came bil­lion­aires and bought yachts and now sit on the boards of AI com­pa­nies, where they are rein­vent­ing the same con with a new vo­cab­u­lary.

Now Winner got one thing wrong, and it’s worth paus­ing on, be­cause it’s the most in­ter­est­ing wrin­kle in all of this. What ac­tu­ally hap­pened was weirder and worse. The cy­ber­lib­er­tar­i­ans be­came the cor­po­ra­tions. They did­n’t sell out. They did­n’t be­tray their prin­ci­ples for the first of­fer of money. They sim­ply scaled un­til their prin­ci­ples be­came in­con­ve­nient, and then they stopped men­tion­ing them.

Once the plat­forms got large enough to be un­stop­pable, once they cap­tured enough of the reg­u­la­tory ap­pa­ra­tus to write their own rules, the lib­er­tar­ian rhetoric got qui­etly shelved like a col­lege poster you took down be­fore your in-laws came over. Meta no longer pre­tends it stands for free speech and seem­ingly takes de­light in putting its thumb on the scale. TikTok users have in­vented an en­tire eu­phemistic shadow lan­guage to evade au­to­mated cen­sor­ship like unalive,” le dol­lar bean,” graped” that would have made 1996 Barlow weep into his bolo tie.

Copyright and patents mat­ter when they’re Apple’s copy­right and patents. Or Googles. Or OpenAIs. Go try to make a Facebook+ web­site and see how quickly Meta is ca­pa­ble of re­spond­ing to con­tent it finds ob­jec­tion­able.

Cyberlibertarianism was the lad­der. Once they were on the roof, they kicked it away and started charg­ing ad­mis­sion to look at the view.

So the Internet is Doomed?

Remember I like the Internet. I said it in the be­gin­ning and it is still true. I love the Fediverse, I love weird Discords about small table­top RPGs I’m in. I spend hours in the Mister FPGA fo­rums. There are cor­ners that are good. But they’re mostly good be­cause they’re not big enough to be worth break­ing up.

It feels in­creas­ingly like I’m hang­ing out in the old neigh­bor­hood dive bar af­ter most of the reg­u­lars have moved away. The light­ing is the same. The bar­tender re­mem­bers your or­der. But you can hear your­self think now, and that’s mostly be­cause the room is half empty and the juke­box fi­nally died. The new clien­tele is from out of town. They are tak­ing pic­tures of the menu.

If we want to have a se­ri­ous con­ver­sa­tion about why we are in the sit­u­a­tion we’re in, it is no longer pos­si­ble to pre­tend that the bro­ken ide­ol­ogy that put us on this tra­jec­tory is still some­how com­pat­i­ble with the harsh re­al­i­ties that sur­round us. It is not clear to me if democ­racy can sur­vive a dereg­u­lated Internet. A dereg­u­lated Internet filled with LLMs that can per­fectly im­per­son­ate hu­man be­ings pow­ered by un­reg­u­lated cor­po­ra­tions with zero eth­i­cal guide­lines seems like a some­what ob­vi­ous prob­lem. Like an episode of Star Trek where you the viewer are like well clearly the Zorkians can’t keep the Killbots as pets.” It does­n’t take some gi­ant in­tel­lect to see the pretty fuck­ing ob­vi­ous prob­lem.

If we want to save the parts of the in­ter­net worth sav­ing, we have to evolve. We have to find some sort of eth­i­cal code that says: just be­cause I can do some­thing and it makes money, that is not suf­fi­cient jus­ti­fi­ca­tion to un­leash it on the world. Or, more sim­ply: just be­cause I want to do some­thing and you can­not ac­tively stop me, that does not make do­ing it a good idea. We have waited thirty years for the cy­ber­lib­er­tar­ian fu­ture to ar­rive and pro­duce the promised har­mo­nious com­mu­nity. It’s time to face the facts. It’s never com­ing. The bus left in 1996. The bus was never real.

People did not get bet­ter be­cause they went on­line. Giving every­one ac­cess to a raw, un­fil­tered pipeline of every fact and lie ever pro­duced did not turn them into bet­ter-ed­u­cated peo­ple. It broke them. It al­lowed them to choose the re­al­ity they now in­habit, like or­der­ing off a menu. If I want to be­lieve the world is flat, TikTok will gladly serve me that con­tent all day. Meta will rec­om­mend sup­port­ive groups. There will be hash­tags. There will be Discords. There will be a guy named Trent who runs a pod­cast. I will never have to face the deeply un­com­fort­able pos­si­bil­ity that I might be wrong about any­thing, ever, un­til the day I die, sur­rounded by peo­ple who agree with me about every­thing, in­clud­ing which of the other mourn­ers are se­cretly lizards.

That is the in­ter­net we built. It was not an ac­ci­dent. It was the prod­uct of a spe­cific ide­ol­ogy, writ­ten down by spe­cific peo­ple, at a spe­cific cock­tail party in Davos, in 1996. Winner watched it hap­pen and told us where it was go­ing. We did not lis­ten. There is still time, maybe, to start.

GrapheneOS fixes Android VPN leak Google refused to patch

cyberinsider.com

GrapheneOS has re­leased a new up­date that fixes a re­cently dis­closed Android VPN by­pass vul­ner­a­bil­ity ca­pa­ble of leak­ing a user’s real IP ad­dress.

The leak hap­pens even when Android’s Always-On VPN and Block con­nec­tions with­out VPN pro­tec­tions were en­abled.

The is­sue, dis­closed last week by se­cu­rity re­searcher lowlevel/Yusuf,” af­fected Android 16 and stemmed from a newly in­tro­duced QUIC con­nec­tion tear­down fea­ture in Android’s net­work­ing stack. In its lat­est re­lease, GrapheneOS says it has disable[d] reg­is­terQuic­Connec­tion­Close­P­a­y­load op­ti­miza­tion to fix VPN leak,” ef­fec­tively neu­tral­iz­ing the at­tack vec­tor on sup­ported Pixel de­vices.

GrapheneOS is a pri­vacy- and se­cu­rity-fo­cused Android-based op­er­at­ing sys­tem pri­mar­ily de­vel­oped for Google Pixel de­vices. The pro­ject is widely used by pri­vacy-con­scious con­sumers, jour­nal­ists, ac­tivists, and en­ter­prise users seek­ing stronger ap­pli­ca­tion sand­box­ing, ex­ploit mit­i­ga­tions, and re­duced re­liance on Google ser­vices.

According to Yusuf’s tech­ni­cal write-up, the vul­ner­a­ble API al­lowed or­di­nary ap­pli­ca­tions with only the au­to­mat­i­cally granted INTERNET and ACCESS_NETWORK_STATE per­mis­sions to reg­is­ter ar­bi­trary UDP pay­loads with sys­tem_server.

When the ap­p’s UDP socket was later de­stroyed, Android’s priv­i­leged sys­tem_server process would trans­mit the stored pay­load di­rectly over the de­vice’s phys­i­cal net­work in­ter­face rather than through the VPN tun­nel. Because sys­tem_server op­er­ates with el­e­vated net­work­ing priv­i­leges and is ex­empt from VPN rout­ing re­stric­tions, the packet by­passed Android’s VPN lock­down pro­tec­tions en­tirely.

The re­searcher demon­strated the flaw on a Pixel 8 run­ning Android 16 with Proton VPN en­abled along­side Android’s lock­down mode. The app re­port­edly leaked the de­vice’s ac­tual pub­lic IP ad­dress to a re­mote server de­spite VPN pro­tec­tion be­ing fully en­abled.

Google in­tro­duced a fea­ture that al­lows ap­pli­ca­tions to grace­fully ter­mi­nate QUIC ses­sions when sock­ets are un­ex­pect­edly de­stroyed. However, the im­ple­men­ta­tion ac­cepted ar­bi­trary pay­loads with­out val­i­dat­ing whether they were le­git­i­mate QUIC CONNECTION_CLOSE frames and did not ver­ify whether the orig­i­nat­ing ap­pli­ca­tion was re­stricted to VPN-only traf­fic.

The re­searcher re­ported the is­sue to Android’s se­cu­rity team, which clas­si­fied it as Won’t Fix (Infeasible)” and NSBC (Not Security Bulletin Class), stat­ing that it did not meet the thresh­old for in­clu­sion in Android se­cu­rity ad­vi­sories. The re­searcher ap­pealed the de­ci­sion, ar­gu­ing that any ap­pli­ca­tion could leak iden­ti­fy­ing net­work in­for­ma­tion us­ing only stan­dard per­mis­sions, but Google main­tained its po­si­tion, au­tho­riz­ing pub­lic dis­clo­sure on April 29.

GrapheneOS re­sponded by dis­abling the un­der­ly­ing op­ti­miza­tion en­tirely in re­lease 2026050400.

Beyond the VPN leak fix, the lat­est re­lease also in­cludes the full May 2026 Android se­cu­rity patch level, mul­ti­ple hard­ened_­mal­loc im­prove­ments, Linux ker­nel up­dates across Android’s 6.1, 6.6, and 6.12 branches, and a back­ported fix for CVE-2026 – 33636 in libpng. The up­date ad­di­tion­ally ships newer Vanadium browser builds and ex­panded Dynamic Code Loading re­stric­tions.

The re­searcher noted that stock Android users could tem­porar­ily mit­i­gate the is­sue man­u­ally through ADB by dis­abling the close_quic_­con­nec­tion DeviceConfig flag. However, that workaround re­quires de­vel­oper ac­cess and may not per­sist in­def­i­nitely if Google re­moves the fea­ture flag in fu­ture up­dates.

If you liked this ar­ti­cle, be sure to fol­low us on X/Twitter and also LinkedIn for more ex­clu­sive con­tent.

Apple is increasing my cortisol levels

blog.kronis.dev

Date: 2026 – 05-09

I’m cre­at­ing a sim­ple de­vel­oper util­ity to make man­ag­ing Claude Code pro­files (e.g. run­ning it with DeepSeek, or some OpenRouter mod­els) a lit­tle bit eas­ier.

Edit: I just did the first re­lease, which you can check out on ccode.kro­nis.dev, or go di­rectly to the Itch.io page to ei­ther down­load or buy the pre-built bi­na­ries or look at the source code. It’s a sim­ple util­ity and it’s early on (consider get­ting it for free first and only pay­ing later, if it feels use­ful), but cur­rently the code is not signed.

The util­ity is writ­ten in the Go lan­guage, and the tool­ing there makes it re­ally easy to com­pile for var­i­ous plat­forms - I get a sta­tic ex­e­cutable that I can put any­where I want. Even be­fore the re­lease, I wanted to see how easy it would be to ship it.

It works just fine for dis­trib­ut­ing Linux soft­ware (same deal, af­ter chmod +x).

It works sort of fine for dis­trib­ut­ing Windows soft­ware (I get an .exe, SmartScreen might have a word or two, though you can click through it in the same pop-up).

Distributing Mac soft­ware

It does not just work for ma­cOS and my MacBook in­stead shows me this:

What you see is their quar­an­tine kick­ing in for down­loaded soft­ware, even if I share it with my­self over Nextcloud.

Technically, you can ask your users to over­ride it man­u­ally, in the ter­mi­nal:

Most de­vel­op­ers might be will­ing to do that. It is not, how­ever, good user ex­pe­ri­ence and might raise some eye­brows.

Doesn’t seem like such a big deal, right? I’ll just en­roll in their Apple Developer Program, sign the ex­e­cutable and be on my way, right?

Giving Apple money, and fail­ing

Wait, they want how much money for the ac­count?

And it’s a yearly sub­scrip­tion? My brother in Christ, I in­tend to re­lease a util­ity maybe a dozen or two dozen peo­ple are go­ing to down­load, tops, for like 7 USD on Itch.io with a pay-what-you-want model, mean­ing that most of those peo­ple will prob­a­bly choose the price of 0 USD in­stead (since I don’t in­tend to be like Apple, peo­ple have var­i­ous cir­cum­stances).

That means that even if it works out that much, there’s go­ing to be VAT and Itch.io will also take a cut so out of those maybe 50 USD I’ll get about 25 USD, which funds me about 3 months of that Apple Developer Program price. I guess the rea­son for it be­ing priced like that lies some­where be­tween greed and want­ing to gate­keep hob­by­ists out and only sup­port Serious Users™, but it seems a bit stu­pid. Oh well, I al­ready had to get the over­priced MacBook for an­other free­lance thing, be­cause they also won’t let me com­pile ma­cOS/​iOS apps on Windows or Linux, so I guess this is just them spit­ting on me af­ter slap­ping me in the face.

What I get from that is that ar­ti­cles like An app can be a home-cooked meal are cool but don’t take the eco­nom­ics of want­ing to re­lease some­thing pub­licly into ac­count - un­less you’re de­vel­op­ing some­thing that you’ll add a bunch of mon­e­ti­za­tion to, you’ll be los­ing money. For desk­top soft­ware there is Homebrew but that also means that you could­n’t charge a few bucks for it even if you wanted to (or that you’d need to add mac-home­brew-in­stall-in­struc­tions.txt to the Itch.io down­loads page when do­ing the pay-what-you-want ap­proach, which would feel awk­ward).

I don’t like that the eco­nom­ics are push­ing soft­ware and app de­vel­op­ment in a di­rec­tion where re­leas­ing a pack­age (that might be non-open-source or just source-avail­able, but you want to re­lease bi­na­ries) costs money, though I also ac­knowl­edge that there would be other is­sues, like in­sane amounts of spam, with not do­ing that.

Then, we get to the ac­tual ver­i­fi­ca­tion process - it’s un­der­stand­able that they’d want to ver­ify my ID. The prob­lem is that on the MacBook they also ex­pect me to use its we­b­cam to take a pic­ture. I will ad­mit that my M1 MacBook Air is get­ting dated at this point, but re­gard­less of what light­ing I tried, I could just not get a good pic­ture of the doc­u­ment. It’s not like they were like Oh hey, we’ve de­tected that your own iPhone is con­nected to the same lo­cal net­work as this MacBook, would you like to use it as a cam­era?”, so for about 10 at­tempts, this is what I saw:

Eventually, I moved over to try­ing to use my main we­b­cam for that, since their built in one just does­n’t work:

Why they can’t just let me up­load a scan of the doc­u­ment eludes me. I mean, I guess I can imag­ine a few rea­sons why, but it’d prob­a­bly be eas­ier to forge my own ID so it’s not as glossy rather than hav­ing to turn my small kitchen table into this. Pictured for max­i­mum frus­tra­tion, a don­gle that I needed:

Even that was­n’t good enough, be­cause un­der­stand­ably it does­n’t have aut­o­fo­cus for some­thing that you hold close. Not only that, but every 2nd fail­ure seemed to just give me a generic er­ror and I’d have to start the whole en­roll­ment process from the be­gin­ning again:

Luckily I re­al­ized that I can in­stall the app on my iPhone di­rectly. There, it worked on the first try. I guess it must re­ally suck if you don’t have an iPhone or a fancy we­b­cam, bet­ter spend some more money so you can give them money! The pay­ment went through okay, soon af­ter I had an ac­ti­vated de­vel­oper ac­count.

Except of course I did­n’t, look, the app tells me to await an e-mail (which I seem­ingly al­ready re­ceived?):

And the desk­top app does­n’t care at all ei­ther, it does­n’t even know that I’ve tried the en­roll­ment, and of­fers me to start the whole thing over again, de­spite me be­ing signed into the ex­act same ac­count:

It’s prob­a­bly a case of even­tual con­sis­tency and some back­ground processes or what­ever, but it’s also quite frus­trat­ing and, in a word, stu­pid.

Apple is kind of frus­trat­ing

Apple, I think you make hard­ware with pretty good build qual­ity and the M-series chips made for pretty much the per­fect note­book for me - and I’m sure they’re great main dev ma­chines for those that can af­ford the higher spec ver­sions.

I think that’s nice and I gen­uinely en­joy hav­ing the iPhone SE 2022, at least be­fore learn­ing that you killed off the bud­get se­ries al­to­gether (your new e-se­ries are more ex­pen­sive) and re­moved the nice silent mode tog­gle on the side and re­moved TouchID. That’s be­fore we even start talk­ing about the 3.5mm jack and frankly all of that makes me ques­tion whether my next phone should­n’t just be an Android again.

I can deal with need­ing soft­ware like AutoRaise and Rectangle and DiscreteScroll along­side oth­ers to cus­tomize your OS to my lik­ing be­cause you won’t let me do that my­self like most Linux dis­tros do. I can even deal with your win­dow fo­cus need­ing an ex­tra click across mul­ti­ple mon­i­tors and AutoRaise be­ing nice but per­haps too ag­gres­sive, since the de­vel­op­ers are at least try­ing to make the ex­pe­ri­ence nicer!

I can deal with your key­board short­cuts be­ing odd and not even hav­ing a Cut” op­tion in your Finder pro­gram.

I can deal with your weird Control/Command but­ton setup which even breaks re­mote desk­top soft­ware.

I can deal with your weird programs you close aren’t ac­tu­ally closed” ap­proach even though you sold me a MacBook with 8 GB of RAM just so I could de­velop soft­ware in your walled gar­den ecosys­tem.

But to first ven­dor lock me to your ecosys­tem for de­vel­op­ing apps, then de­mand­ing a whole bunch of money so I can sign my soft­ware and it not get quar­an­tined all while I’m not too well off fi­nan­cially, then refuse to let me sub­mit my doc­u­ments to you be­cause your hard­ware pro­duces pic­tures that are not good enough and make me have to in­stall the app on a phone that’s also ex­pen­sive and that not even every­one has, then to still make me wait and have your apps not even show that I’ve sub­mit­ted my ap­pli­ca­tion?

You know what? Apple, fuck you and your for­saken ecosys­tem. This sucks.

A more sane world

I can use SmartID to ver­ify my ID (and age) in about 20 sec­onds when buy­ing an en­ergy drink at the lo­cal gro­cery store.

I can use eParak­sts to dig­i­tally sign doc­u­ments in about a minute, from ei­ther my PC with a card reader (using my gov­ern­ment is­sued ID card), or my phone with their app, end­ing up with a proper cryp­to­graphic sig­na­ture ei­ther at­tached to the EDOC con­tainer (ASIC-E) or a PDF file di­rectly.

I’m sure that other coun­tries also have plenty of sim­i­lar ser­vices for ID and age ver­i­fi­ca­tion, sign­ing doc­u­ments, and other dig­i­tal ser­vices. I ac­knowl­edge that that’s not all of them, and that things are all over the place in this re­gard (alongside the credit card mafia hold­ing a lot of the world’s pay­ment in­fra­struc­ture hostage), but come on, surely it’s pos­si­ble to cre­ate some­thing that works bet­ter than my ex­pe­ri­ence did.

Having a bunch of scrappy Baltic soft­ware pack­ages work­ing bet­ter than those by a multi-bil­lion dol­lar com­pany feels silly.

Update

You know what, Apple is­n’t the only com­pany where things are some­what messed up.

If you want to sign some code for Windows, you can find Certum of­fer­ing code sign­ing, but that also costs around 209 EUR per year, all while they’re sup­posed to be one of the af­ford­able op­tions out there! I’m not sin­gling them out as much as ac­knowl­edg­ing that they’re one of the cheaper op­tions and that many oth­ers out there are worse. What the fuck?

And then you look at Azure Artifact Signing notic­ing that their ba­sic tier only costs 8.54 EUR per month and for a bit feel happy that some­one has at least tried to dis­rupt the ex­tor­tion­ate prices - un­til you set up your Azure ac­count and no­tice that you can­not sign cer­tifi­cates as an in­di­vid­ual if you’re out­side of US & Canada, in the EU only or­ga­ni­za­tions can sign code through them.

This feels about as bad as get­ting TLS certs was be­fore Let’s Encrypt dis­placed most of that rent seek­ing be­hav­ior - the prob­lem be­ing that there aren’t many al­ter­na­tives or com­peti­tors to them, which on one hand means that we’re mov­ing to­wards a mas­sive point of fail­ure, and on the other hand means that if they ever de­cide to de­mand money, then a large part of the Internet will be straight up fucked.

I thought that maybe I’m over­re­act­ing a bit - since my pre­vi­ous is­sues were mostly about the Apple signup process be­ing an­noy­ing and the way they’ve treated their users be­ing worse than it could be, but no, I should be more an­gry - the whole code sign­ing space is stu­pidly ex­pen­sive for what it is. You have ar­gu­ments in the op­po­site di­rec­tion? Just know that they said the ex­act same kind of stuff about TLS be­fore Let’s Encrypt and how you have to pay 100 EUR a year for a cert that’s not even a wild­card be­cause of rea­sons™.

Just let me sign my code with my gov­ern­men­tal ID card and be done with it, jeez.

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

Visit pancik.com for more.