10 interesting stories served every morning and every evening.




1 628 shares, 36 trendiness

Agent Safehouse

Skip to con­tentGo full –yolo. We’ve got you. LLMs are prob­a­bilis­tic - 1% chance of dis­as­ter makes it a mat­ter of when, not if. Safehouse makes this a 0% chance — en­forced by the ker­nel. Safehouse de­nies write ac­cess out­side your pro­ject di­rec­tory. The ker­nel blocks the syscall be­fore any file is touched. All agents work per­fectly in their sand­boxes, but can’t im­pact any­thing out­side it.Agents in­herit your full user per­mis­sions. Safehouse flips this — noth­ing is ac­ces­si­ble un­less ex­plic­itly granted.Down­load a sin­gle shell script, make it ex­e­cutable, and run your agent in­side it. No build step, no de­pen­den­cies — just Bash and ma­cOS.Safe­house au­to­mat­i­cally grants read/​write ac­cess to the se­lected workdir (git root by de­fault) and read ac­cess to your in­stalled tool­chains. Most of your home di­rec­tory — SSH keys, other re­pos, per­sonal files — is de­nied by the ker­nel.See it fail — proof the sand­box worksTry read­ing some­thing sen­si­tive in­side safe­house. The ker­nel blocks it be­fore the process ever sees the data.# Try to read your SSH pri­vate key — de­nied by the ker­nel

safe­house cat ~/.ssh/id_ed25519

# cat: /Users/you/.ssh/id_ed25519: Operation not per­mit­ted

# Try to list an­other repo — in­vis­i­ble

safe­house ls ~/other-project

# ls: /Users/you/other-project: Operation not per­mit­ted

# But your cur­rent pro­ject works fine

safe­house ls .

# README.md src/ pack­age.json …Add these to your shell con­fig and every agent runs in­side Safehouse au­to­mat­i­cally — you don’t have to re­mem­ber. To run with­out the sand­box, use com­mand claude to by­pass the func­tion.# ~/.zshrc or ~/.bashrc

safe() { safe­house –add-dirs-ro=~/mywork $@”; }

# Sandboxed — the de­fault. Just type the com­mand name.

claude() { safe claude –dangerously-skip-permissions $@”; }

codex() { safe codex –dangerously-bypass-approvals-and-sandbox $@”; }

amp() { safe amp –dangerously-allow-all $@”; }

gem­ini() { NO_BROWSER=true safe gem­ini –yolo $@”; }

# Unsandboxed — by­pass the func­tion with `command`

# com­mand claude — plain in­ter­ac­tive ses­sion­Gener­ate your own pro­file with an LLMUse a ready-made prompt that tells Claude, Codex, Gemini, or an­other model to in­spect the real Safehouse pro­file tem­plates, ask about your home di­rec­tory and tool­chain, and gen­er­ate a least-priv­i­lege `sandbox-exec` pro­file for your setup.The guide also tells the LLM to ask about global dot­files, sug­gest a durable pro­file path like ~/.config/sandbox-exec.profile, of­fer a wrap­per that grants the cur­rent work­ing di­rec­tory, and add shell short­cuts for your pre­ferred agents.Open the copy-paste prompt

...

Read the original on agent-safehouse.dev »

2 485 shares, 30 trendiness

- YouTube

...

Read the original on www.youtube.com »

3 451 shares, 21 trendiness

FrameBook.

A reimag­ined clas­sic, that’s only a lit­tle bit janky.the first-gen mac­book from 06 is one of my fa­vorite lap­top de­signs ever, mostly be­cause for the longest time it was one of the only mac­books u could get in black be­sides the power­book g3.plus it was the first mac­book i ever per­son­ally owned, al­though this was around 2015, so even by then the per­for­mance was pretty crummy.what in­spired me to do this pro­ject was read­ing ar­ti­cles and watch­ing videos about peo­ple retro­fitting old macs and old pcs with new guts (usually m1 minis), and i got re­ally mo­ti­vated when i read that some­one had al­ready done some­thing like this, and af­ter watch­ing f4mi’s video on con­vert­ing an imac g5 to a fully-kit­ted mon­i­tor.and so, af­ter do­ing lots of re­search into moth­er­boards, dis­play pan­els, and gath­er­ing every­thing i could think i would need for this pro­ject, in the wise words of NileRed:i de­cided to just go for it.to be­gin, i or­dered a few black poly­carb mac­books (model a1181) from ebay. they were all pretty beat up and did­n’t have their bat­ter­ies, nor did they power on.i then found and or­der some oem parts of the outer chas­sis that i guess had never been made into an ac­tual uniti then fol­low an ifixit tu­to­r­ial to com­pletely take apart the mac­books, till i was down to just the bare chas­sis. my main idea was that the used macs were gonna be my test runs be­fore i did any­thing with the oem parts, since they were the clean­est look­ing parts.pretty much all of the parts of the mac i dis­carded, since they did­n’t work and even on their own, aren’t worth alot if i did sell them. i did keep only a few metal brack­ets that screwed into the bot­tom of the bot­tom chas­sis (after dremel­ing away the middle” sec­tion for the old re­move­able ram sticks), and an­other that’s screwed in at the top that ac­tu­ally holds the hinges for the top chas­sis.here’s the guts i’m putting in the mac­book:and here’s some periper­als and other things i put in the macand away we go…my first con­cern was if i could use the top case, or the Apple Internal Keyboard/Trackpad. for­tu­nately i found an ar­ti­cle that al­lows you to tap into the case’s cir­cuitry and sol­der a usb ca­ble to use it as a key­board and track­pad for pretty much every­where.so, this was ac­tu­ally the very first time sol­dered any­thing ever lol. i had watched plenty of sol­der­ing videos and stuff so i felt pretty con­fi­dent, and when i fin­ished sol­der­ing on a usb-c and plugged it into my main com­puter, it ac­tu­ally worked!as a side note, the sol­der pads are quite small and frag­ile, i learned the hard way by ac­ci­den­tally yank­ing the ca­ble and tear­ing the sol­der pads off the case’s pcb. :|so i got a new top case and sol­dered a new ca­ble again.to start putting the mac­book to­gether, i re­moved the orig­i­nal brass in­sert stand­offs from the bot­tom case and re­placed them with my own 3d printed stand­offs.for my stand­offs i just used go­rilla glue to hold them in place, not su­per ideal, but idk much about weld­ing plas­tics to­gether and stuff like that so; su­per glue ftw lol.i reused the orig­i­nal screws used through­out the mac since they were all the same thick­ness - M2 size - and i started to slowly piece to­gether where i was go­ing to put every­thing.come on in !here’s an early pic­ture i took where i fig­ured out where to put most things. the main­board i put slightly of­f­cen­tered in the mid­dle, mostly cause i wanted to cen­ter the fan’s ex­haust out the back the best it could. there is a beam in the mid­dle of the ex­haust for a screw to go in at the bot­tom of the mac, but i fig­ured it’d be ok.i mounted the speak­ers in the most ob­vi­ous spots, they sound, fine. maybe a lit­tle bet­ter than the orig­i­nal mac­book’s speak­ers.i also seper­ately got an orig­i­nal dead mac­book bat­tery that matched the lap­top, and very, very care­fully, re­moved the plas­tic side that held in the bat­tery cells and just su­per glued it to the back­side of the chas­sis to fill in the gi­ant hole, since i had no real de­sire or way of use the re­mov­able bat­ter­ies with the main­board, since the in­ter­nal con­nec­tors for both frame­work’s bat­tery and ap­ple’s bat­tery are com­pletely dif­fer­ent.i also 3d printed a cus­tom made button” to fill in the hole left by the miss­ing lock­ing mech­a­nism for the og bat­tery.one of the trick­est parts of the whole pro­ject was fig­ur­ing out what to do with the I/O, namely the left side of the chas­sis, since that’s where all of the orig­i­nal ports were housed di­rectly on the orig­i­nal logic board of the mac­book.tak­ing in­spi­ra­tion from the f4mi video from be­fore, i or­dered some usb hubs, stripped them out of their en­clo­sures, and 3d printed some cus­tom stand­offs that al­low them to be mounted in a way that holds them in place, while al­low­ing me to eas­ily re­move them with screws.i also worked quite hard on mod­el­ing an I/O shield” for the left side since i did­n’t want to try to work around the old holes for the orig­i­nal ports, so i dremeled that side out, took a scan of the af­ter­math, and metic­u­lously re­made the side of the chas­sis to per­fectly fit the hubs’ new ports.i then just su­per glued the shield onto the chas­sis, again, not su­per ideal, but it does hold up quite well!the right side was thank­fully much eas­ier than the left, since the old dvd dri­ve’s slot was ex­actly the height of a usb c port. so i or­der a usb c hub, de­sign a shield to fill in the gaps be­tween the ports, and mounted it will some stand­offs.bet­ter yet, this side has a mount­ing piece that not only holds down the ports and pre­vents from lift­ing up, it also clips the top case down.to con­nect the hubs to the main­board, i used some small flat usb c ca­bles and stripped them of their rub­ber coat­ing to ex­pose their fpc ca­bles, did some folds to clean up the slack, and con­nected them.to con­nect the top case and the we­b­cam to the main­board, i ended up get­ting this small usb mod­ule on the right that i sol­dered the con­nec­tions to, which then feeds into this in­put shim that breaks out the con­nec­tor into sol­der­able points.the power but­ton for the top case is a sim­i­lar story, where i care­fully re­moved the orig­i­nal mem­brane but­ton and re­placed it with a small but­ton that breaks out into to header pins. those pins then plug into two wires that run to the same in­put shim to turn the main­board on.now this would­n’t be a mac­book with­out the clas­sic glow­ing ap­ple logo on the back. and at first, i re­ally did­n’t know how to repli­cate the glow­ing logo.

the orig­i­nal dis­play panel of the mac­book was de­signed to al­low the back­light of the panel it­self to dou­ble as the back­light of the dis­play, and as the light to shine through the logo, thus al­low­ing the logo to glow. u can ac­tu­ally see the logo through the dis­play if u have a black im­age over where the logo is.

my best idea was to find a panel thin enough to fit an led of some kind to turn on when the sys­tem is on.af­ter search­ing on al­ibaba and talk­ing with a seller, i or­dered a cus­tom made 7x7x0.28 cm led that i mounted (aka su­per glued) to the back of the top chas­sis, and then ran the wires to the usb mod­ule from be­fore, sol­dered them, and it worked!speak­ing of which, to get the new dis­play panel mounted, i very care­fully cen­tered the dis­play in the mac­book’s bezel, tape it down with some mask­ing tape, care­fully turned it over, and taped it all the way around with some strong alu­minum tape, which turned out pretty good!the we­b­cam was kinda tricky to mount at the top of the mac­book, since the orig­i­nal we­b­cam mod­ule was much much smaller than the one i found and ended up us­ing. i ended up just care­fully dremel­ing away most of the plas­tic at the top of the chas­sis to make room for the mod­ule, and care­fully derem­led a big­ger hole to fit the lenses through the top chas­sis’ metal brack­etafter fi­nally get­ting every­thing to stay in place with­out snap­ping off, or pos­si­bly short­ing every­thing, this is what the fi­nal look in­side my frame­book looks like.i added some padding around the bat­tery to dis­cour­age hot air around the bat­tery, as well as 3d print­ing that big rec­tan­gle to fill in some space.the wifi card man­aged to snug­gly fit un­der the right usb hub, with the an­ten­nas run­ning up to the right side of the top chas­sis.to con­nect the top case i added a male usb c port to the usb mod­ule to con­nect to the top case’s sol­dered on fe­male usb c port.the rea­son why i did it like this is so that way if i dis­con­nect the top case from the frame­book, i can use any reg­u­lar usb c m2m ca­ble to use the top case any­where i want.over­all this was a re­ally fun and in­ter­est­ing pro­ject. from start to fin­ish this took me around 3 months. i learned alot from this pro­ject, from how to sol­der, how to 3d model, this was a re­ally nice way to learn these skills.there are some­things i would’ve like to have done bet­ter or dif­fer­ently, namely mak­ing some cus­tom pcbs in place of my usb hubs so i could have any i/​o i want, and find­ing a bet­ter way to mount stuff in­stead of su­per glue lol.thanks for read­ing all the way through about my am­a­teur at­tempt at retrofitting” my mac­book! sorry if i glossed over or skipped some stuff, i did­n’t re­ally prop­erly doc­u­ment things or even take pho­tos along the way, most of this ar­ti­cle is just me rec­ol­lect­ing what i did in, semi-chrono­log­i­cal or­der.

if you do have ques­tions about my process, shoot me an email or dm me on bluesky.

i do have some very spe­cial thanks for some peo­ple that made this whole thing pos­si­ble:N3rd­ing for send­ing me the in­put shim for the top case and power but­tonMy friend Phillip for teach­ing me how to use blender to make my lil stand­offs and the i/​o shield.and YOU, for read­ing this lil blog, ar­ti­cle, thing, what­ever !!! :P

...

Read the original on fb.edoo.gg »

4 381 shares, 13 trendiness

Based on its own charter, OpenAI should surrender the race

Based on its own char­ter, OpenAI should sur­ren­der the race

We are con­cerned about late-stage AGI de­vel­op­ment be­com­ing a com­pet­i­tive race with­out time for ad­e­quate safety pre­cau­tions. Therefore, if a value-aligned, safety-con­scious pro­ject comes close to build­ing AGI be­fore we do, we com­mit to stop com­pet­ing with and start as­sist­ing this pro­ject. We will work out specifics in case-by-case agree­ments, but a typ­i­cal trig­ger­ing con­di­tion might be a bet­ter-than-even chance of suc­cess in the next two years.”

Interestingly, this is still hosted at https://​ope­nai.com/​char­ter/, mean­ing it re­mains the of­fi­cial com­pany pol­icy.

At the same time, ex­plic­itly stated AGI time­lines by Sam Altman are the fol­low­ing:

Within the next ten years, AI sys­tems will ex­ceed ex­pert skill level in most do­mains”

By the time the end of this decade rolls around, the world will be in an un­be­liev­ably bet­ter place”

I think in 5 years […] peo­ple are like, man, the AGI mo­ment came and went”

What are you ex­cited about in 2025? - AGI

AGI will prob­a­bly get de­vel­oped dur­ing Trump’s term”

By 2030, if we don’t have ex­tra­or­di­nar­ily ca­pa­ble mod­els that do things we can’t, I’d be very sur­prised”

AGI kinda went whoosh­ing by… okay fine, we built AGIs”

We ba­si­cally have built AGI (later: a spir­i­tual state­ment, not a lit­eral one”)

We can see that the time­line of AGI (let’s as­sume this is the time­line for a bet­ter-than-even chance) has ac­cel­er­ated and the me­dian pre­dic­tion since 2025 is around 2 years. Notably, in the lat­est in­ter­views it’s claimed that AGI has been achieved, and we’re now rac­ing to­wards ASI.

Finally, here’s a snap­shot of the cur­rent over­all Arena rank­ing of top 10 mod­els.

Based on these, the flag­ship GPT-5.4 model is clearly trail­ing be­hind com­pe­ti­tion. At least Anthropic’s and Google’s mod­els are clearly safety-con­scious, and prob­a­bly value-aligned (whatever that means, but since the mod­els are drop-in re­place­ments to GPT, it should hold).

It can be de­bated whether arena.ai is a suit­able met­ric for AGI, a strong case can prob­a­bly be made for why it’s not. However, that’s ir­rel­e­vant, as the spirit of the self-sac­ri­fice clause is to avoid an arms race, and we are clearly in one.

Therefore, one can only con­clude, that we cur­rently meet the stated ex­am­ple trig­ger­ing con­di­tion of a bet­ter-than-even chance of suc­cess in the next two years”. As per its char­ter, OpenAI should stop com­pet­ing with the likes of Anthropic and Gemini, and join forces, how­ever that might look like.

While this will never hap­pen, I think it’s il­lus­tra­tive of some great points for pon­der­ing:

The im­po­tence of naive ide­al­ism in the face of eco­nomic in­cen­tives.

The dis­crep­ancy be­tween mar­ket­ing points and prac­ti­cal ac­tions.

The chang­ing goal­posts of AGI and time­lines. Notably, it’s com­mon to now talk about ASI in­stead, im­ply­ing we may have al­ready achieved AGI, al­most with­out notic­ing.

...

Read the original on mlumiste.com »

5 355 shares, 17 trendiness

LibreOffice 26.2 is here: a faster, more polished office suite that you control

We’re pleased to an­nounce the re­lease of LibreOffice 26.2, the newest ver­sion of the free and open source of­fice suite trusted by mil­lions of users around the world. This re­lease makes it eas­ier than ever for users to cre­ate, edit and share doc­u­ments on their own terms. Designed for in­di­vid­u­als and or­ga­ni­za­tions alike, it con­tin­ues to be a trusted al­ter­na­tive to pro­pri­etary of­fice soft­ware.

LibreOffice 26.2 is fo­cused on im­prove­ments that make a dif­fer­ence in daily work and brings bet­ter per­for­mance, smoother in­ter­ac­tion with com­plex doc­u­ments and im­proved com­pat­i­bil­ity with files cre­ated in other of­fice soft­ware. Whether you’re writ­ing re­ports, man­ag­ing spread­sheets, or prepar­ing pre­sen­ta­tions, the ex­pe­ri­ence feels more re­spon­sive and re­li­able.

LibreOffice has al­ways been about giv­ing users con­trol. LibreOffice 26.2 con­tin­ues that tra­di­tion by strength­en­ing sup­port for open doc­u­ment stan­dards, and en­sur­ing long-term ac­cess to your files, with­out sub­scrip­tions, li­cense re­stric­tions, or data col­lec­tion. Your doc­u­ments stay yours — for­ever.

Behind this re­lease there is a global com­mu­nity of con­trib­u­tors. Developers, de­sign­ers, trans­la­tors, QA testers, and vol­un­teers from around the world worked to­gether to de­liver hun­dreds of fixes and re­fine­ments. Their ef­forts re­sult in a suite that not only adds fea­tures, but also im­proves qual­ity, con­sis­tency, and sta­bil­ity, re­lease af­ter re­lease.

* Improved per­for­mance and re­spon­sive­ness across the suite, mak­ing large doc­u­ments open, edit, and save more smoothly.

* Enhanced com­pat­i­bil­ity with doc­u­ments cre­ated in pro­pri­etary and open core of­fice soft­ware, re­duc­ing for­mat­ting is­sues and sur­prises.

* Hundreds of bug fixes and sta­bil­ity im­prove­ments con­tributed by the global LibreOffice com­mu­nity.

See the Release Notes for the full list of new fea­tures.

Florian Effenberger, Executive Director of The Document Foundation, says:

LibreOffice 26.2 shows what hap­pens when soft­ware is built around users, not busi­ness mod­els, and how open source soft­ware can de­liver a mod­ern, pol­ished pro­duc­tiv­ity suite with­out com­pro­mis­ing user free­dom. This re­lease is about speed, re­li­a­bil­ity, and giv­ing peo­ple con­trol over their doc­u­ments.

LibreOffice 26.2 is avail­able for Windows, ma­cOS, and Linux, and sup­ports over 120 lan­guages out of the box. It can be used at home, in busi­nesses, schools, and pub­lic in­sti­tu­tions, with no li­cens­ing fees and no ven­dor lock-in.

You can down­load LibreOffice 26.2 to­day from the of­fi­cial LibreOffice web­site. We in­vite users to try the new re­lease, share feed­back, and join the com­mu­nity help­ing shape the fu­ture of LibreOffice. If they are happy, they can do­nate to sup­port the in­de­pen­dence and the fu­ture de­vel­op­ment of the pro­ject.

About LibreOffice and The Document Foundation

LibreOffice is a free, pri­vate and open source of­fice suite used by mil­lions of peo­ple, busi­nesses, and pub­lic in­sti­tu­tions world­wide. It is de­vel­oped by an in­ter­na­tional com­mu­nity and sup­ported by The Document Foundation, an in­de­pen­dent non-profit or­ga­ni­za­tion that pro­motes open stan­dards, dig­i­tal sov­er­eignty and user choice.

...

Read the original on blog.documentfoundation.org »

6 293 shares, 12 trendiness

UPDATED Request to the European Commission to adhere to its own guidances

The European Commission has ac­cepted our re­quest, and start­ing from to­day — Friday March 6 — has added the Open Document Format ODS ver­sion of the spread­sheet to be used to pro­vide the feed­back. We are grate­ful to the peo­ple work­ing at DG CONNECT, the Commission’s Directorate-General for Communications Networks, Content and Technology, for re­spond­ing to our re­quest within 24 hours. At this point, the rest of this mes­sage is no longer rel­e­vant, and the call for ac­tion is no longer nec­es­sary.

The European Commission has spent years ad­vo­cat­ing for open stan­dards, ven­dor neu­tral­ity, and dig­i­tal sov­er­eignty. The European Interoperability Framework ex­plic­itly rec­om­mends open for­mats for pub­lic sec­tor dig­i­tal ser­vices. The EUs own Open Source Software Strategy calls for re­duc­ing de­pen­dency on pro­pri­etary tech­nolo­gies, and the Cyber Resilience Act it­self is de­signed to ad­dress sys­temic risks from un­ac­count­able tech­nol­ogy de­pen­den­cies.

On March 3rd, 2026, the European Commission pub­lished a re­quest for feed­back on to the guid­ances to be pro­vided in re­la­tion to the CRA, which must be pro­vided through the linked spread­sheet in .xlsx for­mat, a pro­pri­etary for­mat that makes in­ter­op­er­abil­ity ex­tremely dif­fi­cult due to its ever chang­ing and un­doc­u­mented fea­tures.

This is not a mi­nor pro­ce­dural over­sight. It is a struc­tural bias built into the process which sends out a clear mes­sage: full par­tic­i­pa­tion in EU pol­i­cy­mak­ing re­quires a Microsoft li­cence.

We ask the European Commission to lead by ex­am­ple by fol­low­ing its own guid­ances in re­la­tion to in­ter­op­er­abil­ity and at to least pro­vide, along­side the pro­pri­etary for­mat gen­er­ated by the pro­pri­etary soft­ware and ser­vices they use, also an Open Document Format (ODF) file which is an ac­tual in­ter­op­er­a­ble and in­ter­na­tion­ally recog­nised stan­dard.

While the Commission eval­u­ates plans to up­grade its in­fra­struc­ture and ser­vices to Open Source so­lu­tions, with the aim of im­prov­ing re­siliency and re­duce risky de­pen­den­cies, it should im­ple­ment in its stan­dard pro­ce­dures the re­lease of doc­u­ments in ODF for­mat to al­low all cit­i­zens, or­gan­i­sa­tions and in­sti­tu­tions to par­tic­i­pate in the de­mo­c­ra­tic processes.

We are writ­ing to pro­vide feed­back on a pro­ce­dural mat­ter that, while per­haps ap­pear­ing mi­nor at first glance, car­ries sig­nif­i­cant im­pli­ca­tions for the prin­ci­ples un­der­pin­ning EU dig­i­tal pol­icy — in par­tic­u­lar the com­mit­ments to open stan­dards, in­ter­op­er­abil­ity, and ven­dor neu­tral­ity that the Commission it­self has cham­pi­oned in mul­ti­ple leg­isla­tive and strate­gic con­texts.

The stake­holder feed­back tem­plate for the Cyber Resilience Act Guidance doc­u­ment has been made avail­able ex­clu­sively in Microsoft Excel for­mat (.xlsx). This choice is, re­spect­fully, dif­fi­cult to rec­on­cile with the Commission’s own stated com­mit­ments.

The .xlsx for­mat is a pro­pri­etary for­mat de­fined and con­trolled by Microsoft Corporation, a pri­vate en­tity in­cor­po­rated in the United States. In fact, al­though OOXML (ISO/IEC 29500) has been ap­proved as a stan­dard, its im­ple­men­ta­tion has never com­plied with the spec­i­fi­ca­tions of the stan­dard it­self, as widely doc­u­mented in the lit­er­a­ture on in­ter­op­er­abil­ity. Requiring par­tic­i­pants to use this for­mat as the sole ve­hi­cle for struc­tured data en­try ef­fec­tively con­di­tions par­tic­i­pa­tion in a pub­lic con­sul­ta­tion on the avail­abil­ity or will­ing­ness to use soft­ware pro­duced by a sin­gle sup­plier.

This stands in di­rect con­tra­dic­tion to sev­eral prin­ci­ples the EU has ad­vanced:

• The European Interoperability Framework (EIF), which rec­om­mends the use of open stan­dards in pub­lic sec­tor dig­i­tal ser­vices and the avoid­ance of lock-in to pro­pri­etary tech­nolo­gies.

• The Open Source Software Strategy 2020–2023 and its suc­ces­sor, which pro­mote the use of open source and open stan­dards across EU in­sti­tu­tions.

• The spirit, and ar­guably the let­ter, of the very Cyber Resilience Act it­self, which seeks to re­duce sys­temic risk aris­ing from de­pen­dency on un­ac­count­able or opaque tech­nol­ogy com­po­nents.

A con­sul­ta­tion process that re­quires re­spon­dents to use a pro­pri­etary for­mat pro­duces a struc­tural bias: it dis­ad­van­tages in­di­vid­u­als, or­gan­i­sa­tions, and pub­lic ad­min­is­tra­tions that have made the en­tirely le­git­i­mate and EU-endorsed choice to op­er­ate on open source soft­ware and open for­mats. A cit­i­zen or small or­gan­i­sa­tion us­ing LibreOffice, for in­stance, may en­counter com­pat­i­bil­ity is­sues when work­ing with the pro­vided .xlsx tem­plate. A gov­ern­ment body that has mi­grated to ODF-based work­flows faces an un­nec­es­sary ob­sta­cle.

The rem­edy is straight­for­ward. Feedback tem­plates of this kind should be pro­vided in at min­i­mum two for­mats: one open for­mat (ODF spread­sheet, .ods, be­ing the ob­vi­ous choice, as it is a true ISO-standardised for­mat with no pro­pri­etary own­er­ship) and one widely-used pro­pri­etary for­mat for those whose en­vi­ron­ments re­quire it. Ideally, a plain-text or web-based form would sup­ple­ment both, re­mov­ing the spread­sheet de­pen­dency en­tirely for re­spon­dents who pre­fer it.

The Commission’s cred­i­bil­ity on dig­i­tal sov­er­eignty, open stan­dards, and ven­dor-in­de­pen­dent in­fra­struc­ture is un­der­mined — sym­bol­i­cally but mean­ing­fully — each time its own processes rely ex­clu­sively on pro­pri­etary for­mats from non-Eu­ro­pean tech­nol­ogy ven­dors. The CRA is pre­cisely the kind of leg­is­la­tion where pro­ce­dural con­sis­tency with stated prin­ci­ples mat­ters most.

We re­spect­fully urge the Commission to re­view its tem­plate dis­tri­b­u­tion prac­tices and to adopt a for­mat-neu­tral ap­proach to stake­holder con­sul­ta­tion as stan­dard pol­icy go­ing for­ward.

...

Read the original on blog.documentfoundation.org »

7 261 shares, 15 trendiness

BigBodyCobain/Shadowbroker: Open-source intelligence for the global theater. Track everything from the corporate/private jets of the wealthy, and spy satellites, to seismic events in one unified interface. The knowledge is available to all but rarely aggregated in the open, until now.

ShadowBroker is a real-time, full-spec­trum geospa­tial in­tel­li­gence dash­board that ag­gre­gates live data from dozens of open-source in­tel­li­gence (OSINT) feeds and ren­ders them on a uni­fied dark-ops map in­ter­face. It tracks air­craft, ships, satel­lites, earth­quakes, con­flict zones, CCTV net­works, GPS jam­ming, and break­ing geopo­lit­i­cal events — all up­dat­ing in real time.

Built with Next.js, MapLibre GL, FastAPI, and Python, it’s de­signed for an­a­lysts, re­searchers, and en­thu­si­asts who want a sin­gle-pane-of-glass view of global ac­tiv­ity.

git clone https://​github.com/​Big­Body­Cobain/​Shad­ow­bro­ker.git

cd Shadowbroker

docker-com­pose up -d

* Carrier Strike Group Tracker — All 11 ac­tive US Navy air­craft car­ri­ers with OSINT-estimated po­si­tions

* Clustered Display — Ships clus­ter at low zoom with count la­bels, declus­ter on zoom-in

* Region Dossier — Right-click any­where on the map for:

The repo in­cludes a docker-com­pose.yml that builds both im­ages lo­cally.

git clone https://​github.com/​Big­Body­Cobain/​Shad­ow­bro­ker.git

cd Shadowbroker

# Add your API keys (optional — see Environment Variables be­low)

cp back­end/.​env.ex­am­ple back­end/.​env

# Build and start

docker-com­pose up -d –build

Custom ports or LAN ac­cess? The fron­tend auto-de­tects the back­end at

. If you remap the back­end to a dif­fer­ent port (e.g. 9096:8000”), set NEXT_PUBLIC_API_URL be­fore build­ing:

NEXT_PUBLIC_API_URL=http://​192.168.1.50:9096 docker-com­pose up -d –build

This is a build-time vari­able (Next.js lim­i­ta­tion) — it gets baked into the fron­tend dur­ing npm run build. Changing it re­quires a re­build.

If you just want to run the dash­board with­out deal­ing with ter­mi­nal com­mands:

Go to the Releases tab on the right side of this GitHub page.

Download the lat­est .zip file from the re­lease.

Extract the folder to your com­puter.

It will au­to­mat­i­cally in­stall every­thing and launch the dash­board!

If you want to mod­ify the code or run from source:

# Clone the repos­i­tory

git clone https://​github.com/​your-user­name/​shad­ow­bro­ker.git

cd shad­ow­bro­ker/​live-risk-dash­board

# Backend setup

cd back­end

python -m venv venv

venv\Scripts\ac­ti­vate # Windows

# source venv/​bin/​ac­ti­vate # ma­cOS/​Linux

pip in­stall -r re­quire­ments.txt

# Create .env with your API keys

echo AIS_API_KEY=your_aisstream_key” >> .env

echo OPENSKY_CLIENT_ID=your_opensky_client_id” >> .env

echo OPENSKY_CLIENT_SECRET=your_opensky_secret” >> .env

# Frontend setup

cd ../frontend

npm in­stall

# From the fron­tend di­rec­tory — starts both fron­tend & back­end con­cur­rently

npm run dev

All lay­ers are in­de­pen­dently tog­gleable from the left panel:

The plat­form is op­ti­mized for han­dling mas­sive real-time datasets:

* Viewport Culling — Only fea­tures within the vis­i­ble map bounds (+20% buffer) are ren­dered

* Clustered Rendering — Ships, CCTV, and earth­quakes use MapLibre clus­ter­ing to re­duce fea­ture count

# Required

AIS_API_KEY=your_aisstream_key # Maritime ves­sel track­ing (aisstream.io)

# Optional (enhances data qual­ity)

OPENSKY_CLIENT_ID=your_opensky_client_id # OAuth2 — higher rate lim­its for flight data

OPENSKY_CLIENT_SECRET=your_opensky_secret # OAuth2 — paired with Client ID above

LTA_ACCOUNT_KEY=your_lta_key # Singapore CCTV cam­eras

This is an ed­u­ca­tional and re­search tool built en­tirely on pub­licly avail­able, open-source in­tel­li­gence (OSINT) data. No clas­si­fied, re­stricted, or non-pub­lic data sources are used. Carrier po­si­tions are es­ti­mates based on pub­lic re­port­ing. The mil­i­tary-themed UI is purely aes­thetic.

Do not use this tool for any op­er­a­tional, mil­i­tary, or in­tel­li­gence pur­pose.

This pro­ject is for ed­u­ca­tional and per­sonal re­search pur­poses. See in­di­vid­ual API provider terms of ser­vice for data us­age re­stric­tions.

Built with ☕ and too many API calls

...

Read the original on github.com »

8 258 shares, 13 trendiness

My Homelab Setup

How I re­pur­posed my old gam­ing PC to set up a home server for data stor­age, back­ups, and self-hosted apps.

How I re­pur­posed my old gam­ing PC to set up a home server for data stor­age, back­ups, and self-hosted apps.

For the longest time, I’ve pro­cras­ti­nated on find­ing a good backup and stor­age so­lu­tion for my Fujifilm RAW files. My so­lu­tion up un­til re­cently in­volved man­u­ally copy­ing my pho­tos across two ex­ter­nal SSD dri­ves. This was quite a has­sle and I had­n’t yet fig­ured out a good off-site backup strat­egy.

After hear­ing con­stant news up­dates of how hard drive prices have been surg­ing due to AI data cen­ter build­outs, I fi­nally de­cided to pur­chase some hard dri­ves and set up a home­lab to meet my stor­age and backup needs. I also used this op­por­tu­nity to ex­plore self-host­ing some apps I’ve been ea­ger to check out.

I re­pur­posed my old gam­ing PC I built back in 2018 for this use case. This ma­chine has the fol­low­ing specs:

I pur­chased the Western Digital hard dri­ves over the win­ter hol­i­day break. The other com­po­nents were al­ready in­stalled on the ma­chine when I orig­i­nally built it.

On this ma­chine I in­stalled TrueNAS Community Edition on my NVMe drive. It’s a Linux-based op­er­at­ing sys­tem that is well-tai­lored for net­work-at­tached stor­age (NAS), file stor­age that is ac­ces­si­ble to any de­vice on your net­work.

For in­stance, TrueNAS al­lows you to cre­ate snap­shots of your data. This is great for pre­vent­ing data loss. If, for ex­am­ple, you ac­ci­den­tally deleted a file, you could re­cover it from a pre­vi­ous snap­shot con­tain­ing that file. In other words, a file is only truly deleted if and only if the sys­tem has no snap­shots con­tain­ing that file.

I’ve set up my ma­chine to take hourly, daily, and even weekly snap­shots. I’ve also con­fig­ured it to delete old snap­shots af­ter a given pe­riod of time to save stor­age space.

Most of my data is mir­rored across the two 8 TB hard disks in a RAID 1 setup. This means that if one drive fails, the other drive will still have all of my data in­tact. The SSD is used to store data from ser­vices that I self-host that ben­e­fit from hav­ing fast read and write speeds.

Not only is TrueNAS good for file stor­age, you can also host apps on it!

TrueNAS of­fers a cat­a­log of apps, sup­ported by the com­mu­nity, that you can in­stall on your ma­chine.

Scrutiny is a web dash­board for mon­i­tor­ing the health of your stor­age dri­ves. Hard dri­ves and SSDs have built-in firmware called S. M.A.R.T. (Self-Monitoring, Analysis, and Reporting Technology) that con­tin­u­ously tracks health met­rics like tem­per­a­ture, power-on hours, and read er­rors.

Scrutiny reads this data and pre­sents it in a dash­board show­ing his­tor­i­cal trends, mak­ing it easy to spot warn­ing signs that a drive may fail soon.

Backrest is a web fron­tend for restic, a com­mand-line tool used for cre­at­ing file back­ups. I’ve set this up to save daily back­ups of my data to an ob­ject stor­age bucket on Backblaze B2.

Immich is one of the most pop­u­lar open-source self-hosted apps for man­ag­ing pho­tos and videos. I love that it also of­fers iOS and Android apps that al­low you to back up pho­tos and videos from your mo­bile de­vices. This is great if you want to rely less on ser­vices like Google Photos or iCloud. I’m cur­rently us­ing this to back up pho­tos and videos from my phone.

Mealie is a recipe man­age­ment tool that has made my meal prep­ping ex­pe­ri­ence so much bet­ter! I’ve found it great for sav­ing recipes I find on sites like NYT Cooking.

When im­port­ing recipes, you can pro­vide the URL of the recipe and Mealie will scrape the in­gre­di­ents and in­struc­tions from the page and save it in your recipe li­brary. This makes it eas­ier to keep track of recipes you find on­line and want to try out later.

Ollama is a back­end for run­ning var­i­ous AI mod­els. I in­stalled it to try run­ning large lan­guage mod­els like qwen3.5:4b and gem­ma3:4b out of cu­rios­ity. I’ve also re­cently been ex­plor­ing the world of vec­tor em­bed­dings such as qwen3-em­bed­ding:4b. All of these mod­els are small enough to fit in the 8GB of VRAM my GPU pro­vides. I like be­ing able to of­fload the work of run­ning mod­els on my home­lab in­stead of my lap­top.

When I’m not at home, I use Tailscale, a plug-and-play VPN ser­vice, to ac­cess my data and self-hosted apps re­motely from any de­vice. Tailscale builds on top of an­other tool called WireGuard to pro­vide a se­cure tun­nel into my home net­work.

The ad­van­tage here is that my home­lab PC does­n’t need to be ex­posed to the pub­lic in­ter­net for this to work. Any de­vice I want to use to ac­cess my home­lab re­motely needs to in­stall the Tailscale app and be au­then­ti­cated to my Tailscale net­work.

Right now, ac­cess­ing my apps re­quires typ­ing in the IP ad­dress of my ma­chine (or Tailscale ad­dress) to­gether with the ap­p’s port num­ber. Because all of my ser­vices share the same IP ad­dress, my pass­word man­ager has trou­ble dis­tin­guish­ing which lo­gin to use for each one.

In the fu­ture I’ll look into fig­ur­ing out how to as­sign cus­tom do­main names to all of my ser­vices.

...

Read the original on bryananthonio.com »

9 255 shares, 15 trendiness

We Should Revisit Literate Programming in the Agent Era

Literate pro­gram­ming is the idea that code should be in­ter­min­gled with prose such that an un­in­formed reader could read a code base as a nar­ra­tive, and come away with an un­der­stand­ing of how it works and what it does.

Although I have long been in­trigued by this idea, and have found uses for it in a cou­ple of dif­fer­ent cases, I have found that in prac­tice lit­er­ate pro­gram­ming turns into a chore of main­tain­ing two par­al­lel nar­ra­tives: the code it­self, and the prose. This has ob­vi­ously lim­ited its adop­tion.

Historically in prac­tice lit­er­ate pro­gram­ming is most com­monly found as Jupyter note­books in the data sci­ence com­mu­nity, where ex­pla­na­tions live along­side cal­cu­la­tions and their re­sults in a web browser.

Frequent read­ers of this blog will be aware that Emacs Org Mode sup­ports poly­glot lit­er­ate pro­gram­ming through its org-ba­bel pack­age, al­low­ing ex­e­cu­tion of ar­bi­trary lan­guages with re­sults cap­tured back into the doc­u­ment, but this has re­mained a niche pat­tern for nerds like me.

Even for some­one as en­thu­si­as­tic about this pat­tern as I am, it be­comes cum­ber­some to use Org as the source of truth for larger soft­ware pro­jects, as the source code es­sen­tially be­comes a com­piled out­put, and af­ter every edit in the Org file, the code must be re-ex­tracted and placed into its des­ti­na­tion (“tangled”, in Org Mode par­lance). Obviously this can be au­to­mated, but it’s easy to get into an­noy­ing sit­u­a­tions where you or your agent has edited the real source and it gets over­writ­ten on the next tan­gle.

That said, I have had enough suc­cess with us­ing lit­er­ate pro­gram­ming for book­keep­ing per­sonal con­fig­u­ra­tion that I have not been able to fully give up on the idea, even be­fore the ad­vent of LLMs.

For ex­am­ple: be­fore cod­ing agents, I had been adapt­ing a pat­tern for us­ing Org Mode for man­ual test­ing and note-tak­ing: in­stead of work­ing on the com­mand line, I would write more com­mands into my ed­i­tor and ex­e­cute them there, edit­ing them in place un­til each step was cor­rect, and run­ning them in-place, so that when I was done I would have a doc­u­ment ex­plain­ing ex­actly the steps that were taken, with­out ex­tra steps or note-tak­ing. Combining the act of cre­at­ing the note and run­ning the test gives you the notes for free when the test is com­pleted.

This is even more ex­cit­ing now that we have cod­ing agents. Claude and Kimi and friends all have a great grasp of Org Mode syn­tax; it’s a for­giv­ing markup lan­guage and they are quite good at those. All the doc­u­men­ta­tion is avail­able on­line and was prob­a­bly in the train­ing data, and while a big down­side of Org Mode is just how much syn­tax there is, but that’s no prob­lem at all for a lan­guage model.

Now when I want to test a fea­ture, I ask the clanker to write me a run­book in Org. Then I can re­view it — the prose ex­plains the mod­el’s re­flec­tion of the in­tent for each step, and the code blocks are in­ter­ac­tively ex­e­cutable once I am done re­view­ing, ei­ther one at a time or the whole file like a script. The re­sults will be stored in the doc­u­ment, un­der the code, like a Jupyter note­book.

I can edit the prose and ask the model to up­date the code, or edit the code and have the model re­flect the mean­ing upon the text. Or ask the agent to change both si­mul­ta­ne­ously. The prob­lem of main­tain­ing the par­al­lel sys­tems dis­ap­pears.

The agent is told to han­dle tan­gling, and the prob­lem of ex­trac­tion goes away. The agent can be in­structed with an AGENTS.md file to treat the Org Mode file as the source of truth, to al­ways ex­plain in prose what is go­ing on, and to tan­gle be­fore ex­e­cu­tion. The agent is very good at all of these things, and it never gets tired of re-ex­plain­ing some­thing in prose af­ter a tweak to the code.

The fun­da­men­tal ex­tra la­bor of lit­er­ate pro­gram­ming, which I be­lieve is why it is not widely prac­ticed, is elim­i­nated by the agent and it uti­lizes ca­pa­bil­i­ties the large lan­guage model is best at: trans­la­tion and sum­ma­riza­tion.

As a ben­e­fit, the code base can now be ex­ported into many for­mats for com­fort­able read­ing. This is es­pe­cially im­por­tant if the pri­mary role of en­gi­neers is shift­ing from writ­ing to read­ing.

I don’t have data to sup­port this, but I also sus­pect that lit­er­ate pro­gram­ming will im­prove the qual­ity of gen­er­ated code, be­cause the prose ex­plain­ing the in­tent of each code block will ap­pear in con­text along­side the code it­self.

I have not per­son­ally had the op­por­tu­nity to try this pat­tern yet on a larger, more se­ri­ous code­base. So far, I have only been us­ing this work­flow for test­ing and for doc­u­ment­ing man­ual processes, but I am thrilled by its ap­pli­ca­tion there.

I also rec­og­nize that the Org for­mat is a lim­it­ing fac­tor, due to its tight in­te­gra­tion with Emacs. However, I have long be­lieved that Org should es­cape Emacs. I would pro­mote some­thing like Markdown in­stead, how­ever Markdown lacks the abil­ity to in­clude meta­data. But as usual in my posts about Emacs, it’s not Emacs’s spe­cific im­ple­men­ta­tion of the idea that ex­cites me, as in this case Org’s im­ple­men­ta­tion of lit­er­ate pro­gram­ming does.

It is the idea it­self that is ex­cit­ing to me, not the tool.

With agents, does it be­come prac­ti­cal to have large code­bases that can be read like a nar­ra­tive, whose prose is kept in sync with changes to the code by tire­less ma­chines?

...

Read the original on silly.business »

10 222 shares, 10 trendiness

Why can’t you tune your guitar?

Short an­swer: be­cause math. Longer an­swer: be­cause prime num­bers don’t di­vide into each other evenly.

To un­der­stand what fol­lows, you need to know some facts about the physics of vi­brat­ing strings:

* When you pluck a gui­tar string, it vi­brates to and fro. You can tell how fast the string is vi­brat­ing by lis­ten­ing to the pitch it pro­duces.

* Shorter and higher-ten­sion strings vi­brate faster and make higher pitches. Longer and lower-ten­sion strings vi­brate slower and make lower pitches.

* The sci­en­tific term for the rate of the string’s vi­bra­tion is its fre­quency. You mea­sure fre­quency in hertz (Hz), a unit that just means vibrations per sec­ond.” The stan­dard tun­ing pitch, 440 Hz, is the pitch you hear when an ob­ject (like a tun­ing fork or gui­tar string) vi­brates to and fro 440 times per sec­ond.

* Strings can vi­brate in many dif­fer­ent ways at once. In ad­di­tion to the en­tire length of the string bend­ing back and forth, the string can also vi­brate in halves, in thirds, in quar­ters, and so on. These vi­bra­tions of string sub­sec­tions are called har­mon­ics (or over­tones, or par­tials, they all mean the same thing.)

If you watch slow-mo­tion video of a gui­tar string vi­brat­ing, you’ll see a com­plex, evolv­ing blend of squig­gles. These squig­gles are the math­e­mat­i­cal sum of all of the string’s dif­fer­ent har­mon­ics. The weird and in­ter­est­ing thing about har­mon­ics is that each one pro­duces a dif­fer­ent pitch. So when you play a note, you’re ac­tu­ally hear­ing many dif­fer­ent pitches at once.

It’s not dif­fi­cult to iso­late the har­mon­ics of a vi­brat­ing string and hear their in­di­vid­ual pitches. Harmonics are very use­ful for tun­ing your gui­tar — here’s a handy guide for do­ing so. They are also the ba­sis of the whole Western tun­ing sys­tem gen­er­ally.

As a string vi­brates, its longer sub­sec­tions pro­duce lower and louder har­mon­ics, while its shorter sub­sec­tions pro­duce higher and qui­eter har­mon­ics. Click the im­age be­low to hear the first six har­mon­ics of a string:

Remember that in a real-world string, you are hear­ing all these har­mon­ics blended to­gether. However, you can iso­late the har­mon­ics of a gui­tar string by lightly touch­ing it in cer­tain places to deaden some of the vi­bra­tions.

* If you touch the vi­brat­ing string at its halfway point, that dead­ens the vi­bra­tion along the string’s en­tire length, en­abling you to hear it vi­brat­ing in halves.

* If you touch the string a third of the way along its length, that dead­ens the vi­bra­tion both of the en­tire string and the halves of the string, so you can now hear it vi­brat­ing in thirds.

* If you touch the string a quar­ter of the way along its length, that dead­ens the vi­bra­tion of the whole string, the halves, and the thirds, so you can now hear it vi­brat­ing in quar­ters.

Imagine that you have a gui­tar string tuned to play a note called middle C,” which has a fre­quency of 1 Hz. (In re­al­ity, mid­dle C has a fre­quency of 261.626 Hz, so if you want to think in terms of ac­tual fre­quen­cies, just mul­ti­ply all the num­bers in the fol­low­ing para­graphs by 261.626.)

The first har­monic is the string vi­brat­ing along its en­tire length, oth­er­wise known as the fun­da­men­tal fre­quency. When we say that your C string is vi­brat­ing at 1 Hz, that re­ally means that its fun­da­men­tal has a fre­quency of 1 Hz. The other har­mon­ics all have other fre­quen­cies, and we’ll get to those, but the fun­da­men­tal is usu­ally the loud­est har­monic, and it’s usu­ally the only one you’re aware of hear­ing.

The sec­ond har­monic is the one you get from the string vi­brat­ing in halves. Each half of the string vi­brates at twice the fre­quency of the whole string. The 2:1 re­la­tion­ship be­tween the pitches of the first and sec­ond har­mon­ics is called an oc­tave. (I know that the word sug­gests the num­ber eight, not the num­ber two. Don’t worry about it.) The pitch that’s an oc­tave above mid­dle C has a fre­quency of 2 Hz, and it is also called C. Both of these notes have the same let­ter name be­cause in Western con­ven­tion, notes an oc­tave apart from each other are con­sid­ered to be the same note“. The im­por­tant con­cept here is that you can move up an oc­tave from any pitch by dou­bling its fre­quency. You can also move down an oc­tave from any pitch by halv­ing its fre­quency.

The third har­monic is the one you get from the string vi­brat­ing in thirds. Its fre­quency is three times the fun­da­men­tal fre­quency. Since your C string’s fun­da­men­tal is 1 Hz, the third har­monic has a fre­quency of 3 Hz, and it pro­duces a note called G. The in­ter­val be­tween C and G is called a per­fect fifth, for rea­sons hav­ing noth­ing to do with har­mon­ics. I know it’s con­fus­ing.

The fourth har­monic is the one you get from the string vi­brat­ing in quar­ters, at 4 Hz. This note is an oc­tave higher than the sec­ond har­monic, and so is also called C. (The eighth har­monic will also play C, as will the six­teenth, and the thirty-sec­ond, and all the pow­ers of two up to in­fin­ity.)

The fifth har­monic is the one you get from the string vi­brat­ing in fifths. Its fre­quency is 5 Hz, and it pro­duces a note called E. The in­ter­val be­tween C and E is called a ma­jor third, which is an­other name that has noth­ing to do with har­mon­ics.

There are many more har­mon­ics (infinitely many more, in the­ory) but these first five are the most au­di­ble ones.

The an­cient Greeks fig­ured out that if you have a set of strings, it sounds re­ally good if you tune them fol­low­ing the pitch ra­tios from the nat­ural har­monic se­ries. In such tun­ing sys­tems, you pick a start­ing fre­quency, and then mul­ti­ply or di­vide it by ra­tios of whole num­bers to gen­er­ate more fre­quen­cies, the same way you fig­ure out the fre­quen­cies of a sin­gle string’s har­mon­ics. The best-sound­ing note com­bi­na­tions (to Western peo­ple) are the ones de­rived from the first few har­mon­ics. In other words, you get the nicest har­mony (for Western peo­ple) when you mul­ti­ply and di­vide your fre­quen­cies by ra­tios of the small­est prime num­bers: 2, 3, and 5.

So, let’s do it. Let’s make a tun­ing sys­tem based on the har­mon­ics of your C string. First, we should find the C, G and E notes whose fre­quen­cies are as close to each other as pos­si­ble.

* We’ve al­ready got C at 1 Hz.

* We can bring our G at 3 Hz down an oc­tave by di­vid­ing its fre­quency in half. This gives us a G at 3/2 Hz.

* We can also bring our E at 5 Hz down two oc­taves by di­vid­ing its fre­quency in half twice. This gives us an E at 5/4 Hz.

When you play 1 Hz, 5/4 Hz and 3/2 Hz at the same time, you get a lovely sound called a C ma­jor triad.

So far, so good. Let’s find some more notes!

We can ex­tend our tun­ing sys­tem by think­ing of G as our base note, and look­ing at its har­mon­ics. When we do, we get two new notes. The third har­monic of G is D at 9 Hz. (Thanks to oc­tave equiv­a­lency, we can also make Ds at 9/2 Hz, and 9/4 Hz, and 9/8 Hz, and 18 Hz, and 36 Hz, and so on.) The fifth har­monic of G is B at 15 Hz. (There are also Bs at 15/2 Hz, and 30 Hz, and so on.)

The notes C and G feel closely re­lated to each other be­cause of their shared har­monic re­la­tion­ship. The chords you get from their re­spec­tive over­tone se­ries also feel re­lated. If you al­ter­nate be­tween C ma­jor and G ma­jor chords, it just about al­ways sounds good.

Now let’s ex­tend our tun­ing sys­tem fur­ther by treat­ing D as our base note. The har­mon­ics of D give us two more new notes: the third har­monic is A at 27 Hz (and 27/2 Hz and 27/4 Hz and 27/8 Hz), and the fifth har­monic is F-sharp at 45 Hz (and 45/2 Hz and 45/4 Hz and 45/8 Hz).

G ma­jor chords and D ma­jor chords have the same re­la­tion­ship as C ma­jor and G ma­jor chords, and they sound equally good when you al­ter­nate them. Also, C ma­jor, G ma­jor and D ma­jor chords all sound good as a group, in any or­der and any com­bi­na­tion. Western peo­ple just re­ally like the sound of shared har­mon­ics. Last thing: no­tice that you can com­bine the har­mon­ics of C, G and D to form a G ma­jor scale.

Now let’s make some more notes by treat­ing A as our base and look­ing at its har­mon­ics. The third har­monic of A is E at 81 Hz (and 81/2 Hz and 81/4 Hz etc).

But wait. We al­ready had an E, at 5 Hz. If we put these two E’s in the same oc­tave, then one of them is at 80/64 Hz, and the other is at 81/64 Hz. That may not seem like much of a dif­fer­ence, but even un­trained lis­ten­ers will be able to hear that they are out of tune with each other. Furthermore, if we use the E de­rived from C, then it will be out of tune with A. However, if we use the E de­rived from A, then it will be out of tune with C. This is go­ing to be a prob­lem.

Let’s for­get about that con­flict for a sec­ond. Instead, we’ll try a dif­fer­ent method of ex­pand­ing our tun­ing sys­tem, by go­ing in the op­po­site di­rec­tion from C. Let’s think about a note that con­tains C in its har­monic se­ries. That would be F at 1/3 Hz. The third har­monic of F is C at 1 Hz, as ex­pected. The fifth har­monic of F is A at 5/3 Hz.

Uh oh. This new A con­flicts with the one we al­ready had at 27 Hz. That is not good. But let’s bracket that and keep ex­pand­ing.

We can push fur­ther left by find­ing the note whose over­tone se­ries con­tains F. That would be B-flat at 1/9 Hz. Its third har­monic is F at 1/3 Hz, and its fifth har­monic is D at 5/9 Hz. And now we have a new prob­lem: this D clashes with our ex­ist­ing D at 9 Hz.

Can you see the pat­tern here? Anytime you want to use in­ter­vals based on third har­mon­ics, you’re mul­ti­ply­ing and di­vid­ing by 3, but any­time you want to use in­ter­vals based on fifth har­mon­ics, you’re mul­ti­ply­ing and di­vid­ing by 5. (Notice that the con­flict­ing notes al­ways con­flict by the same amount, too, a ra­tio of 81/80.) Starting from C, it’s pos­si­ble to pro­duce any note if you mul­ti­ply or di­vide your fre­quen­cies by 3 enough times, but those notes won’t be in tune with the notes you’d get mul­ti­ply­ing or di­vid­ing your fre­quen­cies by 5, be­cause 3 and 5 don’t mu­tu­ally di­vide evenly. This is not just an ab­stract math­e­mat­i­cal is­sue. It’s the rea­son that it’s im­pos­si­ble to have a gui­tar be in tune with it­self.

Imagine that the gui­tar’s low E string has a fre­quency of 1 Hz. (It’s re­ally 82.4069 Hz; feel free to mul­ti­ply every­thing in this next sec­tion by that num­ber if you want ac­tual fre­quen­cies.) Ideally, you want your high E string to be tuned two oc­taves above the low one, at 4 Hz. Let’s see if you can get there by tun­ing the strings pair­wise.

* The in­ter­val be­tween E and A is a fifth, but it’s up­side down, be­cause we’re go­ing down a fifth from E. In mu­sic the­ory terms, an up­side down fifth is called a fourth. You go up a fourth by mul­ti­ply­ing your fre­quency by 4/3 (it’s 3/2 up­side down, dou­bled to bring it up an oc­tave.) So your A string is now tuned to 4/3 Hz.

* The D string should be an­other fourth higher, so you can mul­ti­ply by 4/3 again, giv­ing you 16/9 Hz.

* The G string should be yet an­other fourth higher, so you mul­ti­ply by 4/3 to get 64/27 Hz.

* The B string is a ma­jor third higher than G, which means mul­ti­ply­ing by 5/4, and that puts you at 80/27 Hz.

* Finally, to get to high E, that’s an­other fourth, so you mul­ti­ply by 4/3 again, giv­ing you… uh… 320/81 Hz.

This is not good. We wanted the high E to be at 4 Hz, which is the same as 324/81 Hz. We’re 4/81 Hz flat! That dif­fer­ence is big enough to make your gui­tar tun­ing sound like warm garbage.

Let’s try a dif­fer­ent strat­egy. I said you should tune the B string a ma­jor third above G. However, you could just as eas­ily re­tune the B string so it’s a fifth plus an oc­tave above the low E string. You do this by mul­ti­ply­ing 1 Hz by 3/2, and then dou­bling it, which puts your B at 3 Hz. Now the B string sounds per­fectly in tune with the low E string at 1 Hz, and with the high E string at 4 Hz. Unfortunately, the B string is now out of tune with the G string at 64/27 Hz.

So maybe you should just re­tune the G string a ma­jor third be­low your new B, at 12/5 Hz. That makes the G and B strings sound great to­gether. Unfortunately, now the G string is out of tune with the D string at 16/9 Hz.

You could re­tune the D string to be a fourth be­low G… but now the D string will be out of tune with the A string. If you re­tune the A string based on your new D, then it will be out of tune against the low E string. And if you re­tune the low E string based on your new A, then it will be out of tune with the high E string.

The bot­tom line: there is no way to tune the gui­tar so that every string is in tune with every other string.

The math­e­mat­i­cal awk­ward­ness of har­mon­ics-based tun­ing sys­tems has caused Western mu­si­cians a lot of pain over the past thou­sand years. Depending on your start­ing pitch, some in­ter­vals can be per­fectly in tune, but oth­ers can’t be. And the more har­mon­i­cally com­plex you want your mu­sic to be, the worse the tun­ing is­sues be­come.

In the 16th cen­tury, Chinese and Dutch mu­si­cians in­de­pen­dently came up with an al­ter­na­tive sys­tem to har­mon­ics-based tun­ing, called 12-tone equal tem­pera­ment, or 12-TET. It’s the sys­tem that the en­tire Western world uses to­day. The idea be­hind 12-TET is to have every­thing be pretty much in tune, which you ac­com­plish by hav­ing every­thing be a lit­tle bit out of tune. Is this a worth­while com­pro­mise? Let’s do the math and find out.

In 12-TET, you di­vide up the oc­tave into twelve equally-sized semi­tones (the in­ter­val be­tween two ad­ja­cent pi­ano keys or gui­tar frets). To go up a semi­tone from any note, you mul­ti­ply its fre­quency by the 12th root of 2 (about 1.05946). To go down a semi­tone from any note, you di­vide its fre­quency by the 12th root of 2. If you go up by an oc­tave (twelve semi­tones), you’re mul­ti­ply­ing your fre­quency by the 12th root of 2 twelve times, which works out to 2. That’s a per­fect oc­tave, hooray! Unfortunately, you can’t ex­actly cre­ate the other har­mon­ics-based in­ter­vals by adding up 12-TET semi­tones; you can only ap­prox­i­mate them.

Remember that the pure fifth you get from har­mon­ics is a fre­quency ra­tio of 3/2. In 12-TET, how­ever, you make a fifth by adding up seven semi­tones. This means that you mul­ti­ply your fre­quency by the 12th root of two seven times, which comes to about 1.498. That’s close to 3/2, but it’s not ex­act. As a re­sult, fifths in 12-TET sound a lit­tle flat com­pared to what your ear is ex­pect­ing from nat­ural har­mon­ics.

Major thirds are worse in 12-TET. Recall that the ma­jor third you get from the over­tone se­ries is a fre­quency ra­tio of 5/4. In 12-TET, you make a ma­jor third by adding four semi­tones, which means that you mul­ti­ply your fre­quency by the 12th root of 2 four times. That comes to 1.25992, which is no­tice­ably higher than 5/4. Thirds in 12-TET are quite sharp com­pared to what your ears are ex­pect­ing from nat­ural har­mon­ics.

If thirds and fifths are so out of tune in 12-TET, why do we use it? The ad­van­tage is that all the thirds and fifths in all the keys are out of tune by the same amount. None of them sound per­fect, but none of them sound ter­ri­ble, ei­ther. You don’t have to worry about whether your notes are de­rived from the third har­monic of some note or the fifth har­monic of some other note; they all just work to­gether, kind of. If you use a dig­i­tal gui­tar tuner, you are tun­ing your strings to the 12-TET ver­sions of E, A, D, G and B. None of them will be per­fectly in tune with each other, but they will all be wrong by an ac­cept­able amount. Also, songs in the key of E won’t sound any bet­ter or worse than songs in the key of F or E-flat.

Not every­one in his­tory thought that 12-TET was an ac­cept­able com­pro­mise. Johann Sebastian Bach thought we should use other tun­ing sys­tems that made bet­ter-sound­ing thirds and fifths in some keys in ex­change for worse-sound­ing thirds and fifths in oth­ers. In Bach’s pre­ferred tun­ing, each key had its own dis­tinc­tive blend of smooth­ness and harsh­ness. However, Bach did not get his way. We as a civ­i­liza­tion have col­lec­tively de­cided that we want all our keys to be in­ter­change­able. There are good rea­sons to want this! In 12-TET, all in­ter­vals and chords are built from stan­dard­ized, Lego-like parts. You don’t have to keep track of a com­pli­cated web of dif­fer­ent-sized in­ter­vals in every key. If you move a song from C to C-sharp or D or any­where else, you can be con­fi­dent that it will still sound the same.”

Some mu­si­cians don’t want to ac­com­mo­date to 12-TET, in­sist­ing in­stead that we should con­tinue to use pure in­ter­vals de­rived from har­mon­ics the way God and Pythagoras in­tended. Harmonics-based tun­ing sys­tems are col­lec­tively known as just in­to­na­tion sys­tems. This is a po­et­i­cally apt term, be­cause it im­plies fair­ness. By con­trast, the im­plicit mes­sage of 12-TET is that life is­n’t fair. Just in­to­na­tion sys­tems give you some lovely pure in­ter­vals, but you can’t change keys un­less you re­tune all your in­stru­ments. In other world cul­tures, this is not nec­es­sar­ily a prob­lem. Hindustani clas­si­cal mu­sic uses just in­to­na­tion over an om­nipresent drone, so every­thing is al­ways in the same key.”

Meanwhile, a few Western odd­balls and nerds have ex­plored just in­to­na­tion sys­tems that use big­ger prime num­bers than 2, 3 and 5 to gen­er­ate finer pure in­ter­vals. Harry Partch used the primes up to eleven to make a tun­ing sys­tem that di­vides up the oc­tave into 43 pure parts rather than 12 im­pure ones. You can try the Partch 43-tone scale us­ing the Wilsonic app or Audiokit Synth One. It’s ex­tremely strange! But, I guess, it’s strange in a pure way. I have made some mu­sic of my own with ex­otic just in­to­na­tion tun­ings.

Just in­to­na­tion may also play a role in the blues. There is a the­ory that the blues orig­i­nates from the nat­ural over­tone se­ries of I and IV. If this is true, then the char­ac­ter­is­tic chords and scales of the blues are re­ally 12-TET ap­prox­i­ma­tions of the orig­i­nal just in­to­na­tion blues scale. It’s con­ven­tional to say that blues mu­si­cians and singers bend notes to make them go out of tune, but it may be that they are ac­tu­ally bend­ing the 12-TET pitches to get them in tune in­stead.

Anyway, out­side of the blues and the avant-garde, most Western mu­si­cians just live with every­thing be­ing a lit­tle out of tune. If you’re a gui­tarist, you know that no mat­ter how you tune your gui­tar, it won’t stay in tune for long any­way, so how much does any of this even mat­ter? There’s a joke among gui­tarists: we spend half our lives tun­ing, and the other half wish­ing we were in tune. There are lots of rea­sons why tun­ing is hard: you might be ham­pered by hav­ing a poorly made gui­tar, or by hav­ing a gui­tar that’s not set up cor­rectly, or by us­ing old worn-out strings, or by changes in tem­per­a­ture or hu­mid­ity, or just by a lack of pa­tience or time. At least you can be se­cure in the knowl­edge that some of your tun­ing strug­gles are due to the ba­sic un­fair­ness of the uni­verse, and not just the lim­i­ta­tions of your ears or your equip­ment.

...

Read the original on www.ethanhein.com »

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

If you like 10HN please leave feedback and share

Visit pancik.com for more.