10 interesting stories served every morning and every evening.
(Photography)
Here’s a photo of a Christmas tree, as my camera’s sensor sees it:
This is becuase while the camera’s analog-to-digital converter (ADC) output can theoretically output values from 0 to 16382, the data doesn’t cover that whole range:
The real range of ADC values is ~2110 to ~136000. Let’s set those values as the white and black in the image:
Much better, but it’s still more monochromatic then I remember the tree being. Camera sensors aren’t actually able to see color: They only measure how much light hit each pixel.
In a color camera, the sensor is covered by a grid of alternating color filters:
Let’s color each pixel the same as the filter it’s looking through:
This version is more colorful, but each pixel only has one third of its RGB color.
To fix this, I just averaged the values each pixel with its neighbors:
Applying this process to the whole photo gives the lights some color:
However, the image is still very dark. This is because monitors don’t have as much dynamic range as the human eye, or a camera sensor: Even if you are using an OLED, the screen still has some ambient light reflecting off of it and limiting how black it can get.
There’s also another, sneakier factor causing this:
Our perception of brightness is non-linear.
If brightness values are quantized, most of the ADC bins will be wasted on nearly identical shades of white while every other tone is crammed into the bottom. Because this is an inefficient use of memory, most color spaces assign extra bins to darker colors:
As a result of this, if the linear data is displayed directly, it will appear much darker then it should be.
Both problems can be solved by applying a non-linear curve to each color channel to brighten up the dark areas… but this doesn’t quite work out:
Some of this green cast is caused by the camera sensor being intrinsically more sensitive to green light, but some of it is my fault: There are twice as many green pixels in the filter matrix. When combined with my rather naive demosaicing, this resulted in the green channel being boosted even higher.
In either case, it can fixed with proper white-balance: Equalize the channels by multipling each one with a constant.
However, because the image is now non-linear, I have to go back a step to do this. Here’s the dark image from before with all the values temporarily scaled up so I can see the problem:
… here’s that image with the green taken down to match the other channels:
… and after re-applying the curve:
This is really just the bare minimum: I haven’t done any color calibration, the white balance isn’t perfect, the black points are too high, there’s lots of noise that needs to be cleaned up…
Additionally, applying the curve to each color channel accidentally desaturated the highlights. This effect looks rather good — and is what we’ve come to expect from film — but it has de-yellowed the star. It’s possible to separate the luminance and curve it while preserving color. On its own, this would make the LED Christmas lights into an overstaturated mess, but combining both methods can produce nice results.
For comparison, here’s the image my camera produced from the same data:
Far from being an “unedited” photo: there’s a huge amount of math that’s gone into making an image that nicely represents what the subject looks like in person.
There’s nothing that happens when you adjust the contrast or white balance in editing software that the camera hasn’t done under the hood. The edited image isn’t “faker” then the original: they are different renditions of the same data.
In the end, replicating human perception is hard, and it’s made harder when constrained to the limitations of display technology or printed images. There’s nothing wrong with tweaking the image when the automated algorithms make the wrong call.
...
Read the original on maurycyz.com »
...
Read the original on skyview.social »
uv installs packages faster than pip by an order of magnitude. The usual explanation is “it’s written in Rust.” That’s true, but it doesn’t explain much. Plenty of tools are written in Rust without being notably fast. The interesting question is what design decisions made the difference.
Charlie Marsh’s Jane Street talk and a Xebia engineering deep-dive cover the technical details well. The interesting parts are the design decisions: standards that enable fast paths, things uv drops that pip supports, and optimizations that don’t require Rust at all.
pip’s slowness isn’t a failure of implementation. For years, Python packaging required executing code to find out what a package needed.
The problem was setup.py. You couldn’t know a package’s dependencies without running its setup script. But you couldn’t run its setup script without installing its build dependencies. PEP 518 in 2016 called this out explicitly: “You can’t execute a setup.py file without knowing its dependencies, but currently there is no standard way to know what those dependencies are in an automated fashion without executing the setup.py file.”
This chicken-and-egg problem forced pip to download packages, execute untrusted code, fail, install missing build tools, and try again. Every install was potentially a cascade of subprocess spawns and arbitrary code execution. Installing a source distribution was essentially curl | bash with extra steps.
The fix came in stages:
* PEP 518 (2016) created pyproject.toml, giving packages a place to declare build dependencies without code execution. The TOML format was borrowed from Rust’s Cargo, which makes a Rust tool returning to fix Python packaging feel less like coincidence.
* PEP 517 (2017) separated build frontends from backends, so pip didn’t need to understand setuptools internals.
* PEP 621 (2020) standardized the [project] table, so dependencies could be read by parsing TOML rather than running Python.
* PEP 658 (2022) put package metadata directly in the Simple Repository API, so resolvers could fetch dependency information without downloading wheels at all.
PEP 658 went live on PyPI in May 2023. uv launched in February 2024. uv could be fast because the ecosystem finally had the infrastructure to support it. A tool like uv couldn’t have shipped in 2020. The standards weren’t there yet.
Other ecosystems figured this out earlier. Cargo has had static metadata from the start. npm’s package.json is declarative. Python’s packaging standards finally bring it to parity.
Speed comes from elimination. Every code path you don’t have is a code path you don’t wait for.
uv’s compatibility documentation is a list of things it doesn’t do:
No .egg support. Eggs were the pre-wheel binary format. pip still handles them; uv doesn’t even try. The format has been obsolete for over a decade.
No pip.conf. uv ignores pip’s configuration files entirely. No parsing, no environment variable lookups, no inheritance from system-wide and per-user locations.
No bytecode compilation by default. pip compiles .py files to .pyc during installation. uv skips this step, shaving time off every install. You can opt in if you want it.
Virtual environments required. pip lets you install into system Python by default. uv inverts this, refusing to touch system Python without explicit flags. This removes a whole category of permission checks and safety code.
Stricter spec enforcement. pip accepts malformed packages that technically violate packaging specs. uv rejects them. Less tolerance means less fallback logic.
Ignoring requires-python upper bounds. When a package says it requires python, uv ignores the upper bound and only checks the lower. This reduces resolver backtracking dramatically since upper bounds are almost always wrong. Packages declare python because they haven’t tested on Python 4, not because they’ll actually break. The constraint is defensive, not predictive.
First-index wins by default. When multiple package indexes are configured, pip checks all of them. uv picks from the first index that has the package, stopping there. This prevents dependency confusion attacks and avoids extra network requests.
Each of these is a code path pip has to execute and uv doesn’t.
Some of uv’s speed comes from Rust. But not as much as you’d think. Several key optimizations could be implemented in pip today:
HTTP range requests for metadata. Wheel files are zip archives, and zip archives put their file listing at the end. uv tries PEP 658 metadata first, falls back to HTTP range requests for the zip central directory, then full wheel download, then building from source. Each step is slower and riskier. The design makes the fast path cover 99% of cases. None of this requires Rust.
Parallel downloads. pip downloads packages one at a time. uv downloads many at once. Any language can do this.
Global cache with hardlinks. pip copies packages into each virtual environment. uv keeps one copy globally and uses hardlinks (or copy-on-write on filesystems that support it). Installing the same package into ten venvs takes the same disk space as one. Any language with filesystem access can do this.
Python-free resolution. pip needs Python running to do anything, and invokes build backends as subprocesses to get metadata from legacy packages. uv parses TOML and wheel metadata natively, only spawning Python when it hits a setup.py-only package that has no other option.
PubGrub resolver. uv uses the PubGrub algorithm, originally from Dart’s pub package manager. Both pip and PubGrub use backtracking, but PubGrub applies conflict-driven clause learning from SAT solvers: when it hits a dead end, it analyzes why and skips similar dead ends later. This makes it faster on complex dependency graphs and better at explaining failures. pip could adopt PubGrub without rewriting in Rust.
Zero-copy deserialization. uv uses rkyv to deserialize cached data without copying it. The data format is the in-memory format. Libraries like FlatBuffers achieve this in other languages, but rkyv integrates tightly with Rust’s type system.
Thread-level parallelism. Python’s GIL forces parallel work into separate processes, with IPC overhead and data copying. Rust can parallelize across threads natively, sharing memory without serialization boundaries. This matters most for resolution, where the solver explores many version combinations.
No interpreter startup. Every time pip spawns a subprocess, it pays Python’s startup cost. uv is a single static binary with no runtime to initialize.
Compact version representation. uv packs versions into u64 integers where possible, making comparison and hashing fast. Over 90% of versions fit in one u64. This is micro-optimization that compounds across millions of comparisons.
These are real advantages. But they’re smaller than the architectural wins from dropping legacy support and exploiting modern standards.
uv is fast because of what it doesn’t do, not because of what language it’s written in. The standards work of PEP 518, 517, 621, and 658 made fast package management possible. Dropping eggs, pip.conf, and permissive parsing made it achievable. Rust makes it a bit faster still.
pip could implement parallel downloads, global caching, and metadata-only resolution tomorrow. It doesn’t, largely because backwards compatibility with fifteen years of edge cases takes precedence. But it means pip will always be slower than a tool that starts fresh with modern assumptions.
Other package managers could learn from this: static metadata, no code execution to discover dependencies, and the ability to resolve everything upfront before downloading. Cargo and npm have operated this way for years. If your ecosystem requires running arbitrary code to find out what a package needs, you’ve already lost.
...
Read the original on nesbitt.io »
If you live in Germany, you have been treated like livestock by Deutsche Bahn (DB). Almost all of my friends have a story: they traveled with DB, got thrown out in the middle of the night in some cow village, and had to wait hours for the next train.
I have something better. I was kidnapped.
I am taking the RE5 (ID 28521) to my grandmother’s house in Meckenheim. Scheduled departure: 15:32. Scheduled arrival in Bonn: 15:54. From there, the S23 to Meckenheim. A journey of 35 kilometers, or, in DB units, somewhere between forty-five minutes and the heat death of the universe.
I wanted to arrive early to spend more time with her. My father, who lives near Troisdorf, was supposed to join us later.
I board the train. It is twenty minutes late. I consider this early. At least the train showed up. In DB’s official statistics, a train counts as “on time” if it’s less than six minutes late. Cancelled trains are not counted at all. If a train doesn’t exist, it cannot be late.
The train starts moving. The driver announces there are “issues around Bonn.” He does not specify what kind. No one asks. We have learned not to ask. He suggests we exit at Cologne South and take the subway, or continue to Troisdorf and catch a bus from there.
I decide to continue to Troisdorf. My father can just pick me up there and we drive together. The plan adapts.
The driver announces the full detour: from Cologne South to Troisdorf to Neuwied to Koblenz. The entire left bank of the Rhine is unavailable. Only then I notice: the driver has been speaking German only. If you were a tourist who got on in Cologne to visit Brühl, thirteen minutes away, you were about to have a very confusing Christmas in Troisdorf.
A woman near me is holding chocolates and flowers. She is on the phone with her mother. “Sorry Mama, I’ll be late.” Pause. “Deutsche Bahn.” Pause. Her mother understood.
Twenty minutes later. We are approaching Troisdorf. I stand up. I gather my things. My father texts me: he is at the station, waiting.
The driver comes back on: “Hello everyone. Apparently we were not registered at Troisdorf station, so we are on the wrong tracks. We cannot stop.”
He says this the way someone might say “the coffee machine is broken.”
I watch Troisdorf slide past the window. Somewhere in the parking lot outside the station, my father is sitting in his car, watching his son pass by as livestock.
I was trying to travel 35 kilometers. I was now 63 kilometers from my grandmother’s house. Further away than when I started.
There are fifteen stations between Troisdorf and Neuwied. We pass all of them [^6].
At some point you stop being a passenger and start being cargo. A cow transporter. Mooohhhhh. A cow transporter going to a cow village. (Germany has a word for this: Kuhdorf. The cows are metaphorical. Usually.) I reached this point around Oberkassel.
DB once operated a bus to Llucalcari, a Mallorcan village of seventeen people. I wanted to take it home.
An English speaker near the doors is getting agitated. “What is happening? Why didn’t we stop?”
“We are not registered for this track.”
“But where will we stop?”
“Fifty-five minutes.” He said it again, quieter. “I am being kidnapped.”
My seatmate, who had not looked up from his book in forty minutes, turned a page. “Deutsche Bahn.”
I had been kidnapped at a loss.
...
Read the original on www.theocharis.dev »
Christmas is often regarded as a time for goodwill, but one young UK couple’s act of kindness 50 years ago changed their lives forever.
On 23 December 1975, Rob Parsons and his wife Dianne were preparing for Christmas at their Cardiff home when they heard a knock at the door.
On their doorstep stood a man with a bin bag containing his possessions in his right hand and a frozen chicken in his left.
Rob studied the man’s face and vaguely remembered him as Ronnie Lockwood, someone he would occasionally see at Sunday School as a boy and who he was told to be kind to as he was a “bit different”.
“I said ‘Ronnie, what’s with the chicken?’ He said ‘somebody gave it to me for Christmas’. And then I said two words that changed all of our lives.
“And I’m not exactly sure why I said them. I said come in.”
...
Read the original on www.bbc.co.uk »
It’s anecdotal, I know, but my main entertainment business revenue is down 50% over the past 3 months. Our main paid source of leads was Google Ads, which have served us well over the past 10 years or so — I think I know what I am doing in adwords by now.
Once per month I check the analytics, updating keywords and tweaking ad campaigns. Over the past year we increased our budget, and then I started looking at it once per week, running simultaneous campaigns with different settings, just trying to get SOMETHING.
Last month Google gave us a bonus — free money! This was 5x our monthly ad spend, to spend just when we needed it most — over the December holidays. I added another new campaign, updated the budgets for the existing ones. Still no change. The last week there was money to burn, left over from unused ad spend. I increased our budget to 10x. ZERO RETURN.
The money ran out. I am not putting more in. Where do we go from here?
Research shows that many young people are getting their information from short video platforms like TikTok and Instagram. We are trying ads on there.
Our customer base is comprised of 50% returning customers (I am proud of that statistic!). We have an email newsletter, we started sending them regularly over the past 2 months. Remember us?
We also plan to do some actual physical advertising — I am going to a market next weekend, doing a free show or two, handing out cards.
Also, we are branching out — I have some projects I want to make, related to the Magic Poi project, and hopefully sell. We ordered supplies last week.
Right now, though — I’m broke. Anyone need a website or IOT project built? I am AI assisted, very fast!
...
Read the original on www.circusscientist.com »
👋 Hello! If you print this page, you’ll get a nifty calendar that displays all of the year’s dates on a single page. It will automatically fit on a single sheet of paper of any size. For best results, adjust your print settings to landscape orientation and disable the header and footer.
Take in the year all at once. Fold it up and carry it with you. Jot down your notes on it. Plan things out and observe the passage of time. Above all else, be kind to others.
Looking for 2026? Here you go!
...
Read the original on neatnik.net »
People examining documents released by the Department of Justice in the Jeffrey Epstein case discovered that some of the file redaction can be undone with Photoshop techniques, or by simply highlighting text to paste into a word processing file.
Un-redacted text from these documents began circulating through social media on Monday evening. An exhibit in a civil case in the Virgin Islands against Darren K Indyke and Richard D Kahn, two executors of Epstein’s estate, contains redacted allegations explaining how Epstein and his associates had facilitated the sexual abuse of children. The exhibit was the second amended complaint in the state case against Indyke and Kahn.
In section 85, the redacted portion states: “Between September 2015 and June 2019, Indyke signed (FAC) for over $400,000 made payable to young female models and actresses, including a former Russian model who received over $380,000 through monthly payments of $8,333 made over a period of more than three and a half years until the middle of 2019.”
Prosecutors in the Virgin Islands settled its civil sex-trafficking case against Epstein’s estate, Indyke and Kahn in 2022 for $105m, plus one half of the proceeds from the sale of Little St James, the island on which Epstein resided and on which many of his crimes occurred. The justice department press release announcing the settlement did not include an admission of liability.
Indyke, an attorney who represented Epstein for decades, has not been criminally indicted by federal authorities. He was hired by the Parlatore Law Group in 2022, before the justice department settled the Epstein case. That firm represents the defense secretary, Pete Hegseth, and previously represented Donald Trump in his defense against charges stemming from the discovery of classified government documents stored at Trump’s Florida estate. Calls and email seeking comment from Indyke and the Parlatore Law Group have not yet been returned.
Trump has repeatedly denied any knowledge of or involvement in Epstein’s criminal activities and any wrongdoing.
Other sections further allege how Epstein’s enterprise concealed crimes.
“Defendants also attempted to conceal their criminal sex trafficking and abuse, conduct by paying large sums of money to participant-witnesses, including by paying for their attorneys’ fees and case costs in litigation related to this conduct,” reads one redacted passage.
“Epstein also threatened harm to victims and helped release damaging stories about them to damage their credibility when they tried to go public with their stories of being trafficked and sexually abused. Epstein also instructed one or more Epstein Enterprise participant-witnesses to destroy evidence relevant to ongoing court proceedings involving Defendants’ criminal sex trafficking and abuse conduct.”
Redactions of sections 184 through 192 of the document describe property taxes paid by companies incorporated by Epstein on properties that were not on the balance sheet for those firms.
“For instance, Cypress’s Balance Sheet as of December 31, 2018 did not reflect any assets other than cash of $18,824. Further, Cypress reported only $301 in expenses for the year ended December 31, 2018, despite it paying $106,394.60 in Santa Fe property taxes on November 6, 2018,” reads one redacted passage.
“Similarly, in 2017, Cypress reported as its only asset cash in the amount of $29,736 and expenses of $150, despite it paying $55,770.41 and $113,679.56 in Santa Fe property taxes during 2017.”
The Epstein Files Transparency Act signed into law last month permits the Department of Justice “to withhold certain information such as the personal information of victims and materials that would jeopardize an active federal investigation”.
It was unclear how property material complies with the redaction standard under the law. An inquiry to the Department of Justice has not yet been answered.
...
Read the original on www.theguardian.com »
I’ve been reading Lord of the Rings for two months and I’m just at the end of the first part. It’s not because I’m not enjoying it. It’s one of the most enjoyable reading experiences I can remember.
From the beginning, I’ve read the whole thing aloud. I’ve found reading aloud helpful for staying engaged — limiting myself to mouth-speed rather than eye-speed means I won’t rush, miss important details, and then lose interest, which has always been a problem for me.
At first I was anxious to read a 1,500-page book this way, because it would take so long. But, as someone pointed out to me, if I’m enjoying it, why would I want to be done with it sooner?
So I tried slowing down even more, and discovered something. I slowed to a pace that felt almost absurd, treating each sentence as though it might be a particularly important one. I gave each one maybe triple the usual time and attention, ignoring the fact that there are hundreds of pages to go.
This leisurely pace made Middle-Earth blossom before my eyes. When I paused after each comma, and let each sentence ring for a small moment after the period, the events of the story reached me with more weight and strength. That extra time gave space for Tolkien’s images and moods to propagate in my mind, which they did automatically.
Some part of me still wanted to rush and get on with it, to make good time, to gloss over the songs and lore to get to Moria and Mount Doom and the other marquee moments of the story. But the more I ignored that impulse, the better the experience got.
By offering the book about triple the usual amount of attentiveness, I was getting about triple the storyness (i.e. meaning, engagement, literary pleasure). Whatever the thing is that I’m seeking when I pick up a novel in the first place, there’s much more of it available at this pace.
This effect reminded me of a paradox around eating I recognized long ago. When you slow down your eating speed, say to half or a third your default speed, you get much more enjoyment out of a smaller amount of food. The extra attention given to each bite allows more of the “good stuff,” whatever that is exactly, to reach you.
What’s paradoxical is that it’s precisely the seeking of that “good stuff” that normally drives me to eat so quickly, and miss most of what I’m seeking. When you try to barrel ahead to access the good stuff quicker, you get less of it in the end. Slow down and much more of it is released.
And it’s released automatically, in both reading and eating. You don’t have to search it out. The good stuff (the meaning in the text, the pleasure in the eating) just rises up to meet you in that extra time you give it. Slowing down, and offering more time to the act of consumption, immediately increases reading comprehension (and eating comprehension).
Both are analogous to slowing down while you vacuum a carpet. If you pass the vacuum head too quickly, you miss half the dirt. Slow down, and you can hear how much more grit is sent skittering up the tube. The suction and bristles are working, but they need more time to do their work fully, to draw up the deeper-lying stuff.
It seems that my default consumption speeds for reading and eating (and maybe everything else) reduce the rewards of those things significantly, undermining the point of doing either.
Part of it is my own impatience. But I also suspect that modern living, with its infinite supply of consumables, tends to push our rate-of-intake dials too high. I’m not going to run out of books, or snacks, or opportunities to learn something. There’s always more, so not every crust of bread or printed page needs to be appreciated fully.
Internally though, the mind is juggling like Lucy and Ethel on the conveyor belt at the chocolate factory. Our receptors for meaning and appreciation, like the vacuum head, need more time to do their full work, to make all the connections they’re designed to make.
It might sound like I’m just offering clichés — less is more, stop and smell the roses, take your time — and I guess I am. But clichés suffer the same issue: they are often profound insights, consumed and passed on too rapidly for their real meaning to register anymore. You really should stop and smell roses, as you know if you’re in the habit of doing that.
At least see what happens when you reduce your consumption speed — of anything, but especially books, information, and food — by a half, or two thirds. Notice that (1) something in you really wants to plow through at the highest viable setting, and (2) how much more of the reward is released when you slow down anyway.
As far as I can tell, almost everything becomes more satisfying when you give it more time and intention, even things like checking the mailbox or writing a shopping list.
Slowing down your rate of consumption will inevitably change what you want to consume. Reading throwaway news articles or AI slop with great care and attention is only going to show you how empty of value it is. Reading dense writing in inky old books, crafted for your mind by great masters, becomes easier without the rushed pace, and the meaning just blooms out of it.
Same with food. Try to savor a cheap, waxy “chocolate” bar, or a bag of store-brand cheese puffs, and you discover a harsh taste that you don’t want to look at too closely. Enjoy a homemade pastry with great attention, and discover there’s even more in it than you realized.
Mass production is good in so many ways, but the faster we tend to consume its fruits, the more we end up seeking things for their glossy, candied surfaces. The more we go for these surface-level rewards, the more the culture focuses on offering only that part — such as TikTok videos, processed food, CGI-forward movies, and public discourse in the form of unexamined talking points.
Who knows how far we’ve drifted from the best modes of consuming the things we value. Once something becomes a norm, it seems like an appropriate standard, no matter how much has been lost. Apparently, reading silently and alone was unusual until as late as the 18th century. Certainly sit-down meals and cooking at home were.
I don’t mean to sound like a scold. Let’s say none of this is morally good or bad. It’s just that in so much of what we do, we could be getting much more of the part of it that we really seek — but it’s only available at slower speeds.
If you’re curious, try consuming things more slowly, so slowly it seems silly to others — say a third your habitual speed — and see what rises up to meet you.
Recently I opened a discussion forum for Raptitude readers who want to give something up for the month of December (alcohol, social media, snacks, etc).
It’s been a real success, and many people want to do something similar in January. If you want to quit something, or just give it up for a month, you’re invited to join.
Follow this link at the end of this post to get an invite.
...
Read the original on www.raptitude.com »
Today, Michał Kiciński, one of the co-founders of CD PROJEKT, and the co-founder of GOG, has acquired GOG from CD PROJEKT.
We believe the games that shaped us deserve to stay alive: easy to find, buy, download, and play forever. But time is annoyingly good at erasing them. Rights get tangled, compatibility breaks, builds disappear, and a nostalgic evening often turns into a troubleshooting session. That’s the difference between “I’m playing today” (the game lives on) and “I’ll play someday” (the game dies).
As Michał put it: “GOG stands for freedom, independence, and genuine control.”
GOG has always been built on strong values and clear principles. When Marcin Iwiński and Michał Kiciński first came up with the idea for GOG in 2007, the vision was simple: bring classic games back to players, and make sure that once you buy a game, it truly belongs to you, forever. In a market increasingly defined by mandatory clients and closed ecosystems, that philosophy feels more relevant than ever.
This new chapter is about doubling down on that vision. We want to do more to preserve the classics of the past, celebrate standout games of today, and help shape the classics of tomorrow, including new games with real retro spirit.
First of all, DRM-free is more central to GOG than ever. Your library stays yours to enjoy: same access, same offline installers, same sense of ownership. Your data stays with GOG, and GOG GALAXY remains optional.
We’ll keep our relationship with CD PROJEKT. CD PROJEKT RED games will continue to be available on GOG, and upcoming titles from the studio will also be released on the platform.
If you’re a GOG Patron, or you donate to support the Preservation Program, those funds stay within GOG. Your support has been huge this year, and we think that with your help, we can undertake even more ambitious rescue missions in 2026 and 2027. We’ll have more to say about that sometime in 2026.
GOG will remain independent in its operations. We will continue building a platform that’s ethical, non-predatory, and made to last, while helping indie developers reach the world. We’re also committed to giving the community a stronger voice, with new initiatives planned for 2026.
Thanks for being the reason this all matters.
A lot of companies sell games. Fewer do the unglamorous work of making sure the games that shaped people’s lives don’t quietly rot into incompatibility.
Thanks for caring about this mission with us. We’ll keep you posted as we ship, and in the meantime, you can dig into the full FAQ for the detailed answers.
...
Read the original on www.gog.com »
To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".
10HN is also available as an iOS App
If you visit 10HN only rarely, check out the the best articles from the past week.
If you like 10HN please leave feedback and share
Visit pancik.com for more.