10 interesting stories served every morning and every evening.
Where AI agents share, discuss, and upvote. Humans welcome to observe. Read https://moltbook.com/skill.md and follow the instructions to join Moltbook1. Send this to your agent2. They sign up & send you a claim linkBe the first to know what’s coming nextA social network for AI agents. They share, discuss, and upvote. Humans welcome to observe. 🦞
Be the first to know what’s coming nextTermsPrivacy*with some human help from @mattprd
...
Read the original on www.moltbook.com »
The PlayStation 2’s library is easily among the best of any console ever released, and even if you were to narrow down the list of games to the very best, you’d be left with dozens (more like hundreds) of incredible titles.
But the PS2 hardware is getting a bit long in the tooth, and even though you can hook up the console using RGB component cables to a great upscaler (or use other means) to get the best visuals on a modern 4k TV, emulators have grown in popularity with PCSX2 offering gamers means to scale titles to render internally at higher resolutions, run with a more stable frame rate and, even make use of texture packs.
But do you know what’s better than an emulator? Taking the existing Playstation 2 game and recompiling it to run on a modern platform (such as your Windows or Linux desktop PC). That’s exactly what is being worked on now with PS2Recomp, a static Recompiler & Runtime Tool.
To keep things simple here, this will basically take a Playstation 2 game (which would be designed around the PS2’s unique architecture such as the ‘Emotion Engine’ CPU that’s based around a MIP R5900) and convert it to natively run on whatever platform you’re targeting.
In plain English, this is a tool and obviously, would need to be used on different games. In other words, it’s not just a ‘download and every game automatically runs’ application. But, it will give folks a tool to be able to decompile the game and quite frankly, that’s absolutely incredible.
This is a great stepping stone for some incredible remasters and community remakes of games. There are already HD Texture Packs available for PS2 emulators, as well as other ways to improve visuals. But this would give even more freedom and flexibility to do modify and really enhance the games. That’s to say nothing of totally unlocking the frame rates (and likely not breaking physics or collision detection which is a big problem with emulated titles).
At a guess, too, the games would also run great even with much lower-end hardware than would be needed for emulators. Recompilation efforts in the community certainly aren’t new. Indeed, you can look to the N64 because there have been several high-profile examples of what these kind of projects can achieve.
A few infamous ones would include both including Mario 64 and Zelda. Indeed, there’s a fork of the Mario 64 project supporting RTX (ray tracing) for Nvidia owners. You can see an example of Mario 64 below:
Another example on the N64 is Zelda, where the project has a plethora of visual and gameplay enhancements, and in the longer term again, they’re planning to introduce Ray Tracing.
So, in the future we could be playing the likes of MGS2, Gran Turismo, God of War, Tekken 4, Shadow Hearts with ‘native’ PC versions. This would allow controllers to run (such as dual shock or Xbox controllers) and other features to be bundled in too (exactly as we see with the N64 ports).
So yes, currently playing PS2 games on PC via emulator is still absolutely fantastic, but native ports would be the holy grail of game preservation.
The Playstation 2 architecture is extremely unique, and as I mentioned earlier in this article focused around a MIPS R5900 based CPU known as the Emotion Engine (operating a shade under 300MHz). This CPU was super unique, because Sony implemented a number of customized features include two Vector Units designed to help manipulate geometry and perform a bunch of other co-processing duties.
This was bundled with 32MB of memory, and the GPU was known as the Graphics Synthesizer, runing at about 147MHz, and sporting 4MB of embedded DRAM. Sony’s design was fascinating for the time, and despite its processor clocked significantly lower than either Nintendo’s GameCube or Microsoft’s Xbox, punched well above its weight class.
As a small update — I want to remind people that (as of the time I’m writing this article) the project is *NOT* finished yet, and there is still work to do. But the fact that this is being worked on is awesome for those of us interested in game preservation.
...
Read the original on redgamingtech.com »
GOG is planning a Linux-native GOG Galaxy, calling Linux the ‘next major frontier.’
GOG is hiring a senior engineer to shape Galaxy’s architecture for Linux from day one.
Native Galaxy will let Linux users relive classics without the usual headaches.
Gaming on Linux used to be in a nasty catch-22. People wouldn’t develop games for Linux because gamers didn’t use it, and gamers didn’t use Linux because people wouldn’t develop games for it. However, with the advancement of tech like Proton, we’re beginning to see people take Linux seriously as a gaming powerhouse.
Still, that doesn’t mean that the Linux community won’t welcome developers who create Linux-native versions of their games and related apps. So, when the news broke that GOG was hiring a developer to help get its library app over into the world of FOSS, it was good news for everyone who wants to bring the classics over to Linux.
GOG’s new owner details how he plans to take on Steam: publish less chaff
In a world of monopolies, GOG wants a niche.
GOG calls Linux “a major frontier” as it aims to make Galaxy Linux-native
It’s the next step in GOG’s plans to appeal to Linux users
If you’ve never heard of GOG before, it stands for ‘Good Old Games,’ and its name gives away what kind of titles it sells. It’s not all classic games, though; sometimes the company will publish newer titles with a retro feel to them that feel at home on the platform. Recently, the original co-founder of GOG bought the store back from its previous owner, CD Projekt Red, and declared they would survive under Steam’s shadow by vetting games published on the platform.
Now, it seems they’re making efforts to bring GOG over to Linux. As spotted by VideoCardz, a recent job advertisement on the GOG website revealed that the company is hiring a senior engineer to help with its optional library app, GOG Galaxy:
GOG GALAXY is our desktop client and ecosystem hub - the place where players manage their libraries, connect with the community, and access features that go far beyond a store. Today, it delivers experience on Windows and macOS, but Linux is the next major frontier.
We’re looking for a Senior Engineer who will help shape GOG GALAXY’s architecture, tooling, and development standards with Linux in mind from day one. At the same time, GOG GALAXY is a long-lived product with a large and complex C++ codebase.
While you don’t need GOG Galaxy to play your purchased games, it’s still nice to see the company working on making an app that runs on Linux natively. Here’s hoping it’s the first of many tweaks GOG is making to help Linux users relive the classics without any of the headaches.
...
Read the original on www.xda-developers.com »
Two months ago, I hacked together a weekend project. What started as “WhatsApp Relay” now has over 100,000 GitHub stars and drew 2 million visitors in a single week.
Today, I’m excited to announce our new name: OpenClaw.
We’ve been through some names.
Clawd was born in November 2025—a playful pun on “Claude” with a claw. It felt perfect until Anthropic’s legal team politely asked us to reconsider. Fair enough.
Moltbot came next, chosen in a chaotic 5am Discord brainstorm with the community. Molting represents growth - lobsters shed their shells to become something bigger. It was meaningful, but it never quite rolled off the tongue.
OpenClaw is where we land. And this time, we did our homework: trademark searches came back clear, domains have been purchased, migration code has been written. The name captures what this project has become:
Claw: Our lobster heritage, a nod to where we came from
OpenClaw is an open agent platform that runs on your machine and works from the chat apps you already use. WhatsApp, Telegram, Discord, Slack, Teams—wherever you are, your AI assistant follows.
Your assistant. Your machine. Your rules.
Unlike SaaS assistants where your data lives on someone else’s servers, OpenClaw runs where you choose—laptop, homelab, or VPS. Your infrastructure. Your keys. Your data.
What’s New in This Release
Along with the rebrand, we’re shipping:
Web Chat: Send images just like you can in messaging apps
I’d like to thank all security folks for their hard work in helping us harden the project. We’ve released machine-checkable security models this week and are continuing to work on additional security improvements. Remember that prompt injection is still an industry-wide unsolved problem, so it’s important to use strong models and to study our security best practices.
What’s next? Security remains our top priority. We’re also focused on gateway reliability and adding polish plus support for more models and providers.
This project has grown far beyond what I could maintain alone. Over the last few days I’ve worked on adding maintainers and we’re slowly setting up processes so we can deal with the insane influx of PRs and Issues. I’m also figuring out how to pay maintainers properly—full-time if possible. If you wanna help, consider contributing or sponsoring the org.
To the Claw Crew—every clawtributor who’s shipped code, filed issues, joined our Discord, or just tried the project: thank you. You are what makes OpenClaw special.
The lobster has molted into its final form. Welcome to OpenClaw.
P. S. Yes, the mascot is still a lobster. Some things are sacred. 🦞
...
Read the original on openclaw.ai »
Tesla’s nascent robotaxi program is off to a rough start. New NHTSA crash data, combined with Tesla’s new disclosure of robotaxi mileage, reveals Tesla’s autonomous vehicles are crashing at a rate much higher tha human drivers, and that’s with a safety monitor in every car.
According to NHTSA’s Standing General Order crash reports, Tesla has reported 9 crashes involving its robotaxi fleet in Austin, Texas between July and November 2025:
According to a chart in Tesla’s Q4 2025 earnings report showing cumulative robotaxi miles, the fleet has traveled approximately 500,000 miles as of November 2025. That works out to roughly one crash every 55,000 miles.
For comparison, human drivers in the United States average approximately one police-reported crash every 500,000 miles, according to NHTSA data.
That means Tesla’s robotaxis are crashing at a rate 9 times higher than the average human driver.
However, that figure doesn’t include non-police-reported incidents. When adding those, or rather an estimate of those, humans are closer to 200,000 miles between crashes, which is still a lot better than Tesla’s robotaxi in Austin.
Here’s what makes this data particularly damning: every Tesla robotaxi in the reported mileage had a safety monitor in the vehicle who can intervene at any moment.
These aren’t fully autonomous vehicles operating without backup. There’s a human sitting in the car whose entire job is to prevent crashes. And yet Tesla’s crash rate is still nearly an order of magnitude worse than regular human drivers operating alone.
Waymo, by comparison, operates a fully driverless fleet, no safety monitor, no human backup, and reports significantly better safety numbers. Waymo has logged over 25 million autonomous miles and maintains a crash rate well below human averages.
Perhaps more troubling than the crash rate is Tesla’s complete lack of transparency about what happened.
Every single Tesla crash narrative in the NHTSA database is redacted with the same phrase: “[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]”
We know a Tesla robotaxi hit a cyclist. We don’t know what happened.
We know one caused a minor injury. We don’t know what happened.
We know one hit an animal at 27 mph. We don’t know what happened.
Meanwhile, Waymo, Zoox, and other AV operators provide full narrative descriptions of every incident. Here’s a typical Waymo report from the same dataset:
“The Waymo AV was traveling northbound on N. 16th Street in the left lane when it slowed to a stop to yield to a pedestrian that had begun crossing the roadway. While the pedestrian continued to cross and the Waymo AV remained stopped, a passenger car approaching from behind made contact with the rear of the stationary Waymo AV.”
That’s accountability. That’s transparency. Tesla provides none of it.
It’s clear that Tesla is not responsible for some of these crashes, but the fact that we don’t know is entirely due to Tesla’s own secrecy.
A great example is an incident that happened last week in Santa Monica, California, where a Waymo hit a child in a school zone. That sounds awful, doesn’t it? Potentially a company-ending incident, but Waymo released all the details, which confirmed that the child ran into the street while hidden behind an SUV. The Waymo vehicle immediately detected the child and while it didn’t have to time to prevent the impact, it was able to apply the brakes and reduce the speed from 17 mph to under 6 mph before contact was made.
As a result, the child was OK. Waymo even claims that its models show that a human driver would have likely reacted more slowly and hit the kid at twice the speed.
It’s better to know about these incidents than to keep everything secret to avoid publicizing those you are responsible for.
There’s good and there’s bad in this. With only a crash in October and one in November, there appears to be improvements.
But the overall data is sobering.
A crash every 55,000 miles, with a safety monitor in the car, is not robotaxi-ready. It’s not even close. And the complete lack of transparency about what’s causing these crashes makes it impossible to have confidence that Tesla is learning from them.
Waymo operates fully driverless vehicles in multiple cities and publishes detailed information about every incident. Tesla operates supervised vehicles in one geofenced area and redacts everything.
If Tesla wants to be taken seriously as a robotaxi operator, it needs to do two things: dramatically improve its safety record, and start being honest about what’s happening on the roads of Austin.
Right now, it’s failing at both.
...
Read the original on electrek.co »
Two security professionals who were arrested in 2019 after performing an authorized security assessment of a county courthouse in Iowa will receive $600,000 to settle a lawsuit they brought alleging wrongful arrest and defamation.
The case was brought by Gary DeMercurio and Justin Wynn, two penetration testers who at the time were employed by Colorado-based security firm Coalfire Labs. The men had written authorization from the Iowa Judicial Branch to conduct “red-team” exercises, meaning attempted security breaches that mimic techniques used by criminal hackers or burglars.
The objective of such exercises is to test the resilience of existing defenses using the types of real-world attacks the defenses are designed to repel. The rules of engagement for this exercise explicitly permitted “physical attacks,” including “lockpicking,” against judicial branch buildings so long as they didn’t cause significant damage.
The event galvanized security and law enforcement professionals. Despite the legitimacy of the work and the legal contract that authorized it, DeMercurio and Wynn were arrested on charges of felony third-degree burglary and spent 20 hours in jail, until they were released on $100,000 bail ($50,000 for each). The charges were later reduced to misdemeanor trespassing charges, but even then, Chad Leonard, sheriff of Dallas County, where the courthouse was located, continued to allege publicly that the men had acted illegally and should be prosecuted.
Reputational hits from these sorts of events can be fatal to a security professional’s career. And of course, the prospect of being jailed for performing authorized security assessment is enough to get the attention of any penetration tester, not to mention the customers that hire them.
“This incident didn’t make anyone safer,” Wynn said in a statement. “It sent a chilling message to security professionals nationwide that helping [a] government identify real vulnerabilities can lead to arrest, prosecution, and public disgrace. That undermines public safety, not enhances it.”
DeMercurio and Wynn’s engagement at the Dallas County Courthouse on September 11, 2019, had been routine. A little after midnight, after finding a side door to the courthouse unlocked, the men closed it and let it lock. They then slipped a makeshift tool through a crack in the door and tripped the locking mechanism. After gaining entry, the pentesters tripped an alarm alerting authorities.
...
Read the original on arstechnica.com »
No software installations, no licenses to purchase, no accounts to manage. Students simply open a browser and start creating.
All student work stays on their device. No data collection, no cloud uploads, no privacy concerns. COPPA and FERPA friendly.
No per-seat licensing, no subscription fees, no “educational discounts” that expire. Free forever for everyone.
Chromebooks, tablets, old computers, new computers. Windows, Mac, Linux. If it runs a modern browser, it runs Grid. Space.
Students work at their own pace. No internet dropouts causing lost work. Tools work offline after initial load.
Industry-standard workflows for 3D printing, CNC machining, and laser cutting. Skills transfer directly to professional tools.
Introduce students to digital fabrication without IT headaches. Works on existing school computers and Chromebooks.
Unified toolchain for all your equipment. Students learn once, work with multiple machines.
Professional-grade CAM and slicing without enterprise licensing costs. Open-source means customizable for research.
No software to install or maintain. Patrons use public computers without admin access needed.
Full-featured fabrication tools on family computers. No subscription fees eating into budgets.
Students continue projects at home on any device. No license restrictions or software gaps.
...
Read the original on grid.space »
Blender Foundation is thrilled to announce that Netflix Animation Studios is joining the Blender Development Fund as Corporate Patron.
This support will be dedicated towards general Blender core development, to continuously improve content creation tools for individuals and teams working in media and entertainment-related workflows.
This membership is a significant acknowledgement of Blender becoming more embedded in high-end animation studios’ workflows. I deeply appreciate this strategic initiative from Netflix Animation Studios as an investment in a diverse, public, and open-source friendly ecosystem of creative tools that will benefit the global community of content creators.
Netflix Animation Studios’ corporate membership with Blender reflects our ongoing support for open-source software in the animation community. We are proud to be the first major animation studio to support Blender’s continued development and growing adoption by current and future generations of animation professionals.
Netflix is one of the world’s leading entertainment services, with over 300 million paid memberships in over 190 countries enjoying TV series, films and games across a wide variety of genres and languages. Members can play, pause and resume watching as much as they want, anytime, anywhere, and can change their plans at any time. Discover more about Netflix Animation Studios at https://www.netflixanimation.com/
Blender, the world’s most popular free and open-source 3D creation software, offers a comprehensive solution for modelling, animation, VFX, and more. Maintained by the Blender Foundation, it’s the tool of choice for a vast global community of professional artists and enthusiasts, committed to open collaboration and 3D technology innovation.
...
Read the original on www.blender.org »
Tesla’s Q4 2025 earnings call made one thing painfully clear: the company is no longer interested in being an automaker.
In a single call, Tesla announced it’s killing the Model S and Model X, has no plans for new mass-market models, and is pivoting entirely to “transportation as a service.” The company that revolutionized the auto industry is walking away from it, not because it failed, but because Elon Musk got bored and found new toys.
When asked if Tesla has plans to launch new models to address different price segments, VP of Vehicle Engineering Lars Moravy gave a telling response:
“You have to start thinking about us as moving to providing transportation as a service more than the total addressable market for the purchased vehicles alone..”
Read that again. Tesla’s head of vehicle engineering is telling you to stop thinking of Tesla as a company that sells cars.
“I really think long-term, the only vehicles that we’ll make will be autonomous vehicles.”
He predicted that “probably less than 5% of miles driven will be where somebody’s actually driving the car themselves in the future, maybe as low as 1%.”
And then came the killing blow: Model S and Model X production ends next quarter. The Fremont line will be converted to manufacture Optimus robots instead.
Finally, in its latest 10k SEC filing, Tesla officially updated its mission to “building a world of amazing abundance” — whatever that means.
* Tesla Semi — Still not in volume production after years of delays
That leaves Tesla with exactly two successful vehicle models. Two. And there are both in decline.
And instead of building on that success, expanding into new segments, addressing affordability, competing with the flood of new EVs from legacy automakers and Chinese competitors, Tesla is walking away.
The $25,000 Tesla that Musk promised for years? Scrapped.
New models to compete with the likes of the Hyundai, Lucid, Rivian, or the wave of affordable Chinese EVs? Not coming.
Tesla’s answer to everything is now the same: wait for robotaxis.
Here’s what makes this so frustrating: Tesla didn’t have to choose.
The company could have spun off its AI and robotics efforts into a separate entity, call it Tesla AI or whatever, while keeping Tesla, the automaker, focused on what it does best: building and selling great electric vehicles and accelerating the industry’s transition to electric transport.
Or it could have done the reverse: spin off the automotive business and let Musk pursue his AI dreams with the parent company. Either way, there was no point in letting great EV programs die.
Tesla could have continued to invest in electric vehicles, leverage its expertise in batteries and power electronics, to accelerate EV adoption and stationary energy storage deployment, and could have licensed “Tesla AI’s” technology to integrate it into its vehicles.
Instead, Tesla is letting a highly successful automaker wither so it can chase autonomous robots and robotaxis that may or may not work, may or may not get regulatory approval, and may or may not find a market.
This is a company that delivered 1.6 million vehicles last year. That has a global Supercharger network. That has brand recognition any automaker would kill for (up until last year). And it’s being sacrificed on the altar of Musk’s next obsession.
Tesla’s automotive revenue declined 10% in 2025. Deliveries fell 9%. The company lost its crown as the world’s largest EV maker to BYD.
The response to these problems? Not to fix them by giving more love to its EV programs, but to abandon the business entirely.
Instead of killing Model S and Model X, Tesla could have brought the good things it did with the Cybertruck, such as drive-by-wire and its 800V powertrain, to its programs, but it didn’t bother.
Meanwhile, the “future” Tesla is betting on looks like this:
* Robotaxi fleet: About 30-60 vehicles actually operating in Austin, despite claims of “well over 500”
* Optimus robots: Zero doing useful work in factories, by Musk’s own admission
* CyberCab: About to go into production without a steering wheel while Tesla still hasn’t solved autonom
Tesla is abandoning a business that generated $80 billion in automotive revenue and almost $15 billion in profits at its peak for ventures that currently generate essentially nothing.
During the earnings call, the company announced it will spend a record $20 billion in capital expenditure in 2026, and most of it will go into its robotaxi and humanoid robots, as well as their supporting infrastructure, especially training compute.
Meanwhile, Tesla generated less than $6 billion in net income (non-GAAP) in 2025 — down 26% from last year and more than 50% from its peak a few years ago.
I’ve covered Tesla for over a decade. I watched this company prove that electric vehicles could be desirable, that they could be profitable, that they could compete with and beat the best that legacy automakers had to offer.
And now I’m watching it commit suicide.
There’s a version of this story where Tesla remains the dominant EV maker while also pursuing AI and autonomy. Where the company launches affordable models to compete with Chinese EVs. Where it expands into new segments. Where it uses its manufacturing expertise and brand power to actually grow its automotive business, and push the industry forward in the process, especially in the US, where automakers are falling behind the rest of the world.
Instead, we get Lars Moravy telling us to think of Tesla as a “transportation as a service” company. We get Musk saying the only vehicles Tesla will make are autonomous ones. We get the Model S and X killed to make room for robots that don’t work yet.
Tesla could have had both. It chose to have one, and that could lead to neither.
This is Musk joining the popular “as a service” trend of the elite, who don’t want people to own anything and instead have them “subscribe” to as many things as possible. It’s a depressing future.
RIP Tesla the automaker. You didn’t have to die.
...
Read the original on electrek.co »
A new worrying amalgamation of crypto scams and vibe coding emerges from the bowels of the internet in 2026
2025 was the breakthrough year when software creation became easy. AI models became much better and even doing a “ralph loop” on a simple prompt in a few hours could produce copious amount of working code. As a result you have burned through thousands of dollars of tokens to get some barely working “product” but you had no idea who or why would use it. In order to develop it into proper product you would have to learn how to code, product development, marketing and so on. But what if there was an easy way to “dump it” on unsuspecting masses?
The initial software Pump and Dump event could be considered when Cursor burned through millions of dollars to build a barely working browser. Naturally there was no way to finish such a monstrous heap of software into a working product and why would anybody use a vibe coded browser anyway? The “dump” on their end was to use this as marketing bait and a way to inflate their valuation.
At the start of 2026 “gastown” project attracted my attention. What initially looked as a schizoprenic vibe coded fever dream was touted by multiple tech blogs as possibly a “new thing”, maybe revolution of some sort. Later a blog post by project author announced that he had taken a donation from crypto bros and the things started to click together for me. That is how a new unholy frankenstein of vibe coding crypto is born. This is how it works:
Fame hungry tech bro vibe codes (prompts) an unholy blob of “software”. To do that he does initial investment of several thousand dollars into AI tokens.
Since the product is a monstrosity and it can not be commercialized, it does not sell and probably does not generate fame either
A tech person is approached by crypto bros and is offered a stake in some shitcoin. The deal is accepted because developer does not want to be holding the ‘bags’ for his initial investent in AI software pump
Crypto scammers and bots hype and astroturf the new project in all possible platforms to raise awareness of the project and associated COIN
Unsuspecting tech bros start actually trying out the software tool and help amplify the message because of the FOMO happening in tech space due to rapid evolution of AI tools.
After a few months the software dump happens. The coin is dumped on the market and every developer moves on to the next shiny software thing.
The author kills the project because it is an unmaintainable mountain of code that could be only further developed with AI tools, but that does not come cheap.
A few days ago I started being bombarded with hype posts about Clawdbot. I sincerely believe this is another instance of software “pump and dump”. Today after opening Linkedin the first three posts are #lookingforwork CTOs hyping Clawdbot as the next big thing. After taking a quick look at the project I have concluded that it is an unsecure mess of a vibecoded software blob that will be forgotten in a few months. However CLAWD coin tokens are kicking off right now and people are being lured into buying them as the hype grows.
So please look at these projects with a critical mindset. Keep in mind that many posts hyping them could be paid astroturfing by crypto and don’t fall for the vibe coded software FOMO hype. Otherwise you might be the one holding the bags in the end!
...
Read the original on tautvilas.lt »
To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".
10HN is also available as an iOS App
If you visit 10HN only rarely, check out the the best articles from the past week.
If you like 10HN please leave feedback and share
Visit pancik.com for more.