10 interesting stories served every morning and every evening.
Microsoft provided the FBI with the recovery keys to unlock encrypted data on the hard drives of three laptops as part of a federal investigation, Forbes reported on Friday.
Many modern Windows computers rely on full-disk encryption, called BitLocker, which is enabled by default. This type of technology should prevent anyone except the device owner from accessing the data if the computer is locked and powered off.
But, by default, BitLocker recovery keys are uploaded to Microsoft’s cloud, allowing the tech giant — and by extension law enforcement — to access them and use them to decrypt drives encrypted with BitLocker, as with the case reported by Forbes.
The case involved several people suspected of fraud related to the Pandemic Unemployment Assistance program in Guam, a U. S. island in the Pacific. Local news outlet Pacific Daily News covered the case last year, reporting that a warrant had been served to Microsoft in relation to the suspects’ hard drives. Kandit News, another local Guam news outlet, also reported in October that the FBI requested the warrant six months after seizing the three laptops encrypted with BitLocker.
A spokesperson for Microsoft did not immediately respond to a request for comment by TechCrunch. Microsoft told Forbes that the company sometimes provides BitLocker recovery keys to authorities, having received an average of 20 such requests per year.
Apart from the privacy risks of handing recovery keys to a company, Johns Hopkins professor and cryptography expert Matthew Green raised the potential scenario where malicious hackers compromise Microsoft’s cloud infrastructure — something that has happened several times in recent years — and get access to these recovery keys. The hackers would still need physical access to the hard drives to use the stolen recovery keys.
“It’s 2026 and these concerns have been known for years,” Green wrote in a post on Bluesky. “Microsoft’s inability to secure critical customer keys is starting to make it an outlier from the rest of the industry.”
...
Read the original on techcrunch.com »
Why else would they keep them around for so long?
Why else would they keep them around for so long?
Every bug is different. But the math is always real.
Think our numbers are wrong? Edit them yourself.
Users Affected × Frequency × Time Per Incident
How many Apple users hit this bug, how often, and how long they suffer each time.
Σ (Workaround Time × Participation Rate)
The extra time spent by people who try to fix what Apple won’t.
Years Unfixed × Pressure Factor
How long Apple has known about this and how urgent the task usually is.
Human Hours Wasted ÷ Engineering Hours to Fix
How many times over Apple could have fixed it with the productivity they’ve destroyed.
...
Read the original on www.bugsappleloves.com »
...
Read the original on gptzero.me »
On January 14, 2006, John Resig introduced a JavaScript library called jQuery at BarCamp in New York City. Now, 20 years later, the jQuery team is happy to announce the final release of jQuery 4.0.0. After a long development cycle and several pre-releases, jQuery 4.0.0 brings many improvements and modernizations. It is the first major version release in almost 10 years and includes some breaking changes, so be sure to read through the details below before upgrading. Still, we expect that most users will be able to upgrade with minimal changes to their code.
Many of the breaking changes are ones the team has wanted to make for years, but couldn’t in a patch or minor release. We’ve trimmed legacy code, removed some previously-deprecated APIs, removed some internal-only parameters to public functions that were never documented, and dropped support for some “magic” behaviors that were overly complicated.
We have an upgrade guide and jQuery Migrate plugin release ready to assist with the transition. Please upgrade and let us know if you encounter any issues.
As usual, the release is available on our CDN and the npm package manager. Other third party CDNs will probably have it available soon as well, but remember that we don’t control their release schedules and they will need some time. Here are the highlights for jQuery 4.0.0.
jQuery 4.0 drops support for IE 10 and older. Some may be asking why we didn’t remove support for IE 11. We plan to remove support in stages, and the next step will be released in jQuery 5.0. For now, we’ll start by removing code specifically supporting IE versions older than 11.
We also dropped support for other very old browsers, including Edge Legacy, iOS versions earlier than the last 3, Firefox versions earlier than the last 2 (aside from Firefox ESR), and Android Browser. No changes should be required on your end. If you need to support any of these browsers, stick with jQuery 3.x.
jQuery 4.0 adds support for Trusted Types, ensuring that HTML wrapped in TrustedHTML can be used as input to jQuery manipulation methods in a way that doesn’t violate the require-trusted-types-for Content Security Policy directive.
Along with this, while some AJAX requests were already using tags to maintain attributes such as crossdomain, we have since switched most asynchronous script requests to use <script> tags to avoid any CSP errors caused by using inline scripts. There are still a few cases where XHR is used for asynchronous script requests, such as when the”headers” option is passed (use scriptAttrs instead!), but we now use a tag whenever possible.
It was a special day when the jQuery source on the main branch was migrated from AMD to ES modules. The jQuery source has always been published with jQuery releases on npm and GitHub, but could not be imported directly as modules without RequireJS, which was jQuery’s build tool of choice. We have since switched to Rollup for packaging jQuery and we do run all tests on the ES modules separately. This makes jQuery compatible with modern build tools, development workflows, and browsers through the use of .
...
Read the original on blog.jquery.com »
To use the Mastodon web application, please enable JavaScript. Alternatively, try one of the native apps for Mastodon for your platform.
...
Read the original on mastodon.social »
...
Read the original on www.threads.com »
• The 2025 US tariffs are an own goal: American importers and consumers bear nearly the entire cost. Foreign exporters absorb only about 4% of the tariff burden—the remaining 96% is passed through to US buyers.
• Using shipment-level data covering over 25 million transactions valued at nearly $4 trillion, we find near-complete pass-through of tariffs to US import prices.
• US customs revenue surged by approximately $200 billion in 2025—a tax paid almost entirely by Americans.
• Event studies around discrete tariff shocks on Brazil (50%) and India (25–50%) confirm: export prices did not decline. Trade volumes collapsed instead.
• Indian export customs data validates our findings: when facing US tariffs, Indian exporters maintained their prices and reduced shipments. They did not “eat” the tariff.
...
Read the original on www.kielinstitut.de »
We are already working with Brussels. This can become reality. But we need your help!Read the in-detail proposal, made in collaboration with the best startup legal teams, funds and founders in Europe.
Europe has the talent, ambition, and ecosystems to create innovative companies, but fragmentation between European nations is holding us back.“A startup from California can expand and raise money all across the United States. But our companies still face way too many national barriers that make it hard to work Europa-wide, and way too much regulatory burden.”
Yes! But we need your help!So far, we submitted our proposal to Justice Commissioner McGrath and Startup Commissioner Zaharieva. President Von der Leyen has setup a dedicated working group in the Commission with whom we are in regular contact.Additionally, the European Council and Parliament have each signaled interest in the EU–INC, or what in Brussel is called the “28th regime” (for 28th virtual state).
The entire community is currently influencing the upcoming European Commission legislative proposal for a pan-European legal entity which is set to be released in Q1 2026. We need your help, see below!Afterwards, the European Parliament and the European Council (made up of the 27 national governments) agree on the legislative details. The final implementation of the EU–INC would then happen in 2027. For more details of what happened so far and what comes next, read our roadmap.
In Europe, laws are still decided on national level, meaning we need to convince all 27 EU member state governments to back the EU–INC. Thus we need YOU to activate your contacts, talk to your national politicians about the urgency of the EU–INC, talk to the press about how crucial the EU–INC is for European startups.National governments need to understand the necessity of EU–INC for the future of Europe. Read more in FAQ.
...
Read the original on www.eu-inc.org »
Believe it or not, A$AP Rocky is a huge fan of radiance fields.
Yesterday, when A$AP Rocky released the music video for Helicopter, many viewers focused on the chaos, the motion, and the unmistakable early MTV energy of the piece. What’s easier to miss, unless you know what you’re looking at, is that nearly every human performance in the video was captured volumetrically and rendered as dynamic splats.
I spoke with Evercoast, the team responsible for capturing the performances, as well as Chris Rutledge, the project’s CG Supervisor at Grin Machine, and Wilfred Driscoll of WildCapture and Fitsū.ai, to understand how Helicopter came together and why this project represents one of the most ambitious real world deployments of dynamic gaussian splatting in a major music release to date.
The decision to shoot Helicopter volumetrically wasn’t driven by technology for technology’s sake. According to the team, the director Dan Strait approached the project in July with a clear creative goal to capture human performance in a way that would allow radical freedom in post-production. This would have been either impractical or prohibitively expensive using conventional filming and VFX pipelines.
Chris told me he’d been tracking volumetric performance capture for years, fascinated by emerging techniques that could enable visuals that simply weren’t possible before. Two years ago, he began pitching the idea to directors in his circle, including Dan, as a “someday” workflow. When Dan came back this summer and said he wanted to use volumetric capture for the entire video, the proliferation of gaussian splatting enabled them to take it on.
The aesthetic leans heavily into kinetic motion. Dancers colliding, bodies suspended in midair, chaotic fight scenes, and performers interacting with props that later dissolve into something else entirely. Every punch, slam, pull-up, and fall you see was physically performed and captured in 3D.
Almost every human figure in the video, including Rocky himself, was recorded volumetrically using Evercoast’s system. It’s all real performance, preserved spatially.
This is not the first time that A$AP Rocky has featured a radiance field in one of his music videos. The 2023 music video for Shittin’ Me featured several NeRFs and even the GUI for Instant-NGP, which you can spot throughout the piece.
The primary shoot for Helicopter took place in August in Los Angeles. Evercoast deployed a 56 camera RGB-D array, synchronized across two Dell workstations. Performers were suspended from wires, hanging upside down, doing pull-ups on ceiling-mounted bars, swinging props, and performing stunts, all inside the capture volume.
Scenes that appear surreal in the final video were, in reality, grounded in very physical setups, such as wooden planks standing in for helicopter blades, real wire rigs, and real props. The volumetric data allowed those elements to be removed, recomposed, or entirely recontextualized later without losing the authenticity of the human motion.
Over the course of the shoot, Evercoast recorded more than 10 terabytes of raw data, ultimately rendering roughly 30 minutes of final splatted footage, exported as PLY sequences totaling around one terabyte.
That data was then brought into Houdini, where the post production team used CG Nomads GSOPs for manipulation and sequencing, and OTOY’s OctaneRender for final rendering. Thanks to this combination, the production team was also able to relight the splats.
One of the more powerful aspects of the workflow was Evercoast’s ability to preview volumetric captures at multiple stages. The director could see live spatial feedback on set, generate quick mesh based previews seconds after a take, and later review fully rendered splats through Evercoast’s web player before downloading massive PLY sequences for Houdini.
In practice, this meant creative decisions could be made rapidly and cheaply, without committing to heavy downstream processing until the team knew exactly what they wanted. It’s a workflow that more closely resembles simulation than traditional filming.
Chris also discovered that Octane’s Houdini integration had matured, and that Octane’s early splat support was far enough along to enable relighting. According to the team, the ability to relight splats, introduce shadowing, and achieve a more dimensional “3D video” look was a major reason the final aesthetic lands the way it does.
The team also used Blender heavily for layout and previs, converting splat sequences into lightweight proxy caches for scene planning. Wilfred described how WildCapture’s internal tooling was used selectively to introduce temporal consistency. In his words, the team derived primitive pose estimation skeletons that could be used to transfer motion, support collision setups, and allow Houdini’s simulation toolset to handle rigid body, soft body, and more physically grounded interactions.
One recurring reaction to the video has been confusion. Viewers assume the imagery is AI-generated. According to Evercoast, that couldn’t be further from the truth. Every stunt, every swing, every fall was physically performed and captured in real space. What makes it feel synthetic is the freedom volumetric capture affords. You aren’t limited by the camera’s composition. You have free rein to explore, reposition cameras after the fact, break spatial continuity, and recombine performances in ways that 2D simply can’t.
In other words, radiance field technology isn’t replacing reality. It’s preserving everything.
...
Read the original on radiancefields.com »
To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".
10HN is also available as an iOS App
If you visit 10HN only rarely, check out the the best articles from the past week.
If you like 10HN please leave feedback and share
Visit pancik.com for more.