10 interesting stories served every morning and every evening.
Say No to Palantir in Europe
To European governments and the EU
Review and phase out existing contracts with the company.
And we call on the EU to urgently investigate Palantir’s use across Europe, ensure full transparency over contracts and data use, and push governments to halt new deals until strong safeguards and democratic oversight are guaranteed.
Europe must not hand its public systems, data, and security to a private US surveillance company, especially one that is involved in fueling wars and mass deportations.
Why is this important?
A powerful company enables genocide in Gaza, helps ICE separate families, and fuels Trump’s war with Iran. [1]
Most people have never even heard of it.
But governments across Europe are quietly signing contracts with it, paid for with our tax money. [2] Its name is Palantir.
From the UK to Germany to France and beyond, governments are handing this US spy-tech giant access to sensitive public systems and data. Police in Germany use it to track suspects, the UK hands it vast healthcare datasets - and this is just the beginning. [3]
Palantir’s influence in Europe is spreading fast, largely out of public sight.
That’s exactly why we must shine a light on it. Otherwise, we risk expanding mass surveillance and fuelling wars, while Europe hands its data and security to a US spy-tech giant.
If we build momentum to expose Palantir, we can push leaders to stop signing new contracts and protect Europe’s public systems from powerful surveillance giants.
Add your name now to demand transparency and stop the expansion of Palantir in Europe.
And the people running the company aren’t hiding their intentions. CEO Alex Karp once said Palantir is “here to… scare enemies and, on occasion, kill them.” https://www.wired.com/story/uncanny-valley-podcast-palantir-most-mysterious-company-silicon-valley
If you don’t subscribe, you might miss news on this campaign
or future opportunities to act. (If you’re already subscribed,
leaving this box unchecked won’t remove you.)
Do you want to find out if this campaign is successful?
Yes! Let me know if this campaign is successful and how I can participate in other relevant campaigns.
If you leave us your email, we may contact you to tell you more about how you can help us,
including by supporting our work with a donation.
No. I don’t want to receive information about the progress of this campaign or other campaigns.
You can unsubscribe at any time. Just go to our unsubscribe page.
By entering your information you confirm that you are at least 16 years old.
WeMove Europe is fighting for a better world, and we need heroes like you to join our community of more than 700,000 people. Already you’re powering this campaign call, but by clicking “Yes”, you’ll receive a wider range of campaigns that need your help. Sign up to hear more and make a real difference. If legally required in your country, we will send you an email to confirm adding your data on our list.
By choosing “Yes”, you’re giving WeMove Europe your consent to process your personal information. We might share your name, surname and country with the petition target. Unless you subscribe to receive personalised updates, we will delete your data after the campaign has ended. We will never share your data with any third parties without your permission. See our full privacy policy here.
...
Read the original on action.wemove.eu »
Your brain is still growing new cells right now. Here’s how to keep it happening
...
Read the original on techfixated.com »
AI companies continually scrape the internet at an enormous scale, swallowing up all of its contents to use as training data for their next models. If you have a public website, they are already stealing your work.
Miasma is here to help you fight back! Spin up the server and point any malicious traffic towards it. Miasma will send poisoned training data from the poison fountain alongside multiple self-referential links. It’s an endless buffet of slop for the slop machines.
Miasma is very fast and has a minimal memory footprint - you should not have to waste compute resources fending off the internet’s leeches.
cargo install miasma
miasma
miasma –help
Let’s walk through an example of setting up a server to trap scrapers with Miasma. We’ll pick /bots as our server’s path to direct scraper traffic. We’ll be using Nginx as our server’s reverse proxy, but the same result can be achieved with many different setups.
When we’re done, scrapers will be trapped like so:
Within our site, we’ll include a few hidden links leading to /bots.
Amazing high quality data here!
The style=“display: none;”, aria-hidden=“true”, and tabindex=“1” attributes ensure links are totally invisible to human visitors and will be ignored by screen readers and keyboard navigation. They will only be visible to scrapers.
Since our hidden links point to /bots, we’ll configure this path to proxy Miasma. Let’s assume we’re running Miasma on port 9855.
location ~ ^/bots($|/.*)$ {
proxy_pass http://localhost:9855;
This will match all variations of the /bots path -> /bots, /bots/, /bots/12345, etc.
Lastly, we’ll start Miasma and specify /bots as the link prefix. This instructs Miasma to start links with /bots/, which ensures scrapers are properly routed through our Nginx proxy back to Miasma.
We’ll also limit the number of max in-flight connections to 50. At 50 connections, we can expect 50-60 MB peak memory usage. Note that any requests exceeding this limit will immediately receive a 429 response rather than being added to a queue.
miasma –link-prefix ‘/bots’ -p 9855 -c 50
Let’s deploy and watch as multi-billion dollar companies greedily eat from our endless slop machine!
Be sure to protect friendly bots and search engines from Miasma in your robots.txt!
Miasma can be configured via its CLI options:
Contributions are welcome! Please open an issue for bugs reports or feature requests. Primarily AI-generated contributions will be automatically rejected.
...
Read the original on github.com »
...
Read the original on blog.literarily-starved.com »
Skip to content
Secure your code as you build
We read every piece of feedback, and take your input very seriously.
Include my email address so I can be contacted
Use saved searches to filter your results more quickly
To see all available qualifiers, see our documentation.
Sign up
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.
You switched accounts on another tab or window. Reload to refresh your session.
There was an error while loading. Please reload this page.
Notifications
You must be signed in to change notification settings
Sorry, something went wrong.
Sorry, something went wrong.
There was an error while loading. Please reload this page.
This tag was signed with the committer’s verified signature.
Note: On Windows “Server” you may need to install vcruntime140.dll.
If your system does not have the required glibc version, try the (unsupported) builds for older glibc.
Run chmod u+x nvim-linux-x86_64.appimage && ./nvim-linux-x86_64.appimage
If your system does not have FUSE you can extract the appimage:
./nvim-linux-x86_64.appimage –appimage-extract
./squashfs-root/usr/bin/nvim
Run chmod u+x nvim-linux-arm64.appimage && ./nvim-linux-arm64.appimage
If your system does not have FUSE you can extract the appimage:
./nvim-linux-arm64.appimage –appimage-extract
./squashfs-root/usr/bin/nvim
You can’t perform that action at this time.
...
Read the original on github.com »
On a spring morning in 1987, a 30-year-old man named Robert Kilgour pulled up beside a row of foamy cherry trees in the town of Kirkcaldy, on Scotland’s east coast, to visit an old hotel. The building was four storeys of blackened Victorian sandstone. Kilgour was a big man, a voluble Scot with a knack for storytelling. He already owned a hotel in Edinburgh but wanted to branch into property development and was planning to turn this old place, Station Court, into apartments. A few months after he completed the purchase, however, the Scottish government scrapped a grant for developers that he had been counting on. He had just sunk most of his personal savings into a useless building in a sodden, post-industrial town. He urgently needed a new idea.
Care homes weren’t so different from hotels, Kilgour thought. And the beauty was, their elderly residents were unlikely to get drunk, steal the soap dispensers or invite sex workers back to their rooms. Turning Station Court into a care home seemed like the best way out of a bad situation. Kilgour arranged a bank loan and in June 1989 he launched Four Seasons Health Care, taking the name from a restaurant in Midtown Manhattan where he had once dined.
By sheer luck, Kilgour had found himself at the start of something big. The following year, the government in Westminster started to transfer responsibility for social care on to local councils. This gave businessmen such as Kilgour a huge opportunity. Councils began paying them to provide beds that had previously been supplied by the NHS. Demand boomed.
Kilgour opened three other homes in Kirkcaldy, another overlooking the Firth of Forth, and a further one near Dundee. Alongside running his new business, he juggled the pastimes of an increasingly wealthy man. He raised money for a cancer charity, played tennis, networked ceaselessly and began to dabble in politics, campaigning (and failing) to become one of Scotland’s few Conservative MPs. By 1997, he owned seven care homes across Fife.
That year, he chaired a fundraising appeal to open a new hospice in the grounds of Kirkcaldy’s main hospital. The guest of honour was an irascible TV celebrity called John Harvey-Jones, star of a reality show called Troubleshooter in which he dispensed tough-love advice to underperforming British businessmen. Over tumblers of whisky, Harvey-Jones counselled Kilgour: “He said I was stuck in a regional comfort zone. He said I needed to break out of it and go wider.” Deep down, Kilgour agreed.
He had few contacts in London, where the serious money was. It occurred to him that his best lead might be an accountant he knew called Hamilton Anstead, who had recently left a job at a care company in the south of England. Kilgour invited him up to a hotel in Glasgow and the two men hatched a plan for Anstead to join Four Seasons as a joint chief executive.
Kilgour told me all about this over coffee at his private members’ club in Mayfair, a high-ceilinged, low-lit place with clusters of velvet chairs arranged for quiet conversation. He had now entered the “legacy” phase of his life, he said: more concerned with what he was leaving behind than what lay ahead. He often mentioned the politicians with whom he was on first-name terms, as if showing me the photographs in a well-handled album. Mostly, he seemed happy, but there were aspects of his past that bothered him.
Over the course of two years, Kilgour and Anstead built Four Seasons into, if not quite an empire, then a small dominion of 43 homes dotted across Britain. As the business grew, however, their relationship soured. Anstead often felt that Kilgour was more interested in his political career than the minutiae of spreadsheets or suppliers. (“I’m a strategy and vision person, not a detail person,” Kilgour said. “Hamilton is a brilliant micromanager and I’m an entrepreneur.”)
In 1999, the two men decided to sell the company, with the idea that they would stay on as executives. Anstead identified a buyer, a private equity firm called Alchemy Partners. Shortly after they signed the deal, in August that year, he called Kilgour and said they urgently needed to meet. Anstead put it bluntly: neither he nor the company’s new owners wanted Kilgour to stay on as an executive at Four Seasons. Kilgour felt his temper rising. He was being asked to leave the business he had created from scratch. “He started effing and blinding and calling me all sorts of obscenities,” Anstead recalled. (Kilgour later told me that by this point he was exhausted, and wanted out.)
Alchemy sold Four Seasons in 2004, and the company became notorious as a failed experiment, a byword for the folly of entrusting elder care to private equity. “You could ask me, well, do I feel guilty about what happened?” Kilgour said. “And yes, I do, actually.”
Private equity relies on a basic technique known as the leveraged buyout, which works like this: you, a dealmaker, buy a company using just a small portion of your own money. You borrow the rest, and transfer all this debt on to the company you just bought. In effect, the company goes into debt in order to pay for itself. If it all goes well, you sell the company for a profit and you reap the rewards. If not, it is the company, not you, that is on the hook for this debt.
Leveraged buyouts first came to prominence in the 1980s, when dealmakers on Wall Street began targeting underperforming companies and bloated conglomerates in the US. Then, these American businessmen and their British imitators started to scour the world for other places to put this technique to work. With a dwindling supply of undervalued companies to choose from, some of the sharpest minds in finance found a new and unexpected target: care homes.
As people were now living well into their 80s and 90s, financiers began to think of elderly people as recession-proof investments, and assumed that the care home market in Britain and the US would keep growing. In the UK, many of these homes were bankrolled by local authorities, which guaranteed a steady income from the government. Elderly people who paid for their care out of their own pockets typically covered the cost by selling their houses, and the ceaseless increase in property prices endowed them with so much housing equity that they became the human equivalent of ATMs. Care homes were the slot for withdrawing their cash.
It takes a certain kind of mind to look into the world of colostomy bags, incontinence pads and emollient cream and see dollar signs. Nevertheless, from the turn of the 21st century, private equity investment in care homes ballooned in both Britain and the US. Fund managers thought “there are all these affluent baby boomers heading towards retirement. They’ve made a fortune from their houses, or inherited money from their parents, and they all have gold-plated pension schemes,” Nick Hood, a chartered accountant who has studied Britain’s care sector, told me. “They rubbed their hands together and said, ‘Sooner or later, as the demand increases, the prices must go up.’”
In the UK, a stream of deals took place. New companies emerged and new care homes went up, some built out of faded hotels whose clientele had migrated to southern Spain after the advent of cheap air travel. Other businessmen bought crematoriums as well as care homes, in anticipation of their clients’ final billable requirements. “Private equity’s presence in British care homes was negligible 30 years ago,” said Peter Morris, a researcher and associate scholar at the University of Oxford. “Since then, it’s grown inexorably.”
Anstead and Kilgour belonged to a small group of newly minted care home millionaires. At the heart of many of these new fortunes was a technique financiers called “sale and leaseback”. You would take a care home and split it into an operating company, or “opco”, which dealt with everything concerning the business of care, from staff to beds, medicine cabinets and cutlery. On the other side you had the property company, or “propco”, which now owned the physical home. After splitting these in two, you could sell off the propco to someone else, allowing you to quickly raise cash (this was how Anstead and Kilgour initially managed to grow Four Seasons to 43 homes in just two years).
In theory, sale and leaseback was an efficient way of raising money, with estate agents acting as middlemen between fund managers who were buying and selling the homes. “In practice, a lot of the deals were bananas,” Paul Saper, a former healthcare consultant, told me. A care home that no longer owned its own property was like a family that sold its house to a rapacious landlord. If the landlord decided to raise the rent, obviously the family would have less to spend on other essentials.
“There’s a phrase my friends use when analysing companies,” Hood told me. “Hang gliders.” Just as a hang glider coasts through the sky supported only by the spread of its wings, a company can coast along for a while supported only by the stability of its cashflow. But if it is crippled with debt, or locked into escalating rental payments, its cashflow dries up and “it crashes to earth. Because it’s got nothing to keep it up there.”
After Anstead and Kilgour sold Four Seasons, it was passed between a string of different owners. Alchemy sold the company in 2004 to a German insurance firm called Allianz Capital Partners, which then sold it to a Qatari private equity fund in 2006. When the financial crisis arrived in 2008, the care company’s debts had soared to an estimated £1.56bn. As its Qatari owners couldn’t find anyone willing to refinance the company, Four Seasons fell into the hands of its creditors, led by the Royal Bank of Scotland. “It was wonderful for the financiers, who put in these supposedly clever structures that took equity away and replaced it with debt,” said Ros Altmann, a Conservative peer who has studied the sector. “They were playing financial pass-the-parcel with elderly people’s lives. They could pile on as much debt as they liked, and there was nothing to stop them.”
By February 2012, RBS was still looking for a buyer, and word had spread about a bidding war. Among the rivals for control of Four Seasons were a Canadian pension fund, the Abu Dhabi investment authority, a Hong Kong billionaire and four private equity firms including Terra Firma, founded by Guy Hands.
After starting on the trading floor at Goldman Sachs, Hands had made his name at the Japanese bank Nomura, buying up trains and pubs, among other things. He was ambitious and had an uncompromising streak. When his team reached the final, frenetic stages of a deal, Hands would hardly sleep. He was known for having a temper. “I’m not a particularly conciliatory human being,” he told me. In an FT report in 2024, several former colleagues accused Hands of screaming and raging at staff and humiliating junior employees. (Hands and Terra Firma forcefully denied these accusations.)
In 2002, he broke away from Nomura to found Terra Firma, a phrase used by 17th-century Venetian merchants to describe the areas of Italy ruled by Venice. Like a doge surveying his kingdom from across the water, Hands relocated offshore, to the tax haven of Guernsey.
Despite his grand ambitions, however, his deals were not always a great success. In 2007, Terra Firma bought EMI, the iconic British music label that had recorded the Beatles at its Abbey Road studios. The match was ill-fated from the start. Hands had little understanding of the music business or the power that artists exerted over the label, and his clinical approach to profit creation left some musicians cold. Paul McCartney described how EMI became “boring” once it was under Terra Firma’s control, while Radiohead were so incensed by the new management that they released an album on their website, sidestepping the label altogether. Two years into its new ownership, EMI was reporting losses of £1.75bn, and in 2011 Hands surrendered control to its creditors, Citibank. (Later, Hands insisted to me that the thesis of the deal was still “100% right” and would have made Terra Firma’s investors over £14bn “had Citigroup not seized the company”.)
With his reputation now tarnished, Hands was desperate to convince the world that he could still do his job, and soon alighted on the care home sector.
In the early months of 2012, Terra Firma held 10 board meetings at which its partners frantically analysed pages and pages of presentations. Their proposition hinged upon a simple premise: they would make Four Seasons into the “IBM of care”, providing reliable, unglamorous services to local councils, much as IBM had sold reliable, unglamorous computer systems to the public sector. In the scramble for acquisition, Terra Firma’s offer won out.
Not everyone was happy. Mark Drakeford, the then first minister for Wales, was concerned that Terra Firma planned to add Four Seasons to a grab bag of unrelated assets: a garden-centre company, a group of wind farms, the Odeon cinema chain and an assortment of motorway service stations in Germany. “Older people are fellow citizens, not commodities,” Drakeford later wrote, likening the transaction to buying a sack of compost or a tub of geraniums. “It just isn’t good enough.”
Hands told me he wanted to improve the quality of care at Four Seasons to attract more residents, which in turn would make the business more profitable. “The cost of doing it would have been about £1,100 a week [per bed],” he said. “And we were getting paid about £550 by the local authorities.” Terra Firma had bought the company for £825m, putting down £325m of its investors’ money and borrowing the rest. While the firm paid off some of Four Seasons’ existing liabilities, the company was still hobbled with debt, and interest payments of £50m each year. In May 2015, the chancellor George Osborne outlined plans to cut a further £55bn from the state’s budget. This trickled down to local authorities, which cut funding for care homes. That autumn, the ratings agency Standard & Poor’s warned that Four Seasons was on track to run out of money.
In Hands’s view, the government’s unwillingness to spend more money on the sector was what caused his plans to unravel. “We believed the government was going to support care, and we got it completely wrong,” he told me. “We saw a Conservative government, with old voters, family values, and we thought, these guys are going to put money into this sector. And they did the reverse. They drained it.”
While the austerity drive undoubtedly did upset Hands’s calculations, it was almost impossible to know what was really going on inside Four Seasons. By now, its corporate structure had become a labyrinth, with 185 separate companies organised across 15 different layers. We know this thanks to research by forensic accountants at the University of Manchester, who studied the company for a 2016 report. “The rules of capitalism have been changed through the construction of opaque, complex groups of companies,” they wrote. “Four Seasons is a black box and only Guy Hands and a few close associates understand what is going on.”
Hands insisted that, in this case, the structure was inherited from Terra Firma’s predecessors, though his firm didn’t exactly simplify things. “It’s a little bit like the government issuing laws,” he told me. “They issue laws the whole time. They never abolish any … it’s much more exciting putting rules in than taking rules out.”
Private equity people tend to be better than just about anyone else at two things: managing huge amounts of debt, and concealing the inner workings of their companies. Fund managers can charge mysterious “monitoring” and “transaction” fees to a company they own. Or they can borrow against that company to pay themselves or their investors a dividend. Whenever I have spoken to policy researchers or trade unionists about this dynamic, the picture they have painted isn’t so different from the argument often made about foreign aid: that it’s pointless pouring money into countries with corrupt governments, as a group of middlemen will siphon off the donations before they can reach the people who need them. Likewise, if it isn’t possible to see how much money a care home is actually making, its owners can more easily pressure the government for more funding.
“In the old days of unionism, you had the factory up the road and you could see how well they were doing,” Natalie Grayson, a trade union organiser who worked with care home staff, told me. “But you can’t do that when your employer gets bought by an investment fund. A company can say, ‘We haven’t got any money, we can only afford to pay people the minimum wage’ and because we don’t know how much debt a company is paying, and there are so many separate companies and holding companies … it makes it impossible for us to trace that money and disprove their arguments.”
On the other hand, when presented with an impossible case, sometimes the most unlikely people find themselves playing sleuth.
It was a stifling August day when I travelled to meet Eileen Chubb, a slight, serene woman with perfectly coiffed hair and silky mannerisms, in a suburb of south London. We were sitting in her living room which, despite its crowd of ornaments and bright-orange paintwork, was a place of remarkable calm. Chubb had poured me a coffee and set out a plate of biscuits. Her rescue dog, Strider, sat at her feet.
Chubb used to work at a care home, until she became concerned by its falling standards and blew the whistle. From her living room, she then founded a charity, Compassion in Care, to help whistleblowers in similar situations. “I always tell people: go home, sit in a chair for eight hours, without food, without water, without human contact. That is what poor care is like,” she said.
In 2013, Chubb started running undercover inspections of care homes. She would pretend she was visiting to find a space for her elderly mother, and use false names — colours (Mrs Black, Mrs Green) or country and western names (Mrs Parton, Mrs Cash). Sometimes she took a walking stick to feign immobility, which let her slow down to better survey the landscape. Chubb had uncovered details of disturbing cases all over the country, both in small, family-owned homes and those run by large companies. Some of the worst cases she learned of were at homes owned by Southern Cross in the late 2000s, in the years before it collapsed. There was Betty Delaney, who developed excruciating bed sores at a home in Rochdale, two of them so bad that they wore down to muscle and bone. Or Alan Simper, a former electrical engineer who was staying at a Southern Cross home in Leighton Buzzard and was covered in dry excrement by the time he arrived at a hospital in 2009. A coroner later found he died “for want of care”.
I thought these might just be tragic exceptions, but Chubb told me that at any one time, her charity was helping between 200 and 300 employees at homes where they were worried about the quality of care, many of which were owned by private equity. “Every single day, I hear about people who haven’t been fed or given fluids, or are left in their own faeces. We see it all the time,” she said. Chubb was a one-woman detective agency, effectively doing the regulator’s job for it. She had little faith in the Care Quality Commission (CQC), the watchdog for social care in England, which had neither the resources nor the inclination to investigate many of the complaints it received, as it lost more than 10% of its budget and almost 10% of its staff between 2016 and 2020. In the six years leading up to 2024, in-person care home inspections fell by two-thirds.
Poor care, Chubb told me, mostly happened behind closed doors, to people who were too sick or senile to protest. Many of the whistleblowers who called her hotline were sharing vital information about wrongdoing that would otherwise never be exposed. But those who took matters into their own hands often found themselves alone. One woman whose mother had suffered falls and a black eye while staying at a Four Seasons home in south-west London in 2013 tried to find out whether this was an isolated incident. She wrote to the council, which refused to give her any information about other complaints patients had made because it said sharing this would affect the company’s “commercial interests”. She then submitted freedom of information requests to the CQC, which said it had received more than 1,000 notifications of serious injury from Four Seasons homes over the previous 12 months, but that it was unable to say how many residents had died as a result of specific types of injuries, because it did not keep a central record of this information.
Were these problems worse in private equity-owned homes? Anecdotally, Chubb noticed a pattern of “ingrained” cost-cutting when homes were taken over by these investors. “The staff are run ragged, absolutely exhausted. You can see it in their faces,” she said. Some of these observations were borne out in qualitative data: in one study from 2022, more than a dozen anonymous staff members in homes taken over by investment funds said their employers were “cutting corners” to curb costs. One said there were sometimes so few staff on duty, cleaners were roped in to care for elderly residents.
One of the most unsettling studies I found was from 2021. Atul Gupta, a health economist at the University of Pennsylvania, had set out with a team of researchers to analyse the changes that took place in nursing homes in the US after private equity takeovers. The team sifted through more than 100 deals between 2004 and 2015, and a dark picture emerged. After a takeover, deaths among residents increased by an average of 11%.
This result was so stark that Gupta initially thought it was an error. But when his team checked their results, they were robust. At homes that had been acquired by private equity funds, researchers found there were fewer staff. Residents were more likely to have pressure ulcers and reported higher levels of pain. “And we found an increase in the use of antipsychotic drugs, which are sometimes used [on residents] as substitutes for restraints,” Gupta said. “So we found a worsening of outcomes on multiple dimensions, including death.”
By the spring of 2016, Four Seasons’ position was tenuous. An American hedge fund was now buying up its debt, betting on financial meltdown. An interest payment of £26m fell due in December the following year. Terra Firma failed to meet it.
The hedge fund operated out of Connecticut under the management of a former Lehman banker called Spencer Haber. Little was known about Haber save for the fact that he had large sideburns and was passionate about animal welfare, making numerous donations to a charity for homeless cats in New York. That, and the fact that he had never owned a care home.
As Haber bought up more of the company’s debt, he acquired more power to determine what happened once the firm was reorganised or liquidated. Terra Firma fought to sell some of the more profitable homes, and Hands agreed to remain an owner in name alone, while Haber’s fund dictated a restructure. In 2019, Four Seasons announced it was going into administration. It could no longer pay its debts, so the restructuring would begin.
And then the pandemic struck. Suddenly, UK care homes were all over the news. The basic problem was that patients with Covid-19 were being discharged from hospitals into homes staffed by low-paid workers with little experience of dealing with a deadly and contagious virus. Compounding this, they often didn’t have enough masks or gloves to avoid catching it themselves. Eileen Chubb told me calls to her hotline increased by about 60% during the first wave of Covid-19. She found herself trying to console distraught care workers until 10pm each evening. “Many were in tears, terrified of what was going on. Being told to get used PPE out of a dustbin, spray it with Dettol and put it back on. Having to use sanitary towels for face masks,” she recalled. At first, the CQC kept data on care home deaths from Covid secret, partly — by its own admission — to protect the commercial interests of providers. It was as if the regulator didn’t want the public to find out what was happening inside these homes. Or perhaps it didn’t know: during Covid, it paused routine inspections entirely.
Once again, the task of analysing what was going on fell to self-appointed investigators and academics rather than the state. According to one paper, at the peak of Covid’s first wave, the homes with the greatest debts, where leverage was above 75%, had a death rate nearly twice as high as homes with no leverage at all. “In bad times, leveraged operators have to cut costs more than unleveraged operators,” the researchers explained.
The pandemic forced the public to focus on the industry, and the UK government sprang belatedly into action. It pumped an extra £2.1bn into the sector — about £5,900 for each bed. Homes received free PPE, money to cover staff sick pay and subsidies for empty rooms as residents died. As Amy Horton, an economic geographer and professor at UCL discovered, however, staff working in the largest for-profit homes, the majority of which were owned by private equity funds, reported working longer hours and receiving less than satisfactory sick pay. “These differences,” Horton suggested, “could be because some companies are paying out significant portions of their revenue to investors, landlords and creditors, rather than reinvesting in the service.”
Hands seemed to regret his decision to buy Four Seasons. When I asked him whether his industry should ever be responsible for the care of elderly people, he told me he felt there was a “fundamental mismatch” between private equity and social care. “I mean, private equity’s role is to make profits for its investors. And you can’t, in the care home business, just make profits. You’ve got to take into account something that is more important, which is people’s lives.” I didn’t disagree, though it seemed an easier thing to say once you had retired offshore, having made a sizable fortune.
In 2022, the remaining homes from the Four Seasons estate appeared on the website of a real-estate broker. The photos showed an early Victorian mansion, an Edwardian pile and a 1990s neo-Georgian housing block. In real-estate vernacular, the portfolio was described as “attractive”, with “strong average fee uplifts” and “favourable demographics”, a euphemism for locations where house prices had boomed, once again conjuring the idea that elderly people were asset-rich cash machines.
Ever since he was ousted from Four Seasons, Robert Kilgour had resolved to create what he told me was a different type of care business. I met him on a rainy day in Edinburgh to visit three of the homes he now owns. We drove between them in his SUV, which had a personalised number plate spelling out his surname. The first was a crenellated, three-storey Victorian manor. Inside, Kilgour pointed with pride to the artworks he had donated, and paused to appreciate the texture of a brass light switch. For lunch that day, the residents could choose between mushroom stroganoff and shepherd’s pie. There were small vases of carnations on each of the dining tables. I checked: the flowers were real. Kilgour chatted with an attendant who ran the in-house hair salon, then we went to look around the bedrooms. “This,” Kilgour told me, stroking a bedstead in an empty room with a theatrical flourish, “is life stuff.”
It would be nice to think that homes such as this provided a solution to the care crisis, but the residents of the home we visited that day paid upwards of £1,700 a week, a hefty bill that effectively ruled out almost everyone without an expensive property to remortgage or sell. Kilgour planned to expand his business to 30 homes by the end of the decade, and said he’d received various approaches from private equity funds. “You know, ‘We’d like to invest £100m in the care home sector, and we’d like to do a deal with you’ — that sort of thing.” Kilgour didn’t tell me who these were, but he was adamant that he wouldn’t work with any of them after watching what their industry had done to Four Seasons.
...
Read the original on www.theguardian.com »
The bot situation on the internet is actually worse than you could imagine. Here’s why:
As you may know, on Glade Art we tend to take anti-bot measures very seriously; it is one of our topmost priorities to protect our fellow users from having their art trained on. We also tend to engage in trolling bots by using endless labyrinths of useless data to trap them in. These are commonly referred to as “honeypots” or “digital tar pits.” And so, after 6.8 million requests in the last 55 days at the time of writing this, we have some substantial data, so standby and let us share it with you. : )
> 1. Quick clarification.
For starters, these bots do not obey robots.txt. This is expected from unethical companies, but it doesn’t make it any better. (A robots.txt file is a plain txt file placed in websites which contains rules of where bots are allowed and disallowed to go. Good bots such as search engine crawlers obey these rules, while bad bots do not). To avoid trapping good bots, we have our robots.txt set to disallow all bots from going into this site’s tar pits.
> 2. Pages and Contents.
The 2 traps on this site which have the most bot activity are these:
gladeart(DOT)com(SLASH)data-export (Over 6.8 million requests in the past 55 days).
gladeart(DOT)com(SLASH)gro (Over 84k requests in the past 35 days).
(NOTE: Use a VPN on these pages if you don’t want your IP shown in the logs, but it won’t be significant amongst the millions of others anyways).
As you can see when visiting the pages, GRO generates more book-like text, while Data Export’s text is well… whatever it’s supposed to be.
Data Export is by far more successful than GRO. It would be safe to assume that these companies are scraping for more number-rich data for better facts and stuff. Fake personal information such as emails or phone numbers seem to also attract scraping very well.
> 3. Characteristics of these bots.
The IPs of these bots here actually do not come from datacenters or VPNs most of the time; the overwhelming majority come from residential and mobile networks. Asian and Indonesian countries are where nearly all of them reside. By leveraging cheap compute from such countries while using residential IPs, they can appear as completely human traffic to many websites, and scrape at massive scale. However, there is some good news: these bots do not execute JavaScript, at least not when scraping random sites across the entire web. Just imagine the compute costs if they couldn’t use headless browsers while scraping millions of sites every hour! This makes PoW challenges extremely effective against them. Website traffic at these scales coming from bots while looking like normal humans begs this question: “How much of the internet’s traffic comes from bots?”
> 4. How much of the traffic on the internet comes from bots?
Reports in 2024 say that approximately 51% of all traffic on the internet comes from bots. Now this sounds like a lot, and it is, but it is much worse than that. This is because these estimate rely heavily on where the IP addresses originate from: whether they come from datacenters or not. As we can see in our data, there is an extremely high amount of bots that don’t come from datacenters at all. They can certainly be rigged to execute JavaScript on high quality sites, and many sites don’t even require JS, such as Wikipedia and Old Reddit. With this in mind, it wouldn’t be unreasonable to assume that the amount of bot traffic on the internet is much higher, perhaps over 70% even.
> 5. Some experiments on these bots.
Of course we ran some experiments on these bots.
Quick fact: Anubis is a program that adds a proof of work challenge to websites before users can access them.
And so Anubis was enabled in the tar pit at difficulty 1 (lowest setting) when requests were pouring in 24/7. Before it was enabled, it was getting several hundred-thousand requests each day. As soon as Anubis became active in there, it decreased to about 11 requests after 24 hours, most just from curious humans.
Was it a coincidence? No, it was not. It was tested on several other occasions yielding very similar results.
As this confirms, bots do not like PoW challenges, even ultra easy ones. If a few do execute JS, extremely little will solve challenges; take the search engine crawler GoogleBot for example.
> 6. Who are these bots from?
These bots are almost certainly scraping data for AI training; normal bad actors don’t have funding for millions of unique IPs thrown at a page. They probably belong to several different companies. Perhaps they sell their scraped data to AI companies, or they are AI companies themselves. We can’t tell, but we can guess since there aren’t all that many large AI corporations out there.
> 7. How can you protect your sites from these bots?
If your site has a vast amount of pages, then these bots could potentially raise resource usage for your server when they are crawling through everything. The best options in this case would be Cloudflare or Anubis. Alternatively, you could add a simple JS requirement in your web-server, Nginx for example, (this won’t be as effective, but often sufficient for most sites). It would be recommended to add an hCaptcha to forms such as sign ups and similar as well. Overall, a correctly configured Anubis on your site eliminates nearly all bot traffic.
> 8. Server resource usage.
Our server usage for the tar pit endpoints is quite low. For example, when a global 1000 request per minute rate-limit was being reached in Data Export, the server’s CPU usage was not noticeably higher than when idle (i5 4460). The ram usage for it was also very low, much less than 500mb. And since it’s just text data being sent out, uploads were no more than 700KiB/s.
> 9. Fun fact.
So on average, the Data Export tar pit generates 9000 characters per request. Doing the math on that makes the 6.8 million loads equivalent to ~52 billion characters, or over 120,000 novels worth of text generated and sent in total since Jan 29th, 2026.
> 10. Download a log file.
Here is a massive log file for some activity in the Data Export tar pit:
https://mega.nz/file/69Rh3IpS#ThlagHz8e58jLvU-vWn9U9m9T_WegL4SE0H2mhZRcZY
Caution: this file decompresses to about 1.1GB. Standard text editors will struggle to open it.
Note: this file contains logs from Jan 29th to March 22nd, 2026.
[This is for educational purposes only].
<> Outro.
And so, with this information we can see just how bad the bot situation is right now on the internet. Look on the bright side though, trolling bots is fun! We recommend you to add your own tar pits to your site as well; the more volume the better. Just be sure to disallow going into there in your robots.txt so that good bots don’t get trapped. Bad bots actually often go into that page because you disallowed it for them.
Thank you for reading! : ) <>
← Back to Blog
...
Read the original on gladeart.com »
Almost 30 years after the intricate web of nerves inside the penis was plotted out, the same mapping has finally been completed for one of the least-studied organs in the human body — the clitoris.
As well as revealing the extent of the nerves that are crucial to orgasms, the work shows that some of what medics are learning about the anatomy of the clitoris is wrong, and could help prevent women who have pelvic operations from ending up with poorer sexual function.
The clitoris, responsible for sexual pleasure, is one of the least studied organs of the human body. Cultural taboo around female sexuality has held back scientific investigations and the clitoris did not even make it into standard anatomy textbooks until the 20th century. And in the 38th edition of Gray’s Anatomy in 1995 it was introduced as just “a small version of the penis”.
A Melbourne urologist, Helen O’Connell, says the clitoris has been ignored by researchers for far too long. “It has been deleted intellectually by the medical and scientific community, presumably aligning attitude to a societal ignorance,” she said.
To get a better idea of the inner workings of this key pleasure-related organ, Ju Young Lee, a research associate at Amsterdam University Medical Center in the Netherlands, and her colleagues used high-energy X-rays to create 3D scans of two female pelvises that had been donated through a body donor organ programme.
The scans revealed in 3D the trajectory of the five complex tree-like branching nerves running through the clitoris in unprecedented detail, the widest 0.7mm across. The work has been reported on the preprint server bioRxiv and has not yet been peer reviewed.
“This is the first ever 3D map of the nerves within the glans of the clitoris,” said Lee. She is amazed it has taken so long, considering a similar level of knowledge regarding the penile glans was reached back in 1998, 28 years ago.
Lee and her colleagues show that some branches of clitoral nerves reach the mons pubis, the rounded mound of tissue over the pubic bone. Others go to the clitoral hood, which sits over the small, sensitive, external part of the clitoris — the glans clitoris — which is just 10% of the total organ. Other nerves reach the folds of skin of the vulva, the labial structures.
Previous research had indicated that the big dorsal nerve of the clitoris gradually diminished as it approached the glans. However, the new scans appear to show that some of what medics have been learning in anatomy is wrong and the nerve continues strongly all the way to the end.
“I was especially fascinated by the high-resolution images within the glans, the most sensitive part of the clitoris, as these terminal nerve branches are impossible to see during dissection,” said Georga Longhurst, the head of anatomical sciences at St George’s, University of London.
O’Connell, who published the first comprehensive anatomical study of the clitoris in 1998, said the findings were crucial to understanding the female sensory mechanism underlying arousal and orgasm via stimulating the clitoris. “Orgasm is a brain function that leads to improved health and wellbeing as well as having positive implications for human relationships and possibly fertility,” she said.
The mapping of clitoral nerves is likely to inform reconstructive surgery after female genital mutilation, one of the most extreme examples of cultural misogyny. According to the World Health Organization, more than 230 million girls and women alive today in 30 countries in Africa, the Middle East and Asia have undergone such mutilation, in which the visible part of the clitoris may be removed, along with parts of the labia.
The practice has no health benefits and can result in issues including severe bleeding, infection, problems urinating, menstrual difficulties and complications in childbirth.
About 22% of women who undergo surgical reconstruction after mutilation experience a decline in orgasmic experience after their operation, so a better understanding of how far the nerves extend could reduce that percentage, said Lee.
O’Connell said the work could also inform surgery to treat vulvar cancer, gender reassignment surgery and genital cosmetic surgeries, such as labiaplasty, which increased in popularity by 70% from 2015 to 2020.
Lee is hoping to open a clitoris exhibition within Amsterdam University Medical Center to help expand knowledge about the clitoris, inspired by the Vagina Museum in London.
...
Read the original on www.theguardian.com »
After using the TCL tablet for two months, I’ve come to the conclusion that my tablet doesn’t need a screen with smooth motion. I only read static content — still text.
This realization made me take a fresh look at a type of device I hadn’t even considered before, but which now seems perfect for my needs. I’m referring to Android tablets with E-Ink screens, manufactured by brands like Boox, Bigme, and Pocketbook.
The problem? They’re expensive. The smaller models, with 7–7.8-inch screens, start at prices four times higher than a basic Kindle. The one I wanted, the Boox Go 10.3, with a 10.3-inch screen, is even pricier. And it comes with an outdated version of Android, although I’ve been told that this isn’t a problem, unlike with the iPad. (Last week, Boox launched the second generation of the model, featuring Android 15 and a variant with a backlit screen. It’s likely to be even more expensive.)
Besides being expensive, I hate buying… things. That’s why I was happy when I realized I could use my Kindle — the very one that has never accessed the internet — to read articles, posts, and newsletters published on the web, without spending a single cent and with great quality.
It’s this setup — the result of a week of new brain connections (or many neurons fried over something almost insignificant) — that I’ll share with you.
Amazon’s e-readers only read unorthodox digital book formats, such as *.mobi and *.azw3. There is an official way to convert other, more popular formats to supported ones, such as “Send to Kindle.” My Kindle isn’t connected to the internet, which rules out that option.
Therefore, we’ll need Calibre, a great e-book manager, to convert files *.epub, the most common digital book standard, into a format the Kindle can understand.
After installing Calibre, the next step is to create a “book” from a collection of articles/links.
Most services of this type, such as Instapaper and Wallabag, generate RSS feeds from the various filters they offer — unread, favorites, folders etc. At first, I thought about combining this feature with another one in Calibre called “Get News.” The icon on the app’s chaotic toolbar already gives you an idea of what it’s about. It’s an RSS/Atom feed client that fetches new posts and generates books on demand or on a predefined schedule.
To add a new feed, just click the arrow next to the button and select . On the screen that opens, click , set the parameters, and add the feeds you want to follow. You can list several, which allows you to create a highly personalized publication. Among them, include Instapaper, Wallabag etc. own feed.
I noticed that the formatting of these books generated by Calibre is a bit different from that of standard e-books. The table of contents doesn’t use the same layout as books, and even the text display — or what surrounds it, like the progress bar/page numbers — has its own structure. I’ve never read a magazine on Kindle; maybe that’s what they look like?
The important thing is that it works, but there are ways to improve certain aspects of this process and its outcome.
I had chosen Wallabag to be the hub for the articles I intend to read on Kindle. I had already been using it on my TCL tablet. (The Android app is good, even if it lacks some features.)
Realizing that its parser is worse than average made me take a step back. The parser is the algorithm that identifies the content of a URL and extracts it. On some websites, Wallabag’s parser fails; it can’t extract the text. The Brazilian magazine website is an example. (Obviously, I’m referring to the open articles, without a paywall.)
Instapaper performed better, but I didn’t want to use it. After all, we self-host not one, but *two* such services: Wallabag and Readeck.
Readeck’s parser is just as good as Instapaper’s. Case closed, right? No, because I couldn’t find the darn RSS feed for unread items.
I had to check the official website to realize that the Atom feed is hidden behind the three-dot menu. And then came the big surprise: Readeck itself generates an e-book, in the *.epub, from the listed articles.
I adopted Readeck, which allowed me to set aside Calibre’s “Get News” feature. However, Calibre still needs to be present to convert the file to *.mobi, which the Kindle understands. As a bonus, I take this opportunity to edit the book’s title and add a cover I quickly made in an image editor.
It’s been just over a week since I had this epiphany. I save links in Readeck throughout the day, and in the late afternoon, I generate my own edited newsletter. After reading the edition, I go back to Readeck to archive what I’ve read and, if necessary, “use” some links — register them on the links of the day, share them with someone, or save them as reference material for a longer piece I plan to write.
It’s been great. The E-Ink screen is less tiring on the eyes, especially without the backlight. I can read in the soft sunlight streaming through the living room window at this time of year, early in the morning, without worrying about screen glare. On the contrary: the more sun, the more external light, the more readable the screen becomes.
The only (major) problem with this process is that it requires a computer, because of Calibre and the need to convert the file to a format readable by the Kindle. In this regard, Android tablets with E-Ink screens would be more practical, since they have apps that read *.epub. Besides, you might not even need the e-book. The Readeck app would be enough, with direct access to the texts on the same E-Ink screen. Bonus: you could use Readeck’s native highlighting and note-taking features, which would be quite useful.
For those who already have a Kindle and a computer at their disposal, however, it’s hard to justify a new device for these few advantages of direct access to Readeck. Generating the book is a minimal effort in exchange for ~90% of what an Android tablet would provide.
One side effect I didn’t anticipate is that I’ve been reading fewer books, which now share space (and my time) with web articles on the Kindle. That’s my problem, right?
...
Read the original on manualdousuario.net »
Last week I was writing about the hardware side of the AI memory problem: the HBM density penalty, the EUV bottleneck, and the supply chain pressure squeezing DRAM prices for everyone from data centre operators down to consumer electronics. This week, Google published something that attacks the exact same problem using another approach: not “build more memory”, but “need less of it.”
You guessed it! This post will dive a bit deeper into what TurboQuant is, and what this may imply to the field of AI. What Pied Piper achieved in the Silicon Valley TV Show with their general-purpose lossless compression algorithm, Google may have achieved it for the compression of information represented as vectors in a high-dimensional space.
But before getting into what TurboQuant does, let’s make a brief detour to understand what is this algorithm is actually built to compress, and why it is important for LLMs and the memory problem.
GPT models are what are known as autoregressive: they generate text one token at a time, where each new token is conditioned on everything that came before. You send a prompt, the model reads all of it, picks the most likely next word, appends it, reads everything again, picks the next word, and so on. One token at a time, left to right, until it decides to stop.
The core mechanism that lets the model read everything at each step is called attention. For every token in the sequence, the model computes three vectors: a query, a key, and a value. You can think of these data structures as a bit more complex key-value stores. To generate the next token, the model compares the current query against every previous key, essentially asking “which past tokens are relevant right now?”, and uses the answer to weigh the corresponding values and build up context.
This is implemented (as you may all know by now) through the transformer architecture. Transformer layers are responsible for encoding the input sequences into a meaningful representation, applying the attention mechanism, and decoding into an output representation. All LLMs are architectural variations of this basic cell.
To get a sense of each of these variations I highly recommend Sebastian Raschka’s LLM Architecture gallery: from GPT-2 to DeepSeek and GLM.
The keys and values for every previous token are recomputed from scratch on every single pass through architecture. If your conversation is N tokens long and you’re generating token N+1, the model recalculates N sets of keys and values it already calculated on the previous step. This is slow and wasteful in terms of the resources.
The obvious fix to this is to cache them. The query, key and values are computed once per token and stored so they can be looked up in subsequent steps instead of being recalculated. This is the KV cache, a running store of QKV tokens from all previous tokens stored in GPU memory (so they are readily accessible when needed).
The problem is that the KV cache grows with every token. With short messages this is trivial as all tokens fit in memory, but a long conversation, or a full code base, involves hundreds of thousands of tokens. Each token has its own key and value vectors, across every attention layer in the model, each stored as a full-precision floating-point number (as long as there’s no quantisation involved). For a model like Llama 3.1 70B, the KV cache for a single long context can consume more GPU memory than the model weights themselves.
This is one of the key bottlenecks in production inference. Serve more users simultaneously? More KV cache. Support longer contexts? More KV cache. Run cheaper inference? Figure out what to do about the KV cache. We are trading the compute necessary to compute on-the-fly the QKV values, for increased memory requirements.
By using quantisation instead of storing each value at 32-bit or 16-bit precision, one can round it down to 4 bits or 3 bits (or even 2 bits, like Microsoft recently showed). Some accuracy is lost in the approximation, but if it is not significant for the user case, the trade-off is obviously worth it. The question is how to do this well. Standard quantisation techniques add 1-2 extra bits of overhead per value as metadata, which partially undermines the compression you’re trying to achieve. Getting to genuinely low bit-widths without that overhead, and without accuracy degradation, is the hard part. HuggingFace has a really nice page with an overview of quantisation and a list of methods
But things may be about to change. Google announced this week TurboQuant. TurboQuant (see paper) is a two-stage algorithm. The two stages have different jobs.
Stage 1: PolarQuant. This is the main compression step. We currently store vectors using Cartesian coordinates as distances of a base to the origin (the x, y, z components that we learnt in primary school). The distribution of those components in space makes them hard to compress efficiently.
PolarQuant converts the vector to polar coordinates: a radius, and an angle. The key observation is that, in high-dimensional transformer key spaces, the angle distribution is highly concentrated and predictable, it clusters in ways that maps neatly onto a fixed quantisation grid (like the ones used to compress audio and image). That predictability means you can eliminate the expensive normalisation steps that standard quantisation methods require, and you can do it without any dataset-specific tuning. No fine-tuning or calibration pass required to quantise a specific model. One can directly apply it to the vectors in this new representation independent of the model.
Stage 2: QJL (Quantised Johnson-Lindenstrauss). PolarQuant handles the main compression, but any quantisation introduces error, and some of that error accumulates in the dot products that the transformer uses to compute attention scores. QJL’s job is to correct for this bias. It applies a Johnson-Lindenstrauss transform to the residual error, a random projection that preserves distances between high-dimensional points, and then reduces each component to a single sign bit: +1 or -1. The result is an unbiased estimator for the inner products, with zero additional memory overhead. The error correction costs nothing to store (see bottom-left part of the image below for a mental model of the shift from existing quantised KV cache and a QJL-transformed one).
The combination achieves 3.5 bits per channel with what the paper calls “absolute quality neutrality” across Gemma, Mistral, and Llama-3.1-8B-Instruct, tested on LongBench, Needle In A Haystack, ZeroSCROLLS, RULER, and L-Eval. At 2.5 bits, accuracy degrades only marginally. The headline number from the blog post: 6x reduction in KV memory size with no measurable accuracy loss, and on H100 GPUs, 4-bit TurboQuant delivers up to 8x performance increase over 32-bit unquantised keys.
As briefly described above, most quantisation methods require at least some calibration on representative data, they learn the optimal quantisation grid for a specific model on a specific dataset. TurboQuant is data-oblivious: the algorithm works from first principles, near the theoretical lower bounds of what information theory says is possible, without seeing the data first. That’s what makes it deployable at inference time to any models without having to explicitly train the quantised model. There is no need for specific training and fine-tuning to achieve the most optimal compression rate without trading accuracy.
Last week I was writing about how HBM stacking reduces DRAM bit density by 3-4x, and how the entire supply chain for consumer DRAM is under pressure because data centres and consumer electronics are competing for the same wafers. If TurboQuant reduces the memory footprint per inference job by 6x, applying this compression algorithm at scale may significantly relax the memory bottleneck issue.
Anthropic is not the only one that is able to crash the market cap of public companies with a single announcement. Immediately after Google’s announcement, the stock from memory manufacturing companies like Micron and Sandisk plunged (and as an investor in Micron, this hits me home 🙈).
This may be an overreaction, like when Nvidia stock plunged after Deepseek’s announcement. Or it may be signalling a complete shift in the economics and resource requirements of AI labs. If I were Google, I wouldn’t release research that exposes a competitive advantage. I would only publish research whose progress has already been factored in as the competitors may have already realised it, or adopted themselves TurboQuant has most probably been already adopted inside Google’s infrastructure before anyone outside read the paper.
If Google is publishing 6x KV cache compression, the reasonable thing to think is that every serious AI lab has been working on this problem already. Reducing the memory requirements of the KV cache has been a known problem for quite some time, and advancements like TurboQuant adopted at scale change the memory requirements (justifying the hit on these memory stocks). I can’t wait for the next report from SemiAnalysis analysing this release, the real adoption of this new approach to compression (and similar ones) from big labs, and what it can entail to the memory crunch.
Micron and SanDisk haven’t suddenly become bad businesses. But any thesis that depends on memory demand growing linearly with AI context usage deserves a second look. My personal take is that the market is overreacting, but we’ll see.
In this post about money and collateral in an AI-first society, I mentioned the book “The Last Economy”. This book describes how extreme volatility and sharp turns over any news without achieving a clear equilibrium is a symptom of a sick system. This big market movements over a single news may be proof of the symptoms of a broken system.
What excites me the most about this release is what this Johnson-Lindenstrauss Transform that powers QJL and compression algorithms like TurboQuant could mean for other use cases outside of LLMs and vector search that rely on high-dimensional vector data.
The obvious one outside of KV caches as mentioned above is vector databases. Any RAG pipeline that stores embedding vectors for retrieval benefits from the same compression. TurboQuant reduces indexing time to “virtually zero” on vector search tasks and outperforms product quantisation and RabbiQ on recall benchmarks using GloVe vectors.
Further out: recommendation engines, fraud detection, drug discovery similarity search, genomics, any system that stores large tables of high-dimensional embeddings and needs to run fast nearest-neighbour lookups (assuming a similar distribution in space as the values stored in KV caches, which is something I want to explore). These systems weren’t waiting for transformer-specific optimisation, but they may inherit the benefit directly.
On-device inference is another field inside the world of LLMs where we could start seeing immediate impact. If the KV cache for a long context shrinks by 6x, you can fit substantially more context into the memory envelope of a mid-range phone or a modest edge device. Local models with usable context lengths start to look more tractable. The economics of inference at the edge change, and that’s a different set of winners and losers than the data centre story.
I don’t know if you’ve already seen how some LLMs are being stored in fast flash memory in order to be able to run LLM inference of big models in a Mac. I’ll leave this for some other post, but the field of edge inference is getting more interesting every day. And even more now that we got TurboQuant.
The TurboQuant code is out, both the QJL and PolarQuant components are available, and I can’t wait to find the time to start applying to other use cases. We’ve seen throughout history the impact that changing the way we represent information can have for performance (and even feasibility) of certain use cases (think of what the Fourier Transform, FFTs, and the frequency domain already enabled :) ).
I want to find the time to do the exercise of trying to apply the TurboQuant approach to other use cases to see what this is capable of. I already have some ideas, but I’ll report back. In the meantime, until next week!
...
Read the original on adlrocha.substack.com »
To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".
10HN is also available as an iOS App
If you visit 10HN only rarely, check out the the best articles from the past week.
If you like 10HN please leave feedback and share
Visit pancik.com for more.