

Gemma4 doesn’t Turboquant. But it is leaner on the KV cache.
edit: looks like there are forks that do turboquant already


Gemma4 doesn’t Turboquant. But it is leaner on the KV cache.
edit: looks like there are forks that do turboquant already
It works both ways.
I can tap my government ID card to my phone to identify myself on government sites.


Most do. My social circle and family are spread out over the EU.
I had a SIL and FIL who don’t own, and are generally bad with money. I fear we’re going to have to take them in when they reach retirement.


I’m coming from the past, back when the distribution came on two HD diskettes named Linux 0.99b. It was a gradual change to come to the point where you could just assume that you’d have a good time on Linux. I guess static kernel modules was the starting point, and even then it took years. Remember, We’ve only had loadable kernel modules since 2011.
linux-on-laptops.com was invaluable before making a purchase.
ARM is a different story, mostly hindered by not having any universal way of booting and detecting hardware.


400€ in 2006-money is 600€ today. Starlabs used to have a cheap model, but I guess it’s hard for anyone to be in the budget segment with RAM prices these days. I bought a huawei matebook a few years back for about 600€ - they’re sold with Linux pre-installed in China, but not here. But that means that stuff is well-supported.
In my mind the landscape is quite a bit better than 20 years ago. You’d have to pick and choose a model that worked well then. Chipsets are usually well supported by the time they are in laptops today.
The Microsoft tax has been under pretty heavy NDAs lately, but it wouldn’t surprise me if M$ were paying to be pre-installed. They’re in the data mining business, not operating systems in 2026.
But yes, we’re all still waiting for the year of the Linux desktop.


With ethanol E85? I don’t think so. It used to be heavily subsidized, and made sense for the consumer… but now it’s about the same price as 95 octane with 10% ethanol. It doesn’t make sense when it’s less energy dense. You burn an extra 15-20% in volume compared to “regular” gasoline.


Last time I filled up on diesel was for 25.53 SEK/liter. That’s $9.66/gallon (tax included).
It’s only up 34% since orange dumbfuck had to “prove” he didn’t have a small peepee.


For personal projects I’ve just got a VPS where me and a couple of partners in crime push over ssh. It’s very informal and merges are requested in our group chat.
At my previous place of employment we selfhosted gitlab. I much prefer that over corporate github. I want my own fork, not a shared repo.


Six-seven


I have an idea of how they could reduce the fish requirements.
How about using shared libraries instead of bundling everything in every snap all the times?
Amazingly it reduces RAM usage as well.


It’s a good way to keep the exploit around for seven days, too, if you apply it right away.


I’m developing an app as a side project, but testing can only happen meaningfully out in the field. I found a breaking bug yesterday, asked Claude to fix and deploy over remote session. I installed the update and continued ny testing session.
I sure wouldn’t want that to go away.


Iran has a professional army with a ground force of 300k.
That’s about the same number of boots the US has. It’ll be no small logistic feat to get that operation a chance.


It sounds to me like you were looking to get a dog but got a cat instead.


The president again boasted he has “stopped eight wars”
It doesn’t really count if you were the one who didn’t start those eight wars.
Personally, I’ve prevented nine cases of underage drinking this week.
I remember when I bought my first-gen MacBook with a core duo and 512MB of RAM. The 512 would probably have been enough if I didn’t need Rosetta for half of my applications.
Probably good for a surfing machine if you need to be in the Apple ecosystem. I might get one if I need XCode for something in the future.


Yeah, but I’m not gonna go mug shopping until I’ve had my coffee


Same as I do every morning. A big mug of coffee and a big shit.
No amount of money would change that.


I work at a startup that classifies and extracts data from often very fuzzy sources.
We are encouraged to use agents for development. We use models in our services for things like pinpointing Coca-Cola* cans in YouTube videos. We offer our customers LLMs to discover how Coca-Cola and Pepsi are presented on YouTube.
*Soda scenario imaginary. I don’t want to dox my niche, but it’s similar enough problems that we solve.
I’ve had better luck with llama.cpp for opencode. I’m guessing it does formatting better for tool use.