What? They’re just computer programs. Almost all computers have high quality entropy sources that can generate truly random numbers. LLMs’ whole thing is basically turning sequences of random numbers into sequences of less random stuff that makes sense. They have a built-in dial for nondeterminism, and it’s almost never at zero.
I feel like I’m missing your meaning because the literal interpretation is nonsense.













I think this comment is based on an extremely optimistic – bordering on fantastical – outlook.
The capacity and capability to handle the data will grow too.
This is what data analysis is though. Extracting patterns from noisy data. Ignoring outliers. I don’t think anybody is suggesting they’ll just dump a CSV of your web history into ChatGPT and ask it if you’re probably going to a protest this weekend (although does it sound so far fetched that that might actually work?), it’ll be used in combination with existing and constantly improving data mining techniques.
Are you implying data protection laws will not only not be inexorably eroded year upon year by increasingly surveillance-hungry governments, but will actually get a significantly better than their current milquetoast state? I’ve gotta say, that’s seeming increasingly unlikely to me; right now we’re seeing mandatory identity verification being legislated on more and more things by more and more governments.
This has to be a sarcastic reference to Snowden, right? The thing where the entire world found out about the how NSA absolutely is – not “might be” – monitoring your internet and conversation logs, and basically nobody did a fucking thing to change? That was 12 years ago.
Good thing they’re not doing anything crazy to get more computing power, like buying up practically the entire global supply of RAM or building data centres at an exponentially increasing rate.