That’s the thing, there’s nowhere to leap to from LLMs. They’re basically really advanced spell checkers, and there’s no way for them to discriminate around the information in their training sets. And this is why AI companies are lobbying to get unrestricted access to copyrighted materials, they’ve absorbed all the information they could legally acquire and get away with stealing and the newest generation of LLMs have stalled out. There’s so much AI-generated garbage already out there it’s corrupting the datasets and the new outputs are even worse than older ones as a result.
All the talk of AGI coming out of LLMs is an attempt to build hype to maintain investment. All the companies pushing this are desperate that there’ll be a huge breakthrough before the money and access to data runs out, and nobody wants to drop out in case they miss out on the hypothetical breakthrough. It’s wort noting that increasingly the heads of tech companies aren’t technical people, but financial ones who don’t understand the complexity or ethical issues in the same way, and a lot of them have clearly watched or read major SF works without a shred of undersanding of what they’re saying (See for example Elon Musk’s obsession with colonising Mars, clearly based on reading Kim Stanley Robinson’s Mars Trilogy, but he fantasises about being in charge, how society will work and like, what the money will be… while ignoring that after the Second Revolution Mars is an anarchist society. I’ve joked that Musk thinks he’s Arkady Bogdanov but he’s really Phyllis Boyle… which makes sense if you’ve read the books?)
But their technical documents are a lot more honest about the limitations. I’m running a pilot program of Microsoft copilot in work, and there’s a training video I watched going through the capabilities and they talked a lot about how you shouldn’t rely on copilot for finished documents, and they talk a lot about how it limits access to data but also that you need to use Microsoft’s file classification system to prevent sensitive or confidential info from being used.
1 user thanked author for this post.