It’ll be marketed as Skyrim with all LLM text and end up as Oblivion with prefab text chunks.
Even disregarding the fact that current LLMs can’t stop hallucinating and going off track (which seems to be an inherent property of the approach), they need crazy accounts of memory. If you don’t want the game to use a tiny model with a bad quantization, you can probably expect to spend at least 20 gigs of VRAM and a fair chunk of the GPU’s power on just the LLM.
What we might see is a game that uses a small neural net to match freeform player input to a dialogue tree. But that’s nothing like full LLM-driven dialogue.
I challenge you on that.
We’ll see a Skyrim like game using LLM for NPC’s within 3 years, definitely 5.
It’ll be marketed as Skyrim with all LLM text and end up as Oblivion with prefab text chunks.
Even disregarding the fact that current LLMs can’t stop hallucinating and going off track (which seems to be an inherent property of the approach), they need crazy accounts of memory. If you don’t want the game to use a tiny model with a bad quantization, you can probably expect to spend at least 20 gigs of VRAM and a fair chunk of the GPU’s power on just the LLM.
What we might see is a game that uses a small neural net to match freeform player input to a dialogue tree. But that’s nothing like full LLM-driven dialogue.
I think some will exist in that time frame, but I don’t think they’ll be any good, or well received.
IN the near-future of gaming, but not BEING the near-future of gaming.
There’s already mods doing it now that are absolutely doing it better than Bethesda will.