i_like_water@feddit.orgtoTechnology@lemmy.world•Local AI is one step closer through Mistral-NeMo 12BEnglish
35·
4 months agoFrom the top of my head: the context size is way higher. 128k tokens vs 8k usually.
From the top of my head: the context size is way higher. 128k tokens vs 8k usually.
You’re not alone in that. I remember when Blender 2.80 was about to release and it was a big milestone and everybody online was hyping about “Blender 2.8”. I think even Blender themselves went with the flow and changed the marketing.
I think the issue is how the games are programmed. Some games check which controllers are plugged in only at start and some do regular checks during runtime.