misktoTechnology@lemmy.worldEnglish·5 months agoWe have to stop ignoring AI’s hallucination problem(www.theverge.com)external-linkarrow-up1532arrow-down129message-square208fedilink
arrow-up1503arrow-down1external-linkWe have to stop ignoring AI’s hallucination problem(www.theverge.com)misktoTechnology@lemmy.worldEnglish·5 months agomessage-square208fedilink
minus-squareCyberflunkEnglisharrow-up1arrow-down2·5 months agolinkfedilinkWtf are you even talking about.
minus-squareUnsavoryMolluskEnglisharrow-up2arrow-down0·5 months agoedit-25 months agolinkfedilinkThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunkEnglisharrow-up1arrow-down1·5 months agolinkfedilinkYour 1 sentence makes more sense than the slop above.
Wtf are you even talking about.
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
Your 1 sentence makes more sense than the slop above.