[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Talk:hallucinate

Latest comment: 6 months ago by Jberkel in topic AI definition wrong

AI definition wrong

edit

The current AI def reads "To produce factually invalid information". That's... not what hallucinating is. If you train a model on a moon-deniers' forum and it outputs stuff about the moon being fake, that's not hallucinating, that's just it accurately reproducing what it learned. Hallucinating is when it makes shit up because it doesn't know what else to say. Whereas if you trained a model on exclusively english text, and gave it a prompt in Russian, and it outputted a bunch of random Cyrillic characters because it has no experience with Russian, that would be hallucinating, even though a string of random Cyrillic characters is not "factually incorrect information". Simplificationalizer (talk) 15:19, 19 May 2024 (UTC)Reply

You're right, it has to be relative to the training data, I changed the definition. Jberkel 20:40, 19 May 2024 (UTC)Reply