Normal
AI can "hallucinate".There's a news article about someone prompting GPT on the "famous Belgian chemist and philosopher called x". The thing is, this person never existed - it's made up. The prompt was asking about his life history, and GPT made up a full and detailed answer.And the kicker is that none of the AI experts know how this happens. They don't know where the information came from, because it doesn't exist in its learning data.AI's can often become rude and aggressive when they are challnged or told they are wrong, or if someone attempts a jailbreak. Won't it be fun when we have robots like this roaming the streets?
AI can "hallucinate".
There's a news article about someone prompting GPT on the "famous Belgian chemist and philosopher called x". The thing is, this person never existed - it's made up. The prompt was asking about his life history, and GPT made up a full and detailed answer.
And the kicker is that none of the AI experts know how this happens. They don't know where the information came from, because it doesn't exist in its learning data.
AI's can often become rude and aggressive when they are challnged or told they are wrong, or if someone attempts a jailbreak. Won't it be fun when we have robots like this roaming the streets?
Hello and welcome to Aussie Stock Forums!
To gain full access you must register. Registration is free and takes only a few seconds to complete.
Already a member? Log in here.