I was reminded how terribly ChatGPT hallucinates this morning when I used it to look up the 7 point measurement locations for skin fold calipers to estimate body fat.
It correctly described the locations in text, then it offered to provide a diagram.
I said “sure”, and it generated an image saying the chest location is on the neck, and a bunch of other clearly incorrect locations for the other measurement sites.
It correctly described the locations in text, then it offered to provide a diagram.
I said “sure”, and it generated an image saying the chest location is on the neck, and a bunch of other clearly incorrect locations for the other measurement sites.
It’s gotten better. But it’s still bad.