Hi Gordon
Yes, ChatGPTdoes tend to hallucinate quite a bit. I discovered this when I tried to develop a Bridge (card game) app. It simply got some answers wrong. I decided that it didn't know how to play bridge (yet)!

Interestingly, in this exercise, it did say "Unknown" when it did not know an answer (for example on my own original songs). It does seems to have done a fairly good job and I can sometimes rely on people to give me corrections where it is wrong.


LyricLab – Where words become music https://www.lyriclab.net/