Quote:
So what is with the AI instructions which are so off the mark? Is it just guessing? How can it be so intelligent and be so wrong?


The answer was wrong but actually not too far off. In the Notation Window, you DO get the Insert and edit Section Text context menu items, and it works exactly like it described. It was wrong about the context menu item being in the chord sheet, although it WOULD make sense for us to have a menu item for the section text layer in the Chord Sheet's context menu. It is not clear why it was saying "depending on the software" since it should 'know' that it's answering the question in the context of the Band-in-a-Box software.

To answer your question, it can be wrong for the same reasons that ChatGPT is often wrong since it uses language models developed by Open AI. All of this technology currently suffers from the very significant and poorly understood problem of hallucinations - sometimes generating false or nonsensical information that doesn't seem to fit with the data it was fed. You will have encountered this if you use ChatGPT for factual data (as opposed to creative writing for example) - answers that sound extremely confident but are inaccurate or even completely bogus. ( as an aside, if you think you haven't encountered this, just make sure to verify things you know nothing about with other sources... )

We can improve the quality and amount of data it has about Band-in-a-Box, which should help reduce hallucinations. I asked it a few random questions about Band-in-a-Box and it did a decent job. I find that it can answer some questions correctly even if the answer is not in the data. For example, somehow it could generate an answer on entering a blues progression (not a perfect answer, but a correct one), even though that specific information is not in the docs.





...






Last edited by Andrew - PG Music; 09/22/23 10:17 AM.

Andrew
PG Music Inc.