Are not both using a form of AI?
I think they're probably not, though there are all sorts of philosophical possibilities.
I think "AI" means a software that learns for itself from exposure to data and forms an answer from that data by searching for patterns in the data that suggest some consistency upon which to draw a conclusion. The obvious problem is that the data has to be sound and sensible for that to work.
I think BiaB is a set of rules, presumably including some random seeding, created by people about what is expected of music production and that it probably does
not learn for itself.
The AI will be influenced by people, though, whether by the choices of data on which it is 'trained', or by public opinion, or by mischievous influence, and by how well it understands(?) cause and effect.
I read recently, though haven't bothered to try to verify it, that (ChatGPT?) was arguing very assertively that the current year is 2022 and was accusing it's questioner of bad faith in telling it that it was wrong and that the year was 2023. It's had a year of understanding that the year is 2022 and only a few weeks at most of people telling it otherwise. How long will it take to recognise that things change and it must unlearn certain facts that have become obsolete? Will it be able to tell that from deliberate lies?
That's really quite tricky.