Software has always been stumped with the "do what I'm thinking" challenge.

+1 Smarter Song search including instrument.
+1 On deeper meta data and improve classification in the library.
+1 Multiple column sorting (let the human find it)
+1 Pattern matching ala Toon Tracks Bandmate and tap to find
+1 Star similarity maps (just another way to zoom in)
+1 AI search, (if PGM can train on the help file, they can also train it on their meta data library) Note: This will only improve things if the data is there. (machine learning only learns from its source)

Not brought up but PGM did make a small step in the right direction in 2024 with the filter drop downs including the style in the Feel, BPM, signature fields, they just failed to make it the default.

The root of the issue is good meta data (data about data).
Odds are "do what I'm thinking" has something to do with genre, feel, time signature, BPM, note duration and instrument, so make it automatic the default and easy to zoom in or out of attribution. " Best substitution is also an attempt this space.


Studio One (latest version), Win 11 23H2 , i9 -10940X 3.3 GHz, 32GB Mem, a 4K 40" monitor, PreSonus Studio Live III Console as interface/controller. secondarily test on Reaper, Cakewalk, and S1 on Surface Pro 3 Win 10 (latest versions).