For me, it's processes that I've begun using, that I simply don't want to give up.
It's like automobiles. Nearly every vehicle in showrooms today has cruise control and power locks, power windows, automatic transmissions (at least here in the US), CD players, etc. etc. etc.
These are things that are not central to being what an automobile is but they have become features that people are willing to pay for because they have become conveniences. None of them arguably make the 'automobile' aspect of transportation any 'better'.
Here are the processes and functions that caused me to leave years ago:
ASIO use for playthrough of VSTi - Once I got this bug, I couldn't let go. I think it's now a feature.
Tempo integration of VSTi - for the only sound generating plugin I've ever purchased; Jamstix. Without this feature, Jamstix is an expensive drum sample playback plugin. With it, it's a decent substitute for a drummer.
Tempo integration for effect plugins. I only use delays and autofilters that have note values instead of needing to calculate mS and so forth. If I decide to change the tempo, it's just one less thing to keep track of (like shifting in a car).
Flexible signal routing that is more 'modular' in nature. I have spent the past few days babysitting my DAW, as I've been eq'ing some CDs for a friend that has significant hearing loss in his left ear. I was able to make an 'equipment rack' where I chained together 4 linear phase graphic EQs for each ear, after a gain reduction plugin on each channel, and what I did with the 4 EQs for the left ear was make a cumulative EQ which basically 'reverses' my friends left-to-right hearing sensitivity. I had to put in a 30 dB or so gain reduction first on each side because his loss is at points 30 dB differences between left and right ears. I did all of this 'graphically' meaning, I wired the whole thing together with virtual patch cords. In the right side, I also put the 4 EQs in series, but I made no gain adjustments - I strictly put them in there to account for the delays that happen for each stage of the left ear. The end result sounds terrible to me, but to him, it brings tears of joy to his eyes - literally.
What I miss by not using RB/BIAB:
1. Notation - the DAW I use does not have any notation to speak of. Seeing as I really can only read a single melody line these days, it really doesn't leave me out of my MO too much.
2. Auto-orchestration (this is NOT a standard feature with any other software). I bought Jamstix for this for drums, which is pretty much the only instrument that I typically use that I can't play serviceably. If I used horns regularly, then I would be much closer to pulling the trigger on RB/BIAB. It's probably a chicken-or-the-egg thing, but I've never composed with horn parts in mind, likely because I have no idea how to write for them.
Anyway, those are my reasons.