I think 16 is only magic because it's very early in the sequence of binary-based numbers.

This is for anyone interested, not necessarily Rustyspoon. Let me give an example of why this used to be extremely important and is not now. Does everyone remember the fear when the date changed to 2000? The Y2K problem was that some programmers had used only two digits to represent dates (like 88 or 95 to represent 1988 or 1995). In the early days of coding, every byte (character) was important when designing files. Storage devices were small and expensive. No one wanted to allocate four bytes to store a number that wasn't important, or so they thought, if two bytes would do. Many of us teaching programming surmised the problem in the early 1990s and were hired to rewrite code to avoid the problem. Thus 2000 came and went smoothly.

Why is this story relevant here? Because BIAB was designed before the early 1990s. Every byte that had to be stored on a hard drive (or earlier than that, a cassette tape), mattered. Thus we have things like the 255 measure limit. Now, we have huge relatively inexpensive storage devices which cause almost no constraints on coding. A rewrite of BIAB would be needed to allocate sufficient space for future development. It's not a unique problem.


BIAB 2025 Win Audiophile. Software: Studio One 7 Pro, Swam horns, Acoustica-7, Notion 6, Song Master Pro, Win 11 Home. Hardware: Intel i9, 32 Gb; Roland Integra-7, Presonus 192 & Faderport 8, Royer 121, Adam Sub8 & Neumann 120 monitors.