Nowadays most programmers don’t need to care about working with bits directly. And in general, it’s much better if the system is aligned with literally every other measurement unit in meaning. I also think it’s oftentimes deceiving exactly because it’s so close to 1000 that you just behave like it is, untill it actually starts making a difference at larger scales.
I think that for most people in 99% of usecases it would be better for MB to actually mean mega, and for the 1% you can clarify with MiB that it’s 1024.
I suppose if you’re staying in a high level like JS, yeah, but if you’re sitting there defining the width of your types with stuff like uint16_t or int32_t, you probably want to be using the former system.
Nowadays most programmers don’t need to care about working with bits directly. And in general, it’s much better if the system is aligned with literally every other measurement unit in meaning. I also think it’s oftentimes deceiving exactly because it’s so close to 1000 that you just behave like it is, untill it actually starts making a difference at larger scales.
I think that for most people in 99% of usecases it would be better for MB to actually mean mega, and for the 1% you can clarify with MiB that it’s 1024.
I suppose if you’re staying in a high level like JS, yeah, but if you’re sitting there defining the width of your types with stuff like uint16_t or int32_t, you probably want to be using the former system.