Brandon's Blog

6/17/2013

Servant to the Fuzz

There was a pointless discussion on Reddit regarding a TV whose volume level setting was a scale between zero and 63.  I hadn’t made the connections, but that implies at a bit-by-bit level they optimized their setting value into six bits (2^6 = 64, the number of settings from zero to 63).  Maybe there’s another bit or two used for indicating the mute and/or speaker disabling or something like that to use up the rest of the 8-bit byte, but regardless, from a technical standpoint it at least makes some kind of sense.

One usability person chimed in that a well-designed application would hide such details from the user, instead only showing a graphical indicator.  Essentially, if you can’t influence the scale of the number, just hide the details.

Of course, if you’re a little neurotic and like to know the number, this would probably drive you nuts.  I have the typical levels (at least ranges) memorized for our AV receiver, for example, and would be very irritated to have to think spatially to assess where I am in those ranges.

Given the kind of work I do right now, I just thought, why in the world wouldn’t you just multiply the 63-based number by 10063 to scale it up to a 100-based number?  The only reason we choose 100 is because of the “natural” concept of percentages, so why not just lie to the user and pretend you’re storing it as such?

It was just interesting to me that a reasonable sample of a technical crowd didn’t think to “misrepresent” the number; the only acceptable solution was to obfuscate, not transform.