and I'll teach you, teach you, teach you, I'll teach you the numeric slide

Sliding numeric scales should pretty much never show up outside of color pickers, volume controllers, and other things so (a) closely linked to sensory input that people get emotional and (b) providing such instantaneous feedback that binary search is possible without taking notes. Not that I write code with UI's, but this applies just as well to bullshit command line arguments like gzip's compression-vs-time level, oggenc's quality-vs-space level, mpd/pulseaudio's sampling-vs-cpu level, and gcc's optimization-vs-everything turkey shoot of -O and -f options.

If gcc had broken tradition with past compilers (Cray and SGI compilers I've known supported a gamut of -O levels beyond -O3, and surely a surfeit of others as well), and called...actually, I guess I give compilers a pass here: you don't want to throw crap like --optimize-space into a compilation line, due to iterated consumption of terminal real estate during a build. Exactly as -Os (optimize for space on gcc) indicates, you'd end up with a series of difficult-to-remember flags, ala the -Q options on Intel's icc c++ compiler (then again, how often need you remember optimization flags outside makefile creation?) I think, however, that the generated-code-speed vs. compilation-time tradeoff will slide (has already slid?) in importance, and the competing axes of compilation decisions -- power-vs-performance, codespace-vs-codecycles, parallelism-vs-fastserial, all that -- will render a single vector meaningless. At this point, you'll need even more rely on descriptions of processing environments and architectures, and almost certainly parameterize on them at runtime.

Let's remove the optimization level decision, aside from as a debugging aid, from the realm of the code's creators and to the code's executers. It'd be a win all around.

Oh, to complete an earlier thought: if -O0..-O9 had never been introduced, I wouldn't have to listen to someone say "I think it ran faster with -O4 than -O2", and tell them "uhhh, gcc on x86 has never meaningfully supported an optimization level higher than -O3." This happens every few years, and it typically results in a bad scene. I'd like to bring out rms ala Annie Hall: "I heard, I heard what you were saying. You, you know nothing of my work. How you ever were hired to write any kind of code is totally amazing." Tyro assholes!


  1. sliding scales are also rather nonfunctional for sensory intensity outside of specific ranges. It is fairly well assumed that sensory inputs follow a ln(1/1+x) sort of function, which you may recognize from enzyme kinetics. This takes into account thresholds (the tail of the function) and the fact that 10x "FUCKING LOUD AS HELL" is still "FUCKING LOUD AS HELL" (shoulders). Search around for "Beidler's Taste Equation."

    This makes sense especially in taste, because you actually are dealing with receptors, although sound tends to work the same way (hence why dB is more useful than absolute amplitude).

  2. hah, yeah it would be nice to have the appropriate Marshall McLuhan on-call for any situation