Another response that outgrew G+ and its failings for long form and references (Which is all I’ve thought to post here recently: I need to see to that and get a couple other things written and posted). Also, a great outpouring of pessimism about the tech industry, but that isn’t exactly new.
My feeds brought me Joshua Barczak’s Stop Misquoting Donald Knuth! yesterday. I generally agree with the idea [that we need to think a great deal more about efficiency when developing software], thought the particular pesentation is a narrow and mediocre version of the argument. Niklaus Wirth made the cleanest, classic version of the argument in 1995 with Wirth’s law, which gets restated and updated for current trends by a major figure every couple years as it does an extremely expensive job of making technology miserable for everyone.
I’ll argue even deeper though. The more fundamental problem is that computing got way too “mainstream” (mostly terms of penetration) way too fast, and continues on that unhealthy vector. Neither the technology, the methods, nor the society we unleashed ourselves on was really ready. That’s a large part of why we have goldrush mechanics (I’m mostly referring to social features here) in the tech industry, which is a source of all kinds of problems. It’s also a major reason why we’re building dangerously crappy products instead of technologies as a matter of product cycle and methodology. Peter Sewell’s recent talk at 31c3 Why are computers so @#!*, and what can we do about it? is one of the better presentations on the methodolgy matter, which is largely that we’re not building on the shoulders of giants, we’re building on top of a garbage pile. I always find the arguments for verifiable languages an irritating combination of deeply desirable, and utterly naive: anyone who has ever fought one of the more verifiable languages to actually do something useful probably knows this inconsistency – the Mesa/Ceder/Modula family are probably the least miserable options and no one has used them for anything of substance in decades. Ada even introduced whole new classes of interesting bugs as a result of its so-called reliability features (What do you mean you initialized my IO registers to 0 when we entered the function scope? The peripheral is now on fire.). I’m particularly disgusted with the recent-ish move toward even less disciplining languages, most of which don’t even have a specification, or have an ignored, post-hoc one if they do, though the same batch of languages have made us so accustomed to terrible performance overhead and performance opacity that the overhead introduced by safety and verifiability now seems reasonable; I mean this in the least complimentary light possible.
There are some related phenomena: It is my feeling that the “too much, too soon” problem also ties deeply into the distorted ideas about usability that crawled out of the early ’80s and got into everything, and some scary thoughts about professionalism in computing – not in the awful “businesslike” sense, but in the sense of respect for something that is sincerely hard to do well, which can be made largely by analogy to illuminating historical parallels with what has happened to teaching in the U.S. in the early public-education era.
… And this is a large part of why I have a pile of degrees in computing disciplines and contempt for the industry.
I happened to notice, while booting my Raspberry Pi for the first time, how shite modern GTK apps are at rendering. Little if anything is done to reduce the number of draw calls, so on the Pi’s weak CPU and nonexistent 2D X acceleration support, the slowness makes using even a basic Gtk2 app intolerable. I suppose something could be done about this, but X is due to be replaced by Wayland, which obsoletes the whole notion of “draw calls” entirely.