I bought myself a TP-Link Archer C7 because the 2.4Ghz congestion in my apartment has become so terrible that my good old TP-Link WR1043ND (no 5Ghz radios) is no longer adequate, and the C7 was very well spoken of among reasonably-priced 802.11ac routers. It also has some nice perks like two on-board USB ports, so I can use it as a print server (with p910nd) and have USB storage for logs (vnstat & co.) and such attached without a separate hub. I wasn’t feeling quite motivated enough to buy and set up one of the NUC-like cheap SFF Intel boxes as a router like Ars Technica and Jeff Atwood have recently noted is an increasingly good plan, based largely on the dearth of ac WiFi cards that work reliably in host mode.
Some notes that may be of use to others, particularly about firmware replacement on recent models and throughput:
I gave an informal talk for the IEEE student branch about breaking in to your own devices this evening. I did the low-postable-content notes with live examples and links thing, but at least one person wanted to watch the video links, so here are the notes. There is something delightful about giving talks that require legal disclaimers. I don’t think there is anything in here that will get me in trouble…
A generic version of the email you will receive on a daily basis from the University of BS (Which is probably the school you deal with), as you will read it after the first few repetitions. Graduates can relive their college experience, or, for current students, simply stop checking your email and skim this page every day.
From: Dean of Posterior Coverage <email@example.com>
Subject: Mandatory CYA Training
Body: All students need to take this course that the university paid a fortune to a third-party ed-tech carpetbagger to license, which provides the absolute minimum coverage of an issue required under a new federal regulation. Everyone must take it, because otherwise we might be liable for your behavior.
This is why tuition is so high.
My 5-year-old T510 has been showing its age, mostly by falling apart (speakers died, parts held on with gaffers tape, occasional probably-thermal GPU lockups, etc.) and I decided it was time for a replacement. Unfortunately, the laptop market right now has absolutely nothing appealing (clickpads everywhere!) so I gave up and bought a closest-match Clevo P650SE chassis that I could kit out myself. There are a couple annoyances, but overall it’s a nice machine. Really long detail notes including a bunch of Linux tweaking below.
I recently picked up a USB RFID reader/writer pod to play with, partly to learn enough to be dangerous about the tech, and partly hoping to tamper with the RFIDs in the current university ID cards. I’m pretty sure I failed on the latter point, but am succeeding at the former in the process.
Notes from the first round of fiddling with it follow.
MyTouch 4G Slide vs. SGS5, flat dimensions.
I got a Samsung Galaxy S5 (The T-Mobile flavor, SM-900T) a week ago. I’m pretty pleased with it overall, but many of the impressions worth sharing are not positive, particularly of the small-but-stupid variety. Much of the animus (and credit) is really for Google, not Samsung.
I’ve been playing with chorded input devices for years, and got the itch again recently.
The ancient FreeBSD/i386 box that was running cgi.aggregate.org finally keeled over. I just blew my afternoon/evening scraping the data off of it because while the most critical stuff was backed up, not everything was. Some keyword and error string filled notes that will hopefully save others swearing and googling:
I gave a quick primer on 3D printing as a Keeping Current (CS departmental) seminar yesterday.
It’s not a great standalone deck but posted (in reduced quality,
gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -dQUIET -dBATCH -sOutputFile=/path/to/output.pdf /path/to/input.pdf is good magic).
It seemed to be well-received, printed objects were played with, thoughtful questions were asked, I think this is the first time I’ve given a 3D printing primer to a wider group and not been asked the “guns” question.
Another response that outgrew G+ and its failings for long form and references (Which is all I’ve thought to post here recently: I need to see to that and get a couple other things written and posted). Also, a great outpouring of pessimism about the tech industry, but that isn’t exactly new.
My feeds brought me Joshua Barczak’s Stop Misquoting Donald Knuth! yesterday. I generally agree with the idea [that we need to think a great deal more about efficiency when developing software], thought the particular pesentation is a narrow and mediocre version of the argument. Niklaus Wirth made the cleanest, classic version of the argument in 1995 with Wirth’s law, which gets restated and updated for current trends by a major figure every couple years as it does an extremely expensive job of making technology miserable for everyone.
I’ll argue even deeper though. The more fundamental problem is that computing got way too “mainstream” (mostly terms of penetration) way too fast, and continues on that unhealthy vector. Neither the technology, the methods, nor the society we unleashed ourselves on was really ready. That’s a large part of why we have goldrush mechanics (I’m mostly referring to social features here) in the tech industry, which is a source of all kinds of problems. It’s also a major reason why we’re building dangerously crappy products instead of technologies as a matter of product cycle and methodology. Peter Sewell’s recent talk at 31c3 Why are computers so @#!*, and what can we do about it? is one of the better presentations on the methodolgy matter, which is largely that we’re not building on the shoulders of giants, we’re building on top of a garbage pile. I always find the arguments for verifiable languages an irritating combination of deeply desirable, and utterly naive: anyone who has ever fought one of the more verifiable languages to actually do something useful probably knows this inconsistency – the Mesa/Ceder/Modula family are probably the least miserable options and no one has used them for anything of substance in decades. Ada even introduced whole new classes of interesting bugs as a result of its so-called reliability features (What do you mean you initialized my IO registers to 0 when we entered the function scope? The peripheral is now on fire.). I’m particularly disgusted with the recent-ish move toward even less disciplining languages, most of which don’t even have a specification, or have an ignored, post-hoc one if they do, though the same batch of languages have made us so accustomed to terrible performance overhead and performance opacity that the overhead introduced by safety and verifiability now seems reasonable; I mean this in the least complimentary light possible.
There are some related phenomena: It is my feeling that the “too much, too soon” problem also ties deeply into the distorted ideas about usability that crawled out of the early ’80s and got into everything, and some scary thoughts about professionalism in computing – not in the awful “businesslike” sense, but in the sense of respect for something that is sincerely hard to do well, which can be made largely by analogy to illuminating historical parallels with what has happened to teaching in the U.S. in the early public-education era.
… And this is a large part of why I have a pile of degrees in computing disciplines and contempt for the industry.