When I was in high school, I used to amuse myself by multiplying four digit numbers in my head. It was a point of pride that I had memorized all of my friends’ phone numbers. I even worked out my own (possibly incorrect) version of the Doomsday rule (no, no, of course there was no good reason to do this). And then, one day, I bought a Casio calculator watch, with the ability to store up to 50 phone numbers. There was no longer a need to multiply numbers in my head, or to memorize phone numbers, and those skills disappeared.
When I was in college, I spent thousands of hours studying Japanese vocabulary. I repeatedly (much to my roommate’s annoyance – sorry, Gerg) wrote out Chinese characters, said the Japanese and/or Chinese pronunciation, then the English translation. Scribble scribble scribble. “Zhang”. “Paper shaped thing”. Scribble scribble scribble. “Zhang”. “Paper shaped thing”. The first year was torture. By the second year I had a system. And by the third year my brain had been wired to do this one thing very, very well.
My first job after college was as a technical trainer in Tokyo. I figured that working in an all-Japanese environment, and teaching 8 hour technical classes in Japanese, would be a good way to solidify my language skills (it was). I did this for a little over a year and a half, came home, blinked, and it’s been 17 years since I’ve used Japanese on a daily basis. I’ve started using it again (I’m trying to speak to my daughter exclusively in Japanese), and even after over a year, it’s still pretty rough.
Of course, it should come as no surprise that we lose skills we don’t constantly exercise. What got me thinking about this a bit more seriously was the genteel discussion between Vivek Haldar and Nicholas Carr over the way in which we’re automating away basic understanding of underlying principles of computer programming. Lost skills aren’t simply an individual matter – in some cases they’re being completely erased from our profession.
This is not all bad, as I’ve discussed previously – better tools, improved infrastructure, and higher levels of abstraction have led to dramatically better programmer productivity. It’s also expanded the pool of “programmers” from the traditional conception of BS/MS/PhD CS graduates, to include those with a wide variety of more specialized skills (e.g., web devs, Flash, Rails, Drupal, iOS, etc.). And it’s enabled completely non-technical people to perform automated programming tasks using specialized tools (e.g., my non-technical sister who set up a surprisingly nice web page for her business using a template and automated tools, or data analysts who are able to use query generators to access database tables without understanding SQL).
However, as Joel Spolsky points out (by way of Vivek Haldar), all abstractions are leaky. If you’re using an automated tool to create an HTML page, and the tool does something confusing, you won’t be able to debug the problem unless you actually understand how HTML works. If your automagically-created query takes ten hours to complete, you won’t be able to optimize it unless you understand how indexes and query plans work.
About 10 years ago, there was a lot of hand-wringing over the switch from C++ to Java in many university CS programs (as, I’m sure, there was much hand-wringing when programs moved from Fortran to C, or C to C++). How could someone who didn’t have a deep understanding of pointer arithmetic and memory management actually understand what they’re doing? And indeed, anyone who programs in a language with automatic garbage collection (i.e. all(?) modern languages) is not going to have the same deep understanding of how pointers, virtual function tables, stack- vs. heap-based memory allocation, and so on, work. It all just happens magically behind the scenes.
I interview a lot of people, and it’s interesting to see the difference between how C++, Java, and Python programmers approach memory. C++ programmers are especially cautious about how much (and when) they allocate, Java programmers aren’t quite so picky, and Python coders think nothing of creating N lists and recursively appending them to each other.
To me, this has a deep and persistent code smell. It seems, however, to be the wave of the future. In The Hundred-Year Language, Paul Graham talks about how one day (presumably around the same time all restaurants will be Taco Bell), everything – integers, Strings, etc. – will be lists. But these aren’t just the mad ravings of a Lisp zealot – the current roadmap for Java calls for primitives to be gone by JDK 10. Garbage collection has become the norm, even for supposed “systems level languages” like Go. There’s a trend toward magic in frameworks like Rails. Even hot-shot coders don’t know the key code points in ASCII, how to do bit operations, or the powers of 2 past 64, and who the hell even understands how concurrency really works anymore?
In most situations, this is a positive development – you shouldn’t have to understand how Dekker’s algorithm works to be able to create a critical section. But I worry that we, as a population, are losing skills. We can throw together e-commerce video-sharing blogging sites in just four lines of code, but no one knows how or why they work.
I think the problem that Carr is pointing out is particularly timely, because we’re living through an important inflection point. There are still a lot of active programmers who came of age before the current raft of automation, but they’re increasingly moving into “architect” and management positions, losing their coding skills, and being replaced either by the new wave of more-productive-but-less-technically-deep engineers, or by the specialists described above. In twenty years, a far smaller number – both relative and absolute – of our technical leadership will have spent significant time at lower levels of abstraction, and in forty years, only systems engineers will understand how things actually work.
Ultimately, I agree with Haldar when he says that automation is a great tool if you understand how to get things done without it. Knowing how to use a bread machine doesn’t make you a great baker – but if you’re already a great baker, having earned it the hard way (there’s no other way), then you’ll know if using a bread machine is the appropriate technology in a specific situation. If you’ve never corrupted memory, overwritten the frame buffer, had to track down memory leaks, written your own on-the-cheap memory manager, or thought through the word boundaries of a packed struct, then one day you’re going to add hundreds of thousands of key/value pairs to a locally declared HashMap<Integer,Integer>, get crushed by GC, and have no idea why.