In defense of (graduate) school and Computer Science

Every few days, a new hysterical screed against formal education, undergrad studies, or grad studies comes across my screen. As a result I've been mulling over the hot-button issue of academic study, both undergrad and grad, for the last few months. Why are there so many loud people decrying the usefulness of a formal education? Which of the (many!) things that are shoved down your throat in academia are worthwhile, and which aren't? Why does this country (the USA) have such a focus on it -- and why do US institutions have such a good academic rep?

Unfortunately I think it's going to be difficult for me to appear neutral in this discussion, because I'm now firmly ensconsed as a member of the establishment (well, barring tenure ;). I'm the child of an academic (my father is a physics professor); I went to a snooty liberal arts undergrad college (Reed College, BA Math); I received my PhD at an institution that is a pillars of modern research (Caltech, PhD Biology); and I'm starting a faculty position (at Michigan State U.) in Computer Science and Microbiology & Molecular Genetics. This last position will no doubt require me to extort work out of dozens if not hundreds of graduate students, with the full intent of making their lives as miserable as mine was during much of my own graduate school experience.

On the flip side, I'm certainly not your traditional 1980s ivory tower academic. I was introduced to Open Source programming by Mark Galassi back in high school, and I've been a contributor to various projects ever since. I'm especially active (or I try to be active) in contributing to a bunch of projects in the Python community. During my graduate work I implemented actual! functioning! software! that is used by hundreds of people. I've consulted for real companies, I've run one or two (unsuccessful) startups, and I've written one fairly pragmatic book on Web testing. I got the faculty offer at MSU not just in spite of this, but because I expressed a certain (guarded) unhappiness with computer science in general -- I still don't know what they were thinking, but I'm happy they offered me a job ;).

So I think that while I'm not a card carrying genius (like Paul Graham) or an annoying corporate know-it-all (like Joel Spolsky) or even a terribly successful academic (like, umm... lots of people?) I have gotten my feet damp in all these areas. I know what real software looks like. I know what academic software looks like (oh boy, do I ever!) I've done scientific research, albeit not so much in computer science as in biology. In fact, my interest in software development testing comes from having seen how corporations are often at a loss to implement novel features and pragmatic test regimes; how research software presents different programming challenges while also being a source of unending (and often useless) novelty; and how open source programs and methodology provide solutions, however confusing and contradictory, to many corporate and academic challenges.

Now that I've presented my creds and tried to convince you that I have a chance of knowing what I'm talking about, let's set up some straw men.

Straw man #1: Computer Science undergrad education is useless.

I'm sensitive to this argument because I frankly have no formal CS training. I took a compiler class (C+), an OS course (B? A? not sure), and a graphics course (F) back in the dawn of time. Virtually everything I know about programming I learned by myself or from others, and virtually none of it is taught in any formal CS curriculum that I've seen.

Now that I've seen the elephant, however, I am sad that I didn't pay more attention when I had the chance. The compiler course would have been really useful, in particular, for reasons discussed @CTB here. The few things I remember from the OS course come in handy every time I need to think about network programming, memory management, and efficient OS interaction.

So I would say this: CS undergrad curricula has a much tougher job to do than most give it credit. CS undergrad first has to separate the wheat from the chaff: there are unmotivated people, there are dumb people, and there is even some evidence that people who lack a certain mindset cannot learn to program. Then people with at least minimal motivation and aptitude must be taught, from scratch, to program -- sometimes over the corpses of their earlier high school or open source/PHP/Perl training. Finally, these programmers, often with less than two years of actual training, are thrown into the cauldron of design, methodologies, and implementation details that characterizes even an approximation of real-world software development. While the poor students are struggling with that load, they're expected to master the theoretical aspects of algorithmic analysis, compiler design, language theory, image processing, and lord knows what else. In another 2 years.

Given all of that, is it any wonder that most CS undergrad curricula are a mess?

The real question to be asked, however, is: does this all serve a useful purpose? And there I think the answer is, unambiguously, yes. The purpose is not to churn out quality programmers, because that's impossible without actual exposure to real-world programming. The purpose is to educate people in the breadth of possibilities -- to give them a taste of what is out there, with recourse to experienced, knowledgeable, and hopefully inspiring professors whose job it is to answer questions.

Straw man #2: Graduate school is generally a waste of time.

Comments !

social