Some thoughts on my years of college
In his famous last lecture, Randy Paush started by saying that when there’s an elephant in the room, one should start by introducing it. My name is Óscar Pereira, and I’m a Computer Engineer. I’ve spent the last years, from Fall of 2001 to Spring 2007, in Coimbra, getting the aforementioned degree. And I’m writing this document as a written statement of why I think those years were both the single biggest waste of time and resources of my life, and at the same time one of the biggest rip-offs I’ve ever been through (education-wise!).
It has been my intention for a while to write about this. However, I did not want to do so in the aftermath of the longest half-decade of my life, lest the recentness of events biased my accuracy on relating them (by failing to see the big picture properly—after all, it was half of a decade). Hence the only starting to write now. The last remaining issue in need of some explaining is the laguage: all the events related here took place in Portugal, among Portuguese people. So it does feel a little strange that I should choose to account for such events in English. The reason is quite simply that despite the “Portuguese nature” of said events, some of the conclusions are important enough to not be restricted to Poruguese readers.
How it all began
I enrolled in the Department of Informatics Engineering of the University of Coimbra (Portugal), in October of 2001. My objective in doing so was to learn about computers—that one was obvious—and the knowledge and technology that loiters around them: mathematics, programming, security, and information management in general. There are more, but those are the ones that most interested me. And for a while, that was what happened. But the signs of rot started early.
The first one of those was the Java programming language. It was the language used to teach the students how to program. I had some limited previous experience with C (but not C++), and as far as programming languages were concerned, that was it. So when we started learning Java, I delved into it. And pretty soon, I started to notice something really weird. In several assignments, after I’d thought out the algorithm and started to code, there were compiler errors I just could not fix. The teacher would then reply something like: “yeah, I know why this happens, but don’t worry about it, you’ll learn how to deal with that in the next semester, when you learn object oriented programming”. If you happen to know Java, you can probably surmise where this is going.
By the end of the semester, I had really started to dislike Java, but I could not pin-point what exactly I didn’t like about it. I mean, of course they should have taught that “object oriented” stuff earlier, but other than that, it was just another programming language. Eventually things eased out, but I never really quite got what was it about having Java as one of the first programming languages that caused me so much discomfort. Until, much later, I came across this:
Let us propose the following principle: The irresistible beauty of programming consists in the reduction of complex formal processes to a very small set of primitive operations. Java, instead of exposing this beauty, encourages the programmer to approach problem-solving like a plumber in a hardware store: by rummaging through a multitude of drawers (i.e. packages) we will end up finding some gadget (i.e. class) that does roughly what we want. How it does it is not interesting! The result is a student who knows how to put a simple program together, but does not know how to program.
Sometimes, the best things you can read are those that tell you what you already know. For something like this to happen, in place where programming is suppose to be taught, is nothing short of unacceptable. Recently (one year and half ago, I think) they started teaching Python as a first programming language. I think that was a very good decision. Why did it took them so long? I’ve no idea… (I don’t mean why did it took them so long to change to Python, I mean why did it took them so long to change it, period! Even C would have been better…) I can only surmise that they were trying to give the (Portuguese) software industry what it wanted (Java web developers). Which, by the way, is not what an university should do with its alumni. But that’s another story…
As an end note, I add the following: I’m not saying one should not learn Java, end of story. The authors of the aforementioned article name two reasons to learn Java viz., reflexion and the relatively friendly way of using threads. What I (and the authors) say, it that using Java to teach how to program is a very ill-devised strategy, and one that will produce mediocre programmers at best. The following quote, also cited in the article, from Bjarne Stroustrup, the creator of the C++ language, should prove insightful:
It [Texas A&M] did [teach Java as the first language]. Then I started teaching C++ to the electrical engineers and when the EE students started to out-program the CS students, the CS department switched to C++.
The Engineering phase
The remainder of the first year, Java aside, is essentially math (and a puny class of physics). The next year, is when things really start to get interesting. It is also when the heavy workload really starts to kick in, but for all the stuff we were learning, I thought it worthwhile. Sadly, that’s the first and last year about which something like that could be said.
From the third year onwards, things started to go very wrong very fast. First of all, the workload, heavy as it was, increased at least threefold! Sometime near the end of that year, one teacher would tell me that “whoever can put up with the third year, can put up with anything” (free translation). !!! I heard a different form of the same bullshit next year, when a teacher said (shortly after the beginning of the semester, when he gave the class the toughest assignment of that semester): “it is my objective to put you under stress, and see what you can produce under those conditions” (again free translation). That would actually have been a very interesting challenge, if it had not lasted the entire semester!! You can’t sprint for the whole length of the marathon! Well you can try, but it is bound to yield a bad result…
Besides still retaining the focus on the technical aspects, most of the classes from third year onward start pushing in increasingly high dosages of management related tasks (viz. development methodology, documentation galore, …). This was madness. One thing is to teach, another is to make students do the work as if they were working in a software company. The reason why such a thing is madness, is not because it isn’t a good idea, but rather because most of the students can’t afford the resources to mimic such an environment. If the college can’t afford those either, as it is often the case, the students are left with an unsolvable riddle. They know they can’t do things right, so they are stuck with trying to perform as less badly as they can. This usually has one of two outcomes: the slackers will do things at their own (usually rather relaxed) pace, and accept that whatever will be, will be. The hard-working students, in turn, will try to do as “good” as they can, and end up over-stressed because their work will seem like an under-achievement, no matter how much effort they put into it. You’re welcome to try to decide which outcome is worse (and for whom). As an example of the aforementioned lack of resources, some of those projects involved more than four students. As very few can afford to hold meetings in their houses, they’re stuck with the rooms available for that use on campus—which usually are over-crammed and noisy, and not at all setup for a meeting to be held. The last resort is to meet in somebody’s house and, no matter how uncomfortable, try to get some work done. I’ve had direct experience of this—and so has almost everyone of my colleagues back then.
So in the end, you just end up doing the same work you would do if you were employed in a software company, having to learn (most of) the stuff on your own, but without the pay. As a matter of fact, to make this ordeal even worse, you’re the one actually paying tuition!, if not a rented room and other expenses from “going away” to college. You’d probably be better off working in a real company, learning as you go. The pay would be like crap in the beginning, but chances are that you’d learn faster than you do in college, and after five years, experience would more than compensate for the lack of formal “education”. Also, chances are you would get only one project, where you could focus, instead of the five or six classes per semester, each lectured as if it was the only the students were enrolled in for that semester (I remember one teacher acknowledging that he had no idea whatsoever of the other classes his students also had to work on that semester)! Madness indeed. Would it not be a better alternative instead, to regularly send the students for Summer internships, in software companies, where they could see how reality works? But this is just a suggestion, and I admit the possibility that there might be some reason for such an alternative to be unfeasible. I hope that by now, however, I’ve made it clear, beyond the shadow of a doubt, why the approach followed is an ill-devised and misguided one, that will do more to hurt students than to help them.
By this time, I was completely aware that I was not learning anything (well not anything useful), because even for those few classes I thought were worth the effort, the global schedule was so insane that I ended up dedicating the minimum strictly necessary to each class, trying to get a passing grade for as many classes as I could—and learning zero or close to it in the process. My teachers kept saying that the more knowledge, the better it will be for our CV’s and professional lives. How utterly wrong they were (and are still, for what I hear!). The reason why they, despite well-meaning, are nonetheless completely wrong, is well worth understanding: and that’s the subject of the next section.
College “High School”
In 2001, in his third year as a visiting professor at IST, MIT professor Michael Athans wrote a paper titled “Portuguese Research Universities: why not the best?”, in which he points some things that should be improved or corrected in order to improve the quality of the overall Portuguese higher education system. I strongly recommend reading this paper, specially to anyone related to engineering fields. In the paper, he makes several suggestions regarding faculty, relation with government and industry, undergraduates, and some more. My focus here will be, obviously, undergrads.
Here’s a comparison of MIT and IST undergrads, in Professor Athans’ own words (emphasis added):
(1). IST students are just as intellectually gifted and hard-working as their MIT counterparts
(2). In 5 years IST students have been taught almost twice as many technical subjects as compared to those of an average MIT student in a 4 year engineering curriculum.
(3). The depth of technical knowledge of MIT students is superior to that of their IST counterparts
(4). MIT students excel in independent thinking and problem-solving, while IST students are “spoon-fed”
(5). MIT students have more exposure to, and appreciation of, industrial issues and are far more sophisticated about the nature of the engineering research process than their IST counterparts
And there you have it, the cause for the lousy state of affairs I’ve described in the last paragraph of the last section! But don’t take my word for it; quoting again:
In my opinion, while IST engineering students have a far greater exposure to a variety of science and engineering subjects, their skills in deeply understanding and applying fundamental concepts, in conducting independent study, and executing complex problem- solving are inferior to those of their MIT counterparts. This state of affairs appears to be the consequence of treating undergraduate engineering education, learning and testing, as an extension of Portuguese high-school practices(+). Thus, although both IST and MIT undergraduate students start with the same intellectual credentials, IST students simply do not have the time for deep understanding and true mastery of the very large volume of the technical material that they have been taught.
(+) An example is the common practice to offer (non-hardware) “laboratory subjects” where problems are solved by the instructor. This represents, in my opinion, a time-wasteful process, reinforces an inferior mechanism for learning, and is representative of the “spoon-feeding” mentality. Portuguese students deserve better.
Of all the reasons why this is wrong, the most blatantly shocking of all is that this learn-all-you-can-as-fast-as-you-can mentality, while yielding students that essentially learn nothing, or close to it, neglects to teach them the most valuable lesson of all: learn to learn. Actually this is not quite so, because due to the aforementioned hectic organization of the different classes each semester and each year, whatever you happened to pick up in college, was most likely self taught. But given the facts that a) you’re the one paying tuition, supposedly for getting an education, and b) most engineering fields are characterized by fast-paced change, that learn to learn stuff should be THE VERY LAST THING you should have to pick up on your own. Confused? Well Professor Athans explains it better than me (again emphasis added):
In science and technology, and especially in engineering, technical obsolescence can occur in as little as 10 years. To safeguard against such technical obsolescence, undergraduate students must “learn how to learn” and “learn how to think” so that they are prepared for the inevitable life-long continuing-education requirements. It is far better to learn fundamentals well and in depth, rather than to fire-hose the students with a myriad of technical details, many of which may well become obsolete by the time the student graduates. In short, stop the current practice of having undergraduate engineering education mimic that of high-school; it only encourages mediocrity.
I can’t stress how urgent I think it is for anyone related to science or engineering activities in Portugal to read and spread this paper. Despite slightly dated, it is still one of the best accounts of what’s wrong with Portuguese universities, and how can some of those wrongs begin to be set right. I do not know why it is almost still unknown, but I do hope that changes, if for nothing else, at least for the students to come—they deserve better, and heck, the country deserves better!
The full article can be read here, or using the local copy.
Most of what I have told here, I was already aware of by the time I got to the end of the third year (though not with enough clarity to put in writing). Looking back with hindsight, I now see that I should have quit college then, and gone to work. I knew enough to start as a full time programmer (and truth be told, the coming years added very little to that), and by now I’d have a lot more experience, together with knowledge (and perhaps salary, but for more on this see the next section). But, for my eternal shame, I did not quit: rather, I endured it until graduation. I say “with hindsight” because, what I thought back then was that, despite all this, I had already been through so much, I might as well hang a couple of years and get a degree. What I did not know (and could not know), was that the worse was yet to come. Sometimes you really should listen to your gut rather than your brain.
But I was a rationalist (some will argue that I still am, but let’s not dwell on that, lest we digress), and so of course, I stayed and went on to get the diploma (which, by the way, was requested—and payed!—when I graduated (February 2007), but at the time of writing (December 2008), is still to arrive).
About the third and fourth years, I’ve said all I wanted to say. I could in fact have lumped those two years together, in what the account I wanted to make of them is concerned. Anyway, then along came the fifth (and final) year. The deal is like this: in the first semester, you must choose five subjects out of a list of them, and then on the second semester you should perform a final, semester long, project. In my case however, I still had one extra discipline: Database Management, the second! (second semester) Yes, this one was my true Achilles’ heel.
OK, so first immediate task at hand: choose optional subjects. I did not make much fuss about it, and chose what I thought would be a good trade-off between interesting and easy—which essentially amounted to networks (three different subjects on networks), simulation and systems integration. If most of my colleagues were by this time worrying about the choice of the final course project, I certainly was not. After all, by that time I was absolutely fed up with the course, my academic track record was less than brilliant (and even that’s quite possibly an over-statement…), and I was pretty sure the projects I’d find would be more of the same: paperwork, neatly packed rushed code, a final report and a presentation. The first semester came and went, and then the second came and went… earlier than expected. I ended up doing my (then) final project in a certain CMMI level 3 company in Coimbra, that despite that classification had one of the most messy chaotic project management systems I have encountered. That together with the fact that I still had Database Management 2 (along with its “small” project) to do forced me to quit my final project, and to dedicate myself that final database course.
Again looking back with hindsight, that result was to be expected. But I’m going besides my point here. I wanted to explain why the course largely sucked at what was supposed to be it’s main objective, and what happened in the fifth year was more the accumulated result of what preceded it, rather than faults per se. If you’re interested by the way, I finished DB management that semester, and then took another (this time for real) final project in the following semester (of my “sixth” year), this time in academia instead of corporate world. And then, a long five and a half years later, I was done.
Should I have chosen to not write this final section, the text could hardly be criticized for lack of completeness. However, I want to address here some criticisms that are easily anticipated.
The most blatant of all, is perhaps saying that all I’ve described here essentially means this: I was in the wrong course, I should have taken math or physics instead. Well, looking back with hindsight yet another time, my retort to this is that: 1) it’s absolutely right (and is so for several different reasons, but I digress), and 2) it does not invalidate any of the criticisms I make throughout the document. This last point bears repeating: my account of things is as objective as I am capable of, and not biased by whether I liked or not of what I was being taught (though, mind you, I often did not).
Another possible criticism is that nowadays college is a way to ensure a “good” salary. To put it another way, it’s an assurance of a “minimum” wage that’s higher than the national “minimum wage”. This is an extremely pervasive argument. Put yet another way, college is becoming a de facto part of compulsory education. This of course applies broadly beyond Informatics. And it’s fatally and hopelessly wrong, as I explained elsewhere. In addition, that mentality of “throwing” everyone at college, almost as if by instinct, has in the case of Informatics, a disastrous effect, which a former co-worker described very engagingly (Portuguese only).
The last possible criticism I’ll argue about here is that things have improved, and are no longer as bad. It’s certainly true things are not as bad as they used to be (I’ve talked about the move from Java to Python above), and I heard of several other changes being made. But the words I’ve used in the beginning,
those [college] years were both the single biggest waste of time and resources of my life, and at the same time one of the biggest rip-offs I’ve ever been through (education-wise!)
still remain sadly accurate. I can only hope what I have written will help to change that, although I don’t expect that change any time soon. And even after said change takes place, I’ll still keep this reference, to remind and to keep account of how things once were—lest we forget. For those who forget their history, are truly the ones doomed to repeat it.