At It For Twenty Years

Twenty years ago, I timidly walked into a room filled with welcoming nerds and got my first gig in the computer industry. It wasn't much: I wrote Flash tutorials for a local magazine. It wasn't a programming gig, which is what I was really aiming for, but it was something. It got my foot in the door, and my first programming gig would follow soon after.

I'd known I wanted to do this ever since I was nine. I'd seen computers before that, and played some games, but never really got to fiddle with a computer. Then, tired of burning the midnight oil at the one computer he and five other PhD students shared in a decrepit office, my dad bought a computer and brought it home. A friend of my parents nudged me in the right direction -- showed me a few games, showed me how to use a text editor, brought me a programming book -- and after a few hours of poking at this heartless gadget that held an entire world in its beige case, I knew this was what I wanted to do.

I am sadly no longer in touch with the people who worked there, so there is no way for me to tell them how incredibly grateful I am, and how in awe I am at their courage and their patience. This was a real computer magazine -- a local edition of a global franchise, read by tens of thousands of people every month. And these people had the guts to publish stuff written by an arrogant kid who hadn't the faintest idea about how clueless he actually was. They had to teach me not only the basics of journalism and writing for a technical audience, but a lot of basic common sense, too. I cringe a lot when I read some of the stuff I published. Thankfully, I don't have any of the drafts.

A Whole Other Era

The first draft of this post had a long section about just how different things were back then. I thought it made for a good trip down the memory lane, but not much else. The point I wanted to make -- that this was a completely different era -- can be more concisely illustrated through just two ancedotes: one about how software was learned, used, and distributed, and one about the computer industry.

First, let me tell you how I learned Flash. See, I had no Internet line at home, and nobody would ship a Flash book to my part of the world, not that I had any way to pay for one. Instead, I went to the Internet cafe down the street with a floppy disk, downloaded whatever looked interesting, and took it back home. Then I'd pore over all that stuff until no more secrets were left, or until I got bored. Then I went to the Internet cafe again.

All this was done using Flash 4, which I installed off of a warez CD that I got from a very shady dude who sold CDs a couple of streets down from the school I went to, after saving a few weeks' worth of allowance and/or lunch/school trip money.

Two years passed between Flash 4 and Flash 5, during which Flash 4 was all there was. If anyone moved fast and broke things at Macromedia, most of the breakage remained unshipped. What were they going to do, send you a CD every week?

Second, that story about how I got my job? That's pretty much how I tell it to everyone. Here's the part that I don't tell interns, because it breaks my heart to see these bright kids agonizing over their CVs and cramming personal projects, volunteering, summer schools and Github portfolios in the hope that the corporate hiring machine will notice them.

Where I'm from, you've got to be at least 14 to work, or at least that's how it was twenty years ago or so. Your parents have to sign off on it, and there are some checks and balances in place to make sure you can keep going to school and that you don't end up digging coal for some cigar-smoking robber baron, but it can be done.

I know that because I was not yet fourteen when I walked into that office for the first time. My birthday is in October, so we agreed that my first article would be published in the November issue, so that there would be no paper trail leading to a part of my life when I was not yet fourteen.

I had no work experience. Hell I barely had any school experience -- I was literally closer to being in kindergarten than finishing university. My "interview" consisted of sitting in front of a computer running Flash 6 and goofing around with two other people. I spent a good couple of minutes fumbling my way through some basics because I'd never seen Flash 6 before -- it was pretty new, and I'd only worked with Flash 5. The whole "coding interview" took like ten minutes. The whole candidate pre-screening process was yes, I'm really 14, yes, I really know how to do this, yes, of course my parents know I'm here, ummm, yeah, okay, sure, you can talk to my parents, here, hang on.

Flash wasn't exactly new at the time, but it was still new enough -- and there were few enough people who knew it and were willing to write tutorials -- that several responsible, well-adjusted grown-ups who hated paperwork jumped through a whole lot of hoops just to get me in through the door. They did it knowing full well that I'd be a handful, too. I didn't have a CV, but if I'd had one, and I'd submitted it today, it's unlikely that it would earn me as much as an interview for an internship position.

The fact that some of the programs we use today -- hell, even some of the programs I use to write this blog post, like Emacs -- are still around is completely deceiving. Kind of like, yes, they had horse-drawn carriages back in 1903, but the Middle Ages were firmly behind when the Wright Flyer took off. Twenty years ago, we were living in a completely different era.

What I Learned

Most of the things I learned in these twenty years are now firmly in the computer archaeology department. Flash was cool but I never really got into web programming. Last time I touched a web project, PHP 5 was still fresh. Administering Unix servers got me into writing Unix code. Then I wanted to work on embedded systems, and for a while I did just that, aided considerably by the fact that I figured you can't really do that without a thorough understanding of how magic smoke works, so I went into EE instead of CS or CE.

Many of the lessons I learned turned out to be bad, too. For example, at one point, I intensely cultivated the "brutal honesty" approach some of my teenage idols practiced, and it took me a while to figure out what a load of bullshit that was.

But some of the things I learned survived the test of time. I still believe they are true, twenty years later, and they served me well. I'm going to list them here, maybe they'll serve some bright fourteen year-old who's got their whole life ahead, too. Take them with a grain of salt, though.

There's no substitute for being where people do cool new things. Some communities long for the good old days too much to let go, and judge others' work by how well it adheres to one "philosophy" or another. Others don't do cool new things -- they do various old things, just over and over again, so it always feels fresh, even though they make little progress. There's even a name for that, and I was there on Slashdot to cheer when it got coined.

You can do cool new things with old technology -- hell, there's something new to be learned every day at pouet, and there's a whole retro gaming scene built on fresh takes on old themes. The latest technology isn't always the best, either. I'm certainly not advocating for jumping onto every bandwagon there is.

But there's no substitute for being there when the future is debated, chewed, and invented. Even if "there" is a crowded basement where three over-caffeinated third-year students are half revolutionizing the industry and half figuring things out.

Don't surround yourself with people who know exactly why everything sucks these days. Instead, seek the company of those who are willing to try to come up with something better time after time, and are willing to come up with twenty really bad programs just so they can build the twenty-first, which is absolutely gonna rock.

Your profession is not the same thing as your job. Way back in high school, I had a friend who dreamed to be a musician. Every day, we'd show up at school dead tired -- we'd stayed up until 2 AM, kept awake by a stubborn program or a difficult piece. He really did go on to become a musician, and years later, we were talking about our early morning coffee shots, and he said, you know, Alex, I have absolutely no regrets about it -- I know that, if I couldn't make music, I would never be happy. And I understood him completely. I know I would be unhappy if I couldn't do what I'm doing today. That's all there's ever been for us.

Jobs, on the other hand, are a whole other story. I've held outrageously high-paying jobs with fancy titles, and I left them without thinking twice because they made me a worse programmer. I've left jobs because there was no technical career path, and engineering was largely thought of as the stressful hoop-jumping period you had to go through in order to get a job with a nice office and the ability to take twelve-month sabbaticals and go OOO at four PM to pick your niece from school.

There's a whole industry built around employer branding, coaching, career planning and personal development which would have you believe that you need to cultivate your job in order to succeed. It sounds really good on paper, pop psychology and all, but it only takes you as far as the next layoff.

It's how good you are at your profession -- and I don't mean just the technical parts -- that takes you further after that. If, when yearly evaluations come up, you ever find yourself having to bullshit yourself that you're a better programmer than last year, it's time to go.

The difference between good software and exceptional software is often in the parts that nobody wants to do and the compromises nobody wants to make. Everybody wants to write a cool logging library -- it's surprisingly challenging. But nobody wants to go through every log call and make sure the messages are readable and provide useful information. And when it all comes tumbling down, the Level 3 support tech won't really care if the diagnostic messages are delivered in order if they say "Database connection error: -31" and "Unable to continue, giving up..."

Who cares about L3 techies, right? Well, you should care. First, because they're your colleagues and maybe just don't be an asshole to your colleagues. And second, because how well they can do their job with your software determines how good an uptime it has and how satisfied people are with it and, ultimately, how good they think it'll be.

There's also a ton of incredibly successful proprietary software out there that's riddled with questionable UI changes and old, ugly code that reeks of 1994. And it's so successful not despite, but because of that. Because someone in the right place understands that six month-old bad UX is bad UX, but twenty year-old bad UX is a customer base with muscle memory, and twenty year-old bugs with well-known workarounds are now features. And whoever they are, I guarantee that a good part of their job consists of figuring how to help their subordinates tick the "organizational impact" checkbox on their evaluation forms while steering them clear of the parts where you can't make a positive difference for users anymore.

Kind people act as force multipliers. If a programmer needs three months to write a program, three programmers can make it happen in a matter of days if they're the kind of people who have each other's backs, help each other through hopeless debugging sessions, and who teach instead of sneering at people who don't know something.

These people bind teams together. They enforce good spirits, they help others find strength and resolve that they never suspected they could muster, and enable them to navigate the most intimidating obstacles. They make teams more than the sum of their parts.

If one of these three programmers is an asshole, though, it'll probably take four months instead. There are people who insist that kindness in a professional setting is a form of weakness. Don't indulge them: they hold on to this belief because they yearn to project an aura of strength, and they already tried -- and failed -- to project it through their skills.

If you ever find yourself in an organization that promotes and allows these people to thrive, and you can't avoid them, leave. No amount of clever maneuvering will make your life less miserable, and no amount of bringing your concerns to the right people will change the backbone of an organization that just doesn't know how to handle assholes. In fact, since most of the time, these organizations are ran by assholes, it'll just get you in more trouble. Vote with your feet. Better yet, do your part in keeping our industry fair and bent on inventing the world of tomorrow, rather than gluing the wrong asses to the right chairs: vote with your feet, and make it clear that you're doing so in your exit interview.

Good ideas are cheap. A bad idea with a good implementation always trumps a buggy implementation of a brilliant idea. The history of computing is riddled with the graves of revolutionary ideas that never made it past the "it mostly works on my machine" stage. Sometimes they were just flat out poorly implemented, because technology just didn't allow anything better at the time, or the people who came up with these ideas just didn't have enough time, or enough patience, to implement them well. Other times they failed just because they delivered too little, too late.

But ultimately, the reason why they were doomed boils down to the fact that computers run software, not ideas, and these brilliant ideas were backed by pretty bad software.

The threshold for "good implementation" can be surprisingly accessible. Ward Cunningham's original WikiBase implementation was pretty barebones and unpretentious, but it was a solid literate Perl program that endured the abuse of freeform composition style very well. Like many programs assembled at the dawn of the Internet era, it was clunky in many ways, but it worked well. But make no mistake about it, it was so successful not because there was a brilliant idea atop that pile of clunky hacks, but because it was a brilliant idea with a working implementation behind it.

Don't trust simple grand unified theories. There are many elegant models and architectures out there that attempt to bring order by cutting the Gordian knot of real-world complexity with the sharp sword of a categorical model ("everything is a file!", "everything is an object!", "everything is an S-expression!") or a particular approach ("purely-functional!", "completely memory safe!").

There are too many of these and they are too diverse to examine in detail, but they have one thing in common. They all have stern defenders who will readily claim that anything which doesn't really fit in their model is a niche case, and/or that what you want isn't what you actually need anyway. Like the proverbial hedgehog, they know only one thing.

I don't mean to say that these things are useless, or that they're bad. But they're not all-encompassing. Recognizing their limits, and knowing when to reach out for other tools, even if it compromises the elegance of your program, is far more rewarding than creative adherence to strict orthodoxy.

I learned this the hard way, by relentlessly defending the positions of more than one of these schools of self-flagellation. The real world is illogical and complex, and the effectiveness of a model or a theory lays not so much in how much complexity it can shave off, but in how accommodating it is to complexity that you can't avoid.

So if you want to check out how good a language is, try writing, say, a game. See how well it accommodates absurd things like "when the chief alien mushroom turns green, all the underling alien mushrooms must release their spores unless they're closer to each other than a spore release radius, and the ones that are within five feet of the player must run away". If you want to check out how good a UI toolkit is, the quality of its rich text control is a good indicator, as it has to follow six hundred years' worth of typographical conventions, some of which can be traced back to the adverse mental health effects of lead.

Any language and any library looks good in examples that were specifically picked to showcase their strengths and teach you how to use them. It's how helpful they are when dealing with problems that you'd really rather not solve that really makes or breaks them.

Think of it this way. Much of the history of Western scientific thought can be summarized as a quest for simple, all-encompassing, universal principles. It's what Thales set out to do, and 2500 years later, the Standard Model of particle physics is about the closest we've come to something that can explain fundamental physical interactions, and it's not simple at all.

It's a fundamental drive of our way of thinking, but this quest has rarely ended with us actually finding a couple of simple, all-encompassing, universal principles. If someone claims they found the solution to write fast, bug-free software once and for all, and its entire description fits in a few paragraphs, it's very likely that they are either selling you snake oil (especially if it has "Agile" in it), or they just partaking in their own supply of narcotics.

Good communication will be your most useful tool. Lots of the work I did in these past twenty years involved writing largely straightforward code that was a breeze to write compared to how hard it was to figure out what it has to do.

University assignments give you this idea that someone's going to serve you with specs, and you'll have to design something around them. The specs will be set in stone and your ability will be judged based on how well your design matches the specs.

That's a good exercise, but here's what will actually happen.

If you're lucky, you'll end up in an organization ran by people who understand that gathering and refining requirements is an integral part of leadership in tech, and you will spend weeks iterating specs, prototypes and implementations in a coordinated setting. Much of you work will consist of clarifying what's feasible and what isn't, the consequences of various trade-offs, and whether you can efficiently mitigate some types of risk. The impact you'll have will depend on, and your abilities will be judged, in part, by, your ability to think critically, to articulate ideas, and to objectively discuss your own work, and that of others.

These organizations are extremely rare. It's more likely that you'll end up in an organization where product development is ran by people who have no idea how those products are used or how they're put together. The impact you'll have will depend on your ability to extrapolate technically useful information from the horror stories your colleagues from tech support tell over drinks, hurried talks with salespeople who talked to the managers of the people who actually use your software, and ten PowerPoint slides whose large fonts comfortably obscure the fact that the sum of the entire organization's knowledge of the subject could fit in two paragraphs.

There's an uncomfortable truth that we sometimes hide during technical interviews, largely because it's so hard to measure. Your ability to hold a conversation, to listen without being condescending, to discuss other people's work without being passive-aggressive or arrogant, to have coffee with someone without being a creep and to go out for drinks without getting hammered will ultimately have at least as much impact over the quality of your work as whether you can recite your way through a graph coloring problem. And by "the quality of your work" I literally mean how good the software you'll write is going to be. It's not slang for "your chances of getting promoted", it's slang for "how good your code is".

Don't cling to what you know and don't be afraid to be a beginner again. Some things just go out of fashion. Sometimes geography just doesn't work in your favor, and there are no opportunities to seize anymore, unless you're willing to move thousands of miles away. Don't cling to these things. You can't fight cultural and economic shifts, no matter how good you are.

Going along with the stream can be intimidating. It's humbling and ego-wrenching to find yourself effectively back to being a junior dev again. I've had to shift gears professionally once before, and I'm slowly making my way through a second one. It's a scary path, riddled with self-doubt and seemingly unbreakable puzzles. But in these twenty years, I've seen many people who wouldn't let go of dying things, and they were dragged under by them.

Where Will We Be In Another 20 Years?

My crystal ball's just as bad as any other. Who knows?

Don't make twenty-year plans in this field. If you want any kind of certainty, you'll have to settle for a shorter time frame. And, ironically enough, the best way to get it is to be where the future is being made.