I came across an article about the lack of interest in open source software among younger programmers. And while I think it’s an important topic and worth discussing, I think the article misses a few important points about who millennials are, how we were affected by changes in computing and why so few of us seem to care about open source.
First of all, what exactly is a millennial? The definition is “a person reaching young adulthood in the early 21st century,” but now it seems like anyone under the age of 40 is a millennial. Someone born during the Reagan administration is just as likely to be called a millennial as someone born during the Bush administration. As a self-declared millennial (gasp), I fall roughly in the middle.
I grew up just as PCs and Internet access were becoming affordable. Around that time, a little company called Microsoft was dominating the PC market with the first versions of Windows. The first family PC ran Windows 3.1, and most of what I first learned about computing was through school PCs running Windows 95. Sure, the Linux kernel had been released years prior and a little project called Debian had just gotten off the ground, but alternative operating systems were way beyond my little six year old brain to comprehend. That wouldn’t happen until high school, when I stumbled across a boxed version of openSUSE (10.1 for anyone interested) in a Barnes and Noble.
I was definitely one of those geeky kids who spent days (if not weeks) reading technical manuals, scanning forums and message boards, compiling drivers, tweaking kernels, and generally breaking and fixing (and re-breaking) things. It was an incredible learning experience, but far from the typical high school experience. No one in my social group had ever heard of Linux, open source, GNU, or even knew what proprietary software was, but no one really cared. The world ran on Windows, and trying to convince my friends, family, or teachers to use OpenOffice instead of Word was an uphill battle without an end.
But in the early to mid 2000s, something changed. Distributions like Debian were pushing open source software onto more and more servers, Ubuntu was spreading it on the desktop, and Android brought it to mobile phones. As millennials graduated and took on careers in the tech industry, the idea of an alternative to a proprietary Microsoft-based ecosystem was becoming much less farfetched. Today, open source is practically the default way of developing software, with companies like IBM, Facebook, Netflix, and even Microsoft releasing their work for everyone to use. And I think the article explains this quite well.
The problem isn’t interest, but education. This shift from proprietary-as-the-norm to open-source-as-the-norm occurred over a relatively short period of time, and I don’t think we’ve had time to adjust. It’s been a while since university, but I’m confident schools still run on Windows, use Blackboard, and require specialized testing/homework software. In this kind of environment, who but the geeks and nerds would give a damn about open source? I’m certain that if we expose younger generations to the ideas and benefits of open source software, we’d see passionate tinkerers emerge from the crowd just like phone phreaks did in the 70s, computer hackers in the 80s and 90s, and now makers in the 00s – 10s.
I don’t see open source software going anywhere. It may be a marketing tool, but most businesses latch onto marketing tools because they’re shown to make money. If it was just Microsoft or Facebook pushing their open source ideology, I could see it being a passing trend. But with the idea so firmly rooted in today’s tech industry, I don’t see it going anywhere anytime soon, no matter how hard (or how little, depending on who you ask) us millennials are trying to kill it off.
Fabric softener and napkins, on the other hand, can go straight to hell.