Why are there so many Software Engineers?

Hardware vs. Software

Have you ever wondered why so many people are software developers instead of hardware developers these days? I certainly have. And even if you haven’t, I’ll go ahead and give you a cut and dry example of why that is, and why software is considered to be the way of the future (for now). I’ve never studied anything about either hardware or software, to be fair – hardware is black magic and software is a black box, so take my narrative and reasoning with a grain of salt.

What is Software

First off, we’ll have to define what software is. I really don’t have a great definition to give you, but I’m sure you know what software is anyway – it’s anything above hardware, and hardware is anything below software. I jest, of course. Software is defined as operating information used by a computer. These days, any kind of programming that builds on an operating system (OS) is software programming. To me, operating systems are the proverbial line drawn in the sand between hardware and software – an OS takes your instructions and etches them into the hardware. It turns software actions into hardware truth, a sort of bridge between hardware and software.

What is Hardware

This time I’m better prepared to read out the definition of hardware! Put down your pitchforks. Ahem. Hardware is “the machines, wiring, and other physical components of a computer or other electronic system.” Sound good? Hardware is all around us. I’m writing this article on a Macbook Pro, I ride the bus to and from work, and I even use a microwave from time to time (a lot of the time, actually). These are all examples of hardware with software interfaces that expose an interactive API. When I click the button to reheat my pizza, the microwave sends instructions to the hardware of the microwave (turn on, use this much electricity, start spinning, turn off, etc). Hardware is everywhere around us.

Why Software

Now that we have some definitions out of the way, let’s take a trip down memory lane. Long, long ago there were not that many computers and hardware was very expensive. One of the first popular programming languages was Fortran, which ran on the IBM 704 mainframe computer. Computers were the size of rooms back then, and this super computer packed a whopping punch. It had a jaw-dropping memory of about 18KB and weighed 10 tons. Whoo wee. We’ve come a long way, haven’t we? Anyway, given the cost of these computers and the amount of electricity they consumed (thousands of dollars worth an hour), it made sense that it was important to optimize for hardware time, not programmer time. See, a team at IBM had access to one of these. That means a team of smart, capable individuals would all have to share one computer. They’d fuss for hours about making faster algorithms, because it mattered. A lot. Most people programmed in pure machine code in those days because a compiler couldn’t come close to the amount of optimization a team of super smart people would be able to do. Clearly, hardware was the constraining factor here, not labor. And for many years that was true – interpreted languages (such as lisp) were looked over because they ran thousands of times slower than native machine code. Needless to say, most everyone programmed in machine code, and that was the way things were.

But eventually, that changed. Think about Moore’s law. The number of transistors in a circuit doubles every two years. With the increase in computing power of hardware, programmers were freed from having to use machine code for everything. Eventually they used assembly, and then higher level languages like C, and nowadays, extremely high level languages like Python, Ruby, and Javascript have risen to prominence. Sure, Javascript runs many orders of times slower than machine language or assembly, but with every passing year, higher level languages rise in popularity. Every year, our hardware gets better. That means every year, hardware costs less. Thus, programmers are freed from having to optimize every facet of their program. Oddly enough, programmers are incredibly cheap to hire these days. Give them a $1500 computer, two monitors, a keyboard and mouse, a chair and desk. That’s about $3000 for all the hardware they need. But the company needs to pay their salary, which can range quite heavily, but it sure costs a lot more than $3000 a year, let me tell you.

So the trend is quite apparent for right now. Hardware costs less and less with each passing year, whereas software costs more and more. Software took over hardware as the restricting factor in development, and ever since then, there’s been a boom in software jobs and more of a glut in hardware jobs. And it doesn’t seem to be stopping anytime soon. Steve Jobs famously told Barack Obama to lower barriers in immigration because Apple just couldn’t hire enough talented software engineers. Salaries have skyrocketed for software developers, and it looks like they won’t be falling back to earth soon, as long as companies have too much demand for software developers with far too little supply.

Conclusion

So the trend seems to be that we’ll need more and more software engineers, and maybe hardware won’t grow as fast. Even with the explosion in mobile devices in the past decade, there can be thousands of apps on one phone. That means there’s one team of hardware engineers for every thousand or so teams of software engineers. So, perhaps, the direction from now on will be more software-oriented instead of hardware-oriented. Or, perhaps, there’ll be another change. Maybe hardware will see a revival when we all become cyborgs in the near future. Who really knows?