Software engineering is very dynamic environment. Technologies are changing drastically every decade. Twenty years ago Pentium II would be probably the most powerful CPU with just 300 MHz clock frequency and nowadays we have chips with 4 CPU cores each one 8 to 10 times faster than Pentium II. So our computational power has been increased significantly during those 20 years. Absolutely no wonders that it happend as our appetites for computing are only growing with times. Partially because as cheaper computations become the more applications we may use them for. Partially... because our software is very inneficient indeed, but we never mind it as primary consumer of computational resource is a business and it doesn't care to buy more CPU, RAM etc. to increase performance rather than spend money on optimization which may be very expensive and unpredictable. For businesses it's always a trade off and with the phenomena which we call Moore's Law it made more sense to spend money for buying more powerful computers and using tools for rapid development as time matters as nothing else in business world. That's why we walked this way from machine code to assembler, compiled languages and then interpreted languages. Each step adds it's own overhead which affects performance drastically and some algorithms written just on machine code may be up to 1000 times faster than their counterpart written on e.g. JavaScript. But business didn't care, we didn't care as according to Moore's Law computing power had been increasing two times each year and we could afford luxury of rapid development and neglect performance losses.
So welcome to the era of computational stagnation! We're still able to buy more CPUs and RAM, but now we know that our performance won't increase magically in 2x factor only by buying new equipment in future. Simply because we hit limitation of computational technologies we've been using since seventies when Moore first observed his phenomena. Well, it's definitelly not a reason for pessimism. For centuries we lived without computer at all and managed to survive. But this stagnation is very interesting as actually we hit limit of technologies providing computational power but not a limit of demand. With all AI, data science, IOT and other stuff we may see on horizon our demand for computational power is only increasing. We really tend to solve the problem and spend resource (thus making real steps in solving the problem) only when this problem becomes a really serious one, otherwise we usually don't bother. That makes sense as usually resources are limited so it's wise to solve urgent problems first. What a heck am I talking about? Oh yes, for many years we actually new that there're the alternatives to the way we're doing software, and we're doing it not in the most efficient way and bla-bla-bla, but nobody spent resources for alternative approaches just because we didn't need them badly. Old inefficcient way worked for us good enough so nobody bothered except of enthusiasts, but they were underfunded as usual. Hey, do you see it now? Their time is coming. Once again nerdy people will have a change to earn good deal of money as for the next decade entire industry will shift into approaches to optimize current software, make computation more powerfull even having limitation we have right now by optimizing architecture on ALL levels. The hurricaine is coming folks and if you just finished learning React and thought that you at last caught up with it I have to disappoint you - the more changes are coming in next two decades not only in programming languages but in very fundamental paradigms. So if you're going to be in game next 20 years (Nick, if you're reading this, just stop now, you gonna retire in two years) you have to think about what I'm talking about. Cause you may see it, speculate about it and may be even lead the changes.
So my message here is that stagnation is always a sign of drastic changes. And drastic changes are heaven for new opportunities. When investors understand what's going on a lot of money will be poured into startups developing alternate computing approaches. We can't predict what is going to win, but we can see the battle is coming and this battle may change software engineering as we know it. We'll see for sure new programming languages designed for parallel computing paradigms, new multiprocessor architectures, not even with 8 cores on chip but possibly with hundreds. We very likely to see entire computers with CPU, GPU, whateverPU and RAM just on one chip, we'll see those chips made in 3D structures but not on one layer plate. As entire thing likely to burn itself if built using traditional transistors so we'll very likely to see emerging of optical transistors and other alternatives. So we're about to see a lot of wonders in miracles in IT industry in next two decades. So stay tuned and keep dreaming about shiny future with ultra-realistic 3D porn games for you VR.