Why are there so many IT jobs?
Thirty years ago, I embarked on a journey into software applications and product development in the banking world. What began as work on one of the world’s most successful banking products soon evolved into a career filled with rapid technological advancements, each reshaping how we built and delivered software. Today, as I reflect on the evolution of this industry, I’m struck by how far we’ve come—and how some aspects of the software development process have remained curiously similar.
When I joined, our company already had a strong banking product, but it was rooted in old technology: Business Basic, flat file databases, character-based screens, and hardware setups that seem primitive by today’s standards. These systems ran on “dumb terminals” with only keyboards, no processors or hard drives, and no graphical interfaces. All the code resided exclusively on the server.
As GUI-driven systems, relational databases, and object-oriented programming began to emerge, our team decided to reimagine and build a new product with Smalltalk and VisualWorks and GemStone object database . However, after an architecture review from consultants from the western world, our management decided to pivot to Oracle Forms and Oracle Database—a wise choice that would propel the product into the global banking software spotlight.
At the time, client-server was the dominant architecture. The client application ran on the user’s desktop, while the server housed the database and core business logic on a local network. The architecture, though limited in scalability and demanding in network bandwidth, was workable for our needs. Amazingly, this setup allowed a 40-person team to develop the core banking product on specifications like:
With such modest specs by today’s standards, today’s average smartphone has more computing power than our entire infrastructure did! My phone now has enough hardware for a 100-person team of yesteryears. To cope with bandwidth limitations, we relied on technologies like Citrix, which allowed users in one country to access data housed in a data center on another continent. Integration with upstream and downstream systems was supported, though online integrations were nowhere near as advanced as they are today.
Today, I no longer code, but when I see development estimates, they often seem to be 10 times what was required 25 years ago. Have back-office banking product processors changed significantly? I don’t think so—I am not referring to customer-facing applications accessed over the public internet. Back office applications have largely remained the same - they run on the intranet and have a bunch of user interfaces for transactions, complex programs for processing interest, charges etc, some customer outputs like advices and statements, some reports - maybe more APIs.
Recommended by LinkedIn
This begs the question: what’s driving this exponential increase in development time and resources? Here’s my take:
Years ago, I read that the total computing power used to send Apollo 11 to the moon was less than what’s in an average laptop. Reflecting on my own career, I understand this hunger for hardware as technology evolves. Not just hardware but also people - closer to 10X number of people! That's hard to digest.
Interestingly, it is said that over 70% of the world’s business transactions run on COBOL. Now, that is something to ponder about!
Thank you for reading this . It is my first article on such a topic - I hope you found it insightful. I would like to thank by first and only employer, work colleagues and the one and the only Ramesh Srinivasan - my supervisor, guide, colleague, friend and the GOAT of banking technology. All of us who worked with him owe a lot of our abilities to him. He was 10X 25 years back and a conservative estimate is that today would have been 100X. Sadly we lost him 18 year ago.
Before I miss out, I need to thank ChatGPT for helping me in articulate my writing much better than I could have.
Senior Technical Architect- Expertise in [Solution Architecture | Product Engineering | TOGAF Certified Enterprise Architect]
4mo'Years ago, I read that the total computing power used to send Apollo 11 to the moon was less than what’s in an average laptop.' Striking statement! Wonderful article Speed.
Consulting Project Technical Manager/Core Banking-Payments Implementation Manager at Oracle Financial Services Software Ltd. Oracle Accredited Project Manager
4moYou have nicely jotted down the journey Speed , remember those teams of highly passionate guys that took the responsibility to covert D2K into brand new ( then ) NeoWeb. Was lucky to be part of few conversations with Ramesh those days and still cherish those moments.
.
5moInteresting Article Speed, This reminds me of software ‘Package’’ & ‘Language’ and 1.2 MB & 1.44 MB floppy disk storage used in the past decade. Development of projects using Dbase, Foxbase and Clipper package was inducing and improving programmers brain and feels them happy once solution is achieved for the problem. Now days not much of thinking process involved and only designing of drag and drop strategy for most of the development and integration is applied, Thanks again your article Speed.
Data Management| Enterprise Architecture-TOGAF9, SAFe6 Architect| Certified Cloud Architect and FinOps Practitioner| LOMA-ALMI| Big-Data: Hadoop, NoSQL, Streaming| Data Platform: Architecture, Modernisation and Advisory
5moMakes an excellent read - very relatable and thought provoking- its astonishing - the number of lines of codes has considerably diminished - yet the time to achieve the output has increased considerably even after using super friendly IDEs -