Archive

The Dulin Report

Browsable archive from the WordPress export.

Results (45)

The future is bright Mar 30, 2025 On Amazon Prime Video’s move to a monolith May 14, 2023 One size does not fit all: neither cloud nor on-prem Apr 10, 2023 Some thoughts on the latest LastPass fiasco Mar 5, 2023 Comparing AWS SQS, SNS, and Kinesis: A Technical Breakdown for Enterprise Developers Feb 11, 2023 There is no such thing as one grand unified full-stack programming language May 27, 2022 Which AWS messaging and queuing service to use? Jan 25, 2019 Using Markov Chain Generator to create Donald Trump's state of union speech Jan 20, 2019 Adobe Creative Cloud is an example of iPad replacing a laptop Jan 3, 2019 Facebook is the new Microsoft Apr 14, 2018 Leaving Facebook and Twitter: here are the alternatives Mar 25, 2018 Rather than innovating Walmart bullies their tech vendors to leave AWS Jun 27, 2017 Architecting API ecosystems: my interview with Anthony Brovchenko of R. Culturi Jun 5, 2017 TDWI 2017, Chicago, IL: Architecting Modern Big Data API Ecosystems May 30, 2017 Online grocers have an additional burden to be reliable Jan 5, 2017 Windows 10: a confession from an iOS traitor Jan 4, 2017 What I learned from using Amazon Alexa for a month Sep 7, 2016 Why I switched to Android and Google Project Fi and why should you Aug 28, 2016 Amazon Alexa is eating the retailers alive Jun 22, 2016 In search for the mythical neutrality among top-tier public cloud providers Jun 18, 2016 What can we learn from the last week's salesforce.com outage ? May 15, 2016 Why it makes perfect sense for Dropbox to leave AWS May 7, 2016 Our civilization has a single point of failure Dec 16, 2015 IT departments must transform in the face of the cloud revolution Nov 9, 2015 Setting Up Cross-Region Replication of AWS RDS for PostgreSQL Sep 12, 2015 Top Ten Differences Between ActiveMQ and Amazon SQS Sep 5, 2015 What Every College Computer Science Freshman Should Know Aug 14, 2015 Ten Questions to Consider Before Choosing Cassandra Aug 8, 2015 Big Data Should Be Used To Make Ads More Relevant Jul 29, 2015 Book Review: "Shop Class As Soulcraft" By Matthew B. Crawford Jul 5, 2015 Attracting STEM Graduates to Traditional Enterprise IT Jul 4, 2015 Smart IT Departments Own Their Business API and Take Ownership of Data Governance May 13, 2015 Guaranteeing Delivery of Messages with AWS SQS May 9, 2015 We Need a Cloud Version of Cassandra May 7, 2015 The Clarkson School Class of 2015 Commencement speech May 5, 2015 Building a Supercomputer in AWS: Is it even worth it ? Apr 13, 2015 Ordered Sets and Logs in Cassandra vs SQL Apr 8, 2015 Microsoft and Apple Have Everything to Lose if Chromebooks Succeed Mar 31, 2015 Where AWS Elastic BeanStalk Could be Better Mar 3, 2015 Trying to Replace Cassandra with DynamoDB ? Not so fast Feb 2, 2015 Why I am Tempted to Replace Cassandra With DynamoDB Nov 13, 2014 Infrastructure in the cloud vs on-premise Aug 25, 2014 Cassandra: a key puzzle piece in a design for failure Aug 18, 2014 Cassandra: Lessons Learned Jun 6, 2014 Things I wish Apache Cassandra was better at Feb 12, 2014

The future is bright

March 30, 2025

Is there a future in traditional computer science? The short answer is yes. I am confident that the future is bright. Here is the longer answer.

What is Computer Science?


Computer Science is the study of computation. It includes theoretical concepts like information, automata, and complexity theories. You will study algorithms and data structures and gain enough knowledge to build complex computational systems. Some things you will learn will make you a good developer in a programming language or two. For the most part, however, you should graduate with a profound understanding of “computation“ that will allow you to acquire new skills as you progress in your career. If you don’t get that level of thinking from your education, the university fails you.

Computer Science is a young field relative to other sciences like math, physics, chemistry, and biology. In the late 1980s and early 1990s, when Computer Science as a college major was still rare, a student had to major in math or electrical engineering to learn computing. The math vs. electrical engineering track was mainly due to how a particular university implemented the computing program. Eventually, by the mid-1990s, Computer Science evolved as a standalone program in most universities as a hybrid between mathematics and electrical engineering.

Commoditization


Over time, some aspects of Computer Science have become commoditized. For example, basic skills in writing computer programs have been a commodity for quite some time. You didn’t need a four-year college degree for the past twenty years to become a full-stack developer.

Skills become commoditized because the computing industry moves towards higher abstractions, lowering the barriers to entry for newcomers. For example, in the early days of computing, a “coder” translated programming instructions into machine code and punched holes in paper cards. Over time, high-level programming languages made that task completely unnecessary—and yes, those who didn’t grow up lost their jobs.

It used to be that when one got a job at a bank writing banking software, one also had to write their disk access routines. We don’t do that anymore because the operating systems and programming languages have evolved.

Higher-level programming languages, open-source software, and coding boot camps—all of these positive developments lowered the barriers to entry for newcomers while commoditizing previously high-end skills. AI is another force driving the commoditization of some skills. Commoditization is a normal and constant feature of our field. It happens all the time, and it has always been this way.

You will not outcompete the commodity. Please write it down on your bathroom mirror or tattoo it somewhere where you can see it. If your only talent is commodity skills - you will not succeed. Full stack development is a commodity skill. Do not call yourself a full-stack developer. There are millions like you out there and AI can do it.

Specialization and the Death of the Generalist Engineer


Over the past several years, large US tech companies (Google, Meta, Amazon, Microsoft, Apple, etc.) have shifted their hiring focus from “generalist” software engineers to more specialized roles. In the 2010s, a typical software “generalist” might wear many hats. Still, by the late 2010s and early 2020s, the industry had introduced a multitude of niche titles dedicated to specific domains​.

This trend has accelerated since around 2019 as companies grew in scale and complexity. Today, job postings and hiring patterns reflect a demand for domain experts – from machine learning engineers to front-end specialists – rather than one-size-fits-all developers. Several factors underpin this shift. The post-2019 hiring boom (and subsequent correction) forced companies to prioritize critical skill sets over broad talent pools. The maturation of cloud infrastructure and DevOps has created subfields requiring dedicated expertise. Meanwhile, rapid advances in AI/ML, data science, and cybersecurity have made deep domain knowledge indispensable.

Over the past twenty-five years, as computer science has emerged out of math and electrical engineering, other fields have emerged out of Computer Science. Good examples that come to mind are Data Science, Computational Biology, Computational Finance, Information Systems, and other domain-specific disciplines.

These new “splinter” fields that emerged from Computer Science rest on the same foundation as the classic Computer Science major while incorporating interdisciplinary studies. The better universities saw the trends early on. For example, as I was doing my Computer Science Masters Degree at NYU in the early 2000s, NYU was experimenting with the Computational Biology program. A handful of US institutions have established well-funded and well-supported programs in Computational Bio. Other universities still retain Computer Science as a core major but require that students pick an area of specialization: healthcare, biology, finance, public policy, etc.

I remember an old stock-picking advice that can be adapted to choosing an area of specialization: look around and see what indispensable products and services you and the people around you use. Consider how you’d improve them, develop your ideas, and see what excites your passion. The reality is that no software exists in a vacuum without users or business needs. You will have to choose one or another eventually. For me, that specialization became financial technology and related areas like CRM, ERP, payroll, and HCM.

Suppose you insist on staying at the core of computer science. Let me give you some ideas to consider:

  • Cyber-security will never have any shortage of jobs or challenges to solve,

  • Related to cybersecurity is cryptography,

  • High-performance computing (even AI needs to run on some platforms),

  • Platform-based ecosystems (AI does not exist in a vacuum; I am going to elaborate on this in another blog post)


There are no instructions


It is very easy to build a new app from scratch -- vibe-coding or not. Very few projects start from zero lines of code. You have to work with vague requirements and immutable obstacles. There are no instructions.

What differentiates someone with a deep computer science background from someone without is their ability to turn abstract ideas into complex working systems, building upon existing technology and legacy code. Today’s world runs on billions of lines of legacy code: banks, governments, and airlines. Payments are made, funds are transferred, wars start and end, and airplanes fly. Your work must build upon what is already out there.

Someone with a three-month boot camp course in full-stack coding cannot do this work. Vibe-coding bros are not going to do this. AI could be a valuable tool in helping analyze legacy code, but it won’t wake up in the middle of the night when trades fail, or airplanes fall out of the sky. Someone with a background in scientific methods will, however.

Consider this analogy: even though a modern car can be assembled entirely by robots in the dark, car mechanics can charge as much as $200-$250/hour for some specialized repair and maintenance work. I am confident we’ll be called upon to “repair” AI-generated mess, just like we are called upon to clean up junior engineers’ mess. That is why those of us with strong systems and analytical thinking skills will remain marketable for decades -- true artisans in their fields are always successful.

Have a skills strategy


My call to action is simple: whatever you do, develop a skills strategy:

  • Pay attention to emerging areas of innovation. Do an exercise every few weeks where you think of ways a new innovative technology might help you improve your productivity or the product you are working on. Example: Vibe Coding,

  • Focus on developing skills for rapid adoption across the industry. This is where the highest demand for talent and money will be. Example: Generative AI adoption in the enterprise.

  • Move away from commoditized skills. Once you see people around you doing the same work you’ve been doing for years, or even AI doing your job, you are at risk of being substituted. Example: full-stack development.