Archive

The Dulin Report

Browsable archive from the WordPress export.

Results (57)

Strategic activity mapping for software architects May 25, 2025 The future is bright Mar 30, 2025 The day I became an architect Sep 11, 2024 Are developer jobs truly in decline? Jun 29, 2024 Software Engineering is here to stay Mar 3, 2024 Some thoughts on the latest LastPass fiasco Mar 5, 2023 Book review: Clojure for the Brave and True Oct 2, 2022 Stop Shakespearizing Sep 16, 2022 Java is no longer relevant May 29, 2022 Automation and coding tools for pet projects on the Apple hardware May 28, 2022 If you haven’t done it already, get yourself a Raspberry Pi and install Linux on it May 9, 2022 Tools of the craft Dec 18, 2021 Kitchen table conversations Nov 7, 2021 Should we abolish Section 230 ? Feb 1, 2021 The passwords are no longer a necessity. Let’s find a good alternative. Mar 2, 2020 Adobe Creative Cloud is an example of iPad replacing a laptop Jan 3, 2019 Nobody wants your app Aug 2, 2017 TypeScript starts where JavaScript leaves off Aug 2, 2017 Node.js is a perfect enterprise application platform Jul 30, 2017 I built an ultimate development environment for iPad Pro. Here is how. Jul 21, 2017 The technology publishing industry needs to transform in order to survive Jun 30, 2017 Copyright in the 21st century or how "IT Gurus of Atlanta" plagiarized my and other's articles Mar 21, 2017 Emails, politics, and common sense Jan 14, 2017 Collaborative work in the cloud: what I learned teaching my daughter how to code Dec 10, 2016 Apple’s recent announcements have been underwhelming Oct 29, 2016 Don't trust your cloud service until you've read the terms Sep 27, 2016 I am addicted to Medium, and I am tempted to move my entire blog to it Sep 9, 2016 What I learned from using Amazon Alexa for a month Sep 7, 2016 Amazon Alexa is eating the retailers alive Jun 22, 2016 In Support Of Gary Johnson Jun 13, 2016 Why it makes perfect sense for Dropbox to leave AWS May 7, 2016 Managed IT is not the future of the cloud Apr 9, 2016 JavaScript as the language of the cloud Feb 20, 2016 In memory of Ed Yourdon Jan 23, 2016 OAuth 2.0: the protocol at the center of the universe Jan 1, 2016 Operations costs are the Achille's heel of NoSQL Nov 23, 2015 IT departments must transform in the face of the cloud revolution Nov 9, 2015 I Stand With Ahmed Sep 19, 2015 Top Ten Differences Between ActiveMQ and Amazon SQS Sep 5, 2015 What Every College Computer Science Freshman Should Know Aug 14, 2015 Social Media Detox Jul 11, 2015 Book Review: "Shop Class As Soulcraft" By Matthew B. Crawford Jul 5, 2015 Attracting STEM Graduates to Traditional Enterprise IT Jul 4, 2015 The longer the chain of responsibility the less likely there is anyone in the hierarchy who can actually accept it Jun 7, 2015 The Clarkson School Class of 2015 Commencement speech May 5, 2015 Why I am not Getting an Apple Watch For Now: Or Ever Apr 26, 2015 Building a Supercomputer in AWS: Is it even worth it ? Apr 13, 2015 Exploration of the Software Engineering as a Profession Apr 8, 2015 Microsoft and Apple Have Everything to Lose if Chromebooks Succeed Mar 31, 2015 Do not apply data science methods without understanding them Mar 25, 2015 On apprenticeship Feb 13, 2015 On Managing Stress, Multitasking and Other New Year's Resolutions Jan 1, 2015 Why I am Tempted to Replace Cassandra With DynamoDB Nov 13, 2014 Thanking MIT Scratch Sep 14, 2013 Have computers become too complicated for teaching ? Jan 1, 2013 Java, Linux and UNIX: How much things have progressed Dec 7, 2010 We are all contract professionals Jan 13, 2007

The future is bright

March 30, 2025

Is there a future in traditional computer science? The short answer is yes. I am confident that the future is bright. Here is the longer answer.

What is Computer Science?


Computer Science is the study of computation. It includes theoretical concepts like information, automata, and complexity theories. You will study algorithms and data structures and gain enough knowledge to build complex computational systems. Some things you will learn will make you a good developer in a programming language or two. For the most part, however, you should graduate with a profound understanding of “computation“ that will allow you to acquire new skills as you progress in your career. If you don’t get that level of thinking from your education, the university fails you.

Computer Science is a young field relative to other sciences like math, physics, chemistry, and biology. In the late 1980s and early 1990s, when Computer Science as a college major was still rare, a student had to major in math or electrical engineering to learn computing. The math vs. electrical engineering track was mainly due to how a particular university implemented the computing program. Eventually, by the mid-1990s, Computer Science evolved as a standalone program in most universities as a hybrid between mathematics and electrical engineering.

Commoditization


Over time, some aspects of Computer Science have become commoditized. For example, basic skills in writing computer programs have been a commodity for quite some time. You didn’t need a four-year college degree for the past twenty years to become a full-stack developer.

Skills become commoditized because the computing industry moves towards higher abstractions, lowering the barriers to entry for newcomers. For example, in the early days of computing, a “coder” translated programming instructions into machine code and punched holes in paper cards. Over time, high-level programming languages made that task completely unnecessary—and yes, those who didn’t grow up lost their jobs.

It used to be that when one got a job at a bank writing banking software, one also had to write their disk access routines. We don’t do that anymore because the operating systems and programming languages have evolved.

Higher-level programming languages, open-source software, and coding boot camps—all of these positive developments lowered the barriers to entry for newcomers while commoditizing previously high-end skills. AI is another force driving the commoditization of some skills. Commoditization is a normal and constant feature of our field. It happens all the time, and it has always been this way.

You will not outcompete the commodity. Please write it down on your bathroom mirror or tattoo it somewhere where you can see it. If your only talent is commodity skills - you will not succeed. Full stack development is a commodity skill. Do not call yourself a full-stack developer. There are millions like you out there and AI can do it.

Specialization and the Death of the Generalist Engineer


Over the past several years, large US tech companies (Google, Meta, Amazon, Microsoft, Apple, etc.) have shifted their hiring focus from “generalist” software engineers to more specialized roles. In the 2010s, a typical software “generalist” might wear many hats. Still, by the late 2010s and early 2020s, the industry had introduced a multitude of niche titles dedicated to specific domains​.

This trend has accelerated since around 2019 as companies grew in scale and complexity. Today, job postings and hiring patterns reflect a demand for domain experts – from machine learning engineers to front-end specialists – rather than one-size-fits-all developers. Several factors underpin this shift. The post-2019 hiring boom (and subsequent correction) forced companies to prioritize critical skill sets over broad talent pools. The maturation of cloud infrastructure and DevOps has created subfields requiring dedicated expertise. Meanwhile, rapid advances in AI/ML, data science, and cybersecurity have made deep domain knowledge indispensable.

Over the past twenty-five years, as computer science has emerged out of math and electrical engineering, other fields have emerged out of Computer Science. Good examples that come to mind are Data Science, Computational Biology, Computational Finance, Information Systems, and other domain-specific disciplines.

These new “splinter” fields that emerged from Computer Science rest on the same foundation as the classic Computer Science major while incorporating interdisciplinary studies. The better universities saw the trends early on. For example, as I was doing my Computer Science Masters Degree at NYU in the early 2000s, NYU was experimenting with the Computational Biology program. A handful of US institutions have established well-funded and well-supported programs in Computational Bio. Other universities still retain Computer Science as a core major but require that students pick an area of specialization: healthcare, biology, finance, public policy, etc.

I remember an old stock-picking advice that can be adapted to choosing an area of specialization: look around and see what indispensable products and services you and the people around you use. Consider how you’d improve them, develop your ideas, and see what excites your passion. The reality is that no software exists in a vacuum without users or business needs. You will have to choose one or another eventually. For me, that specialization became financial technology and related areas like CRM, ERP, payroll, and HCM.

Suppose you insist on staying at the core of computer science. Let me give you some ideas to consider:

  • Cyber-security will never have any shortage of jobs or challenges to solve,

  • Related to cybersecurity is cryptography,

  • High-performance computing (even AI needs to run on some platforms),

  • Platform-based ecosystems (AI does not exist in a vacuum; I am going to elaborate on this in another blog post)


There are no instructions


It is very easy to build a new app from scratch -- vibe-coding or not. Very few projects start from zero lines of code. You have to work with vague requirements and immutable obstacles. There are no instructions.

What differentiates someone with a deep computer science background from someone without is their ability to turn abstract ideas into complex working systems, building upon existing technology and legacy code. Today’s world runs on billions of lines of legacy code: banks, governments, and airlines. Payments are made, funds are transferred, wars start and end, and airplanes fly. Your work must build upon what is already out there.

Someone with a three-month boot camp course in full-stack coding cannot do this work. Vibe-coding bros are not going to do this. AI could be a valuable tool in helping analyze legacy code, but it won’t wake up in the middle of the night when trades fail, or airplanes fall out of the sky. Someone with a background in scientific methods will, however.

Consider this analogy: even though a modern car can be assembled entirely by robots in the dark, car mechanics can charge as much as $200-$250/hour for some specialized repair and maintenance work. I am confident we’ll be called upon to “repair” AI-generated mess, just like we are called upon to clean up junior engineers’ mess. That is why those of us with strong systems and analytical thinking skills will remain marketable for decades -- true artisans in their fields are always successful.

Have a skills strategy


My call to action is simple: whatever you do, develop a skills strategy:

  • Pay attention to emerging areas of innovation. Do an exercise every few weeks where you think of ways a new innovative technology might help you improve your productivity or the product you are working on. Example: Vibe Coding,

  • Focus on developing skills for rapid adoption across the industry. This is where the highest demand for talent and money will be. Example: Generative AI adoption in the enterprise.

  • Move away from commoditized skills. Once you see people around you doing the same work you’ve been doing for years, or even AI doing your job, you are at risk of being substituted. Example: full-stack development.