Archive

The Dulin Report

Browsable archive from the WordPress export.

Results (79)

On the role of Distinguished Engineer and CTO Mindset Apr 27, 2025 The future is bright Mar 30, 2025 On luck and gumption Oct 8, 2023 Some thoughts on recent RTO announcements Jun 22, 2023 One size does not fit all: neither cloud nor on-prem Apr 10, 2023 Comparing AWS SQS, SNS, and Kinesis: A Technical Breakdown for Enterprise Developers Feb 11, 2023 Working from home works as well as any distributed team Nov 25, 2022 Things to be Thankful for Nov 24, 2022 Why you should question the “database per service” pattern Oct 5, 2022 Stop Shakespearizing Sep 16, 2022 Using GNU Make with JavaScript and Node.js to build AWS Lambda functions Sep 4, 2022 Why don’t they tell you that in the instructions? Aug 31, 2022 Monolithic repository vs a monolith Aug 23, 2022 Keep your caching simple and inexpensive Jun 12, 2022 Java is no longer relevant May 29, 2022 There is no such thing as one grand unified full-stack programming language May 27, 2022 Peloton could monetize these ideas if they only listen May 15, 2022 Best practices for building a microservice architecture Apr 25, 2022 True identity verification should require a human Mar 16, 2020 The passwords are no longer a necessity. Let’s find a good alternative. Mar 2, 2020 What programming language to use for a brand new project? Feb 18, 2020 TDWI 2019: Architecting Modern Big Data API Ecosystems May 30, 2019 Configuring Peloton Apple Health integration Feb 16, 2019 All emails are free -- except they are not Feb 9, 2019 Using Markov Chain Generator to create Donald Trump's state of union speech Jan 20, 2019 The religion of JavaScript Nov 26, 2018 Teleportation can corrupt your data Sep 29, 2018 Let’s talk cloud neutrality Sep 17, 2018 A conservative version of Facebook? Aug 30, 2018 On Facebook and Twitter censorship Aug 20, 2018 Facebook is the new Microsoft Apr 14, 2018 Node.js is a perfect enterprise application platform Jul 30, 2017 Design patterns in TypeScript: Factory Jul 30, 2017 Design patterns in TypeScript: Chain of Responsibility Jul 22, 2017 Singletons in TypeScript Jul 16, 2017 Architecting API ecosystems: my interview with Anthony Brovchenko of R. Culturi Jun 5, 2017 TDWI 2017, Chicago, IL: Architecting Modern Big Data API Ecosystems May 30, 2017 I tried an Apple Watch for two days and I hated it Mar 30, 2017 Emails, politics, and common sense Jan 14, 2017 Online grocers have an additional burden to be reliable Jan 5, 2017 Here is to a great 2017! Dec 26, 2016 Apple’s recent announcements have been underwhelming Oct 29, 2016 I am addicted to Medium, and I am tempted to move my entire blog to it Sep 9, 2016 What I learned from using Amazon Alexa for a month Sep 7, 2016 Praising Bank of America's automated phone-based customer service Aug 23, 2016 Amazon Alexa is eating the retailers alive Jun 22, 2016 In search for the mythical neutrality among top-tier public cloud providers Jun 18, 2016 In Support Of Gary Johnson Jun 13, 2016 Files and folders: apps vs documents May 26, 2016 Why it makes perfect sense for Dropbox to leave AWS May 7, 2016 JEE in the cloud era: building application servers Apr 22, 2016 Managed IT is not the future of the cloud Apr 9, 2016 LinkedIn needs a reset Feb 13, 2016 In memory of Ed Yourdon Jan 23, 2016 OAuth 2.0: the protocol at the center of the universe Jan 1, 2016 IT departments must transform in the face of the cloud revolution Nov 9, 2015 Banking Technology is in Dire Need of Standartization and Openness Sep 28, 2015 Top Ten Differences Between ActiveMQ and Amazon SQS Sep 5, 2015 We Live in a Mobile Device Notification Hell Aug 22, 2015 On Maintaining Personal Brand as a Software Engineer Aug 2, 2015 Book Review: "Shop Class As Soulcraft" By Matthew B. Crawford Jul 5, 2015 Attracting STEM Graduates to Traditional Enterprise IT Jul 4, 2015 The longer the chain of responsibility the less likely there is anyone in the hierarchy who can actually accept it Jun 7, 2015 Guaranteeing Delivery of Messages with AWS SQS May 9, 2015 The Clarkson School Class of 2015 Commencement speech May 5, 2015 Apple is (or was) the Biggest User of Apache Cassandra Apr 23, 2015 Ordered Sets and Logs in Cassandra vs SQL Apr 8, 2015 Exploration of the Software Engineering as a Profession Apr 8, 2015 What can Evernote Teach Us About Enterprise App Architecture Apr 2, 2015 Microsoft and Apple Have Everything to Lose if Chromebooks Succeed Mar 31, 2015 Where AWS Elastic BeanStalk Could be Better Mar 3, 2015 Configuring Master-Slave Replication With PostgreSQL Jan 31, 2015 Docker can fundamentally change how you think of server deployments Aug 26, 2014 Infrastructure in the cloud vs on-premise Aug 25, 2014 Things I wish Apache Cassandra was better at Feb 12, 2014 "Hello, World!" Using Apache Thrift Feb 24, 2013 Thoughts on Wall Street Technology Aug 11, 2012 Scripting News: After X years programming Jun 5, 2012 Java, Linux and UNIX: How much things have progressed Dec 7, 2010

The future is bright

March 30, 2025

Is there a future in traditional computer science? The short answer is yes. I am confident that the future is bright. Here is the longer answer.

What is Computer Science?


Computer Science is the study of computation. It includes theoretical concepts like information, automata, and complexity theories. You will study algorithms and data structures and gain enough knowledge to build complex computational systems. Some things you will learn will make you a good developer in a programming language or two. For the most part, however, you should graduate with a profound understanding of “computation“ that will allow you to acquire new skills as you progress in your career. If you don’t get that level of thinking from your education, the university fails you.

Computer Science is a young field relative to other sciences like math, physics, chemistry, and biology. In the late 1980s and early 1990s, when Computer Science as a college major was still rare, a student had to major in math or electrical engineering to learn computing. The math vs. electrical engineering track was mainly due to how a particular university implemented the computing program. Eventually, by the mid-1990s, Computer Science evolved as a standalone program in most universities as a hybrid between mathematics and electrical engineering.

Commoditization


Over time, some aspects of Computer Science have become commoditized. For example, basic skills in writing computer programs have been a commodity for quite some time. You didn’t need a four-year college degree for the past twenty years to become a full-stack developer.

Skills become commoditized because the computing industry moves towards higher abstractions, lowering the barriers to entry for newcomers. For example, in the early days of computing, a “coder” translated programming instructions into machine code and punched holes in paper cards. Over time, high-level programming languages made that task completely unnecessary—and yes, those who didn’t grow up lost their jobs.

It used to be that when one got a job at a bank writing banking software, one also had to write their disk access routines. We don’t do that anymore because the operating systems and programming languages have evolved.

Higher-level programming languages, open-source software, and coding boot camps—all of these positive developments lowered the barriers to entry for newcomers while commoditizing previously high-end skills. AI is another force driving the commoditization of some skills. Commoditization is a normal and constant feature of our field. It happens all the time, and it has always been this way.

You will not outcompete the commodity. Please write it down on your bathroom mirror or tattoo it somewhere where you can see it. If your only talent is commodity skills - you will not succeed. Full stack development is a commodity skill. Do not call yourself a full-stack developer. There are millions like you out there and AI can do it.

Specialization and the Death of the Generalist Engineer


Over the past several years, large US tech companies (Google, Meta, Amazon, Microsoft, Apple, etc.) have shifted their hiring focus from “generalist” software engineers to more specialized roles. In the 2010s, a typical software “generalist” might wear many hats. Still, by the late 2010s and early 2020s, the industry had introduced a multitude of niche titles dedicated to specific domains​.

This trend has accelerated since around 2019 as companies grew in scale and complexity. Today, job postings and hiring patterns reflect a demand for domain experts – from machine learning engineers to front-end specialists – rather than one-size-fits-all developers. Several factors underpin this shift. The post-2019 hiring boom (and subsequent correction) forced companies to prioritize critical skill sets over broad talent pools. The maturation of cloud infrastructure and DevOps has created subfields requiring dedicated expertise. Meanwhile, rapid advances in AI/ML, data science, and cybersecurity have made deep domain knowledge indispensable.

Over the past twenty-five years, as computer science has emerged out of math and electrical engineering, other fields have emerged out of Computer Science. Good examples that come to mind are Data Science, Computational Biology, Computational Finance, Information Systems, and other domain-specific disciplines.

These new “splinter” fields that emerged from Computer Science rest on the same foundation as the classic Computer Science major while incorporating interdisciplinary studies. The better universities saw the trends early on. For example, as I was doing my Computer Science Masters Degree at NYU in the early 2000s, NYU was experimenting with the Computational Biology program. A handful of US institutions have established well-funded and well-supported programs in Computational Bio. Other universities still retain Computer Science as a core major but require that students pick an area of specialization: healthcare, biology, finance, public policy, etc.

I remember an old stock-picking advice that can be adapted to choosing an area of specialization: look around and see what indispensable products and services you and the people around you use. Consider how you’d improve them, develop your ideas, and see what excites your passion. The reality is that no software exists in a vacuum without users or business needs. You will have to choose one or another eventually. For me, that specialization became financial technology and related areas like CRM, ERP, payroll, and HCM.

Suppose you insist on staying at the core of computer science. Let me give you some ideas to consider:

  • Cyber-security will never have any shortage of jobs or challenges to solve,

  • Related to cybersecurity is cryptography,

  • High-performance computing (even AI needs to run on some platforms),

  • Platform-based ecosystems (AI does not exist in a vacuum; I am going to elaborate on this in another blog post)


There are no instructions


It is very easy to build a new app from scratch -- vibe-coding or not. Very few projects start from zero lines of code. You have to work with vague requirements and immutable obstacles. There are no instructions.

What differentiates someone with a deep computer science background from someone without is their ability to turn abstract ideas into complex working systems, building upon existing technology and legacy code. Today’s world runs on billions of lines of legacy code: banks, governments, and airlines. Payments are made, funds are transferred, wars start and end, and airplanes fly. Your work must build upon what is already out there.

Someone with a three-month boot camp course in full-stack coding cannot do this work. Vibe-coding bros are not going to do this. AI could be a valuable tool in helping analyze legacy code, but it won’t wake up in the middle of the night when trades fail, or airplanes fall out of the sky. Someone with a background in scientific methods will, however.

Consider this analogy: even though a modern car can be assembled entirely by robots in the dark, car mechanics can charge as much as $200-$250/hour for some specialized repair and maintenance work. I am confident we’ll be called upon to “repair” AI-generated mess, just like we are called upon to clean up junior engineers’ mess. That is why those of us with strong systems and analytical thinking skills will remain marketable for decades -- true artisans in their fields are always successful.

Have a skills strategy


My call to action is simple: whatever you do, develop a skills strategy:

  • Pay attention to emerging areas of innovation. Do an exercise every few weeks where you think of ways a new innovative technology might help you improve your productivity or the product you are working on. Example: Vibe Coding,

  • Focus on developing skills for rapid adoption across the industry. This is where the highest demand for talent and money will be. Example: Generative AI adoption in the enterprise.

  • Move away from commoditized skills. Once you see people around you doing the same work you’ve been doing for years, or even AI doing your job, you are at risk of being substituted. Example: full-stack development.