Tech Talk

Share |
E-Mail ArticleE-Mail Article Printer-FriendlyPrinter-Friendly

Programming, Then and Now

Columnist: Michael E. Duffy
March, 2015 Issue

Michael E. Duffy
All articles by columnist
Columnist's Blog

Even with the passage of nearly a half-century, though, programming hasn’t gotten easier.

Forty years ago, in 1975, I was 20. Back then, there were no personal computers, no laptops (like the one I’m using to write this), no iPhones (or even cell phones), no Google Docs (which I’m using to write this), no Internet, no World Wide Web, no…well, you get the idea. 1975 marked the break of the ginormous wave of technology that we’ve been surfing for the last 40 years. Anyone reading this column has lived through at least part of this dramatic change. Businesses now have QuickBooks instead of a shoebox or a handwritten ledger. We email PDFs instead of faxing a paper document (although, mysteriously, the fax machine hasn’t completely died out). We have time clocks that read people’s fingerprints (no more clocking out for someone else). Many of us do all of our work in front of a computer screen.
But some things persist. There are still multi-part forms (white, yellow and pink copies—press hard), and probably more than a few manual typewriters being used to fill them out. Even with programmable coffee makers, people still fail to make a fresh pot of coffee after taking the last cup. And people still program in the C programming language.
I wrote my first serious C program working a student job in college, automating the previously manual process of typesetting the college course catalog. Today, I architect software systems for anti-gravity treadmills, and the code for them is still written in C. It’s one of the few instances of technology that spans the past 40 years. C was developed at Bell Labs by Dennis Ritchie between 1969 and 1973. I learned it from some photocopied Bell documentation. A book, The C Programming Language, authored by Brian Kernigan and Ritchie, was published in 1978 and became the de facto standard for the language (C became a formal standard in 1989). Lots of people learned C from that book. Today, you can learn C online (for free, even).
Ritchie, who died in 2011 at age 70, created a language that's the foundation of much of the world’s software infrastructure. For example, the Linux operating system and the Apache web server, which represent a huge part of the plumbing of the Internet and World Wide Web, are written in C. The software in your automobile is written in C. Even Windows and the Macintosh OS were originally written in C. C and its object-oriented sibling C++ (pronounced cee-plus-plus) remain popular today because they create smaller, faster programs than most alternatives. Coupled with the fact that C/C++ are free and available for almost any computer, from 8-bit microcontrollers to 64-bit supercomputers, it’s been an unstoppable force for four decades.
Even with the passage of nearly a half-century, though, programming hasn’t gotten easier. Forty years ago, the challenge was how to accomplish a particular task. It was unlikely that someone else had already encountered and solved the problem and, even if they had, it was even more unlikely you’d find out about it (back then, people who worked with particular types of systems would form user groups to share this information).
The spread of programming and the rise of the Internet has changed that. Nowadays, whenever I encounter a programming problem, my first question is, “Has anyone already solved this same problem?” So, rather than think up a solution, I have to look for a solution. In theory, this should result in a huge productivity gain.
In some cases, I don’t even have to look too hard for solutions. Right now, I’m working with a STM32F4 microprocessor from ST Microelectronics (STM). To make choosing an STM part more attractive to people designing a microprocessor-based system, STM provides a very large library of sample code (written in C!) showing how to use the various features of STM32F4. They even sell a $15 developer kit, which includes a little printed-circuit board (PCB) with the microprocessor already on it, along with input and output pins for all the features it supports. The kit even includes the cable I need to connect the PCB to my computer. I can start programming the software for a new product without having to first develop the hardware needed to prototype a system. This sure didn’t exist 40 (or even 20) years ago—at least not for $15!
So, I want my new system to talk to another device over a USB link. I look up an example program that does just that, using the library of code STM provides. I copy the example code and the library. It doesn’t work. I read the documentation for the library code and try again. It still doesn’t work. What’s wrong?
As I said in a previous column, programming always turns into debugging. But now, I’m not just debugging my own code. I’m debugging STM’s code. Did I use it correctly? Is there a bug in it? Where’s the productivity gain?
Next month, I’ll talk a little more about “reusable” code and the future of programming. In the meantime, send your biz tech questions to




In this Issue

Plant Power

The Economist called 2019 “the year of the vegan.” Indeed, we are seeing more and more plant-based alternatives to meat and dairy on grocery store shelves and restaurant menus. Many people...

Kid Safe? The Vaccination Debate

Measles, mumps, chicken pox and other childhood diseases of yesteryear are making headlines again in the United States. Formally prevalent viruses and bacterium affected thousands of lives before a sc...

Managing Workplace Stress

If you’re not stressed at your job and workplace, consider yourself lucky. In a recent Gallup poll, 55 percent of Americans said they are stressed during the day, 20 percent higher than the worl...

See all...