Message From the President’s Desk – Michael Olin
Summer has just officially arrived, and with it another urgent reminder that my President’s Message is overdue and its absence is holding up publication and distribution of the summer issue of the NYOUG Technical Journal. To be honest, I haven’t given much thought to what I could write about, and I’m planning to watch some World Cup soccer this evening. However, I did spend some time a little over a month ago trying to put some profound thoughts to paper. In mid-May, I delivered the commencement address to the College of Computing and Information at the University at Albany, almost thirty years after I earned my degrees there. Several retired faculty with whom I studied and worked still live locally, and the College was kind enough to assemble them along with the current faculty so that we could share a meal prior to the commencement ceremony. At lunch, I asked the Dean of the College about another faculty member who was not present. I was told that he was one of the last of their cohorts to retire, and was finally convinced that it was time to call it quits after one of his students remarked that his father has taken the same introductory course with that professor when he was a student. The commencement followed lunch. I delivered my remarks, walked around the campus with my wife (also an alum), who was last on campus more than a decade ago, and we drove home. A day or two later, I received an email from the professor who had invited me to speak. He arrived at the University at Albany a year or two before I graduated and was now planning to retire as well. He thanked me for coming to speak and told me that my remarks were well received. He then told me that at the university wide commencement ceremony the day after the College commencement, he met a former student who was there for his child’s commencement. They spoke about the prior day’s ceremony and the professor mentioned that I had spoken to the graduates. A look of recognition crossed the former student’s face and he told the professor that he remembered me. I had been the teaching assistant in his introductory computer science class decades ago. If only I had decided to become an academic, I now know that it would be time to retire!
Having provided a convoluted introduction to this message, it is time for the columnar cop-out. Instead of original thoughts for the summer, here are some recycled ones, namely my commencement address to the graduates of the
College of Computing and Information at the University at Albany.
College of Computing and Information – University at Albany
Delivered by Michael Olin ’85, ‘86
Albany, NY – May 17, 2014
Dean Faerman, faculty, family, friends, and most importantly, graduates of the College of Computing and Information: I doubt that I could have ever envisioned myself uttering these words thirty years ago, but “It’s great to be back in Albany!” Perhaps the sentiment should not be that much of a surprise to me. I’ve stayed involved with UAlbany since I was sitting where you are today, although back then we referred to this institution as “SUNYA.” Quite a bit has changed since I was a student. The Downtown Campus has been completely renovated; the Uptown Campus has been expanded with new academic buildings, dormitories, a basketball arena, the football stadium and Bob Ford Field; the University added the East Campus and across Fuller Road, billions of dollars have been invested in NanoTech. The world records for the largest games of musical chairs and Twister that I helped set with a few thousand of my closest friends on campus no longer stand and Mayfest is a distant memory, along with the electrifying performance in 1983 of a relatively unknown group of 20-something Irish rockers who called themselves “U2”.
The field of computing has changed just as dramatically, and not just in a Moore’s Law smaller, faster, cheaper way. By the time I arrived at UAlbany in the fall of 1982, many businesses had automated operational tasks such as payroll, inventory and order entry using computers. Large businesses used mainframe computers or the minicomputers that had become so popular in the 1970’s. Mid-sized businesses often did their computing using timeshare services, renting time on a computer that they could not afford to purchase outright. While the need for employees with programming and related skills was increasing, most large businesses were able to fill the positions with a combination of in-house training and the hiring of new computer science graduates. Although there had been quite a few personal desktop computers that came before it, the introduction of the IBM PC in 1981 changed everything. All of a sudden, computers were not just tools for big businesses or toys for hobbyists. IBM marketed the PC as a computer for everyone. By the time I graduated, almost six million PCs had been sold worldwide. Over the next four years, those sales exploded to over 60 million, and to date over 3.5 billion devices have been sold that qualify as “PC compatible.” When I graduated in the mid- 1980’s, there was an acute shortage of programmers. I don’t recall that any of my fellow computer science graduates didn’t have the luxury of choosing from multiple job offers. I had been accepted to law school and was seriously considering pursuing a career in “Computer Law,” which was a relatively new and limited discipline at the time, focused mainly on contracts and intellectual property. However, I chose to accept a job in a corporate IT department because the opportunities at the time seemed better. For a while, that was true. From the time I graduated until the turn of the century, you could jump from job to job every year or two, increasing your compensation by double-digit percentages with each move. I specialized in designing and developing relational databases (thank you Professors Ravi and Willard), and easily moved from assignment to assignment as an independent consultant. Everyone, it seemed was using computers to automate absolutely everything. This explosion in the use of information technology sent the demand for skilled professionals through the roof. As computing moved towards ubiquity, these inexpensive machines became necessary tools, and an integral part of just about every business’s daily operations. The online revolution soon followed and the “tech bubble” created countless software millionaires. Many businesses thought that information technology could be used to gain a competitive advantage, and they invested heavily in both products and people. It was a great time to be working in IT.
A little over a decade ago, in 2003, Nicholas Carr, a writer who, at the time, was editor of the Harvard Business Review, wrote an article entitled “IT Doesn’t Matter,” which he followed up the next year with a book “Does IT Matter?” In the article, he compares the corporate adoption of information technology to the adoption of earlier technologies like railroads and electric power. At the time, corporate executives had fully embraced the idea that information technology provided a strategic advantage and that the way their companies made use of IT would differentiate them in the marketplace. Carr suggested that they were completely wrong. He argued:
Behind the change in thinking lies a simple assumption: that as IT’s potency and ubiquity have increased, so too has its strategic value. It’s a reasonable assumption, even an intuitive one. But it’s mistaken. What makes a resource truly strategic – what gives it the capacity to be the basis for a sustained competitive advantage – is not ubiquity, but scarcity. You only gain an edge over rivals by having or doing something that they can’t have or do.
Industry reaction to the article came quickly and over the next several months rebuttals to his thesis came from the top executives at Microsoft, HP, Intel, Cisco and many others. The debate was covered in The New York Times, magazines such as Fortune and Wired, as well as industry publications such as Computerworld. A more nuanced view of Carr’s article would have acknowledged that in some respects, he was absolutely correct. In cases where computing had replaced pen-and-paper business functions such as payroll, order entry and invoicing, there was no competitive advantage to be gained. On the other hand, there was a strategic advantage to be gained by other uses of information technology. Financial services firms were quite confident that their proprietary quantitative analysis and algorithmic trading provided them with an advantage in the marketplace.
Carr’s prediction of the commoditization of IT was cemented, however, by the next trend that swept through corporate America. The widespread adoption of outsourcing placed IT firmly on the cost center side of the ledger. Successful CIOs were rewarded for how little they managed to spend, often without much thought to the value they provided to the business. In just a few years, the value of the average IT professional dropped significantly. Tens of thousands of IT workers were laid off and the work that they had been doing was contracted out, often overseas. It is a bit of an oversimplification, but the difference between contracting for office supplies and contracting for information technology services has been getting smaller and smaller. We constantly hear from politicians, businesspeople, and educators that there is an enormous shortage of workers with STEM skills. Yet every year there are tens, if not hundreds of thousands of people all over the world entering the workforce with enough proficiency in these areas to meet the demand for the commodity that corporate IT has become. And here we are.
Once again, however, Moore’s law has reshaped the industry, this time, with big data. The first IBM PC was able to address 32 megabytes of disk storage. A one-gigabyte disk drive for a mainframe computer was housed in a six-foot tall cabinet, about the same size as a typical data center rack today. My son got a new cell phone earlier this month, and was able to augment its 16 gigabytes of internal storage with a 128-gigabyte microSD card that is less than half a square inch in size. Large databases have long since blown past sizes measured in mega- or giga- or even terabytes. We’ve moved on to the next orders of magnitude, exa-, peta-, zetta- and yottabytes. A decade after Carr’s article asserting that “IT Doesn’t Matter,” the Harvard Business Review published an article entitled “Data Scientist – The Sexiest Job of the 21st Century.” Of course, this article generated just as much hand wringing as Carr’s. The premise has been contested in articles and blog posts, including Information Week’s “Data Scientist: The Sexist Job No One Has.” Whether or not anyone actually hires “Data Scientists” is not that important. What is crucial, however, is that just as the ubiquity of computing generally led to the commoditization of IT, leveraging big data provides a roadmap to how businesses can once again use information technology to derive a sustained competitive advantage.
The Data Scientist, if such a person actually exists, mines the vast store of data that has been accumulated to discover trends and correlations that are hidden among the petabytes. These insights are then used to guide the development of new products and services, and improve the marketing of existing ones. In other words, the data scientist’s work leads to increases on the revenue side of the ledger, where commoditized IT was focused on decreasing the expense side of the ledger. The vast middle ground between a profit-center and a cost-center was lost somewhere along the way to commoditization. People in corporate IT have always had the ability to contribute to increasing revenues, but this knowledge was shunted aside during the past decade’s focus on cost reductions.
I’ve been fortunate to be able to spend the bulk of my career on assignments where my focus was on projects that helped the businesses I worked for leverage technology to get better at whatever it is that they do. What I discovered is that the key to being able to add value rather than just manage costs is to make sure that you have a holistic view of the organization where you work. Don’t just focus on the specific requirements of the project in front of you. Understand what the business does, what generates revenue, where its profits come from. While it can be exciting to be heads-down coding, racing towards the next release, spending all of your time banging away at a keyboard is a sure way to be pigeonholed as a commodity resource. Cultivate relationships with people outside of the IT department. Talk to the people that IT tends to refer to collectively as “the business”. Understand how they do their jobs and think about how technology can make them better at what they do. Don’t spend all of your days locked inside the IT bubble. What you’ve learned here at UAlbany allows you to look at things through a different lens than how they are seen by the business. Combining your insights with theirs can steer information technology back towards providing a competitive advantage. Even if you’re not heading from here to the sexiest job of the 21st century, you can help ensure once again, and going forward, that IT really does matter.