Thursday, August 1, 2013

The Only Constant Is Change

“Dad, giraffes don’t make noise,” said my adorable little girl.

“Is that true, Zoe?”

“Yes, Google it,” she replied.

I laughed at my child who was not speaking in complete sentences a few years ago was now using “google” as a verb in perfect context. Just as my generation made fun of our parents’ generation, our children will make fun of us. They are an entire generation that will never use a land line, buy a compact disc or DVD, nor will they know a life that did not include the Internet or FaceBook, and more.

I am one-third to one half of my way through a career in Information Technology and now is as good a time as ever to reflect how much has changed in IT since I started my career. I graduated from college and started working professionally in 1995. I considered myself lucky to have a laptop instead of a desktop. It was an Intel 486 without a CD-ROM. It took several minutes to boot Windows for Workgroups 3.11 and log into a Novell Network.

Plenty of people I worked with had their Certified Netware Engineer certificates. Novell was absolutely huge. Logging into their servers enabled extremely crude file sharing, mapping network resources to local drives, and print services. In a span of a few years, Novell all but disappeared from the corporate landscape. It was replaced by NT Server and Active Directory by Microsoft.

Active Directory allowed IT departments to control what software was loaded on their employees PCs, run scripts, and update anti-virus software. It works great when a user had one device and was hooked directly into a corporate network. It works less than great when a user had multiple devices and tried to access corporate resources away from the office. It only works with Windows PCs.

In 1995, Apple Computer was as good as dead. A few people had Macbooks in college, but in the Enterprise, it was all about Windows PCs. A decade and a half later, Apple (no longer Apple Computer) is one of the most valuable companies in the world. While the Mac OSX operating system is far from dominating corporate IT departments and represents about 10% - 15% of the personal computing devices, it is now finally receiving limited enterprise support. Mac users love their Macs and are adamant about using them at work. Except which operating system is being used doesn’t matter nearly as much as it used to since a lot of corporate systems are now written as web interfaces accessed by a browser. So long as a user can access the resource, the OS they are using is of little consequence. Does Active Directory still matter in this environment? The days of a single user with a single computer are long gone. Most corporate users have a work laptop, a home computer, a smartphone, and possibly a tablet. They want their information on whichever device is closest.

As the phrase “World Wide Web” was repeated over and over again in the mid-nineties, Netscape shot to prominence with their Netscape Navigator browser. Without worrying about minor details like profitability, plans for profitability, or a product roadmap; Netscape went public launching the dot com bubble. They were the sexy Internet startup and the tech media darling.

Microsoft was worried that the OS would become irrelevant (rightfully so) and was determined to control the browser experience. Internet Explorer came from behind and in a period of a few short years, was being used by over 90% of PC users. Netscape eventually sold themselves to AOL for several billion dollars and was never heard from again.

Internet Explorer ruled the Internet, for a short period of time. Yet Apple wielded a subtle influence on the future of the web. Although Safari has a small market share, it was built on the webkit platform which was then open sourced. Google built the Chrome browser and a community of developers produced Firefox based on webkit. The default browser on the most dominant operating system (Windows) is now repeatedly swapped for Chrome or Firefox to the point where IE now represents only 30-40% of the browser market. Geeks being passionate about their browser is one thing, but at this point non-geeks are downloading Chrome or Firefox in droves.

Speaking of AOL, in the mid-nineties they had well over 20 million subscribers paying $20 per month for the privilege of accessing the Internet over a telephone with speeds up to 56.6 kbps. They completely dominated consumer access to the Internet. The phrase “you got mail” even spawned an awful movie starring Tom Hanks and Meg Ryan. The only problem was after the novelty of “being online” wore off, no one wanted to connect through their phone line at painfully slow speeds. Around the time AOL purchased Time Warner, in what is widely considered one of the most disastrous corporate mergers of all time, cable and DSL started offering always on Internet connections at orders of magnitude faster speeds and AOL became increasingly less important. Time Warner eventually spun themselves back out of AOL and the company is now worth considerably less than the paid to acquire Netscape over a decade ago.

During the dot com bubble days, companies talked about eyeballs. Acquiring eyeballs. Aggregating eyeballs. Eyeballs were going to lead to sweet advertising dollars. Yahoo! was the leading web portal. Except it turned out display advertising was not nearly as important as it was initially thought. An upstart, Google, produced great search results and had a minimalistic web site built around it. Google charged companies to be mentioned in their search results and made buckets of cash. Yahoo! stumbled and eventually decided not to sell themselves to Microsoft for $50 billion and now, a few years later, are valued at well less than half that amount.

In the early 2000’s corporate executives loved their Blackberries. They loved them so much they were referred to as “crackberries” as the execs who carried them were physically addicted to them. Then, in less than half a decade, the Blackberry market share went sharply in reverse - replaced by iPhones and Android smartphones.

In the last fifteen years, Microsoft’s Exchange became the de facto standard in most corporation replacing products by Novell and Lotus Notes. Upstarts like Yahoo! seemed unstoppable for a few years only to be replaced by an even newer company a few years later. Beloved products like the Blackberry have been discarded in just a few scant years. Complex corporate systems that used to run as executable programs on PCs have been replaced by browser based Intranet systems. It seems like every piece of technology is evolving at an increasingly rapid pace, and that is just the things you see...

Beneath the surface... Programs used to be written in procedural languages. Then along came object-oriented languages. In OO, C++ was dominant, but then Java came along and got rid of the worst of C++ - multi-inheritance, pointers, memory management, header files and more. Three tier architecture became the standard over the two-tier client/server model. New languages and frameworks like Django and Ruby on Rails were designed to make web development even faster.

On the front end, just when it seemed like Flash was the standard for client interactions with the browser, Apple’s steadfast refusal to support it on the iPad killed it. Seemingly minutes later, jQuery became the new standard.

On the database side, new concepts like eventual consistency and big data offered a departure from the traditional relational databases. Even with relational databases the Spring framework and other Object to Relational Mapping (ORM) technologies made the hours I had spent writing SQL semi-obsolete. Where the database world once had IBM, Oracle, and Microsoft slugging it out for corporate dollars; it now has open source alternatives such as MySQL, PostgreSQL, Cassandra, and Mongo that are every bit as good (if not better) than their expensive closed source brethren.

Yet amongst this sea of change, one thing has remained remarkably consistent. People create data. That data must be stored somewhere. Rules need to be established to define what data is accepted. The data must be secured. Solving problems in technology does not require being an expert in every new fangled technology. It requires an open mind and flexibility. It requires good design making intuitive interfaces for users, scalability, and configurability. The syntax of the language is less important than understanding how to setup the data, validate it, interface it, and secure it.

Everyone throws around the buzzwords “flexible” and “scalable” yet few actually define it. So I’m going to do it. Scalable means the ability to seamlessly add users and/or data with minimal impact to performance. Flexible means to anticipate future needs well enough that the system can be modified via configuration versus code changes into the foreseeable future. Neither of these critical design principles have anything to do with syntax and using the latest and greatest, but a deep understanding of the problem domain.

Too often the emphasis on the low level. “How many years of SQL Server do you have?” is the wrong question. “How do you model this problem?” is the right one. I am shocked by the number of people I have ran into who do not understand basic concepts of data normalization. Not understanding this leads to bad/inflexible design. These people are hired for their syntactical experience, not for their problem solving. Except the syntactical skills don’t really matter because, in this field, the only constant is change.

1 comment:

  1. Blackberry is my all time favorite cell phone brand. I don’t understand what makes people think its bad or not suitable.

    ReplyDelete