In this next installment of the “30 Years of CODE Group” celebration articles, we've now reached the halfway point. I'm looking back 15 years to roughly 2008 (give or take a year). I'm approaching a time that I start to perceive as “modern development.” Yes, a lot has changed (especially over the last year or two), but a lot of things we now take for granted started about 15 years ago. The “Cloud” was in its infancy (but already there). .NET Core wasn't quite a thing yet, but as we started developing Cloud apps, the early building blocks of Microsoft's re-imagined .NET ecosystem were being put into place. Web development was already mainstream. Mobile development wasn't quite there yet but it was starting. Except for artificial intelligence, a lot of the things we're doing as developers today were already in place (although in different forms). But before I get ahead of myself, let's survey the overall landscape of software development and the technologies we used at the time.
Operating Systems
For developers, the platform the code runs on is where it all starts, so let's take a look at the operating systems in use 15 years ago. In the last issue's article that looked 20 years back, I talked about Windows XP. Surprisingly, Windows XP was still a very important platform in 2008. Admittedly, Windows XP was a very good operating system for its time, but it also had serious shortcomings when it came to security. This was a problem Windows Vista attempted to fix as one of its key themes, another being better use of dedicated graphics hardware. As we came to find out, users didn't want to jump through the hoops required to have a secure operating system. They just didn't want anything bad to happen to them, while still happily running on Administrator accounts, with their machine's defenses taking a back seat. As a result, Windows Vista was one of the least liked Windows operating systems up to that point, probably only topped by Windows 8, half a decade later.
In hindsight, Windows Vista wasn't all bad. Yes, there were numerous valid points of criticism (hardware requirements were high, driver compatibility was low, and all kinds of security measures made it less pleasant to use than prior versions of Windows), but there were also a lot of good features. Utilization of GPUs is considered a given in any modern operating system, but it was a revolutionary step in Vista. Previously, graphics hardware was almost exclusively used by games, which used DirectX to generate high-performance output, and the “normal” Windows GUI was rendered in software through GDI (the Graphics Device Interface). This not only caused performance problems, but it also created an odd bifurcation of UI technologies that didn't always play well with each other. Using GPUs to render all Windows user interface elements not only offloaded all UI tasks to the graphics hardware (thus leaving more power to do other things for the CPU) but it enabled modern UI paradigms, such as 3D and transparent windows. Figure 1 shows how Windows Vista handled task switching in this new world.
Nevertheless, Windows Vista never was a fan-favorite and was relatively swiftly replaced by Windows 7, which preserved many of Vista's benefits, but packaged it in a much more palatable fashion to an audience that had, by then, also grown used to a more secure setup. It was arguably also more user friendly and faster, too. Unlike Windows Vista, Windows 7 holds a special place in the hearts of many users and remains one of the most beloved versions of the Windows operating system. It can be thought of as a much more polished version of Vista, with better stability and reliability, better software compatibility, cool features, and ease of use. As a result, Windows 7 had some considerable staying power and longevity. We still encounter Windows 7 in the wild regularly, when working with our consulting customers.
Overall, the world had changed a bit, right around 2008 or so, in that it wasn't just a Windows world anymore. Apple had, by then, made considerable inroads and established Mac OS X Leopard as a serious contender and a hip client operating system. It came with a somewhat modernized user interface, better performance, and many improvements to its core features, such as Finder. It was a genuinely good operating system on its way into the mainstream, after previously serving more of a niche-existence. This was in no small part due to the “Hello, I am a Mac, and I am a PC” commercials (Figure 2) that ran from 2006 to 2009 and were, arguably, one of the most effective commercials ever run. Microsoft had little to offer in response.
The World Turns Mobile
The Mac wasn't the only success Apple was able to land during that time. Apple released the original iPhone (Figure 3) in 2007, and the iPhone 3G in 2008. The original iPhone was already a very attractive device, but the 3G version was what truly established the modern era of smart phones. Not only was it faster, but it also established the App Store, and the “there's an app for that” era for mobile devices. Previously, Apple allowed no custom development for their phone, famously quipping that “the web was the development model” for that device. Something that seems unthinkable today, but prior to 2008, developing for the Apple ecosystem was not a big thing for most developers, because the Mac was a specialty platform and there was nothing else we could develop for in the Apple world that promised commercial success. All of that changed in 2008, and oodles of developers (including yours truly) dove head-first into the painful world of Objective-C development with X-Code. Thankfully, Apple later remedied the situation with the Swift programming language and better developer tools, overall.
Then again, Apple wasn't the only player in that market. Previously, Microsoft had made a really good mobile platform, going back to Windows CE and Microsoft SmartPhone. Steve Balmer (at the time, CEO of Microsoft) had this belief that only phones with a physical keyboard had any chance of being successful in commercial scenarios. Not his finest moment, and what condemned Microsoft to a role where most younger developers today can't even conceive of a world where Microsoft was a leader in Mobile tech. (Not to mention Microsoft's once awesome Tablet PC initiative, which failed to claim the market that was - years later in 2010 - claimed by Apple's iPad).
Then again, it wasn't just Steve Balmer who made that mistake. Research in Motion (RIM) was still a big player in the mobile market with their Blackberry devices. And just like Balmer, RIM bet on a physical keyboard on the device. A bet that paid off for a long time, but that time had just about run out. RIM didn't have much to put up against Apple's new offerings. That, as well as some other leadership mistakes, sunk RIM's boat. (For an interesting topic that is beyond the scope of this article, and for those interested in the “make my country great again” line of politics, ask your favorite AI to recap RIM's adventures in building factories in Tierra del Fuego as part of Argentina's attempts to protect their local markets.)
The mobile market segment remained a key technology battleground for quite some time. Microsoft tried to mount a comeback with Windows Phone 7. Too little, too late. Microsoft tried to out-Apple Apple with a slick tile-based user interface. Although technically pretty darn good (Microsoft's touch-keyboard remained the best such offering for quite some time), the overall offering was too weak, and the UI didn't cut it for most users. When Microsoft released its phone, it simply had too few apps, and even the big hitters, while present, only offered what must now be seen as skeleton versions of their app that ticked the box but didn't satisfy users. Microsoft also attempted a parallel product with the Microsoft Kin, a phone that was supposed to appeal to young people, but was so ill-received that after an investment of north of a billion dollars, the plug was pulled on that product less than two months in. If you don't remember the Kin, then at least Microsoft did some decent damage control, which was the best part of the whole affair.
All of this left Google as the only true competitor in the mobile device market, with the Android operating system, which was first commercially released in the fall of 2008. Android was (and still is) exceptionally successful as an Apple iOS competitor, at least in the mobile phone market (although not so much for tablet devices). It's amazing to think that what played out 15 years ago is still the status quo in mobile devices, several generations of software later.
The Birth of the Cloud
Not only does the mobile device world as we know it today go back about 15 years, but so does the Cloud. It still seems to me like cloud computing is relatively new, but it was in October of 2008 that Microsoft first announced its cloud initiative, then called “Windows Azure,” the operating system that's “out there.” We all weren't quite sure what to make of it. How was this different from data centers that hosted your servers? Why was it called a “cloud operating system”? It made sense in some way, because it was a “thing that ran your stuff,” just like a local operating system, but we were scratching our heads over all the details. And how the heck did you even pronounce “Azure?” “The color of the cloudless sky” seemed like an awfully odd name for a cloud operating system (and still does today).
Again, Microsoft wasn't the only player. Amazon (the online bookstore? Yes!) had previously released Amazon S3 (Simple Storage Service) in 2006, followed by Amazon EC2 (Elastic Compute Cloud) a few months later. There were other vendors of various sizes that dabbled in this market as well, but back then (as today), Microsoft and Amazon were the biggest players in what grew into amazing platforms that make much of modern computing (including today's artificial intelligence craze) possible.
The cloud also drove various development trends in years to come. Linux turns out to be an effective operating system to run cloud-servers (and services), which is why everyone seems to embrace it (to our surprise - at least at the time - even Microsoft embraced it). This ultimately drove a re-imagining of the Microsoft .NET platform as a modular, cross-platform development ecosystem. It also further pushed web development into the mainstream, which sparked many web frameworks, from jQuery (released in 2006) to ASP.NET flavors, Ruby on Rails, Angular, REACT, and many others that are too numerous to list here.
Software Development
For many of us, development was very Microsoft-centric. Much of what was going on there, we still recognize today. C#, JavaScript, the .NET Framework, ASP.NET, Visual Studio, SQL Server, and so on. Then there were the things we'd like to forget, such as Microsoft Silverlight (Figure 4), a competitor to both HTML and Adobe Flash. Boy, would I like the money back I invested in those technologies. And on a serious note: The fallout of Silverlight being abandoned still drives a lot of skepticism regarding the longevity of Microsoft developer technologies today, whether that's warranted or not. It wasn't the first technology Microsoft first pushed and then abandoned (Microsoft Hailstorm and .NET My Services, anyone?) but it was probably the most visible and, for many developers, the most expensive and painful instance. The lesson learned here is that Microsoft's developer story never fully recovered from this episode.
.NET was in its third major generation with .NET 3.0. It had an amazing set of highly anticipated sub-technologies. WPF (Windows Presentation Foundation) revolutionized Windows Desktop development in a way that was super-powerful, and graphics accelerated. CODE Group did many projects on top of this platform. Ultimately, web development proved more enticing to a lot of companies, and WPF was also not necessarily for the faint of heart if you wanted to achieve truly great results. However, if you're interested in native Windows desktop development, it's still one of the best choices available.
Windows Communication Foundation (WCF) was the highly anticipated networking layer that provided remote service calling using protocols such as SOAP, which made it more efficient to build service-oriented systems and APIs. SOAP wasn't the only supported protocol. WCF features quite a range of networking technologies, and many of the .NET superstar developers were very invested in it (and, at the time, rightfully so).
Then there was the Windows Workflow Foundation (WWF), which later had to be renamed to WF, because the World Wildlife Fund complained about the acronym. It provided a way to create - wait for it - workflows that could be complex, long-running, and transactional. At the end, this technology didn't have much staying power compared to some other parts of .NET 3, because it simply didn't cover as many scenarios as user interface and communication technologies.
One of the huge aspects of .NET 3 was that it shipped out of the box with Windows Vista, making deployment of .NET 3 apps much easier and allowing various Microsoft teams to use .NET to develop even parts of Windows itself. This was a huge point of contention at the time (after all, .NET was a big deployment package) and it reportedly took the direct intervention of Bill Gates to finalize the decision to ship it “in the box”.
Other Technologies and Trends
2008 was a tumultuous year. The global financial crisis had a deep impact on many of us (while we were still recovering from the dotcom bubble), and, in hindsight, it's scary how close we came to a complete systemic breakdown. But there were also interesting and positive outcomes. Arguably, the world created some additional checks and balances (some efficient, others not so much), and cryptocurrency was invented. It took a bit of time to catch on, but the mysterious Satoshi Nakamoto published the Bitcoin whitepaper, which laid the foundation for the first decentralized cryptocurrency, which would go on to have a profound impact on the world of finance and beyond.
Significant events in the world of software included the launch of the Google Chrome web browser as well as the launch of GitHub. Yup, GitHub has indeed been around that long. It is, in fact, roughly the same age as the Large Hadron Collider (LHC) in CERN, which is still the world's highest energy particle collider that has produced some amazing scientific results. Some of these might be documented on GitHub, I suspect.
What else happened around that time? Microsoft launched the BING search engine. NASA launched their Kepler mission, designed to discover Earth-sized planets around other stars (and which has discovered thousands of exoplanets). Amazon released the Kindle e-reader. And the software developer community in general was pretty excited about this new thing called HTML5.
Music, Movies, and Games
If you're a geek like me, a look back a decade and a half wouldn't be complete without a look at the pop culture of the day.
As discussed in last issue's article, the music industry never fully recovered from the Napster days, but good music was being made nonetheless. Coldplay, Rihanna, Beyonce, the Foo Fighters, and Justin Timberlake were in the charts. And yes, even Taylor Swift was already in the news with her album “Fearless.”
In the movies, we watched Batman: The Dark Knight, Slumdog Millionaire, WALL-E, Harry Potter and the Order of the Phoenix, The Hangover, Avatar, Up, and J.J. Abram's reboot of Star Trek. Yup, it's already been 15 years! Some of these are still on my “to be watched soon” list that I haven't quite gotten to yet.
Watching movies and TV series was about to be completely uprooted, even though we didn't yet know it then. Netflix moved from a DVD-based video rental company to streaming online. In 2008, they launched an auxiliary service called “Watch Now”, which seemed super-cool, but bandwidth limitations caused many to doubt that it would really work well. Today, “Watch Now” is just known as “Netflix” and most of us don't even remember that Netflix used to send you DVDs in the mail not all that long ago. They were the early leader in the streaming market, and they're still the leader today. There have been times when more than half of the internet traffic was due to people watching Netflix content.
The highlight for me personally were the games that were released a decade and a half ago. Oh, what a golden time it was! Microsoft released the Xbox 360, (Figure 5) one of the most enjoyable gaming devices ever made, if you ask me. (And the complete re-launch of the user interface - the NXE, the New Xbox Experience - of this consumer device was a ground-breaking step that hadn't been seen up till then). At the same time, Sony pushed the envelope with PlayStation 3, and Nintendo decided to go in a completely different direction with the innovative (although not very powerful) Nintendo Wii console. Meanwhile, the PC was pronounced dead and revived as a gaming platform numerous times, and kept churning out classic after classic.
In gaming, it isn't so much about the platform, of course; it's about the games. And boy, what a time for games it was! Massive hits from the period include timeless classics such as BioShock (Figure 6), The Witcher, Crysis, Portal (and the Orange Box together with Half-Life 2), Left 4 Dead, Fallout 3 (did you see the recent Amazon TV series?), Warhammer 40,000: Dawn of War II, Dragon Age (my personal favorite - boy, did I waste hours on this game!), Borderlands, Metal Gear Solid 4, Halo 3, Gears of War 2, Forza Motorsport 3 (Figure 7), Leage of Legends, and Minecraft.
Meanwhile, World of Warcraft was still going strong, and social gaming had reached a whole new level with Xbox Live and other online services. Valve had, at that point, a well-established a digital sales channel with its Steam service, and they still maintain a dominant position today.
You'll have to excuse me now. I have to go and re-install some of these classics!