The software industry is driven by the quest for the killer application. Whenever a company releases new software or technology, everyone looks for the killer feature, or the killer app that can take advantage of the new abilities. For software developers the killer app or killer feature is as important as it is for users. Often, whether or not a developer is willing to invest time into learning a new technology or technique is based on whether or not a killer app seems in reach. But is that approach still valid? Does the lack of a killer app immediately disqualify a technology?
I recently have had a number of interesting discussions with fellow developers that revolved around various kinds of killer apps or the lack thereof. Often, people ask why technology X is going to be important to them. “What is the killer app for that technology?” they say. You can substitute your favorite technology for “X.” For instance:
- What is the killer app for Tablet PCs?
- What is the killer feature of .NET that Java does not have?
- What is Windows Vista's killer feature?
There is no doubt that a “killer app” is worth its weight in gold. Having a killer application provides a serious unique selling advantage (“usa”… the holy grail for marketers) for the owner of the technology, and even for third-party developers a large number of opportunities arise. But not every great technology can present a killer application right away. This often has to do with the entire software industry growing up. New technology often takes enormous efforts in manpower, time, and money. Applications built on top of that new technology face equal hurdles. Advantages created by new technologies are often not obvious due to complexity. As a result, companies often fail to recognize the opportunities provided by that technology and fail to take the second step in this equation, which would make the advances obvious. The companies or individuals that do take that next step, on the other hand, often manage to turn new technology into great applications, and even put themselves and their marketing departments into the position of having great unique selling advantages, even if their application could not be considered a “killer app.”
What Is a “Killer App”?
What exactly is a killer application (or killer feature or killer technology)? Generally, the industry considers a killer application to be an application the typical user cannot do without. For instance, text processing software once was considered a killer application. The advantages of using text processing software over a conventional typewriter were so great that text processing alone justified purchasing a PC.
Similarly, desktop publishing and graphical design software provided advantages to the print and publishing industry that were so great, that publishers that did not invest in such applications (and the graphical operating systems that went along with them) went out of business. This not only was great for the companies that created these graphical operating systems and design applications, but it also created a large market for third parties.
Another example is the Internet. In the not too distant past, it was practically impossible (especially for the everyday user) to connect computers that were not in the same office, let alone use such connections for large scale commerce. Today, the Internet is so important that practically every family has a computer that's connected to the Web.
Obvious Limitations
All these scenarios share one aspect: The “killer application” provided such tremendous advantages that one simply could not do without it anymore. There also is a second, less obvious shared aspect: The situation before the killer application involved a very severe limitation. For instance, we have known for a very long time that connecting computers on a large scale provided great benefits, but there really was no good way for the common user to do so. (Does anyone care to explain to their grandmother how to connect to a BBS using their acoustic coupler?) Once all the technologies came together to form the Internet the way we know it today (IP, HTML, HTTP, browsers, modems,...) the entire package was a very compelling solution to a commonly understood problem.
The same is true for text processing. Did anyone really enjoy using white-out? Or how about the inability to insert text? And I am sure publishers didn't enjoy setting newspapers by hand with little metal letters. These limitations were obvious and rather well understood.
Since then, however, things have changed. The IT industry has matured, and for current needs, limitations generally are not quite as obvious. That does not mean that advances aren't happening or are less important.
The other day I ended up watching Godzilla (the modern version from 1998). This movie is not all that old, but it immediately struck me as odd how simple and primitive things seemed just a few years ago. In one scene, the main character purchases a throw-away camera to take pictures of the monster. While throw-away cameras are still available today, it still seemed completely out of place. Digital photography has taken over nowadays. And even if the protagonist didn't have a digital camera on him, it would have been likely that he would have had a cell phone with a built in camera. And even if that was not the case, it would seem more logical to me to buy a pre-paid phone than a throw-away camera. But just a few years back, this was not the case.
Now was there anything really wrong with photography before digital photography? Of course there was the problem with film being expensive. Nevertheless, the limitation was not nearly as obvious as with the above examples. My parents would have never thought “gee, this camera really limits what I can do.” Quite the contrary: When digital photography was first introduced, the average user was worried about not being able to print out their pictures, and the professionals were not satisfied with the available photo quality. There were very few people who considered digital photography a “killer application” when it was first introduced, no less before it became widely available (during beta tests for instance).
What is important though is that digital photography still turned into an extremely important technology. Today, the vast majority of photos are taken digitally. In fact, according to Bill Gate's keynote at MEDC2005, the majority of photos are now taken on cell phones. Photo quality is now great. And we also discovered that it is much more useful to e-mail photos to our friends than to have the print outs in some album on the closet that nobody every looks at.
In my mind, digital photography is clearly a killer application, but one that was only easily identifiable after the fact.
Time to Measure Differently
The moral of all this is that we cannot measure technology the same way we did ten years ago. Killer applications in the common sense simply do not happen anymore because we do not have that many obvious limitations, and the ones that remain are extremely difficult to overcome. Nevertheless, very exciting advances happen and our lives are being changed at a very rapid pace. Every time such advances happen there are great opportunities for the developer community. Don't miss out on those just because “killer features” are not as obvious as they used to be.
Here are some technologies and developments that I think fall into the “not so obvious but extremely important” category:
- Windows Vista (formerly “Longhorn”): What exactly can Vista do that XP can't? Why would a large company upgrade their entire system from XP to Vista? Those are hard questions to answer (and I will leave that up to Microsoft). But what's important is not just the platform itself. Remember that Vista is an operating system and operating systems are just meant to operate computers and allow other applications to do useful things. We are so spoiled by all the things that Windows XP does for us that we now say “what can I do with Vista?” The key is that Vista is a much better platform for practically anything you want to do with your computer including playing or writing games, reading Web pages, storing photos or other files, or building large scale, distributed enterprise applications. Vista itself is pretty exciting, but what is truly important is that Vista will enable developers to build applications that they are not able to build today.
- Tablet PCs: The great feature Tablet PCs provide is the ability to use a pen and to use digital ink. Do I want to use ink for everything I do? Do I want to operate my portable PC in tablet mode all the time? Of course not! But when I want to use tablet functionality and/or ink, it sure is awesome! While I could not point out a single feature that I could not do without, I will never again buy a portable computer without Tablet PC functionality. Even though I cannot point you to a single application that is a killer application in the sense that text processing was a killer app, the whole Tablet PC “package” is extremely compelling and I simply must have it!
- .NET in General: Why exactly is .NET better than Java (or any other language for that matter)? Because it is so much more! But that is not easy to explain. For people to understand the significance of managed CODE and integrated security with stack walks (among many other things), those people have to already understand a whole lot about .NET. Once again, the overall package is extremely compelling even though its significance cannot be explained in a short elevator ride. And that is simply how modern, complex technology is: Extremely useful, but not necessarily easy to comprehend.
- WinFX: For the first time in years, we are looking at a completely new way to program Windows applications. WinFX and its core technologies such as WPF (Windows Presentation Foundation… formerly “Avalon”) and WCF (Windows Communication Foundation… formerly “Indigo”) will enable us to do things we cannot even dream of today. Will Avalon allow you to do things you cannot currently do in your text processor? In some details, probably yes, but in terms of a generic answer, probably not. But that is not the point. What counts is that we can now venture beyond simple text processing. We can create multi-media experiences. Perhaps you would like to incorporate a short video or some statistical animation in your sales forecast at a level of difficulty that is so low it actually becomes feasible to do this. Or perhaps you are interested in digital publishing (as I am) and would like to create digital content that can rival printed materials (one of the obvious limitations that still exists today).
- WinFS: News of the death of WinFS have been greatly exaggerated. As you are reading these lines, MS will have announced that WinFS (not to be confused with WinFX) is back and beta one is available to Microsoft customers. WinFS is a next-generation indexed file system. Basically, it allows the user to treat their file system as a huge database with relational semantics and the ability to run advanced queries and break out of the common folder structure. This is great news for developers as well since all WinFS features are easily accessible from within WinFX. This allows us to take advantage of everything stored within WinFS (such as data or queries) and even add our own constructs, such as new file structures. Gone are the days of the home-brewed database or file format.
- Mobile Devices: Cell phones, Pocket PCs, computerized watches, you name it. The number of smart devices is growing rapidly, and as bandwidth and device quality increases, the opportunities for software developers will increase as well. Perhaps you still consider these devices to be unimportant for your development efforts because they can't do what you need. This is probably already incorrect today, and it certainly will be incorrect in the future. I am sure that mobile devices will be of high importance for the vast majority of CODE's readers within just a few years (if they aren't already developing for them today). Mark my words. I wish I could claim you heard it here first… ??
And the list goes on and on. There are a lot of exciting things happening today, and I cannot mention them all. If you think that nothing exciting and significant is happening in the world of software development and technology in general today, then look back five or ten years. Chances are you had a similar opinion then. And then think about all the changes that occurred since then. What cell phone or mobile device did you use? Did you use the Internet back then? If so, what did you do online? How did you connect? What applications and development environments were you using? How did you get that software? How did you take pictures then? How many e-mails did you receive every day and how often did you have to change the paper in your fax machine? What kind of computer did you have? Five or ten years ago could you have interacted with a voice interface? What navigation system did you have in your car? How did you get directions in the first place? How did you read or watch the news? How did you publish information about yourself or your company?
And then compare these 5 - 10 years to the 5 - 10 years before. And the 5 - 10 years before. How many technologies from the last 5 - 10 years were identified as killer apps ahead of time, compared to 5 - 10 years before? I would argue that we have had far fewer killer apps that were identified ahead of time in recent times than we did before, yet at the same time, technical advances are more drastic recently.
The point is that things are changing very rapidly and in very exciting ways. There certainly have been more large-scale changes in the way we use computers and software in the last five years than in the ten years before. Sure, 20 years ago, killer applications were easy to find and it was easy to jump on new technologies and be reasonably certain that they were significant. Today it takes a little more effort and in-depth understanding to judge the potential of technologies. But don't let that force you into missing out on awesome opportunities and exciting times. We are about to enter a very exciting time for technology. A new version of Visual Studio is around the corner, but more importantly, Windows Vista is not just a new version of Windows-it is the first of a completely new generation of Windows operating systems.
I cannot tell you what the next killer application will be, but I can tell you that what we are seeing on the horizon is the technology killer applications will be built upon. Whether we can identify them ahead of time and whether they will fit the traditional description of a killer app or are technologies that sneakily show up in all parts of like, I am not sure. But they are coming. And who knows: Maybe you or I can contribute to one of them.
I am very excited!