As we revel in the new world of Cloud Computing, Big Data and Mobility via BYOD (bring your own device), it is worthwhile to reflect back and see how computer technology has progressed over the last half century.
The Information Revolution
First there was the “Agricultural Revolution” that lasted for about 900 years. Then came the “Industrial Revolution” towards the second half of the 19th Century with the invention of the steam engine and the internal combustion engine. That lasted for about 150 years. Starting in 1950 and through the second half of the 20th Century, we have identified this period as representing the “Information Revolution”. General wisdom predicts this period to last for 70-80 years and we are past the halfway mark. The follow-on era will become the “Biotechnology Revolution”. Interestingly, each era shrinks in length, meaning the rate of progress accelerates.
So what is the “Information Revolution”? Our capacity to store, communicate and transform information has been growing since millennia. Storing information has evolved from flat stones and paper to current magnetic and optical devices. Transporting information has evolved from physical messengers like pigeons, heliograph, telegraph, telephone, radio, TV and now to the Internet. In order to transform information to knowledge and wisdom, we have been using our brain and continue to do so. Computers help perform some repetitive brain tasks faster, but we have a long way to go to mimic how the brain operates. The universality of computers to multiple tasks has made it very useful across wide areas of applications.
Today, IT is proliferating all around us and you cannot ignore it. Modern cars have 40-50 microprocessors inside them to control navigation, fuel injection, braking, suspension, entertainment, climate control and so on. Even the lowly washing machines, color TVs and microwave ovens have chips controlling them. The DVDs, iPOD and other MP3 players, TV remote controls, cell phones, ATMs (Automated Teller Machine), cable TV, the Internet, dinosaurs in movies, email and chat, etc. are all products of IT. Hence awareness of the fascinating story of IT is becoming a necessity.
Let me quote two visionaries who are pioneers in the evolution of computers and how they saw the future.
“One evening I was sitting in the rooms of the Analytical Society at Cambridge, in a kind of dreamy mood, with a table of Logarithms open before me. Another member, coming into the room and seeing me half asleep, called out ‘What are you dreaming about Babbage?’ I said, ‘I am thinking that all these tables may be calculated by machine.” – Charles Babbage, (1792-1871)
“The inside of a computer is as dumb a shell but it goes like mad! It can perform very many millions of simple operations a second and is just like a very fast dumb file clerk. It is only because it is able to do things so fast that we do not notice that it is doing things very stupidly.” – Richard Feynman, Physics Nobel Laureate, (1918-1991)
The First Two Inflection Points
IBM dominated the decade of the 1970’s via its breakthrough mainframe technology, the System 370. That dominated the computing landscape for the next 15 years. Then during the 1980’s Moore’s law dominated (named after Dr. Gordon Moore, founder of Intel, who postulated back in 1965 that microprocessor capacity will double every 18 months) and the wealth creation came out of the fast emerging PC industry. Both Microsoft and Intel dominated this landscape. The 1990’s ushered in the era of the Internet, a major inflection point in computing. The Internet followed Metcalf’s law (named after Bob Metcalf, father of the Ethernet. which states – the value of a network is directly proportional to the square of the number of users.).
The Arrival Of The Internet
Almost every new idea starts as a “bad idea”. Then its scope is not well understood at the initial stage. Marconi thought his new radio would be used only for ship-to-shore communication. Invention of the transistor received a tiny mention in the New York Times, which said it “might be used to develop better hearing aids for the deaf.” Western Union turned down the chance to buy Alexander Graham Bell’s telephone patent for a mere $100,000. On hindsight the future was “obviously not obvious”. In 1949, IBM predicted there would be a market for no more than 10 to 15 computers. In a similar vain, we at Oracle had looked at the Information Superhighway fifteen years ago. Then came the user-friendly Mosaic front-end to the worldwide web. Suddenly the Internet became one of the biggest events in the way it is changing the fundamentals in the society and industry. Professor Paul Romer of the Stanford Business School says, “It isn’t so much that we have a new economy, as we have a new understanding of the importance of technology in the economy”.
Twenty First Century and where are we?
As we stepped into the new millennium, we are in the era of “pervasive computing or real-time computing”. Pervasive computing was predicted in a seminal paper by Mark Weiser (chief technologist at Xerox Palo Alto Research Center) back in 1991. He said, “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” The essence of that vision was the creation of environments saturated with computing and communicating capability, yet gracefully integrated with human users. Look around today and you see young people all over the world using the smart phone as their primary computing device. Automobiles are “chips-on-wheels” and airplanes are “chips-on-wings”.
The confluence of computing power (physics), bandwidth, and smart devices is creating many disruptive forces. Computing power is now becoming a utility, much like electricity or telephone services. Cloud infrastructure is enabling the availability of applications and services at dramatically lower costs. As smart devices proliferate, there is an explosion of data in the order of petabytes and higher. Suddenly this kind of Big Data is creating new opportunities for analytics and better decision making for business.
The future of this technology is very bright as many new innovations are waiting to happen.
Jnan Dash is a Director at Compassites. He is a technology visionary and executive consultant in the Silicon Valley. Jnan is a well-known expert in the software industry. Prior to joining Oracle in 1992, he spent 16 years at IBM in various positions including development of the DB2 family of products and in charge of IBM’s database architecture. Jnan is a frequent speaker at global industry forums on the future of software technology.