Human history, since time immemorial, has been undeniably marked by a series of revolutions that helped shape us as modern individuals. Our whole concept of civilization is determined by a succession of consecutive turning points for mankind. From the early Neolithic Revolution, occurring more than 12,000 years ago, and all the way through the Industrial Revolution of the 18th century and the Jet Age of the 1950s and 1960s, humanity has evolved enormously.
Nowadays, humanity is going through a new type of revolution which has vastly changed our lifestyles, beliefs, and thoughts. The Digital Revolution, as experts call it, changed how we behave, how we interacted and redefined us as modern humans. We think differently, we work differently and our lives are intrinsically different when compared to 100 years before.
How can we define Digital Revolution?
Also known as the Third Industrial Revolution, the Digital Revolution is the continuous change and evolution from the previous models of analog, mechanical and electronic technologies to digital technology. Anthropologists and other scientists believe that Digital Revolution started as early as the 1940s but was only widely accepted as a new trend in the 1970s.
The first personal digital computers becoming available to the average consumer is the primary reason for this date. The proliferation of digital record keeping and computers has steadily risen since the 1960s and continues up to present day.
Another important aspect of the Digital Revolution is the constant development of communications technology, the main components being the widespread use of the Internet and mobile phones, as well as other mobile communication devices.
According to experts, the revolution was made possible and enhanced by the fabulous rise in productivity of the whole global economy, the mass production of electronic circuits, semiconductors, and other electronic components.
See digital revolution in slides on a slideshare presentation by Whirlpool EMEA (Europe, Middle East, and Africa) shown below,
A timeline of the Digital Revolution
The origins and the early years
The seed of the Digital Revolution is universally accepted as the invention of the transistor in 1947. Developed by a team of American physicists, the transistor changed the field of electronics and helped pave the way for smaller and cheaper radios, computers, and calculators. This tiny semiconductor device switches and amplifies electronic signals and electric power and is widely considered the father of modern electronics.
During the 1950s and the early 1960s, the United States military had thousands of computational centers, equipped with massive computers. National governments around the world, as well as other military organizations quickly embraced the new technology, but its use was often restricted to military purposes. In the late 1960s, more precisely in 1969, the public was first introduced to the Internet, when a short message was sent via ARPANET, a precursor of the world wide web. Multiple other Internet protocols were developed in the following years, most notably Mark I, Merit Network, CYCLADES, and Telenet, but failed to become popular because few people had access to personal computers.
The personal computer was launched in the early 1970s and quickly became popular. Although offered at a hefty price, thousands were sold in a matter of days. The relative popularity of the new personal computer sparked the golden age of arcade games and the first video games. Space Invaders, a breakout hit during the 1970s, saw huge success and remains popular among vintage game enthusiasts. The continuous development of the digital technology replaced the previous analog systems of data collection, archiving and record keeping. A direct result of the digitization process was the creation of a new job: the data entry clerk. This job was a career opener for thousands of people in the United States alone and focused on the conversion of analog data into digital data.
During the 1980s the personal computer was present in many homes, but also made its way to public schools, libraries, hospitals, government and local organizations, as well as businesses. The computer helped increase productivity, created new job opportunities and built an entire economic sector in developed nations.
The role of computers was pivotal in the development of ATMs, CGI, industrial robots, electronic music, television, video games, and a plethora of other devices and services. Starting from the mid-1980s, millions of people in the United States started buying home computers, especially after Apple launched the Commodore and Tandy. Commodore 64 is still regarded as the best selling computer of all time, with more than 17 million units sold (from 1984 to 1994).
According to the United States Census Bureau, in 1984, 8.2 percent of all US households had a home computer, with middle and upper-middle-class families at 22.9 percent. Just five years later, in 1989, 16 percent of households had a home computer in the United States alone. In the business world, the computer was even more present, many companies being fully dependent on the new digital technology by the late 1980s.
The communication technology grew exponentially in the 1980s, especially after Motorola launched the DynaTac in 1983, considered the first mobile phone to be commercially available to the public. Although mobile phones were available in the mid-1980s, the technology used was analog; the first digital mobile phone was launched eight years later, in 1991, in Finland. The new digital mobile phone network called 2G covered small areas in Finland and sparked a surge in mobile phone sales.
The 1980s also saw the beginnings of the Compact Disc (abbreviated as the CD-ROM). Developed by a team of Japanese electronics companies, the new digital disc quickly became the preferred way to store data and music. CD-ROM readers, hi-fi systems, and other devices flooded the market, and many households embraced the new technology. The first digital camera was launched in 1988 in Japan and 1990 in the United States, and, by the early 2000s, they would eclipse traditional film photography in popularity. The development of digital ink changed the way how cartoons were being produced: The Little Mermaid, a Disney production was the first animated movie to use digital ink.
After Tim Berners-Lee invented the World Wide Web in 1989, the 1990s were set to start on the right foot. In 1991, the World Wide Web began its rise in popularity, which was initially quite slow because it was only available for public organizations and universities. The new Internet protocol quickly gained popularity after the first browser was developed by Marc Andreessen and Eric Bina in 1993. This browser called Mosaic was capable of showing inline images and text. Mosaic, although archaic and faulty, was the basis for the more evolved Internet Explorer and Netscape Navigator. By 1996, websites became common for businesses and many organizations relied on the Internet as a way to connect with customers and the public. By 1999, half of the households in the United States had Internet access, and almost all countries were accessible through the World Wide Web. Mass internet culture was already established by the early 2000s, and a huge variety of Internet products and services became available as a result.
The early 2000s were crucial for the development of mobile communication. Mobile phones evolved spectacularly and now were able of sending and receiving short text messages (SMSs), which became a cultural phenomenon. Phones could also be used to play games, connect to the radio, and take photos. Other devices were popular during the 2000s, such as the PDAs (Personal Digital Assistant), which allowed users to connect to the Internet, receive and send e-mails, all while being mobile.
During the 2000s, the Digital Revolution was present even in poor nations, where people were beginning to acquire personal home computers and mobile phones. This was especially true in remote areas of Africa and Asia, where phone lines were a rarity and mobile phones were seen as more effective. Internet use grew exponentially, reaching over 1 billion users worldwide. Similarly, mobile phone users reached 3 billion people, almost half of the world’s total population.
During the last five years, the Digital Revolution entered a new stage marked by the interconnection of multiple types of devices. The Internet is now widespread even among poorer nations, experts estimating that more than 3 billion people can easily get connected. Mobile Internet access accounts for more than 40 percent of these connections. Interpersonal communication has changed profoundly, with social networking, mobile telephony, and web chat being preferred by many users. Mobile devices that can easily connect to the Internet, such as tablets, smartphones, and smart watches are expected to exceed PC usage.
Cloud computing, a somewhat niche technology in the previous years, gained momentum during the last five years, being adopted by many businesses and even home users. This new technology allows users to share resources and apply the principle of economy of scale, similar to an electricity grid. The system focuses on maximizing the effectiveness of shared resources and create a fast, reliable solution for the individual user. Work can be allocated depending on the needs, multiple users can be dynamically reallocated per demand and the resources are used intelligently. Cloud computing is expected to grow at a rate of 50 percent per annum in the coming years, changing the way we see working with computers.