Information Technology
Background
Developed in Asia and widely used during the Middle Ages, the abacus can be considered the origin of modern computing devices. An abacus, composed of strings and beads representing numerical values, can be used for arithmetic.
French philosopher Blaise Pascal invented the world's first digital calculator in the 17th century. His machine was based on a system of rotating drums controlled with a ratchet linkage. In honor of his early contributions to computer technology, the programming language Pascal was named after him in the 1970s. A German philosopher and mathematician, Gottfried Wilhelm von Leibniz, later improved Pascal's design, making a handheld version similar to a handheld calculator. It never became available commercially, however.
The first significant automated data-processing techniques were applied to making fabric patterns, not calculating numbers. French weaver Joseph-Marie Jacquard introduced a punch-card weaving system at the 1801 World's Fair. His system was straightforward enough; the punched cards controlled the pattern applied to the cloth as it was woven. The introduction of these looms, symbolizing the replacement of people by machines, caused riots.
After proposing in 1822 that it might be possible to compute table entries using a steam engine, Charles Babbage had second thoughts about his idea and went on to design the analytical engine that had the basic components of the modern computer in 1833. This earned him the title of father of the computer. He was aided greatly by the daughter of famous poet Lord Byron, Ada Augusta King, Countess of Lovelace, who is recognized as the world's first programmer. In 1890, U.S. inventor and statistician Herman Hollerith put the punched card system to use for the 1890 census. He discovered that perforated cards could be read electrically by machines. Each perforation could stand for some important piece of information that the machine could sort and manipulate. Hollerith founded the Calculating-Tabulating-Recording Company in 1914, which eventually was renamed International Business Machines (IBM) in 1924. IBM is still an IT industry leader today, and it remains on the cutting-edge of technology. Some of its newest projects focus on blockchain technology, data analytics, artificial intelligence, and other emerging fields.
In the mid-1940s, punched cards were also used on the Electronic Numerical Integrator and Calculator (ENIAC) at the University of Pennsylvania. ENIAC's inventors developed the world's first all-electronic, general-purpose computer for the U.S. Army. This computer was enormous and relied on over 18,000 vacuum tubes. In 1949, they introduced the Binary Automatic Computer (BINAC), which used magnetic tape, and then developed the Universal Automatic Computer (UNIVAC I) for the U.S. census. The latter was the first digital computer to handle both numerical data and alphabetical information quickly and efficiently. In 1954, IBM built the first commercial computer, the 650 EDPM, which was programmed by symbolic notation.
By the late 1950s, the transistor, invented 10 years earlier, had made the second generation of computers possible. Transistors replaced the bulky vacuum tubes and were lighter, smaller, sturdier, and more efficient.
The integrated circuits of the late 1960s introduced the solid-state technology that allowed transistors, diodes, and resistors to be carried on tiny silicon chips. These advances further reduced operating costs and increased speed, capacity, and accuracy. Minicomputers, much smaller than mainframes (large-scale computers) but of comparable power, were developed shortly afterward.
The next important advances included large-scale integration and microprocessing chips. Microchips made even smaller computers possible and reduced costs while increasing capacity. The speed with which a computer processed, calculated, retrieved, and stored data improved significantly. Decreased costs allowed manufacturers to explore new markets.
In the mid-1970s, Steve Wozniak and Steve Jobs started Apple out of their garage. Their vision was to bring computers into every home in America and even the world. Toward that end, they developed a user-friendly computer offered at a reasonable price. User-friendliness was essential, since many people without computer skills would have to adapt to the computer system. The development of their eventual product, the Macintosh computer, was the first to give on-screen instructions in everyday language and successfully use a graphical interface. In addition, Apple introduced the mouse, which allows users to point and click on screen icons to enter commands instead of typing them in one by one.
IBM and manufacturers who copied their designs were quick to enter the personal computer (PC) market once they recognized the tremendous sales potential of the device. The result was a friendly debate among computer users over which are better—Macs or PCs. Regardless of personal preference, the two incompatible systems often led to problems when people tried to share information across formats. Software designers have since developed ways to make file conversions easier and software more interchangeable.
One major trend of the last few decades was the downsizing of computer systems, replacing big mainframe computers with client-server architecture, or networking. Networks allow users greater computing flexibility and increased access to an ever-increasing amount of data.
The second major recent trend has been the rapid growth of the Internet and World Wide Web. Initially developed for the U.S. Department of Defense, the Internet is composed of numerous networks connected to each other around the world. Not surprisingly, this massive network has revolutionized information sharing. It’s used for real-time video conferencing, e-mail services, online research, social networking, e-commerce, online education, entertainment, and many other purposes. The World Wide Web usually refers to the body of information that is available for retrieval online, while the Internet generally refers to the back-end network system plus its various services. In recent years, Internet use on handheld and tablet devices and through wireless networks has revolutionized people's access to technology. As of September 2020, there were more than 1.8 billion Web sites and more than 4.6 billion Internet users, according to InternetLiveStats.com. Approximately 8 percent of users lived in North America.
Hardware companies are continually striving to make faster and better microprocessors and memory chips. Advances in hardware technology have led directly to advances in software applications. As the developer of Windows, Microsoft has been the leader in the software industry. Windows is a user-friendly, visual-based operating system. (An operating system is the interface between the user, the programs stored on the hardware, and the hardware itself.) Disk operating system (DOS) is one of the early operating systems, and while still used, it requires more computer knowledge than other operating systems. The Windows and Mac systems allow users to point and click on icons and menus with a mouse to tell the computer what to do, instead of having to type in specific commands by hand, as DOS requires.
Intel and Motorola have been the innovators in microprocessor design, striving for faster and more efficient processors. Such innovations allow computer manufacturers to make smaller, lighter, and quicker computers, laptops, and handheld models. As processors get faster and memory increases, computers can process more sophisticated and complicated software programming.
Two fast-growing trends are cloud computing and mobile computing. Cloud computing allows computer users to store applications and data in the “cloud,” or cyberspace, on the Internet, accessing them only as needed from a compatible tablet, handheld, or notebook computer. The International Data Corporation, a market research, analysis, and advisory firm, reports that the worldwide public cloud services market reached $233.4 billion in 2019—up from $160 billion in 2018 and $45.7 billion in 2013. The market is projected to grow at a compound annual growth rate of 22.5 percent through 2022. Mobile computing has led to a boom in smartphones or handheld computers supported by Wi-Fi technology that allows users to access the Internet and cloud content and programs from anywhere they receive a Wi-Fi signal. In 2020, 51.5 percent of the global online population accessed the Internet from their mobile phones, according to Statista, an Internet statistics firm. This percentage is expected to grow to 72.6 percent in 2025, according to a report by the World Advertising Research Center, using data from mobile trade body GSMA. These trends are key factors driving the evolution of computing devices and the Internet today.
Other major IT trends include the growing use of the following technologies:
- Blockchain: a distributed ledger database (similar to a relational database) that maintains a continuously-growing list of records that cannot be altered, except after agreement by all parties in the chain.
- Artificial Intelligence: a concept that machines can be programmed to perform functions and tasks in a “smart” manner that mimics human decision-making processes; and
- Machine Learning: a method of data analysis that incorporates artificial intelligence to help computers study data, identify patterns or other strategic goals, and make decisions with minimal or no intervention from humans.
- Quantum Computing: a type of advanced computing in which quantum computers are used to solve challenges of massive size and complexity that cannot be solved by the computing power of traditional computers. “Quantum computers could spur the development of new breakthroughs i