The history of computer software is closely tied with the history of the computer itself, as the need for computer software did not escalate until personal computers were introduced. Before that, one of the most important milestones in the industry occurred in 1954, when IBM built the first commercial computer, the 650 EDPM, which was programmed by symbolic notation. In the late 1950s, the transistor, which had been invented 10 years earlier, made the second generation of computers possible. Transistors replaced the bulky vacuum tubes and were lighter, smaller, sturdier, and more efficient. The integrated circuits of the late 1960s introduced the solid-state technology that allowed transistors, diodes, and resistors to be carried on tiny silicon chips. These advances further reduced operating costs and increased speed, capacity, and accuracy. Minicomputers, much smaller than mainframes (large-scale computers) but of comparable power, were developed shortly afterward.
The next important advances included large-scale integration and microprocessing chips. Microchips made even smaller computers possible and reduced costs while increasing capacity. The speed with which a computer processed, calculated, retrieved, and stored data improved significantly. Decreased costs allowed manufacturers to explore new markets. All of this led to the eventual development and introduction of the personal computer in the late 1970s.
Today the software industry comprises many facets: personal computer (PC) applications; operating systems for both stand-alone and networked systems; management tools for networks; enterprise software that enables efficient management of large corporations’ production, sales, and information systems; software applications and operating systems for mainframe computers; customized software for specific industry management; and applications for mobile computing devices. Packaged software is written for mass distribution, rather than for the specific needs of a particular user. Broad categories of packaged applications include operating systems, utilities, applications, and programming languages.
Operating systems control the basic functions of a computer or network. Utilities perform support functions such as backup or virus protection. Programming software is used to develop the sets of instructions that build all other types of software. The software most familiar to the majority of computer users is application software. This includes the familiar word processing, spreadsheets, and e-mail packages used in business; games and reference software used at home; and subject, or skill-based, software used in schools.
Although software has always been an important part of computer history, the software market did not really take off until the advent of the personal computer. IBM debuted its personal computer in 1981. Osborne released the first portable computer that same year. Apple came out with the first graphical user interface (GUI) and mouse, two innovations that would have an enormous impact on the industry. The GUI made computers much easier for the average person to use. Previously, software was used primarily by advanced computer users, but the GUI allowed nontechnical people to make easy use of computers.
User friendliness propelled computers to become the devices that have revolutionized the way we do business and, in reality, the way we live. “In 1983,” according to the IEEE Computer Society, “software development exploded with the introduction of the PC. Standard applications included not only spreadsheets and word processors, but graphics packages and communications systems.” Needless to say, the software industry has boomed since then.