Software

software, instructions that tell a computer what to do. The software comprises the entire set of programs, procedures, and routines associated with the operation of a computer system. The term was coined to differentiate these instructions from hardware—i.e., the physical components of a computer system. A set of instructions that directs a computer’s hardware to perform a task is called a program, or software program.

The two main types of software are system software and application software. System software controls a computer’s internal functioning, chiefly through an operating system, and also controls such peripherals as monitors, printers, and storage devices. Application software, by contrast, directs the computer to execute commands given by the user and may be said to include any program that processes data for a user. Application software thus includes word processors, spreadsheets, database management, inventory and payroll programs, and many other “applications.” A third software category is network software, which coordinates communication between the computers linked in a network.

Software is typically stored on an external long-term memory device, such as a hard drive or magnetic diskette. When the program is in use, the computer reads it from the storage device and temporarily places the instructions in random access memory (RAM). The process of storing and then performing the instructions is called “running,” or “executing,” a program. By contrast, software programs and procedures that are permanently stored in a computer’s memory using a read-only (ROM) technology are called firmware, or “hard software.”

multiprocessing, in computing, is a mode of operation in which two or more processors in a computer simultaneously process two or more different portions of the same program (set of instructions). Multiprocessing is typically carried out by two or more microprocessors, each of which is in effect a central processing unit (CPU) on a single tiny chip. Supercomputers typically combine millions of such microprocessors to interpret and execute instructions.

The primary advantage of a multiprocessor computer is speed, and thus the ability to manage larger amounts of information. Because each processor in such a system is assigned to perform a specific function, it can perform its task, pass the instruction set on to the next processor, and begin working on a new set of instructions. For example, different processors may be used to manage memory storage, data communications, or arithmetic functions. Or a larger primary processor might use smaller secondary processors to conduct miscellaneous housekeeping duties, such as memory management. Multiprocessor systems first appeared in large computers known as mainframes, before their costs declined enough to warrant inclusion in personal computers (PCs).

Personal computers had long relied on increasing clock speeds, measured in megahertz (MHz) or gigahertz (GHz), which correlates to the number of computations the CPU calculates per second, in order to handle ever more complex tasks. But as gains in clock speed became difficult to sustain, in part because of overheating in the microprocessor circuitry, another approach developed in which specialized processors were used for tasks such as video display. These video processors typically come on modular units known as video cards or graphic accelerator cards. The best cards, which are needed to play the most graphic-intensive electronic games on personal computers, often cost more than a bargain PC.

It must be noted, however, that simply adding more processors does not guarantee significant gains in computing power; computer program problems remain. While programmers and computer programming languages have developed some proficiency in allocating executions among multiple processors, parsing instructions among processors becomes more difficult the more processors there are. Power consumption also grows quickly with multiple processors, so many computers improve performance by using a multicore architecture in which multiple processors, or cores, are placed on the same chip.

Sketchpad, the first interactive computer-graphics program. Sketchpad originated as American engineer Ivan Sutherland’s doctoral thesis project in the early 1960s and was one of the first graphical user interfaces. The program allowed users to visualize and control program functions and became a foundation for computer graphics, computer operating system interfaces, and software applications that are used in many facets of modern technology.

In 1961 Massachusetts Institute of Technology (MIT) graduate student Sutherland developed a primitive application, Sketchpad, that would run on the TX-2, one of the first programmable computers, at MIT’s Lincoln Laboratory. The TX-2 had twice the memory capacity of the largest commercial machines and impressive programmable capabilities. The computer possessed 320 KB (kilobytes) of memory and powered a 23-cm (9-inch) cathode-ray tube (CRT) display. Sketchpad displayed graphics on the CRT display, and a light pen was used to manipulate the line objects, much like a modern computer mouse. Various computer switches controlled aspects of the graphics such as size and ratio. In 1963 Sutherland published his doctoral thesis, “Sketchpad: A Man-Machine Graphical Communications System.”

Sketchpad’s process for drawing lines and shapes was quite complicated. The system’s functionality was heavily electrical and used electronic pulses shared between the photoelectric cell of the light pen and an electronic gun fired from the CRT. The timing of the pulse displayed a cursor to represent the light pen’s position on the screen and thus converted the computer screen into a sketchpad upon which objects could be drawn.

How objects in Sketchpad could be visualized and modelled on a screen became the foundation for modern graphical computing used in advertising, business, entertainment, architecture, and Web design. In 1964 Sutherland collaborated with David Evan at the University of Utah in Salt Lake City to initiate one of the first educational computer graphics labs. Sketchpad also led to the advanced development of other imaging software, such as computer-aided design programs used by engineers.

a tablet computer, a computer that is intermediate in size between a laptop computer and a smartphone. Early tablet computers used either a keyboard or a stylus to input information, but these methods were subsequently displaced by touch screens.

The precursors to the tablet computer were devices such as the Stylator (1957) and the RAND Tablet (1961) that used a stylus for input into a larger computer. In 1968 Alan Kay, a graduate student at the University of Utah, promoted his vision of a small, powerful tablet-style computer that he later called the Dynabook; however, Kay never actually built a Dynabook. The first true tablet computers were Cambridge Research’s Z88 and Linus Technologies’ Write-Top, which were introduced in 1987. The Z88 accepted input through a keyboard that was part of the main tablet unit, while the Write-Top accepted input through a stylus. Weighing 0.9 kg (2 pounds), the Z88 was much more portable than the Write-Top, which weighed 4 kg (9 pounds) because it came with an internal floppy disk drive.

Many other models followed the Z88 and the Write-Top, but tablet computers languished in sales until 2010 when Apple Inc. unveiled the iPad, a touch-screen device with a display that measured 24.6 cm (9.7 inches) diagonally. It was about 1.2 cm (0.5 inches) thick and weighed about 0.7 kg (1.5 pounds). The iPad was operated with the same set of finger gestures that were used on Apple’s iPhone. The touch screen was capable of displaying high-definition video. The iPad also had such applications as iTunes built-in and could run all applications that were available for the iPhone. In partnership with several major publishers, Apple developed for the iPad its own e-book application, iBooks, as well as an iBook store accessible through the Internet.

Other tablet computers such as the Samsung Galaxy Tab, the Motorola Xoom, and the HP TouchPad followed on the heels of the iPad. The tablet computer market exploded from a mere 2 million sold worldwide in 2009 to 20 million in 2010. Smaller devices such as the Apple iPad mini and Amazon Kindle Fire also appeared, as well as “phablets,” devices such as the Samsung Galaxy Note that were midway in size between a small tablet and a smartphone. Sales of tablet computers peaked in 2014 with 233 million sold and declined thereafter, the decline being attributed to consumers not replacing tablets as often as smartphones.

Tim Berners-Lee, in full Sir Tim Berners-Lee, (born June 8, 1955, in London, England), British computer scientist, generally credited as the inventor of the World Wide Web. In 2004 he was awarded a knighthood by Queen Elizabeth II of the United Kingdom and the inaugural Millennium Technology Prize (€1 million) by the Finnish Technology Award Foundation.

While at CERN, Berners-Lee developed a program for himself, called Enquire, that could store information in files that contained connections (“links”) both within and among separate files—a technique that became known as hypertext. After leaving CERN, Berners-Lee worked for Image Computer Systems Ltd., located in Ferndown, Dorset, where he designed a variety of computer systems. In 1984 he returned to CERN to work on the design of the laboratory’s computer network, developing procedures that allowed diverse computers to communicate with one another and researchers to control remote machines. In 1989 Berners-Lee drew up a proposal for creating a global hypertext document system that would make use of the Internet. His goal was to provide researchers with the ability to share their results, techniques, and practices without having to exchange e-mails constantly. Instead, researchers would place such information “online,” where their peers could immediately retrieve it anytime, day or night. Berners-Lee wrote the software for the first Web server (the central repository for the files to be shared) and the first Web client, or “browser” (the program to access and display files retrieved from the server), between October 1990 and the summer of 1991. The first “killer application” of the Web at CERN was the laboratory’s telephone directory—a mundane beginning for one of the technological wonders of the computer age.

From 1991 to 1993 Berners-Lee evangelized the Web. In 1994 in the United States he established the World Wide Web (W3) Consortium at the Massachusetts Institute of Technology’s Laboratory for Computer Science. The consortium, in consultation with others, lends oversight to the Web and the development of standards. In 1999 Berners-Lee became the first holder of the 3Com Founders chair at the Laboratory for Computer Science. His numerous other honours included the National Academy of Engineering’s prestigious Charles Stark Draper Prize (2007). Berners-Lee was the author, along with Mark Fischetti, of Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web (2000).

Adobe Flash, animation software produced by Adobe Systems Incorporated from 2005 to 2020.

The development of Adobe Flash software can be traced back to American software developer Jonathan Gay’s first experiments with writing programs on his Apple II computer in high school during the 1980s. Before long, Gay had written a graphics program for the Apple II using Pascal. Later, he teamed up with a local Macintosh users-group organizer, Charlie Jackson, who started a Macintosh software company called Silicon Beach Software. At Silicon Beach Software, Gay combined animation and digital sound to create the Macintosh electronic game Airborne! Eventually, in his drive to create animation software compatible with Apple Inc.’s Macintosh and Microsoft Corporation’s Windows programs, he produced SmartSketch, a program in which users could draw on the computer screen with an electronic pen. This was the start of his own software company, FutureWave Software, in the mid-1990s.

As the Internet grew in popularity, FutureWave added two-dimensional animation features to SmartSketch that let Internet users display graphics and animation over the World Wide Web, and FutureSplash Animator was born. The program’s first success came when Microsoft used the software for their MSN website. Macromedia, Inc., bought the rights to FutureSplash Animator in 1996, creating Macromedia Flash, which became Adobe Flash after Adobe purchased Macromedia in 2005. Adobe Flash allowed users to create animation for use on the Internet, and Adobe’s Flash Player was one of the most widely distributed applications on the Internet in the early 21st century, with many websites using the software.

HTML5 was the latest and widely available standard of the HTML language that encoded websites and allowed animation to be made without Flash. In 2010 Apple co-founder Steve Jobs wrote “Thoughts on Flash,” in which he explained why Flash was not allowed on Apple’s mobile devices, the iPhone, the iPod, and the iPad. Jobs reproved Adobe for Flash’s poor security, frequent crashing, and voracious battery consumption. Through the 2010s many websites transitioned from Flash to HTML5. Adobe ended support for Flash on December 31, 2020.

Steve Jobs, in full Steven Paul Jobs, (born February 24, 1955, San Francisco, California, U.S.—died October 5, 2011, Palo Alto, California), cofounder of Apple Computer, Inc. (now Apple Inc.), and a charismatic pioneer of the personal computer era.

Founding of Apple

Jobs was raised by adoptive parents in Cupertino, California, located in what is now known as Silicon Valley. Though he was interested in engineering, his passions of youth varied. He dropped out of Reed College, in Portland, Oregon, took a job at Atari Corporation as a video game designer in early 1974, and saved enough money for a pilgrimage to India to experience Buddhism.

Back in Silicon Valley in the autumn of 1974, Jobs reconnected with Stephen Wozniak, a former high school friend who was working for the Hewlett-Packard Company. When Wozniak told Jobs of his progress in designing his own computer logic board, Jobs suggested that they go into business together, which they did after Hewlett-Packard formally turned down Wozniak’s design in 1976. The Apple I, as they called the logic board, was built in the Jobs family garage with money they obtained by selling Jobs’ Volkswagen minibus and Wozniak’s programmable calculator.

Jobs was one of the first entrepreneurs to understand that the personal computer would appeal to a broad audience, at least if it did not appear to belong in a junior high school science fair. With Jobs’s encouragement, Wozniak designed an improved model, the Apple II, complete with a keyboard, and they arranged to have a sleek, moulded plastic case manufactured to enclose the unit.

hough Jobs had long, unkempt hair and eschewed business garb, he managed to obtain financing, distribution, and publicity for the company, Apple Computer, incorporated in 1977—the same year that the Apple II was completed. The machine was an immediate success, becoming synonymous with the boom in personal computers. In 1981 the company had a record-setting public stock offering, and in 1983 it made the quickest entrance (to that time) into the Fortune 500 list of America’s top companies. In 1983 the company recruited PepsiCo, Inc., president John Sculley to be its chief executive officer (CEO) and, implicitly, Jobs’s mentor in the fine points of running a large corporation. Jobs had convinced Sculley to accept the position by challenging him: “Do you want to sell sugar water for the rest of your life?” The line was shrewdly effective, but it also revealed Jobs’s own near-messianic belief in the computer revolution.

Insanely great

During that same period, Jobs was heading the most important project in the company’s history. In 1979 he led a small group of Apple engineers to a technology demonstration at the Xerox Corporation’s Palo Alto Research Center (PARC) to see how the graphical user interface could make computers easier to use and more efficient. Soon afterwards, Jobs left the engineering team that was designing Lisa, a business computer, to head a smaller group building a lower-cost computer. Both computers were redesigned to exploit and refine the PARC ideas, but Jobs was explicit in favouring the Macintosh, or Mac, as the new computer became known. Jobs coddled his engineers and referred to them as artists, but his style was uncompromising; at one point he demanded a redesign of an internal circuit board simply because he considered it unattractive. He would later be renowned for his insistence that the Macintosh be not merely great but “insanely great.” In January 1984 Jobs himself introduced the Macintosh in a brilliantly choreographed demonstration that was the centrepiece of an extraordinary publicity campaign. It would later be pointed to as the archetype of “event marketing.”

However, the first Macs were underpowered and expensive, and they had few software applications—all of which resulted in disappointing sales. Apple steadily improved the machine, so that it eventually became the company’s lifeblood as well as the model for all subsequent computer interfaces. But Jobs’s apparent failure to correct the problem quickly led to tensions in the company, and in 1985 Sculley convinced Apple’s board of directors to remove the company’s famous co-founder.

NeXT and Pixar

Jobs quickly started another firm, NeXT Inc., designing powerful workstation computers for the education market. His funding partners included Texan entrepreneur Ross Perot and Canon Inc., a Japanese electronics company. Although the NeXT computer was notable for its engineering design, it was eclipsed by less costly computers from competitors such as Sun Microsystems, Inc. In the early 1990s Jobs focused the company on its innovative software system, NEXTSTEP.

Meanwhile, in 1986 Jobs acquired a controlling interest in Pixar, a computer graphics firm that had been founded as a division of Lucasfilm Ltd., the production company of Hollywood movie director George Lucas. Over the following decade, Jobs built Pixar into a major animation studio that, among other achievements, produced the first full-length feature film to be completely computer-animated, Toy Story, in 1995. Pixar’s public stock offering that year made Jobs, for the first time, a billionaire. He eventually sold the studio to the Disney Company in 2006.

Saving Apple

In late 1996 Apple, saddled by huge financial losses and on the verge of collapse, hired a new chief executive, semiconductor executive Gilbert Amelio. When Amelio learned that the company, following intense and prolonged research efforts, had failed to develop an acceptable replacement for the Macintosh’s ageing operating system (OS), he chose NEXTSTEP, buying Jobs’s company for more than $400 million—and bringing Jobs back to Apple as a consultant. However, Apple’s board of directors soon became disenchanted with Amelio’s inability to turn the company’s finances around and in June 1997 requested Apple’s prodigal co-founder to lead the company once again. Jobs quickly forged an alliance with Apple’s erstwhile foe, the Microsoft Corporation, scrapped Amelio’s Mac-clone agreements, and simplified the company’s product line. He also engineered an award-winning advertising campaign that urged potential customers to “think different” and buy Macintoshes. Just as important is what he did not do: he resisted the temptation to make machines that ran Microsoft’s Windows OS; nor did he, as some urged, spin-off Apple as a software-only company. Jobs believed that Apple, as the only major personal computer maker with its own operating system, was in a unique position to innovate.

Innovate he did. In 1998, Jobs introduced the iMac, an egg-shaped, one-piece computer that offered high-speed processing at a relatively modest price and initiated a trend of high-fashion computers. (Subsequent models sported five different bright colours.) By the end of the year, the iMac was the nation’s highest-selling personal computer, and Jobs was able to announce consistent profits for the once-moribund company. The following year, he triumphed once more with the stylish iBook, a laptop computer built with students in mind, and the G4, a desktop computer sufficiently powerful that (so Apple boasted) it could not be exported under certain circumstances because it qualified as a supercomputer. Though Apple did not regain the industry dominance it once had, Steve Jobs saved his company, and in the process re-established himself as a master high-technology marketer and visionary.

Reinventing Apple

In 2001 Jobs started reinventing Apple for the 21st century. That was the year that Apple introduced iTunes, a computer program for playing music and for converting music to the compact MP3 digital format commonly used in computers and other digital devices. Later the same year, Apple began selling the iPod, a portable MP3 player, which quickly became the market leader. In 2003 Apple began selling downloadable copies of major record company songs in MP3 format over the Internet. By 2006 more than one billion songs and videos had been sold through Apple’s online iTunes Store. In recognition of the growing shift in the company’s business, Jobs officially changed the name of the company to Apple Inc. on January 9, 2007.

In 2007 Jobs took the company into the telecommunications business with the introduction of the touch-screen iPhone, a mobile telephone with capabilities for playing MP3s and videos and for accessing the Internet. Later that year, Apple introduced the iPod Touch, a portable MP3 and gaming device that included built-in Wi-Fi and an iPhone-like touch screen. Bolstered by the use of the iTunes Store to sell Apple and third-party software, the iPhone and iPod Touch soon boasted more games than any other portable gaming system. Jobs announced in 2008 that future releases of the iPhone and iPod Touch would offer improved game functionality. In an ironic development, Apple, which had not supported game developers in its early years out of fear of its computers not being taken seriously as business machines, was now staking a claim to a greater role in the gaming business to go along with its move into telecommunications.

Health issues

In 2003 Jobs was diagnosed with a rare form of pancreatic cancer. He put off surgery for about nine months while he tried alternative medicine approaches. In 2004 he underwent a major reconstructive surgery known as the Whipple operation. During the procedure, part of the pancreas, a portion of the bile duct, the gallbladder, and the duodenum were removed, after which what was left of the pancreas, the bile duct, and the intestine was reconnected to direct the gastrointestinal secretions back into the stomach. Following a short recovery, Jobs returned to running Apple.

Throughout 2008 Jobs lost significant weight, which produced considerable speculation that his cancer was back. (The average survival rate for patients who underwent Whipple operations was only 20 per cent at five years.) Perhaps more than those of any other large corporation, Apple’s stock market shares were tied to the health of its CEO, which led to demands by investors for full disclosure of his health—especially as the first reasons given for his weight loss seemed insufficient to explain his sickly appearance. On January 9, 2009, Jobs released a statement that he was suffering from a hormonal imbalance for which he was being treated and that he would continue his corporate duties. Less than a week later, however, he announced that he was taking an immediate leave of absence through the end of June in order to recover his health. Having removed himself, at least temporarily, from the corporate structure, Jobs resumed his previous stance that his health was a private matter and refused to disclose any more details.

In June 2009 the Wall Street Journal reported that Jobs had received a liver transplant the previous April. Not disclosed was whether pancreatic cancer he had been treated for previously had spread to his liver. The operation was performed in Tennessee, where the average waiting period for a liver transplant was 48 days, as opposed to the national average of 306 days. Jobs came back to work on June 29, 2009, fulfilling his pledge to return before the end of June. In January 2011, however, Jobs took another medical leave of absence. In August he resigned as CEO but became chairman. He died two months later.

In 2022 Jobs was posthumously awarded the Presidential Medal of Freedom.

Leave a Comment

Your email address will not be published. Required fields are marked *