A Short History of Modern Computing, 1945-2022
April 1945 John von Neumann’s “First Draft of a Report on the EDVAC,” often called the founding document of modern computing, defines “the stored program concept.”
December 1945 ENIAC, the first electronic, general purpose programmable computer, runs hydrogen bomb calculations. It was developed at the Moore School of Electrical Engineering, University of Pennsylvania, under a U.S. Army contract.
September 1947 Association of Computing Machinery (ACM) founded.
1947 Statistician John W. Tukey coins the term “bit” to designate a binary digit, a unit of information stored in a computer.
November 1947 At Bell Labs, Walter H. Brattain and John A. Bardeen, under the direction of William B. Shockley, discover the transistor effect, developing and demonstrating a point-contact germanium transistor, later leading to small, low-power electronic devices and eventually low-cost integrated circuits.
1949 The bar code is conceived when 27-year-old Norman Joseph Woodland draws four lines in the sand on a Miami beach. In June 1974, a Universal Product Code (UPC) label was used to ring up purchases at a supermarket for the first time.
March 1951 The U.S. Census Bureau purchases the Univac, developed by the Eckert-Mauchly Computer Corporation (which was acquired by Remington Rand in 1950), establishing the commercial market for computers in the United States. The Univac’s main advantage was the use of tape in place of labor-intensive punch cards processing. By 1954, 20 Univac computers were sold, at around a million dollars each.
June 1955 The National Security Agency signs a contract with Philco for the development of SOLO, the first general-purpose transistorized computer to operate in the U.S.
August 1955 The term “artificial intelligence” is coined in a proposal for a summer workshop submitted by John McCarthy (Dartmouth College), Marvin Minsky (Harvard University), Nathaniel Rochester (IBM), and Claude Shannon (Bell Telephone Laboratories). The workshop, which took place in July and August 1956, is generally considered as the official birthdate of the new field.
December 1955 Herbert Simon and Allen Newell develop the Logic Theorist, the first artificial intelligence program, which eventually would prove 38 of the first 52 theorems in Whitehead and Russell's Principia Mathematica.
September 1956 IBM announces the 305 RAMAC and the 650 RAMAC (Random Access Memory Accounting) which incorporated the 350 Disk Storage Unit, the first computer storage system based on magnetic disks. It came with fifty 24-inch disks and a total capacity of 5 megabytes, weighed 1 ton, and could be leased for $3,200 per month.
April 1957 IBM introduces the programming language Fortran (from “formula translation”).
1958 An international gathering of computer experts define Algol (for “algorithmic language), later becoming “the seed around which computer science began to crystalize as an academic discipline.”
January 1960 Control Data Corporation delivers the CDC 1604 to the U.S. Navy, establishing the market for supercomputers.
1961 The National Machine Accountants Association (NMAA), established in 1949, is renamed The Data Processing Management Association (DPMA). In 1997, DPMA was renamed the Association of Information Technology Professionals (AITP).
1966 The Artificial Intelligence Center of the Stanford Research Institute (SRI) starts developing SHAKEY, the first mobile intelligent robot.
February 1966 Robert Taylor becomes the director of the Information Processing Techniques Office (IPTO) at the U.S. Defense Department’s Advanced Research Projects Agency (ARPA). He proposes to his boss the ARPAnet, a network that will connect the different projects at different universities that ARPA was sponsoring. At the time, each project had its own specialized terminal, computer system and unique set of user commands.
December 1968 Douglas Engelbart demonstrates interactive computer programs controlled by a mouse and connected via a live microwave link to a remote computer in a presentation to the Fall Joint Computer Conference. It became known as “the mother of all demos.”
April 1969 Steve Crocker submits RFC 1, the first “Request for Comment” which became the primary mechanism for the collaborative and open development of the Internet.
1970 IBM’s Edgar F. Codd publishes “A Relational Model of Data for Large Shared Databases.” Relational databases will become the dominant approach to data management by the end of the 1980s.
1971 Bob Thomas at BBN creates the first computer virus, an experimental self-replicating program called Creeper which copied itself to computers connected to the ARPANET and displayed the message "I'm the creeper, catch me if you can!"
1972 DIALOG, the first interactive, online search system, providing access to large text-based databases while allowing iterative refinement of results, is offered commercially.
1973 Charles Bachman, developer of IDS [see 1963], delivers his Turing-Award lecture “The Programmer as Navigator,” arguing for “a shift from a computer-centered to the database-centered point of view,” a Copernican revolution driven by database management systems.
April 1981 The first successful portable computer, the Osborne 1, is released. Weighing 24.5 pounds, its $1,795 purchase price included WordStar and SuperCalc spreadsheet program.
January 1984 Apple Computer’s Steve Jobs introduces the Macintosh, the first desktop personal computer to feature graphical user interface, built-in screen and a mouse. And the Mac said: “Never trust a computer you cannot lift.”
May 1985 Quantum Computer Services, an online services company, is launched, offering Quantum Link, a dedicated online service for Commodore computers. It will later evolve into America Online (AOL), the most popular online service in the early 1990s.
1986 First driverless car, a Mercedes-Benz van equipped with cameras and sensors, built at Bundeswehr University in Munich under the direction of Ernst Dickmanns, drives up to 55 mph on empty streets.
March 1989 Tim Berners-Lee circulates “Information management: A proposal” at CERN, outlining a global hypertext system which in December 1990 he launched as the World Wide Web.
September 1991 Xerox PARC’s Mark Weiser publishes “The Computer in the 21st Century” in Scientific American, using the terms “ubiquitous computing” and “embodied virtuality” to describe his vision of how “specialized elements of hardware and software, connected by wires, radio waves and infrared, will be so ubiquitous that no one will notice their presence.”
September 1991 22-year-old Linus Torvalds posts Linux online, an operating system kernel which later evolved into a family of open-source Unix-like operating systems.
January 1993 Marc Andreessen announces version 0.5 of NCSA X Mosaic Web browser which he developed with Eric Bina at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. Andreesen will go on to co-found Mosaic Communications (later Netscape Communications) which released the first version of the Netscape Navigator browser in November 1994.
April 1993 CERN declares the Web protocol and code free to all users.
August 1993 Apple launches the Newton, a “personal digital assistant.”
November 1993 The video camera monitoring the Trojan Coffee Pot at the University of Cambridge’s Computer Laboratory is connected to the Web, becoming the first Webcam. What before entertained a few locally connected people becomes a world-wide show with 1 million hits by 1996.
1994 Steve Mann develops a wearable wireless Webcam, considered the first example of lifelogging.
October 1994 HotWired is the first website to sell banner ads in large quantities to a wide range of major corporate advertisers.
May 1995 Sun releases Java, a programming language intended to let programmers “write once, run everywhere.” It was originally developed by James Gosling and others at Sun to allow interactive applications to be downloaded to digital cable television boxes. It became widely popular when Netscape used it to allow Web page designers to add animation, movement, and interactivity to their pages.
October 1995 The Pew Research Center finds that 14% of U.S. adults are now online, most using dial-up modem connections, but only 3% of online users have ever signed on to the World Wide Web. 42% of U.S. adults had never heard of the Internet and an additional 21% knew it had something to do with computers.
December 1995 MIT’s Nicholas Negroponte and Neil Gershenfeld write in “Wearable Computing” in Wired: “For hardware and software to comfortably follow you around, they must merge into softwear… The difference in time between loony ideas and shipped products is shrinking so fast that it's now, oh, about a week.”
November 1996 The digital video disc (DVD) format, an extension of CD technology, is launched in Japan with the first major releases from Warner Home Video arriving a month later. DVD players became the fastest-adopted consumer devices in American history.
1998 The first Google index has 26 million Web pages. It reaches one billion in 2000 and one trillion in 2008.
1998 The first working digital video recorder (DVR) prototype is developed at Stanford University Computer Science department. Consumer digital video recorders ReplayTV and TiVo were launched the next year.
May 1999 VMware delivers its first product, VMware Workstation. Its product line supporting virtual machines on Intel chips-based servers later played a key role in the proliferation of cloud computing.
August 2000 According to the U.S. Census Bureau, 51% of U.S. households have one or more computers, up from 8.2% in 1984 and 22.8% in 1993. 41.5% of households have access to the Internet, up from 18% in 1997.
October 2001 Apple introduces the iPod or “a thousand songs in your pocket.” The first pocket-sized music players (e.g., Diamond Rio) were introduced a year earlier. 110 million iPods have been sold by 2007.
April 2003 Apple opens its digital media store, the iTune Store, becoming two years later the world’s largest music retailer.
2004 Jeffrey Dean and Sanjay Ghemawat publish “MapReduce: Simplified Data Processing on Large Clusters,” describing Google’s programming model for processing and generating big data sets with a parallel, distributed algorithm running on a cluster of computers.
January 2007 Apple introduces the iPhone and changes its name from “Apple Computer, Inc.” to "Apple Inc."
September 2008 The first smartphone running Google’s Android operating system, the HTC Dream, is announced.
December 2008 Randal E. Bryant, Randy H. Katz, and Edward D. Lazowska publish “Big-Data Computing: Creating Revolutionary Breakthroughs in Commerce, Science and Society,” arguing that “Big-data computing is perhaps the biggest innovation in computing in the last decade.”
2009 Google starts developing, in secret, a driverless car. In 2014, it became the first to pass, in Nevada, a U.S. state self-driving test.
February 2010 Apple adds to the iPhone a voice-recognition application, Siri. Over the next few years, similar “voice assistants” were released by Amazon (Alexa), Google (Google Asistant), and Microsoft (Cortana).
February 2010 Kenneth Cukier writes in The Economist Special Report ”Data, Data Everywhere“: ”… a new kind of professional has emerged, the data scientist, who combines the skills of software programmer, statistician and storyteller/artist to extract the nuggets of gold hidden under mountains of data.”
February 2011 Martin Hilbert and Priscila Lopez publish “The World’s Technological Capacity to Store, Communicate, and Compute Information” in Science. They estimate that in 1986, 99.2% of all storage capacity was analog, but in 2007, 94% of storage capacity was digital, a complete reversal of roles (in 2002, digital information storage surpassed non-digital for the first time).
October 2012 A convolutional neural network (popularly known as “artificial intelligence” in the following years) designed by researchers at the University of Toronto achieve an error rate of only 16% in the ImageNet Large Scale Visual Recognition Challenge, a significant improvement over the 25% error rate achieved by the best entry the year before.
December 2012 Annual e-commerce sales top $1 trillion worldwide for the first time.
March 2016 Google DeepMind's AlphaGo defeats Go champion Lee Sedol.
June 2018 OpenAI publishes "Improving Language Understanding by Generative Pre-Training," introducing the new Generative Pre-trained Transformer (GPT) approach to natural language processing, based on "semi-supervised" learning.
July 2020 OpenAI releases GPT-3. Based on 175 billion machine learning parameters, it produces “human-like” text.
December 2021 The automotive industry accounts for around 15% of the global semiconductor market, with up to 3,000 chips in a single car. It is estimated that because of the chip shortage automakers worldwide could not produce about 11 million cars they planned to make, costing the global auto industry about $210 billion in lost revenue in 2021.
April 2022 63% of the world’s total population or 5 billion people use the Internet, up by 200 million over the last year. 92.4% of Internet users use a mobile phone to go online at least some of the time and mobile phones account for more than half of the world’s Web traffic.
Post a Comment
Please do not enter any SPAM link in comment box.