A history of computing
- Mark Skilton
- Jul 30, 2007
- 4 min read
History of Computing
Important dates and events
First computer invented – the abacus – a tool for processing data.
Definition of a computer
Collects data
Stores data
Manipulated data
Transmits data
Information processing = data à information à Knowledge
Charles Babbage – invented the programmable computer (difference engine) around 1870’s
Definition of a Computer
Input devices
CPU and RAM
Secondary storage device
Output device
Primary Storage device
Form factors have changed over time e.g. output devices changed from keyboard to mouse to mobile etc. but in principle the basic definition of the computer has remained unchanged over the last 100 years.
1890 Herman Hollerith developed a mechanical tabulator based on punched cards in order to rapidly tabulate statistics from millions of pieces of data.
This later combined with other companies to become the International Business Machines (IBM) by the early 1900’s.
In this period it would take 7 years to compute a census; by the time the tabulator machine was working it could be done in 6 weeks. A huge leap forward is speed and time reduction.
1940’s ENIAC Electronic Numerical Integrator And Computer, was the first large-scale, electronic, digital computer capable of being reprogrammed to solve a full range of computing problems.
Could add 5000 numbers or compute fourteen 10 digit multiplications in a second.
Was driven by the military needs to calculate missile trajectories.
UNIVAC formed in the 1950’s as the catch-all name for the American manufacturers of the lines of mainframe computers by that name, later consolidating into the company UNIVAC.
1940’s also saw Bell labs develop the Transistors, this began the drive towards smaller, faster, cheaper dynamics in the industry.

By the 1970’s the advent of the microprocessor created a rapid explosion in processing speed, size and cost reduction and started to follow the Moore’s law.

1977 Apple II arrived and started the Personal Computer industry started by Steve Jobs and Steve Wozniak.

In 1981 IBM licensed DOS from a small company called Microsoft (Bill gates)
Up until a few years ago windows95, 98 still had DOS underpinning it.
The key was that Microsoft licensed their OS , Apple did not , and locked their OS into their hardware. The result is that everywhere has Microsoft…
In 1984 the Apple Mackintosh came out using a graphical user interface (GUI) and mouse instead of the then-standard command line interface.
Faster, cheaper, smaller really has been driven in the last 25 years. You can not buy a PC for under $200.
Internet started in the 1980’s , The first TCP/IP-wide area network was operational by January 1, 1983, when the United States' National Science Foundation (NSF) constructed a university network backbone that would later become the NSFNet. In the 1990’s the Internet gained a public face.
The first TCP/IP-wide area network was operational by January 1, 1983, when the United States' National Science Foundation (NSF) constructed a university network backbone that would later become the NSFNet.
It was then followed by the opening of the network to commercial interests in 1985. Important, separate networks that offered gateways into, then later merged with, the NSFNet include Usenet, BITNET and the various commercial and educational networks, such as X.25, Compuserve and JANET. Telenet (later called Sprintnet) was a large privately-funded national computer network with free dial-up access in cities throughout the U.S. that had been in operation since the 1970s. This network eventually merged with the others in the 1990s as the TCP/IP protocol became increasingly popular. The ability of TCP/IP to work over these pre-existing communication networks, especially the international X.25 IPSS network, allowed for a great ease of growth. Use of the term "Internet" to describe a single global TCP/IP network originated around this time.
The network gained a public face in the 1990s. On August 6, 1991, CERN, which straddles the border between France and Switzerland, publicized the new World Wide Web project, two years after British scientist Tim Berners-Lee had begun creating HTML, HTTP and the first few Web pages at CERN.
An early popular web browser was ViolaWWW based upon HyperCard. It was eventually replaced in popularity by the Mosaic web browser. In 1993 the National Center for Supercomputing Applications at the University of Illinois released version 1.0 of Mosaic, and by late 1994 there was growing public interest in the previously academic/technical Internet. By 1996 the word "Internet" was coming into common daily usage, frequently misused to refer to the World Wide Web.
Meanwhile, over the course of the decade, the Internet successfully accommodated the majority of previously existing public computer networks (although some networks, such as FidoNet, have remained separate) During the 1990s, it was estimated that the Internet grew by 100% per year, with a brief period of explosive growth in 1996 and 1997.[3] This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary open nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network.
June 10, 2007, 1.133 billion people use the Internet according to Internet World Stat
The factors driving the internet growth
Adoption of the TCP/IP standard for sharing and distribution of information. For IP addresses for computer plugging into the network
Ability to link from one computer to another or from one site to another on the WWW.
Ease of use of the browser and its GUI
Increasing popularity and growth of personal computers and how easy it is to connect to the network. Can discover itself, now have wireless or cellular/Mobile modem cards so can go online anywhere.
Overall in the last 100 years it has been faster, smaller, cheaper, but this has really taken off in the last 25 years.
Web 2.0

Comments