loading
IBM 360 Computer by Don Deblod is licensed under CC BY 2.0
IBM 360 Computer by Don Deblod, licensed under CC BY 2.0

Here we look at some of the most important events and inventions in the history of computing, from the humble abacus to mainframes, terminal emulation software, the internet and social media. It's a brief and information-packed timeline of computer history.

2400 BC: The abacus was invented by the Babylonians. Not a computer, you say? Well, it may not be a computer in the electronic sense that we recognize today, but the abacus still performed computational functions and was the most advanced method of calculation for thousands of years after its invention.

1642: Blaise Pascal invents the mechanical calculator, which he called the Pascaline. The machine could only be operated by Pascal himself, but it set in motion the development of mechanical calculators around the world.

1820: Charles Xavier Thomas de Colmar invents the arithmometer, which would become the first mass-produced mechanical calculator.

1848: George Boole develops Boolean Logic which will one day become the basis of binary computer design.

1940: Konrad Zuse introduces the first electric binary programmable computer, the Z1.

1944: The Harvard Mark 1, designed and built by Howard Aiken and financed by IBM, becomes the second program-controlled machine.

1949: Maurice Wilkes and a team at Cambridge executed the first stored program on the Electronic Delay Storage Automatic Computer (EDSAC). Many consider this the birth of modern computing.

1951: The Whirlwind Computer -- the first computer that operated in real time -- was developed at the Massachusetts Institute of Technology (MIT).

1958: Jack Kilby invents the integrated circuit, otherwise known as the chip -- the technology that allowed smaller and lower costs of production for all computers.

1964: Mainframes and smaller minicomputers began to emerge that used the integrated circuit. The IBM System/360 was created this year, which allowed corporations to purchase a scalable computing solution, while minicomputers allowed smaller enterprises to take advantage of computing technology.

1969: A system for decentralizing data in the event of a nuclear attack is developed by the United States Department, known as ARPANET. This would form the basis of the modern internet.

1971: The first microprocessor is developed by a team at Intel.

1975: Bill Gates and Paul Allen begin Microsoft while Steve Jobs and Steve Wozniak begin Apple. The first PC -- the Altair 8800, is introduced this year too.

1980: IBM hires Microsoft to develop an operating system for its personal computer. Microsoft came up with MS-DOS, which would become the main operating system for IBM computers through the '80s and well into the '90s.

1983: Microsoft introduces Windows, which eliminates the commands required by traditional mainframe terminals and MS-DOS operating systems, replacing them with graphical interfaces that used mouse-controlled navigation.

1990: Tim Berners-Lee and Robert Cailliau propose a hypertext system that would form the basis of the modern Web.

1995: Amazon, eBay and Hotmail are all launched, which will go on to become internet mainstays.

1998: The Google search engine company is founded by Larry Page and Sergey Brin.

2000: Despite predictions that the mainframe would become obsolete in the face of the decentralizing capacities of the internet, the world's biggest companies continue to utilize the technology. Traditional 'dumb' terminals have essentially been completely replace by terminal emulation software which offers a wide range of terminals such as 3270 emulation.

2003: Web 2.0 platforms allowing user-generated content begin gaining traction.

2004: Facebook is incorporated, which would go on to become the world's largest social media website and change the way human beings interact.

2007: Apple introduces the iPhone, the first smartphone to use a multi-touch interface.