The Abacus is the first historical record of computing. It was used to help the ancient technorati gain an edge over trading partners still counting cows by hand. The oldest known complex computing device, called the Antikythera mechanism, dates back to 87 B.C; a gear-operated contraption used by the Greeks to calculate astronomical positions and help them navigate through the seas.
Computing took another leap in 1843, when English mathematician Ada Lovelace wrote the first computer algorithm, in collaboration with Charles Babbage, who devised a theory of the first programmable computer.
But the modern computing-machine era began with Alan Turing’s conception of the Turing Machine, and three Bell Labs scientists invention of the transistor, which made modern-style computing possible, and landed them the 1956 Nobel Prize in Physics.
The first programmeable digital computer was made in 1941 by Konrad Zeus. It took 2 decades before Douglas Engelbart invented the mouse on a computer, precisely in 1963.
For decades, computing technology was exclusive to the government and the military; later, academic institutions came online, and Steve Wozniak built the circuit board for Apple-1 in 1976, making home computing practicable.
On the connectivity side, Tim Berners-Lee created the World Wide Web in 1990, and Marc Andreessen built a browser 3 years later.
Windows Operating System that many use today was released by Microsoft in 1985 with version 1.0 and has gone through various improvements with the most current being Windows 10.
With wearable computers, embeddable chips, smart appliances, and other advances in progress and on the horizon, the journey towards building smarter, faster and more capable computers is clearly just beginning.