When was the First Computer Made?

It is interesting to know that the term ‘computer’ was assigned to people’s job which consisted of performing calculations on a continuous basis, like navigational tables, tide charts, and planetary positions for astronomical needs.
Human errors, boredom and comparatively slow pace of working, led inventors to mechanize the entire process and eventually come up with machines that had brains – computers! Need for accurate and fast calculations led to the invention of tools like Napier’s bones for logarithms, the slide rule, calculating clock, Pascal’s Pascaline, and power-loom that used punched cards; which can be considered as the forefathers of today’s computers.

Talking about the history and evolution of computers, will be incomplete without the mention of Charles Babbage, who is considered the ‘Father of Computers’. He was making a steam-driven calculating machine, the size of a room, way back in 1822, which he called the ‘Difference Engine’.

The Difference Engine
The Difference Engine project, though heavily funded by the British government, could never see the light of the day. Yet, in pursuit of a better machine for more complex calculations, he came up with the ‘Analytical Engine’ that had parts parallel to the memory card and the central processing unit that our systems have today. Hollerith desk was later invented in the U.S. For the need to record the census in 1880, which used a combination of the earlier calculating tools that were invented.
Z1 Computer
In the 1940s there were attempts to make machines that served the purpose of computing numbers and problems, like the Z1 Computer in 1936.
Z Machines
Then Konrad Zuse also wanted to make something that would be like a computer, hence was created electromechanical “Z machines,” the Z3, in 1941, which was the first working machine, which featured binary arithmetic, including floating point arithmetic and a measure of programmability. Zuse also started the first computer start-up company, which was established in 1946.
The ABC or Atanasoff-Berry Computer in 1942, Harvard Mark I Computer in 1944 also contributed to the evolution of computers that we know them today. It was not before the ENIAC that the public got its first feel of computers.
ENIAC / Electronic Numerical Integrator And Computer
The Birth of ENIAC
It was during World War II that the need for computing artillery firing tables for the United States Army’s Ballistic Research Laboratory arose, which gave birth to the giant electronic brain – ENIAC. When the design and construction for this machine was financed by the United States Army, and the contract was signed on June 5, 1943; work progressed secretly in the University of Pennsylvania Moore School of Electrical Engineering, under the name ‘Project PX’.
Renaming of ENIAC
Three years of dedicated work by John Mauchly, J. Presper Eckert and their team towards making of ENIAC finally tasted success when it was announced to the public on February 14, 1946; which was formally accepted by the Army in July 1946. ENIAC was, however, shut down on November 9, 1946 for up-gradation and started again on July 29, 1947. It was operational until October 2, 1955. It was renamed as IEEE Milestone in 1987.
Properties of ENIAC
ENIAC was massive in its physical size if you compare it to modern PCs. It contained approximately 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, which weighed about 30 short tons (27 t). It consumed around 150 kW of power. ENIAC was reliable, programmable, though not as easy and user-friendly as the modern-day computers.
First Ever Desktop Computer
The first desktop-sized computer system, designed specifically for personal use, was launched in 1974 by Micro Instrumentation Telemetry Systems (MITS). This first computer was called Altair, which started selling with an overwhelming response for little less than $400.
Many entrepreneurial firms smelled the market demand for such computers and started producing and selling these machines. Tandy Corporation (RadioShack) was the first major electronics firm to manufacture and sell personal computers in 1977 that had features like a keyboard and a cathode-ray display terminal (CRT).

Introduction of “Apple” to The World
In 1976, Steve Jobs and Steve Wozniak started working on the homemade microprocessor computer board, Apple I. In early 1977, both formed a company called Apple Computer, Inc. and introduced the world to the first personal computer – Apple II which was complete with keyboard and color graphics capability.

History of Microprocessor

The evolution of the microprocessor has been one of the greatest achievements of our civilization. In some cases, the terms ‘CPU’ and ‘microprocessor’ are used interchangeably to denote the same device. Like every genuine engineering marvel, the microprocessor too has evolved through a series of improvements throughout the 20th century. A brief history of the device along with its functioning is described below.

Working of a Processor

☞ It is the central processing unit, which coordinates all the functions of a computer. It generates timing signals, and sends and receives data to and from every peripheral used inside or outside the computer.

☞ The commands required to do this are fed into the device in the form of current variations, which are converted into meaningful instructions by the use of a Boolean Logic System.
☞ It divides its functions in two categories, logical and processing.

☞ The arithmetic and logical unit and the control unit handle these functions respectively. The information is communicated through a bunch of wires called buses.

☞ The address bus carries the ‘address’ of the location with which communication is desired, while the data bus carries the data that is being exchanged.

Types of Microprocessors
◆ CISC (Complex Instruction Set Computers)
◆ RISC(Reduced Instruction Set Computers)
◆ VLIW(Very Long Instruction Word Computers)
◆ Super scalar processors
Types of Specialized Processors
◆ General Purpose Processor (GPP)
◆ Special Purpose Processor (SPP)
◆ Application-Specific Integrated Circuit (ASIC)
◆ Digital Signal Processor (DSP)
History and Evolution
The First Stage
The invention of the transistor in 1947 was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Shockley, Brattain, and Bardeen are credited with this invention and were awarded the Nobel prize for the same.Soon, it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), turned out to be a very crucial achievement and brought along a revolution in the use of computers.

A person named Jack Kilby of Texas Instruments was honored with the Nobel Prize for the invention of IC, which laid the foundation on which microprocessors were developed. At the same time, Robert Noyce of Fairchild made a parallel development in IC technology for which he was awarded the patent.

The Second Stage
ICs proved beyond doubt that complex functions could be integrated on a single chip with a highly developed speed and storage capacity. Both, Fairchild and Texas Instruments, began the manufacture of commercial ICs in 1961.Later, complex developments in the IC led to the addition of more complex functions on a single chip. The stage was set for a single controlling circuit for all the computer functions.
Finally, Intel corporation’s Ted Hoff and Frederico Fagin were credited with the design of the first microprocessor.

The Third Stage
The work on this project began with an order from a Japanese calculator company Busicom to Intel, for building some chips for it. Hoff felt that the design could integrate a number of functions on a single chip making it feasible for providing the required functionality.This led to the design of Intel 4004, the world’s first microprocessor. The next in line was the 8-bit 8008 microprocessor. It was developed by Intel in 1972 to perform complex functions in harmony with the 4004.

This was the beginning of a new era in computer applications. The use of mainframes and huge computers was scaled down to a much smaller device that was affordable to many.
Earlier, their use was limited to large organizations and universities. With the advent of microprocessors, the use of computers trickled down to the common man.

Further Developments
▪ The next processor in line was Intel’s 8080 with an 8-bit data bus and a 16-bit address bus. This was amongst the most popular microprocessors of all time.
▪ Very soon, the Motorola corporation developed its own 6800 in competition with the Intel’s 8080.
▪ Fagin left Intel and formed his own firm Zilog. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions.
▪ Similarly, a break off from Motorola prompted the design of 6502, a derivative of the 6800. Such attempts continued with some modifications in the base structure.
▪ The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a ‘personal computer’ was still a distant dream for the world, and microprocessors were yet to come into personal use.
▪ The-16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being the TMS9900 of Texas Instruments.
▪ Intel developed the 8086, which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it.
68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32-bit architectures.
▪ Similarly, many players like Zilog, IBM, and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.
▪ The 1990s saw a large-scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM, and Microsoft Corporation. It witnessed a revolution in the use of computers, which by then, were a household entity.
▪ This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its ‘Pentium Processor’ which is one of the most popular processors in use till date.
▪ It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Quad Core technology.
▪ They have opened up a whole new world of diverse applications. Supercomputers have become common, owing to this amazing development in microprocessors.
Certainly, these little chips will go down as history, but will continue to rein in the future as an ingenious creation of the human mind.