What does ASCII Stand For?

ASCII stands for American Standard Code for Information Interchange. It is a form of character encoding that is based on the English alphabet. ASCII codes represent the text in computers and communication tools that handle text.

ASCII characters were developed from telegraphic codes. Work on developing the ASCII standard began in 1960. The first edition came up only in 1963. The standard underwent updates in 1967 and in 1986. The committee working on the development of the ASCII character set contemplated the use of a shift key functionality, which would allow them to build 6-bit representations of character symbols. By implementing the shift key functionality in their design, they were going to create some character codes that would determine which character code options to follow. However, the shift key function was discarded from the design and eight-bit codes were formulated. This also allowed the ASCII code design to support parity bits. Robert Berner, a computer scientist at IBM, was instrumental in the development of some features that were added to ASCII in its revised versions.

The ASCII character set is a collection of 33 non-printing characters, 94 printable characters and the space character that is not printable. The first 32 ASCII codes are reserved for control characters like the null characters, characters for denoting start and end of text, the line feed character, the shift in and shift out characters as also the characters used for controlling devices. The other ASCII codes are allotted for actual printable character symbols. The ASCII code provides a mapping between digital bit patterns and characters, thus allowing devices to communicate with each other.

History of Macintosh Computers

Apple Inc., a famous name in the computer industry, refers to a company that develops and markets personal computers with the brand name Macintosh. Macintosh is better known as Mac. The Macintosh 128K, released on January 24, 1984, was a commercial success. It was the first personal computer which came with a mouse and a graphical user interface. With the passing years, Apple Inc. evolved and today, it is a business giant in the field of computers.

History

Jef Raskin, a human computer interface expert from America, was an Apple employee who came up with the idea of building an affordable and easy-to-use computer. In 1979, Raskin started planning for building a team that would bring his idea into reality. He soon formed a team of Bill Atkinson, a Lisa team member, Burrell Smith, a service technician and others. Soon, they started working on Raskin’s idea. The first Macintosh board that their team developed had a 64 KB RAM, used a Motorola microprocessor and featured a black and white bitmap display.

By the end of 1980, Smith, one of the team members of the first Macintosh team, created a board that ran on a higher speed, featured a higher-capacity RAM and supported a wider display. Steve Jobs, impressed by this design, began to take interest in this project. His ideas have highly influenced the design of the final Macintosh. Jobs resigned from Apple in 1985.

The following years witnessed the development of desktop publishing and other applications such as Macromedia FreeHand, Adobe Photoshop, and Adobe Illustrator, which helped in the expansion of the desktop publishing market. It was also during these years that the shortfalls of Mac were exposed to the users. It did not have a hard disk drive and had little memory. In 1986, Apple came up with Macintosh Plus. It supported some excellent features like the parallel SCSI interface, a megabyte of expandable RAM, and support for attachment of peripheral devices. The MacPlus was produced until 1990, making it the longest-lived Macintosh.

In 1987, Apple brought about HyperCard and MultiFinder, which endowed Macintosh with multitasking features. After Macintosh II, Macintosh SE was released. The Macintosh SE supported the Snow White language and the Apple desktop bus mouse and keyboard.

Claris, a computer software company formed as a spin-off from Apple Computer in 1987, brought the Pro series to the market. Their line of products included the MacPoint Pro, MacDraw Pro and others. By the early 1990s, Claris had become immensely popular. Claris released ClarisWorks, which later came to be known as AppleWorks.

In 1991, Macintosh came up with System 7, a 32-bit rewrite of their operating system. They soon introduced Macintosh Quadra 700 and 900, both using the Motorola 68040 processor. They also established the Apple Industrial Design Group to work on further developments in their operating system. The year 1991 witnessed the creation of the PowerBook Range by Apple. In the following year, Apple started selling their low-end Mac, Performa. In 1994, they started using the RISC PowerPC architecture developed by the alliance of Apple Computer, IBM, and Motorola. Their new product line was a huge success.

Apple has always had to face fierce competition from Intel and Microsoft. After the return of Steve Jobs, Apple had a ‘no looking back’ period. They introduced an all-in-one Macintosh and called it iMac. It was a great success. In 1999, they came up with iBook, their first laptop computer. The Mac Mini launched in 2005, is the least expensive Mac till today. Mac OS 9 evolved to Mac OS X that was based on Unix. Mac OS X came up in 2000. The MAC OS remains to be one of the most popular operating systems till date.

The glorious history of Macintosh computers convinces us of their bright future.

Evolution of Supercomputers

Bigger, faster, stronger and better – man seems to thrive on using superlatives, especially in the realm of technology. With mobiles and computers, the trend seems to be speedier and smaller. Then there are supercomputers, which are the brains or “Einsteins” of the computing world. Supercomputers are faster, more powerful and very large, when compared to their everyday counterparts. Supercomputers have a wide range of uses in complex and data consuming applications. Where did it all began? To answer that question, read on for a brief history of supercomputers.

The Beginning of the Supercomputer Age – The 1960s

Livermore Advanced Research Computer
In 1960, the UNIVAC LARC (Livermore Advanced Research Computer) was unveiled. It cannot be considered as the first supercomputer, since its configuration was not as powerful as expected but rather is considered the first attempt at building such a machine. Its inventor was Remington Rand. At the time of its invention, it was the fastest computer around. Following is a list of features of UNIVAC LARC:

1. Had 2 Central Processing Units (CPU’s) and one I/O (input/output) processor.
2. Had a core memory power of 8 banks, which stored 20,000 words.
3. Accessing memory speed was 8 microseconds and cycle time was 4 microseconds.

1961 saw the creation of the IBM 7030 or Stretch. In the whole rat race to build the first supercomputer and sell it, IBM had designs and plans but lost the first contract to the LARC. Fearing that the LARC would emerge as the ultimate winner, IBM promised a very powerful machine and set high expectations, that ultimately it could not live up to. The 7030 was compared to an earlier IBM model, the IBM 7090, which was a mainframe computer released in 1959. Its computational speed was projected to be 100x times faster than the IBM 7090 but once made, it was only 30x times faster.

Its selling price was greatly reduced, few models were sold and this machine was a major embarrassment for IBM. But this machine contributed greatly to key computer concepts, such as multiprogramming, memory interleaving and protection, 8-bit byte and instruction pipelining. IBM implemented these concepts in their upcoming models, spawning successful machines in the business and scientific lines of use. Such concepts are used today in microprocessor systems, such as the Intel Pentium and the Motorola/IBM PowerPC.

CDC 6600
What marks the beginning of a species or objects evolution? One success or one being that surpassed the others to form the prototype from which future generations would be formed. You could say the evolution of supercomputers began with the CDC 6600. It was designed by Seymour Cray, the man regarded as the creator of supercomputers, for the Control Data Corporation, in 1964. A few of the features of CDC 6600 are listed below:

1. 1 CPU for arithmetic and logical operations, different simpler processors (I/O processors or peripheral processors) for other tasks.
2. Introduced Reduced Instruction Set Computer (RISC) concept, where instruction set of the main CPU was smaller, different processors could work in parallel and clock speed was very fast (10 MHz).
3. Introduced logical address translation.

The 6600 had a performance figure of 1 MFLOP (10 6 floating-point operations per second), making it 3 times faster that the Stretch and it reigned supreme as the world’s fastest computer till 1969.

Timeline of Supercomputer Evolution

1969 CDC 7600 was released. It surpassed the 6600 with a clock speed of 36.4 MHz and used a pipelined scalar architecture. It surpassed the 6600 by 10x times, with its performance figure of 10 MFLOPS.

1972 Seymour Cray left CDC to form his own computing firm, Cray Research.

1974 CDC released the STAR-100, a supercomputer with a vector processor. It had a performance speed of 100 MFLOPS.

1976 The Cray-1 was unveiled, a machine with a vector processor and had a clock speed of 80 MHz and a performance figure of 160 MFLOPS. This system was a 64-bit system and had its own OS, assembler and used a FORTRAN compiler.

1982 The Cray X-MP was unveiled. This machine was designed by Steve Chen and used a shared-memory parallel vector. Its clock speed was 105 MHz or 9.5 nanoseconds. This was the first multi processor supercomputer.

1985 Cray-2 was born. This machine exceeded the MFLOPS factor and touched GFLOPS (1000 MFLOPS) with a performance figure of 1.9 GFLOPS. It had 4-8 processors in a completely new design and structure, with pipelining and a high memory latency.

1990 The Fujitsu Numerical Wind Tunnel was created. It had a vector parallel architecture and its sustained performance factor was 100 GFLOPS, with a clock cycle time of 9.5 nanoseconds. It had 166 vector processors, each with a speed of 1.7 GFLOPS.

1996 HITACHI SR2201 used a distributed memory parallel system to attain a performance of 600 GFLOPS from 2048 processors.

1997 Intel and Sandia Labs jointly created the ASCI RED. This mesh-based machine was designed for extremely large parallel processing and had 9298 Pentium II processors. Its performance touched 1.34 TFLOPS, making it the first supercomputer to do so and it remained the king of its kind, till the year 2000. It was also a very scalable supercomputer, with its processors found in most home computer systems.

2004 The Earth Simulator was designed to simulate the world’s climatic conditions, on both land and sea, as well as atmospheric. It was built by NEC and had 8 vector processors. Its performance factor was 131 TFLOPS.

2005 The first machine from the IBM Blue Gene supercomputer series, was the Blue Gene/L. This machine started out with a peak performance of 280 TFLOPS. There are 4 main Blue Gene projects and 27 supercomputers using the architecture, which uses approx 60,000 processors.

2008 The IBM Roadrunner is a hybrid supercomputer, with 2 different processor architectures working in tandem. It uses Red Hat Enterprise Linux and Fedora as its OS and its performance is 1.456 petaflops at peak.

2010 Tianhe-I was a record breaker in so many ways. It was China’s first supercomputer to enter the Top500 list of supercomputers. It has a performance factor of 2.566 PFLOPS, which made it the fastest supercomputer till 2011.

2011 The reigning champion amongst supercomputers is the K computer, a Japanese supercomputer, which touches performance rates of 8.162 PFLOPS. It uses 68,544 8-core processors and its construction is still being completed.

It’s clear to see that from the march of time, the configuration and strengths of one supercomputer model has served to enhance and result in a better and more advanced newer model. Another point to note, is that the supercomputer of yesteryear is more backward that the desktop of today!

Linux: History and Introduction

Linux history
Linux is one of the popularly used operating systems and a free software supporting open source development. Originally designed for Intel 80386 microprocessors, Linux now runs on a variety of computer architectures and is widely used.

A Brief History

Unix was the third operating system to CTSS, the first one followed by MULTICS. A team of programmers led by Prof. Fernando J. Corbato at the MIT Computation Center, wrote the CTSS, the first operating system supporting the concept of time-sharing. AT&T started working on the MULTICS operating system but had to leave the project as they were failing to meet deadlines. Ken Thompson, Dennis Ritchie, and Brian Kernighan at Bell Labs, used the ideas on the MULTICS project to develop the first version of Unix.

MINIX was a Unix-like system released by Andrew Tenenbaum. The source code was made available to the users but there were restrictions on the modification and distribution of the software. On August 25, 1991, Linus Torvalds, a second year computer engineering student studying in the University of Helsinki made an announcement that he was going to write an operating system. With an intent to replace MINIX, Torvalds started writing the Linux kernel. With this announcement of Torvalds, a success story had begun! Linux was previously dependent on the MINIX user space but with the introduction of the GNU GPL, the GNU developers worked towards the integration of Linux and the GNU components.

An Introduction to the Linux Operating System

The Unix-like operating system that uses the Linux kernel is known as the Linux operating system. In 1991, Linus Torvalds came up with the Linux kernel. He started writing the Linux kernel after which, around 250 programmers contributed to the kernel code. Richard Stallman, an American software developer, who was a part of the GNU project, created the General Public License, under which Linux is distributed. The utilities and libraries of Linux come from the GNU operating system.

By the term ‘free software’, we mean that Linux can be copied and redistributed in the altered or unaltered form without many restrictions. Each recipient of the Linux software is entitled to obtain the human readable form of the software and a notice granting the person the permissions to modify its source code. In other words, the distribution of the Linux software implies the distribution of a free software license to its recipients. Linux supports open source development by which we mean that all its underlying source code can be freely modified, used and distributed. The open source method of development enables the users to access its source code.

A Linux distribution is a project that manages the collection of Linux software and the installation of the OS. It includes the system software and the application software in the form of packages and the initial installation and configuration details. There are around 300 different Linux distributions. The most prominent of the Linux distributions include Red Hat, Fedora and Mandrake. Fedora Core came up after the ninth version of Red Hat Linux. Fedora Core is a rapidly updated Linux distribution. Most of the Linux distributions support a diverse range of programming languages. Most of them include Perl, Python, Ruby, and other dynamic languages. Linux supports a number of Java virtual machines and development kits as also the C++ compilers.

Linux is a freely available OS based on the Linux kernel. It is an inexpensive and effective alternative to UNIX programs and utilities. Its open source implementation enables any programmer to modify its code. Linux supports a multi-tasking and multi-user environment as also the copy-on-write functionality. The monolithic Linux kernel handles the process control, networking and the file system. Device drivers are integrated in the kernel. The Linux operating system is equipped with libraries, compilers, text editors, a Unix shell, and a windowing system. Linux supports both the command line as well and the graphical user interfaces. It is popularly used in servers and also with desktop computers, supercomputers, video games and embedded systems. I have always enjoyed working on the Linux platform, have you?

When was the First Computer Made?

It is interesting to know that the term ‘computer’ was assigned to people’s job which consisted of performing calculations on a continuous basis, like navigational tables, tide charts, and planetary positions for astronomical needs.
Human errors, boredom and comparatively slow pace of working, led inventors to mechanize the entire process and eventually come up with machines that had brains – computers! Need for accurate and fast calculations led to the invention of tools like Napier’s bones for logarithms, the slide rule, calculating clock, Pascal’s Pascaline, and power-loom that used punched cards; which can be considered as the forefathers of today’s computers.

Talking about the history and evolution of computers, will be incomplete without the mention of Charles Babbage, who is considered the ‘Father of Computers’. He was making a steam-driven calculating machine, the size of a room, way back in 1822, which he called the ‘Difference Engine’.

The Difference Engine
The Difference Engine project, though heavily funded by the British government, could never see the light of the day. Yet, in pursuit of a better machine for more complex calculations, he came up with the ‘Analytical Engine’ that had parts parallel to the memory card and the central processing unit that our systems have today. Hollerith desk was later invented in the U.S. For the need to record the census in 1880, which used a combination of the earlier calculating tools that were invented.
Z1 Computer
In the 1940s there were attempts to make machines that served the purpose of computing numbers and problems, like the Z1 Computer in 1936.
Z Machines
Then Konrad Zuse also wanted to make something that would be like a computer, hence was created electromechanical “Z machines,” the Z3, in 1941, which was the first working machine, which featured binary arithmetic, including floating point arithmetic and a measure of programmability. Zuse also started the first computer start-up company, which was established in 1946.
The ABC
The ABC or Atanasoff-Berry Computer in 1942, Harvard Mark I Computer in 1944 also contributed to the evolution of computers that we know them today. It was not before the ENIAC that the public got its first feel of computers.
ENIAC / Electronic Numerical Integrator And Computer
The Birth of ENIAC
It was during World War II that the need for computing artillery firing tables for the United States Army’s Ballistic Research Laboratory arose, which gave birth to the giant electronic brain – ENIAC. When the design and construction for this machine was financed by the United States Army, and the contract was signed on June 5, 1943; work progressed secretly in the University of Pennsylvania Moore School of Electrical Engineering, under the name ‘Project PX’.
Renaming of ENIAC
Three years of dedicated work by John Mauchly, J. Presper Eckert and their team towards making of ENIAC finally tasted success when it was announced to the public on February 14, 1946; which was formally accepted by the Army in July 1946. ENIAC was, however, shut down on November 9, 1946 for up-gradation and started again on July 29, 1947. It was operational until October 2, 1955. It was renamed as IEEE Milestone in 1987.
Properties of ENIAC
ENIAC was massive in its physical size if you compare it to modern PCs. It contained approximately 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, which weighed about 30 short tons (27 t). It consumed around 150 kW of power. ENIAC was reliable, programmable, though not as easy and user-friendly as the modern-day computers.
First Ever Desktop Computer
Altair
The first desktop-sized computer system, designed specifically for personal use, was launched in 1974 by Micro Instrumentation Telemetry Systems (MITS). This first computer was called Altair, which started selling with an overwhelming response for little less than $400.
Many entrepreneurial firms smelled the market demand for such computers and started producing and selling these machines. Tandy Corporation (RadioShack) was the first major electronics firm to manufacture and sell personal computers in 1977 that had features like a keyboard and a cathode-ray display terminal (CRT).

Introduction of “Apple” to The World
In 1976, Steve Jobs and Steve Wozniak started working on the homemade microprocessor computer board, Apple I. In early 1977, both formed a company called Apple Computer, Inc. and introduced the world to the first personal computer – Apple II which was complete with keyboard and color graphics capability.

History of Microprocessor

The evolution of the microprocessor has been one of the greatest achievements of our civilization. In some cases, the terms ‘CPU’ and ‘microprocessor’ are used interchangeably to denote the same device. Like every genuine engineering marvel, the microprocessor too has evolved through a series of improvements throughout the 20th century. A brief history of the device along with its functioning is described below.

Working of a Processor

☞ It is the central processing unit, which coordinates all the functions of a computer. It generates timing signals, and sends and receives data to and from every peripheral used inside or outside the computer.

☞ The commands required to do this are fed into the device in the form of current variations, which are converted into meaningful instructions by the use of a Boolean Logic System.
☞ It divides its functions in two categories, logical and processing.

☞ The arithmetic and logical unit and the control unit handle these functions respectively. The information is communicated through a bunch of wires called buses.

☞ The address bus carries the ‘address’ of the location with which communication is desired, while the data bus carries the data that is being exchanged.

Types of Microprocessors
◆ CISC (Complex Instruction Set Computers)
◆ RISC(Reduced Instruction Set Computers)
◆ VLIW(Very Long Instruction Word Computers)
◆ Super scalar processors
Types of Specialized Processors
◆ General Purpose Processor (GPP)
◆ Special Purpose Processor (SPP)
◆ Application-Specific Integrated Circuit (ASIC)
◆ Digital Signal Processor (DSP)
History and Evolution
The First Stage
The invention of the transistor in 1947 was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Shockley, Brattain, and Bardeen are credited with this invention and were awarded the Nobel prize for the same.Soon, it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), turned out to be a very crucial achievement and brought along a revolution in the use of computers.

A person named Jack Kilby of Texas Instruments was honored with the Nobel Prize for the invention of IC, which laid the foundation on which microprocessors were developed. At the same time, Robert Noyce of Fairchild made a parallel development in IC technology for which he was awarded the patent.

The Second Stage
ICs proved beyond doubt that complex functions could be integrated on a single chip with a highly developed speed and storage capacity. Both, Fairchild and Texas Instruments, began the manufacture of commercial ICs in 1961.Later, complex developments in the IC led to the addition of more complex functions on a single chip. The stage was set for a single controlling circuit for all the computer functions.
Finally, Intel corporation’s Ted Hoff and Frederico Fagin were credited with the design of the first microprocessor.

The Third Stage
The work on this project began with an order from a Japanese calculator company Busicom to Intel, for building some chips for it. Hoff felt that the design could integrate a number of functions on a single chip making it feasible for providing the required functionality.This led to the design of Intel 4004, the world’s first microprocessor. The next in line was the 8-bit 8008 microprocessor. It was developed by Intel in 1972 to perform complex functions in harmony with the 4004.

This was the beginning of a new era in computer applications. The use of mainframes and huge computers was scaled down to a much smaller device that was affordable to many.
Earlier, their use was limited to large organizations and universities. With the advent of microprocessors, the use of computers trickled down to the common man.

Further Developments
▪ The next processor in line was Intel’s 8080 with an 8-bit data bus and a 16-bit address bus. This was amongst the most popular microprocessors of all time.
▪ Very soon, the Motorola corporation developed its own 6800 in competition with the Intel’s 8080.
▪ Fagin left Intel and formed his own firm Zilog. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions.
▪ Similarly, a break off from Motorola prompted the design of 6502, a derivative of the 6800. Such attempts continued with some modifications in the base structure.
▪ The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a ‘personal computer’ was still a distant dream for the world, and microprocessors were yet to come into personal use.
▪ The-16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being the TMS9900 of Texas Instruments.
▪ Intel developed the 8086, which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it.
68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32-bit architectures.
▪ Similarly, many players like Zilog, IBM, and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.
▪ The 1990s saw a large-scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM, and Microsoft Corporation. It witnessed a revolution in the use of computers, which by then, were a household entity.
▪ This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its ‘Pentium Processor’ which is one of the most popular processors in use till date.
▪ It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Quad Core technology.
▪ They have opened up a whole new world of diverse applications. Supercomputers have become common, owing to this amazing development in microprocessors.
Certainly, these little chips will go down as history, but will continue to rein in the future as an ingenious creation of the human mind.