announcement

Share the latest technology trends or photos of gadgets you love.

History of Computer Motherboards

History of Computer Motherboards

A motherboard is the most vital component of a computer, connecting almost all its peripherals. Without it, the computer will just be an empty box of tin. We trace the motherboard's tracks as it advanced through the decades.
Sourabh Gupta
Last Updated: Apr 9, 2018
A motherboard is a complex printed circuit board (PCB), which is the main central part of many electronic systems, particularly computers. They are alternately known as mainboard, system board, or logic board (Apple Computers). It is a platform that offers electrical connections through which other components of a computer communicate, and also houses the central processing unit (CPU), generally referred to as the brain of the computer.
Motherboards are also present in mobile phones, clocks, stop watches, etc. They include a lot of essential components of a computer such as a microprocessor, main memory, and the microprocessor's supporting chipset that provides an interface between the CPU and other external components.
This vital piece of technology revolutionized the way computer systems were later designed. The earlier versions weren't as efficient and dependable. Today's motherboards contain more or less of the following parts:
  • Expansion card slots.
  • Many computers come with a CPU directly welded to the motherboard.
  • Logic and connectors that support input devices.
  • They come with power connectors that use the electricity from a computer power supply to run the expansion cards, memory, CPU, and chipset.
  • Integrated sound card.
  • Slots or sockets that allow one or multiple microprocessors to be installed.
  • A clock generator is a vital component that sets the system clock signal to help sync itself to a variety of components.
  • There are non-volatile memory chips that contain BIOS or firmware of the system.
  • Graphic card supporter with 2D and 3D graphic capabilities.
  • USB controllers that can support about 12 USB ports.
History of Motherboards
Before the invention of microprocessors, computers were built into mainframes with components which were connected by a backplane that had countless slots for connecting wires. In old designs, wires were needed to connect card connector pins but they soon became a thing of the past with the invention of PCBs. The CPU, memory and other peripherals were all housed on this printed circuit board.
During the late 1980s and 1990s, it was found that an increasing number of peripheral functions on the PCB were more economical. Hence, single Integrated Circuits (ICs) capable of supporting low-speed peripherals like serial ports, mouse, keyboards, etc., were included on a motherboard. By the late 1990s, they began to have a multifaceted platform integrated with audio, video, storage and networking functions. Higher end systems for 3D gaming and graphic cards were also included later.
Micronics, Mylex, AMI, DTK, Orchid Technology, Elitegroup, etc. were some of the few companies that were the early pioneers in the field of motherboard manufacturing where companies like Apple and IBM soon took over. They offered top grade, sophisticated devices that included upgraded features and superior performance levels over the prevailing ones.
Timeline of Various Computer Components
1967: The first floppy disk is created by IBM.
1970: The first microprocessor is released by Intel, called the 4004. Shortly after, Intel announces the release of the first random-access memory (RAM), called the 1103.
1972: The invention of the compact disc.
1974: The 8080 microprocessor is released by Intel.
1975: Introduction of Apple I, (the company Apple Computer was founded by Steve Wozniak and Steve Jobs) a device that consists of a motherboard, a keyboard, and a display.
1977: The first commercial network ARCNET is developed, where Apple II takes the market by storm with the first personal computer that integrates the use of colored graphics.
1980: Paul Allen and Bill Gates are hired by IBM to create DOS. Microsoft in the same year licenses UNIX and starts to develop a PC version called XENIX.
1987: Elitegroup Computer Systems Co. Ltd. is established in Taiwan and becomes the largest supplier of motherboards in the world.
1989: AsusTek, one of Taiwan's top companies, starts manufacturing graphic cards.
1993: First International Computer Inc. becomes the largest motherboard manufacturer in the world.
1997: Intel Corp. plans to add to its monopoly a microprocessor, by manufacturing motherboards.
2000: ATI Technologies Inc. announces graphic cards technology, an advancement in computer graphics.
2007: AsusTek becomes the world's largest maker of computer motherboards.
Since the motherboard's inception, technology has grown in leaps and bounds to accommodate the needs of the modern-day man with faster, lighter, and high-end capabilities. The need to reinvent and advance further has only been made possible through the working geniuses behind each technological introduction. With every passing year we see more innovations break through the market, with others battling it out to top off the best. With time, we will be fortunate enough to be able to witness a revolutionary change in the face of technology within this lifetime.