While many people assume that digital circuits are a relatively new concept, the reality is that the practice and science of digital circuit design has actually been used in various aspects of design since the early part of the 1930s.
The history of the use of digital circuits, and just how far digital circuit design has come in the last century is actually fascinating, and a great example of how a relatively simple concept has created an element used in virtually all applications and industries today.
The Early Stages
The first prediction of the development of digital circuit design was in the early part of the 1700s by Gottfried Wilhelm Leibniz, who first developed the working binary number system. This was then enhanced by George Boole in the middle of the 19th century, and then later in 1886, the actual use of a digital circuit was hypothesized or predicted by Charles Sanders Peirce.
A letter Peirce penned talked about how mathematics and logic operations could be used to control electrical circuits, what would now be called switching circuits, to control a machine without the need for manual operations.
Actual Implementation
The first need for digital circuit design was in the late part of the 1930s to the early part of the 1941s. At this time, an electromagnetic computer was designed by a German civil engineer, Konrad Zuse. Known as the Z3, this computer was funded in part by the German government, but interestingly enough it was also seen as unimportant at the time.
At about the same time in the United States George Stibitz and Howard Aiken from Bell Laboratories and Harvard respectively, began working on what were large calculator type devices that used mechanical switches and relays. This set the stage for the later development of circuits that could be operated based on both inputs as well as stored data.
In 1947, the transistor replaced the vacuum tubes and telephone relays used in the computers that were built off of the Z3 design. Later on in 1958, Jack St. Clair Kilby invented the integrated circuit, which was the central component of making huge changes in the ability to utilize these control devices in smaller and smaller design options.
What is now considered the hallmark of modern digital circuit design, the actual “computer on a chip” was completed in 1970 by Ted Hoff in his work for Intel. What was once thought of a major accomplishment is now a very basic design option, but it set the groundwork for the possibilities of today and the future.