Title | : | The Chip: How Two Americans Invented the Microchip and Launched a Revolution |
Author | : | |
Rating | : | |
ISBN | : | 0375758283 |
ISBN-10 | : | 9780375758287 |
Language | : | English |
Format Type | : | Paperback |
Number of Pages | : | 336 |
Publication | : | First published January 1, 1984 |
The Chip: How Two Americans Invented the Microchip and Launched a Revolution Reviews
-
Technophobes might as well move on to the next review. I loved this book. It explained in clear, precise language how innumerable barriers were overcome by innovative and insightfully brilliant individuals to create a device that revolutionized our lives. I've always been fascinated by electronics, built my own radios and earned an amateur radio license in 7th grade, just because the subject and theory of how electrons move around to perform useful functions is intriguing. Reid has captured much of that fascination and translated it into a great story.
Before integrated circuits could be produced, the transistor had to be invented. Before that time, switching mechanism, required a vacuum tube to control, amplify and switch the flow of electrons through a circuit. It was the discovery that some semiconductor materials could be doped to have an excess of positive charges or negative charges that provided the breakthrough. A strip of germanium could be doped at each end with differing charges leaving a junction in the middle. The junction worked like a turnstile that could control the flow of current when connected to a battery. Variations in current across these junctions connected in the transistor formation could rectify (prevent current from flowing in both directions) and amplify. That's all that's needed to make a radio (I'm oversimplifying obviously) and hundreds of other devices. Transistors required vastly less current than vacuum tubes, were almost infinitely stable, were cheap and gave off little heat.
But, transistors required thousands of connections to the wires coming in order to make a useful circuit, and as demands for more complex circuitry arose the wiring became infinitely complex. This interconnection problem became a huge barrier that could have prevented the effective utilization of the advantages of the transistor
"You read everything. . . You accumulate all this trivia, and you hope that someday maybe a millionth of it will be useful," remembers Jack Kilby, one of the inventors of the integrated circuit. He also insists that he is not a scientist but an engineer. "A scientist is motivated by knowledge; he basically wants to explain something. An engineer's drive is to solve problems, to make something work. . . . Reid has elegantly interwoven the biographies of Jack Kilby and Robert Noyce. One of the delights of the book was learning how the two inventors thought, how they proceeded, and why they went in the directions they did.
Robert Noyce, founder of Intel, had developed a process to make transistors in arrays on a silicon wafer. They cut apart the transistors and then hired "thousands of women with tweezers to pick them up and try to wire them together. It just seemed so stupid." He, too, realized the tyranny of interconnection numbers. What they both came up with was the "Monolithic Idea." The notion that an entire circuit could be designed and produced on those silicon chips.
Obviously, there is little suspense in the story, but Reid captures and holds our attention. Both men accomplished the same feat at about the same time, approaching it from different directions. Kilby showed how the transistors could be placed on a single wafer and Noyce showed how the chips and circuits could be manufactured. Every transistor radio used the patent Kilby was awarded for his work. In so doing, he turned the future that Orwell had predicted in 1984 on its head. Instead of a monolithic centralization of power in the hands of a few computer elite who controlled all the computing power, "the mass distribution of microelectronics had spawned a massive decentralization of computing power. In the real 1984, millions of ordinary people could match the governmental or corporate computer bit for bit. In the real 1984, the stereotypical computer user had become a Little Brother seated at the keyboard to write his seventh-grade science report."
The social impact was enormous. Slide rules that had been ubiquitous were completely eliminated in just a few years by the handheld calculator that has become so cheap it is often given away in promotions. The Japanese gained virtual control over the memory chip industry because of the way they handled their work force. Americans had a monopoly until the 1973 recession. American companies typically lay off workers to save money during downturns. The Japanese try to keep their work force employed. This meant that when the demand for chips exploded, Americans did not have the capacity to produce enough to meet the demand. The Japanese, having trained workers available, met that demand and were able to produce enough at such a volume to keep the price so low as to inhibit any competition. That and their emphasis on high quality gained them 42% of the world market by 1980. The "Anderson Bombshell" report of 1980 (Anderson was a manager at Hewlett-Packard) that showed that Japanese chips were far more reliable than those made in the United States helped seal their market share.
It took winning the Nobel Prize for Noyce and Kilby to be recognized in the United States (Japan, a nation that honors its engineers, had awarded Noyce and Kilby numerous accolades over the years.) The final irony remains that in "our media-soaked society, with its insatiable appetite for important, or at least interesting, personalities, has somehow managed to overlook a pair of genuine national heroes- two Americans who had a good idea that has improved the daily lot of the world." -
From start to finish, "The Chip" was a markedly insightful, thorough, broad-sweeping, and satisfying account of the inception of the micro-processor revolution . The author-- T.R. Reid, a journalist and technical writer-- provides an intriguing account of the seminal steps taken by the technology industry of the mid-20th century that brought humanity from relying on vacuum tubes as the core component of electronics (radios, computers, etc.) to semi-conductor "chips" comprised of billions of transistors that govern our daily habits and way of life far more than we might realize. The book's primary focus is on the solid-state physics and engineering advancements that predicated the solution to the Tyranny of Numbers problem faced by engineers trying to build more complex circuits out of the revolutionary new electronic component: the transistor; two engineers, Jack Kilby and Robert Noyce, astoundingly conceived of a solution to the problem within 6 months of one another, approaching the problem from two different perspectives and almost simultaneously settling on "The Monolithic Idea".
All throughout, Reid does not hesitate to provide the reader with sufficient descriptions of the underlying technologies at each stage of humanity's technological progression towards microchips as they exist today. Perhaps not for the technology averse reader, this book is a wonderful chronicle of potentially the most important series of scientific and engineering advancements of the 20th century-- Advancements that led to the ubiquity of such devices as the smart phone, the personal computer, and all of micro-electronics that now permeate the fabric of society in the 21st century.
It all started with the vacuum tube, and De Forest's insight into the fact that the "Edison Effect" (i.e. thermionic emission- the transmission of electric current through a vacuum, from filament to conductor) could be intricately leveraged such that an alternating current through a filament could induce a direct current in a nearby metal plate; With the insertion of a positively or negatively charged "grid" (or "screen") placed between the filament and the metal plate, De Forest showed that small variances in the charge applied to the grid could produce large variances in the current induced in the metal plate. Thus, the first vacuum tube amplifier was born, laying the foundations for radio sets and televisions that proliferated through American living rooms during the early and mid 20th century. Perhaps an equally important discovery was the fact that the current induced in the metal plate could be "switched" on or off thousands of times per second; this property of vacuum tubes provided the foundation for building the world's first computers (such as the ENIAC). Albeit state of the art at the time, the rather crude technology had many drawbacks: The large, power hungry, vacuum tubes had filaments that frequently burned out, and often generated a large amount of residual heat, making maintenance of contraptions based on the technology quite unwieldy.
Luckily for the world, William Shockley invented the transistor in 1947 at Bell Labs, using the semiconductor silicon-dioxide, to provide such an electronic switching and amplifying device that improved upon the vacuum tube in virtually every way: it generated orders of magnitude less heat, provided the same current amplification, could switch on and off much more quickly (billions of time per second!), and was the size of a pencil eraser (orders of magnitude smaller than existing vacuum tubes!). Subsequently, construction of far more complex circuits was enabled, and within a few years after the invention of the transistor, the industry had almost all but abandoned the vacuum tubes that were so recently the foundation of electronics at the time. It seemed like the industry had found its way out of the technological rut of the time, and for the next 5 years or so made quick strides in the miniaturization of most existing electronic circuit components; However, as much as the transistor served as the critical component of these smaller, more energy efficient electronic circuits, there was another problem electronics engineers would come to face just several years following the ubiquity of the semiconductor transistor. Pleasingly, the author goes into sufficient detail regarding the solid-state physics that begot such a silicon transistor; I greatly appreciated the explanations of the utility of semi-conductors in conjunction with the Boron/Arsenic doping process to allow for such current switching and electronic signal amplification.
As engineers attempted to build the complex circuits they could conceive, they quickly ran into the seemingly insurmountable problem of manually wiring together the electronic components (transistors, resistors, capacitors, etc.) of their increasingly complex circuit designs. Although the transistor theoretically permitted more complex and reliable circuit construction, the way in which the components had to be tediously wired together (by hand!) brought the industry advancements that fell out of the invention of the transistor to a stand-still. In the late 1950's, a solution to the Tyranny of Numbers was the prime directive of many great minds (and electronics companies) at the time, and the one to find such a solution would reap the benefits of a several-years head start on the production of circuit with the degree of complexity which the industry desired. As it turned out, two engineers, Jack Kilby at Texas Instruments (TI), and Robert Noyce of Fairchild Semiconductor, almost simultaneously happened upon an ingenious solution (in fact, the same solution) to the numerical tyranny that plagued the industry: The Monolithic Idea.
In 1958, both Jack Kilby and Robert Noyce conceived of a solution to the Tyranny of Numbers: if transistors could be made of silicon, then why couldn't other circuit components? Kilby and Noyce both had the idea that resistors, capacitors, and other critical electronic circuit components could be fabricated on silicon wafers too. Without going into the solid-state physics that made such an idea possible, both engineers conceived of such a solution from two different perspectives: Kilby concocted his solution in an ad-hoc, piecemeal fashion, imprecisely (but functionally!) building a single silicon circuit into a single slab of silicon with "flying wires" coming out at all angles. Conversely, Noyce postulated from the perspective of manufacturing tons of transistors on a single silicon wafer (the silicon transistor "fab" process): If a thin layer of silicon-dioxide was lain across the top of the N/P doped silicon "three-layer cake", one could insert wires through the top silicon layer to connect transistors composed of the underlying three layers to other transistors in the same wafer; Furthermore, instead of using external wires as connections, one could print thin copper connections between the components atop the top silicon-dioxide layer as part of the manufacturing process! Though both engineers had the insight to construct other crucial circuit components out of silicon dioxide, Kilby beat Noyce to the punch by about 6 months; However, Noyce conceived of the variant of the idea which necessarily integrated seamlessly into the existing manufacturing process of silicon wafers and the transistor manufacturing process. Thus, the following set of chapters that cover the resulting legal battle over patent awards between TI and Fairchild, speedy advancements in the production of micro-electronics, the influence such a discovery had on the notion of personal computing, and intriguing stories such as Kilby's invention of the pocket-calculator at TI (in the mid 1960's) are all deeply intriguing and captivating.
In the latter part of the book, the author takes liberty to diverge from the purely technological account of history and begins to frame the microprocessor revolution within the context of broader societal issues. For instance, one of the core reasons why micro-electronics development and production took off as quickly as it did was due to the US government's interest in putting a man on the moon; As a result, the US government poured hundreds of millions of dollars into the technology companies that were leading the charge on the silicon chip fabrication processes, and could thus be considered as one of the core reason why such rapid engineering progress was made. Arguably, without the USA's interest in shrinking the size of computers such that one could be launched into orbit and with the computational resources (increasing linearly w/ the shrinking size of the transistor) to guide such a mission, the pervasiveness of micro-electronics could be decades behind where it is today. Reid also goes on a brief tangent into the international competition of the electronics markets, focusing mostly on the effective production and market tactics of Japanese electronics manufacturers and the engineering difficulties inherent in the production of micro-electronics. Though this topic was not the reason I picked up the book, these chapter covering Japan's repeated successful entry into the international micro-electronics markets (starting with radio and television) and seemingly inevitable dominance captured my attention as well as the rest of the book; I found that Reid's coupling of the aforementioned advancements in microchip engineering and manufacturing with characterizations of the international political and economic dynamics at the time allowed me to conceive of such a technological and cultural revolution more clearly.
In my opinion, "The Chip" is a remarkable account of a series of several seminal inventions and advancements in electronics, the star of which being the silicon microchip. With digressions into other foundational concepts that paved the way for the micro-electronics revolution-- such as Shannon's insight into using Boolean Logic to construct circuits that execute computations-- Reid presents an eloquently candid and entertaining account of most relevant scientific and engineering discoveries with sufficient technical depth, keeping the reader stimulated and engaged. Complete with comprehensive explanations of the underlying physics, engineering, computational, and manufacturing concepts and processes that are necessary to viscerally grok the magnitude of the creativity of the ground-breaking discoveries that led to the advent of micro-electronics' penetration into modern society, Reid characterizes the very life-blood of our contemporary society: That is, there is virtually no aspect of modern society that the micro-processor revolution hasn't significantly altered and/or become the foundation of, since it's inception. If there's a book out there that presents the incredible story of the microprocessor revolution of the mid 20th century in such breadth and depth without sacrificing readability, I'd be astounded. -
The Chip is a humanistic look at one of the key inventions of the 20th century, the microchip which undergirds every digital change to our life. Thanks to chips, "just put a computer in it" has been a solution to almost every engineering problem, and the cause of a similar number of engineering problems.
In the 1950s, the electronics industry was carrying a blade with no handle. The silicon transistor had opened up vast possibilities by replacing large, power-hungry, and unreliable vacuum tubes. But the new solid state circuits were still built the same way, by wiring together discrete components like resistors, capacitors, and transistors, and the labor cost of hand wiring all these components was stalling future growth. Worse, as the complexity of circuits increased, their reliability went way down, a fatal flaw for aerospace and military applications.
Kilby at Texas Instruments and Noyce at Fairchild Semiconductor hit on the key idea at roughly the same time. If you could lay down resistors, capacitors, and wires inside silicon, you could make a circuit as a monolithic unit. Kilby was first by several months, but Noyce figured out how to get the leads between the chip and world laid down, which is a very important step. Doing everything in silicon is counter-intuitive, by raw materials it's comparable to building a boxcar out of solid gold, but the advantage in not having to wire together components is incredible. Cue the digital revolution that we know, though from the perspective of decades on the revolution was slower than we remember. The first few years of production went entirely to the military. The consumer product which blew the world open was the pocket calculator, which came out in 1971, 15 years after the invention of the chip.
Reid follows the rise of Japanese firms in high tech, as well as the divergent careers of Noyce and Kilby. Noyce went on to become the patriarch of Silicon Valley and a billionaire investor. Kilby kept inventing, though never with the same success. He was finally awarded the Nobel Prize in 2000, but neither of the two are household names despite their impact as inventors.
Reid also makes some odd choices in the technical explanations. There's a lot on Boolean algebra and binary logic, which is key to how chips work, and precisely nothing on photolithography, which is key to how they're made. This is an older book, which is beneficial because there's nothing like interviews with your subjects to get the right feeling across, and Noyce and Kilby are no longer available for interviews. -
A wonderful history of integrated circuits and microprocessors and the men behind the technology.
5 stars -
Can you name the inventors of the microprocessor? I couldn't, in spite of the fact that I have a career that wouldn't even exist without the invention. So because of that, I'm glad I read this book, which focuses on the inventors (Jack Kilby and Robert Noyce fwiw).
However the book is frustrating in a lot of ways. It is neither a biography of the two inventors, or a technical text, but sort of attempts to do both. There's a chapter explaining how microprocessors work at a fairly technical level- a chapter that is probably tedious for anyone with a basic understanding of this (it was for me) and completely useless for someone who isn't grounded in the concepts. If you really want that, check out
Code The Hidden Language of Computer Hardware and Software. There's a chapter that talks about how Japanese manufacturing was able to supplant US manufacturing in other areas.
As you might sense, this is not a terribly focused book.
Here's a passage I did enjoy quite a bit:In a sense, this distinction between basic and directed research encompasses the difference between science and engineering. Scientists, on the whole, are driven by the thirst for knowledge; their motivation, as the Nobel laureate Richard Feynman put it, is “the joy of finding things out.” Engineers, in contrast, are solution-driven. Their joy is making things work.
-
This is the book I was looking for. The semiconductor industry was a black box for me. Now at least I get what is going on. I would have preferred technicalities instead of human drama in some parts of the book though.
-
a careful, deliberate way of thinking.
exuded the easy selfassurance of a jet pilot, Noyce had an unbounded curiosity that led him, at one time or another, to take up hobbies ranging from madrigal singing to flying seaplanes. His doctorate was in physics, and his technical specialty was photolithography, an exotic process for printing circuit boards that required state-of-the-art knowledge of photography, chemistry, and circuit design. Like Jack Kilby, Noyce preferred to direct his powerful intelligence at specific problems that needed solving
Unlike the quiet, introverted Kilby, who does his best work alone, thinking carefully through a problem, Noyce was an outgoing, loquacious, impulsive inventor who needed somebody to listen to his ideas and point out the ones that couldn’t possibly work. That winter, Noyce’s main sounding board was his friend Gordon Moore, a thoughtful, cautious physical chemist who was another cofounder of Fairchild Semiconductor.
tubes kept burning out in the middle of its computations.
The warmth and the soft glow of the tubes also attracted moths, which would fly through ENIAC’s innards and cause short circuits. Ever since, the process of fixing computer problems has been known as debugging.
The transistor, in contrast, was a breakthrough that ordinary people could use. The transistorized portable radio, introduced just in time for Christmas 1954, almost instantly became the most popular new product in retail history. ($49.95)
There are certain standard components—nouns, verbs, adjectives in a sentence; resistors, capacitors, diodes, and transistors in a circuit—each with its own function
There are certain standard components—nouns, verbs, adjectives in a sentence; resistors, capacitors, diodes, and transistors in a circuit—each with its own function. A resistor is a nozzle that restricts the flow of electricity, giving the circuit designer precise control of the current flow at any point. The volume control on a TV set is really a resistance control. Adjusting the volume adjusts a resistor; the nozzle tightens, restricting the flow of current to the speaker and thus reducing the sound level. A capacitor is a sponge that absorbs electrical energy and releases it, gradually or all at once, as needed. A capacitor inside a camera soaks up power from a small battery and then dumps it out in a sudden burst forceful enough to fire the flashbulb. If you have to wait until the indicator light on your camera blinks to tell you that the flash is ready to use, you’re really waiting for the capacitor inside to soak up enough energy to make the thing flash. A diode is a dam that blocks current under some conditions and opens it to let electricity flow when the conditions change. An electric eye is a beam of light focused on a diode. A burglar who steps through the beam blocks the light to the diode, opening the dam to let current flow through to a noisy alarm. A transistor is a faucet. It can turn current flow on and off—and thus send digital signals pouring through the circuitry of a computer—or turn up the flow to amplify the sound coming from a radio.
‘the tyranny of numbers.’
On the assembly lines, the women who soldered circuits together—it was almost entirely women’s work, because male hands were considered too big, too clumsy, and too expensive for such intricate and time-consuming tasks—now had to pick up miniature components and minute lengths of wire with tweezers and join them under a magnifying glass with a soldering tool the size of a toothpick.
To enhance reliability, the designers tried redundancy. like a car built with two front axles just in case one should snap in half on the road.
A kid playing Super Zaxxon in the arcade needs to destroy an enemy base; to do it, he pushes the “Fire” button. The machine has to work through dozens of separate yes-or-no steps just to figure out that the button was pushed. At a billion times per second—completing one step of the problem every nanosecond—they become the foundation of a revolution that has swept the world.
The wires in an electric circuit tend to slow things down. The transistors in a computer switch on and off in response to electronic signals. A pulse of electricity moving through a wire reaches the transistor, and the transistor switches on; another pulse comes along, and the transistor switches off. No matter how quickly the transistor itself can switch, it cannot do so until the pulse arrives telling it what to do.
To increase computing speed, it was necessary to reduce the distance the messenger pulses had to travel— that is, to make the circuits smaller. But smaller circuits meant decreased capacity. The result was a paradox.
Some of the most crucial inventions and discoveries of the modern world have come about through basic research—that is, work that was not directed toward any particular use. Albert Einstein’s picture of the universe, Alexander Fleming’s discovery of penicillin, Niels Bohr’s blueprint of the atomic nucleus, the Watson-Crick “double helix” model of DNA—all these have had enormous practical implications, but they all came out of basic research. There are just as many basic tools of modern life—the electric light, the telephone, vitamin pills, the Internet—that resulted from a clearly focused effort to solve a particular problem. In a sense, this distinction between basic and directed research encompasses the difference between science and engineering. Scientists, on the whole, are driven by the thirst for knowledge; their motivation, as the Nobel laureate Richard Feynman put it, is “the joy of finding things out.” Engineers, in contrast, are solution driven. Their joy is making things work.
“Integrated circuits are the crude oil of the eighties.”
Among the latter is a humorous, or perhaps quasi-humorous, principle sometimes referred to as “the law of the most famous.” Briefly put, this natural law holds that whenever a group of investigators makes an important discovery, the most famous member of the group will get all the credit.
The work at Menlo Park led, fourteen years later, to the experiment known as “the zero hour of modern physics”—the discovery of the electron— and from there, along a more or less straight line, to wireless telegraphy, radio, television, and the first generation of digital computers.
“Well, I’m not a scientist,” the Wizard of Menlo Park said. “I measure everything I do by the size of the silver dollar. If it don’t come up to that standard then I know it’s no good.”
to build a better life for his fellow man—and get rich in the process. It was an archetypal American picture. set out at the age of twelve to make his fortune. He sold snacks on the Detroit–Port Huron train. He started a newspaper called Paul Pry. By his thirty-fifth birthday, Edison was a millionaire, a leader of industry, and probably the best-known man on earth. When he announced early in 1878 that he might try to perfect an electric light, illuminating gas stocks plummeted on Wall Street.
Struggling to find an efficient filament for his incandescent light, Edison decided to try everything on earth until something worked. He made a filament from dried grass, but that went haywire. He tried aluminum, platinum, tungsten, tree bark, cat’s gut, horse’s hoof, man’s beard, and some 6,000 vegetable growths before finding a solution in a carbonized length of cotton thread.
The mystery of electricity had prompted a number of contradictory hypotheses. Early researchers had postulated that electricity was a fluid (which is why we still talk today of “current” and “flow”).
most fertile era in physics since Isaac Newton’s day
Already scientists had measured the mass of the smallest object in the universe—the hydrogen atom, weighing about .0000000000000000000000017 gram
He was British to the core. In his memoirs he notes with great pride that twenty-seven of his students (including his son) were elected to the Royal Academy; as an aside, he mentions that seven of them (including his son) also picked up Nobel Prizes. J.J.’s own Nobel Prize, in 1906, seems to have satisfied him less than the knighthood he received two years later. When he died, at eighty-four, in 1940, he was buried in Westminster Abbey near the grave of Isaac Newton.
Hard-working, highly disciplined, extremely demanding of himself and those around him, Fleming was determined that everything about his lectures should be perfect—he rehearsed with a stopwatch so that every word and gesture would come at the right second
By marking where the returning beam came from, and measuring how long its round trip had taken, the British defenders could tell their fighters where to intercept the enemy.
scientific work, one experimental and one theoretical
The P-N junction works like the turnstile you pass through when you enter the subway or a stadium: you can go through easily in one direction but not the other -
Ein hervorragendes Buch. Ich will mich weiter mit Mikroelektronik beschäftigen. Super interessantes Feld.
-
Okay, Shackley & Co gave us the magnificent transistor in 1947, but how did we get from there to the general-purpose, spreadsheet-wrangling CPUs we have today? You can think of a transistor as a hose with an electric clamp. The clamp prevents water from flowing through it only when electricity goes to the clamp. If you stop sending electricity to the clamp, the flow of water resumes.
With the electric-clamp, we can now build circuits. Imagine a hose with two clamps next to each other, A and B:
----A----B----
If we send current through A, but not B, then put water through the hose, then
we won't get any water out:
====A====B----
But if we put water through A _and_ B, we'll get water out the other side:
====A====B====
That's called a "AND"-gate: a signal only passes through if both the 'clamps' (transistors) have power. We can construct other simple gates like OR-gates and NAND-gates this way. One of the modern heroes of computing from Bell Labs, Claude Shannon, wrote a thesis showing how we can build general-purpose machines by turning any problem into a combination of boolean functions. That means, we can solve any problem by putting enough transistors together, because we can form those gates with transistors. It means you can multiply numbers, add them together, so on, and so forth with transistors.
This worked well to create computers for a while, but as transistors shrunk and we continued to want them to get faster, a bottleneck crept in: wiring all those transistors together without error became near-impossible past a certain number of transistors. That threshold was reached in the 1960s. The problem was called "The Tyranny of Numbers." The circuit-engineers couldn't fathom how we'd put more transistors into our machines without error.
However, around this time, Jacky Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductors) both independently came up with the idea of the integrated circuit: instead of wiring individual transistors together, put them on a silicon-oxide conductive surface. This meant no more tiny wiring. It meant much faster transistors, because they could get even smaller. The Tyranny of Numbers was a solved problem. Of course, since two people came up with it around the same time, legal hell ensued. That part is less interesting.
Robert Noyce, Goordon Moore, Andy Grove, and others left Fairchild Semiconducters where they had worked on integrated circuits and co-founded Intel. There, they made another magnificent leap into the world of microprocessors. Before this time, each customer wanted their own custom-designed chip. But now chips started becoming available that were general-purpose, like the Intel 8080.
And there you have it, stage for the Computer Age set!
Lovely books, especially in the heels of the one on Bell Labs. -
The Chip, recounts a fascinating story of two relatively unknown men that changed the course of modern civilization... really. Although working for different companies, many miles apart, they simultaneously came up with the monolithic idea, a basic blueprint for the modern microchip. This concept overcame the last remaining limit in the advancement of processing power that was known as the tyranny of numbers.
The book also shows the importance of government support in new industries as the only way to overcome the chicken and egg problem. Libertarians who still don't seem to understand that even necessary investments may not be made by companies if they don't see a quick return, should tell me how much longer it would have take the microchip to become as common as apple pie, or if it had survived at all. -
The definitive account of the micro-processor and how it has shaped almost all aspects of life. I'm sad that I have finished this book. What a read!!!
-
first half contains satisfying history and explanation the transistor and microchip, with some focus on the personalities and the problem-solving mindset of engineers that distinguishes them from scientists. as book zooms out in second half to later developments of microchip industry it is less interesting, more skim-worthy. it is curious that jack kirby is so well-known in japan, and it does seem like a credit to them as the celebrity of gates/bezos/musk etc is surely more to do with dazzle of their wealth than anything of substance. the book was originally published in 1984 but only occasionally shows its age: eg, tvs no longer have cathode ray tubes.
-
Call it 3.5 stars. The first two-thirds of the book, which covers the progression from vacuum tubes to transistors to microchips, is quite interesting. The last third however drifts into subjects like the inner workings of a pocket calculator, the invention of television, and the rise of Japan in the 1970s and 1980s, all of which are only vaguely related to the main topic and which feel very much like padding. There's a better book waiting to be written about this subject.
-
I loved this book! Fascinating story about the dawn of computing in Texas and a sleepy little suburb of San Francisco and all the amazing stories and characters that eventually made Silicon Valley. T.R. Reid tells an epic story from vacuum tubes, the tyranny of numbers to silicon wafers and the space program in an easy laid back style and describes it in layman's terms so it never became too technical or tedious.
-
First of all, the subjects of this book--Kilby and Noyce--stood the world on its head. And very few people have ever heard of them. Reid did his best to change that. He couldn't, but the failure is ours, not his.
Second, Reid is a marvelous writer with a striking ability to render technical subjects understandable.
It's an old book. But you need to read it. -
This book fascinated me! After working more than 20 years in the semiconductor industry, T.R. Reid explained in a satisfactory way just how these components I’ve been making work! And who knew that I could understand Boolean logic! A little dated at this point, but the historical facets are nonetheless intact. I’d love to read an updated version!
-
Popular science writing at its finest. Accessible, forthright, curious. The author didn’t know anything about this subject before writing it and so has a humility and a wonder at the topics of integrated circuits and transistors that anyone will understand and appreciate. Would recommend to anyone who uses a computer aka everyone.
-
Five stars if you are into technology. Excellent biography of two important men and the beginning of the electronics era. I liked the deep dive into their personalities as well as the review of the industry in general.
It really provides an excellent perspective of what we take for granted today.
Very well written. -
Fantastic book. Did a great job of building up each achievement that led to the chip and beyond. So many good nuggets. While it was packed with information, I found the writing to be a playful enjoyable style. A lot of paragraphs end “now the stage was set for…”. And the next paragraph begins with a “…except it wasn’t”