Title | : | Code The Hidden Language of Computer Hardware and Software |
Author | : | |
Rating | : | |
ISBN | : | 9780735611313 |
ISBN-10 | : | 9780735611313 |
Language | : | English |
Format Type | : | Paperback |
Number of Pages | : | 400 pages |
Code The Hidden Language of Computer Hardware and Software Reviews
-
I'll be honest I only read this book because it was uoted as a must read by Joel Spolsky on a stackexchange answer about how to go about learning programming and finding out if you wantshould be a programmerI was a little hesitant due to the year of release Being at least some 11 years old that's a lot of time in the tech world Ultimately though that doesn't matter I defy any developerprogrammersystem builder to read this book and not blitz through it lapping it up Yes if you've done some schooling in computing or computer science you may be happy with much of the content but you'll surely find things you've either not thought about before in much depth or just wasn't explained in uite the elegant way that Petzold does For me whether it was due to age experience or just maturity through both I found it filled gaps in my memory and indeed gaps in student course materialPetzold opens up the world of computing through a concise linear storytelling format Starting with a basis in Morse Code and Braille through the telegraph system barcodes boolean logic circuits with memory von neumann machines adding peripherals IO devices and GUI interfaces we just about catch up to the modern era with talk of HTTP and the world wide web Having pretty much built the systems or simplified versions of we're discussing in the incremental circuit and systems diagrams on the wayAdmittedly there's some rather 'of their time' phrases and facts that raise a smile low resolutions high costs for 'small' HD storage sizes usage of cassette tapes by consumers but this is all still valid information when taken in the context of the time of writingIf you are a DeveloperProgrammer you're not going to go into work having had an epiphany of how better to do things but you may have a new found respect for what you're doing and the many many ingenious shoulders you are standing upon
-
My opinion on this book is really divided on the one hand I enjoy some chapters on the other hand I hardly managed to restrain myself from flipping through other chapters Basically this book designs and builds a basic computer by introducing in each chapter a concept or a technology used inside computers It was written from 1987 to 1999 conseuently one shouldn't expect any description of newest technologiesIt starts really slowly with the first chapters but then things get and complicated One of the things that bother me with this book is the difference in complexity between chapters Some chapters can be easily understood by a junior school or high school student while some of the latest chapters remind me bad memories of electronic circuits from my engineering school years For example a whole chapter is dedicated to explain how to communicate with your neighbour using a flashlight an other chapter tackles the same issue with light bulbs and electrical wires whereas all the gates or all the flip flops are dealt with in a single chapter I admit I have never been either fond of or good at electrokinetics but I confess I didn't try to understand how all the electronic circuits of these later chapters work I guess these chapters mostly interest hard code computer enthusiasts but don't they already know these stuffs ?Besides few chapters are a little boring a whole chapter to describe every op code of Intel 8080 come on Does the decimal system really deserve a whole chapter ? In my opinion decimal and alternative number systems should have been presented in a single chapter instead of twoMoreover the huge difference in complexity leads to some contradiction The binary number system is so well described that a high school student can easily understand it binary addition and subtraction are very detailed but multiplication is done with a simple inefficient loop In my opinion it would have been opportune to present at least a efficient version based on the binary representation of the multiplicand as well as introduce exponentiation by suaring aka suare and multiply or binary exponentiationAdditionally I think that Charles Petzold tries to explain in too many details how each part works so that readers with less technical knowledge can understand but in the end I guess these readers get lost or confused by so many details anyway whereas a few technical references are missing For instance both Von Neumann and Harvard architectures are described but I don't recall them being mentionedNevertheless I really liked when the author gives historical anecdotes or references The chapters I enjoyed the most are the ones where Charles Petzold gives readers some background history to introduce a concept or technology for instance Morse and Braille's codes Bell's telegraph the invention of telegraph relays the evolution of transistors chips or programming languagesEventually I find it a bit contradictory for this book that most of the interesting chapters are the less technical ones indeed Moreover due to the important difference of knowledge reuired to understand chapters I don't think someone may understand or find interesting every chapter
-
Raise your hand if you think metaphors and analogies should be used sparingly I'll raise my hand with you This book is for usAfter reading this book I can see behind the pixels on my computer screen I know what I'm really looking at So many layers of abstraction are removed by learning about how logic gates can be arranged as processors and RAM how code is simply a representation of those microscopic switches being flipped and how pixels are simply a graphical interpretation of the state of particular switches Moreover I also have a little bit of an understanding of the historical evolutions these inventions and conventions went through not just how computers work but why they work that way and how they came to beThe book was tougher to grasp than I thought it would be I do not have an extensive background in electronics or programming Although it started off easily it became progressively complicated except for the last chapter or two Of course this was to be expected as the book began with the basic building blocks of a computer and built progressively complicated systems from those initial components However the problem wasn't really a result of the subject matter but of the writing style which seemed to grow terse in later chapters I was left with the impression that the author felt he was running out of space which I'm sure he was; it must be difficult to keep a book with such a vast scope to a manageable size and prevent it from turning into a reference manual I would characterize this book as grueling but that might be because I was obstinate in making sure I fully understood every detail of every page There were a few pages that I had to pore over repeatedly until I received a eureka moment A few explanatory sentences here and there would have alleviated this but ultimately drawing my own conclusions was very rewarding The book seemed to recover from its gradually adopted terseness with an appreciated but sudden reference to the first chapter in the very last sentence Someone less focused and inclined to skim might find this book to be a bit lighter reading but it still only took me a few days to read the whole thingI was surprised to see that the book did not really cover how transistors work at the electron level which leaves what I consider to be a major gap in any understanding of how modern computers based on integrated circuits work The text says that transistors are functionally euivalent to electromechanical relays or vacuum tubes and work similarly but hardly any than that This missing knowledge is something that would have been appreciated and wouldn't have taken up much space It seems like an especially glaring omission when juxtaposed with the inclusion of a few pages on EBCDIC an obsolete alternative to ASCII text codes descended from paper punch cardsDespite these minor gripes this is a really great book and I highly recommend it to anyone who has the interest and persistence to get through it It teaches and ties together many mathematical and electrical concepts and the payoff for the reader is a new perspective on computing Despite being first published in 1999 it hardly seems dated at all probably because it's really a history book and most of the computing history it covers happened in the 1980s and earlier All computing history after that is basically just increasingly complex variations on those simpler foundations A seuel would be welcomePS I think I've discovered a typo in the assembly code program on page 322 It seems to me that there should be an additional AND A0Fh after the four lines of RRC and before the first CALL NibbleToAscii line If I'm wrong would anyone mind explaining why? And if I'm correct would anyone mind giving me peace of mind by confirming this? Thanks
-
Electricity is like nothing else in this universe and we must confront it on it's own terms That sentence casually buried near the beginning of the book exemplifies the engineer's muse a striving to become aware of the inhuman how it operates and to find means of creating a socket for human enterprise something to extend the fallible chassis of our fleshThe first two thirds or so of this book follows a double track One track covers the ways in which meaning may be encoded into messages the other weaves repetitions of a relatively simple device — the telegraph relay — into machines that marshall electricity into the forms of logic and memory These two tracks eventually coincide at the device we know as a computer Though it would be impossible to build a computer from telegraph relays the machines we use today perform the same tricks with electricity that were possible in the 19th centuryThe last third of the book is concerned with the makeup and successive improvements in implementation of the devices that embody the marriage of electricity and meaning For someone like me accustomed to the elves of the internet bringing me a regular helpings of news porn and status updates from the virtual smörgåsbord it was interesting to see how they have been made so much easier to use since the era of assembly code and text terminalsRegarding electricity that prime mover of the information age it has struck me that electricity is the stuff minerals dream with and we may have subjected an alien order to the vagaries of our desire without being prepared to one day pay the price We live all of us in an era of debt making allowances for even a future of cities submerged and massive conflicts fostered by drought When it finally comes time to pay off our mineral deficit will it be our dreams — that which makes us human — to ultimately be forfeit?
-
Every single person in tech should read this book Or if you're just interested in tech Or if you just want a basic appreciation of one of the most important technologies in human history—the computer This book contains the best most accessible explanation I've seen of how computers work from hardware to software The author manages to cover a huge range of topics—electricity circuits relays binary logic gates microprocessors code and much —while doing a remarkable job of gradually building up your mental model using lots of analogies diagrams and examples so just about everyone should be able to understand the majority of the book and gain a deep appreciation of what's really happening every time you use your laptop or smartphone or read this review online I wish I had this book back in high school and college I've been coding for 20 years and I still found a vast array of insights in the book Some of the topics I knew already and this book helped me appreciate them ; others I knew poorly and now understand with better clarity; still others were totally new A small sampling of the insights Current is the number of electrons flowing past a point per second Voltage is a measure of potential energy The resistance is how much the substance through which electricity is flowing resists the passage of those electrons The waterpipes analogy is great current is similar to the amount of water flowing through a pipe; voltage is similar to the water pressure; resistance is similar to the width of the pipe I took an EM physics course in college and while I learned all the currentvoltageetc euations I never got this simple intuitive understanding of what it actually means We use base 10 because we have 10 fingers; a digit after all is just a finger so obvious when you actually take a second to think about it Had we been born with 8 fingers like most cartoon characters we'd probably use base 8 math Computers use base 2 because building circuitry based on two states—the presence or absence of voltage on and off 1 or 0—is much easier than circuitry based on ten states The notation we use in math is essential It's not about looking pretty or not but actually making the math easier or harder For example addition and subtraction is easy in Roman numerals but multiplication and division are much harder Arabic numerals make multiplication and division much easier especially as they introduce a 0 Sometimes in math you switch to different coordinate systems or different geometries to make solving a problem easier So it's no surprise that different programming languages would have the same properties while any language can in theory solve the same problems as any other in practice some languages make certain problems much easier than others This book does a superb job of showing how logic gates AND OR etc can be built from simple physical circuits—eg from relays which are much easier to imagine and think about than for example transistors—and how easy it is to do math with simple logic gates I remember learning this back in college but it still amazes me every time I see it and with the crystal clear examples in this book I found myself smiling when I could picture a simple physical circuit of relays that could do arithmetic just by entering numbers with switches and passing some electricity through the system eg to add you have a sum and a carry where the sum is an XOR and the carry is an AND The explanation of circuits that can remember eg the memory in your computer was superb and something I don't remember learning at all in college how ironic I love the idea that circuits with memory eg latches work based on a feedback mechanism the output of the circuit is fed back into the same circuit so if it gets into one state eg on because electricity is flowing through it that feedback mechanism keeps it in that state eg by continuing to the flow of electricity through it effectively remembering the value And all of this is possible because it takes a finite amount of time for electricity to travel through a circuit and for that circuit to switch state The opcodes in a CPU consist of an operation to perform eg load and an address You can write assembly code to express the opcodes but each assembly instruction is just a human friendly way to represent an exactly euivalent binary string eg 32 or 64 binary digits in modern CPUs You can enter these opcodes in manually eg via switches on a board that control on and off and each instruction becomes a high or low voltage These high and low voltages pass through the physical circuitry of the CPU which consist of logic gates Based purely on the layout of these logic gates voltage comes out the other end triggering new actions eg they may result in low and high voltages in a memory chip that then remembers the information store or returns information that was previously remembered load; they may result in low and high voltages being passed to a video adapter that based on the layout of its own logic gates results in an image being drawn on a screen; or they may result in low and high voltages being fed back into the CPU itself resulting in it reading another opcode eg perhaps from ROM or a hard drive rather than physical switches and repeating the whole process again This is my lame attempt at describing end to end how software affects hardware and results in something happening in the real world solely based on the physical layout of a bunch of circuits with electricity passing through them I think there is something magical about the fact that the shape of an object is what makes it possible to send emails watch movies listen to music and browse the Internet But then again the shape of DNA molecules plus the laws of physics is what makes all of life possible too And of course you can't help but wonder what sort of opcodes and logic gates are used in your brain as your very consciousness consists entirely of electricity passing through the physical shape of your neurons and the connections between themThere are a few places the book seems to go into a little too much detail—eg going over all the opcodes of a specific Intel CPU—and a few places where it seems to skip over all the important details—eg the final chapter on modern software and the web—but overall I have not found another book anywhere that provides as complete of a picture of how a computer works Given the ubiuity of computers today I'd recommend this book to just about everyone It'll make you appreciate just how simple computers really are—and how that simplicity can be used to create something truly magicalAs always I've saved a few of my favorite uotes from the bookA computer processor does moronically simple things—it moves a byte from memory to register adds a byte to another byte moves the result back to memory The only reason anything substantial gets completed is that these operations occur very uickly To uote Robert Noyce “After you become reconciled to the nanosecond computer operations are conceptually fairly simple”The first person to write the first assembler had to hand assemble the program of course A person who writes a new perhaps improved assembler for the same computer can write it in assembly language and then use the first assembler to assemble it Once the new assembler is assembled it can assemble itself
-
If you work with computers and didn't read this book you are lame
-
This is a great book Surprisingly interestingWhile the subject matter is not a new thing to me far from it the way the author goes about telling the story of how modern computers came to life is exciting engaging and fun He starts with morse and braille talks about the principles of mathematics and information explains the critical concept of switches and finally moves into the world of circuit boards and binary data cultimating in ALU After that he discusses the idea of analytical and computational engines and machines developed through the late 19th and early 20th century before we finally start seeing the modern computer around 1940s with Turing and von Neumann laying down the foundations of what we know and use todayThe book is really cool because it's also a nostalgic trip down the memory lane Charles mentions the famous Bell Labs the legendary Shannon Ritchie Noyce Moore UNIX C language and other people and concepts without which we would not be sitting here writing reviews on Goodreads Or we might but the fundamentals of the computing devices would be completely differentComputers sound like magic but the thing is they are a culmination of 150 years of electric progress 200 years of datainformation progress and about 350 years of math progress The first boards the first programs the first assembler and the first compiler they were all written by hand Control signals are still essentially the same and if you look at a typical x86 Intel processor the legacy support for machine instructions goes back to the first microprocessor The problem is when you condense the centuries of hard work into a cool whirring appliance it does feel like magicThe author wrote the book in the late 80s and then revised it in the late 90s so some of the stuff may look uaint to us like the mention of floppy disks VGA displays and such But then he also shows uncanny foresight around overall information exchange because the information principles are universal and he correctly predicted that Moore's Law would taper out around 2015He also cheated a littleHe described the flip flop as a perpetuum mobile which can be sort of excused and he also skimmed on the concepts of oscillators transistors and did not mention capacitors but then those are fairly complex and I guess it's not really possible to do that without going deep into the fields of physics and electric engineering Excusable because the book is compelling and delightfulEven if you have a PhD in Physics from a top university or have done computer science all your life you can rap in ASM and name all LoTR characters by heart this is still a good read Do not feel like you'd be highschooling yourself with silly analogies Far from it This is a splendid combo of history technology mathematics information and nostalgiaHighly recommendedx49 x67 x6F x72
-
I have been an IT professional for 20 years but I never knew what the switches on the front panel of the Altar computer were for I do nowIn fact because of this book I know many things about how a computer really works that I never did before I think this book is great for anyone except Electrical Engineers who would be bored Having some background in computers probably makes this book easier to get through but Petzold assumes nothing and starts from scratch He does a good job of making potentially dry subjects fairly interestingI think an update to this book would be great because the discussion of 1999 capacity and pricing makes the book feel dated Also the last chapter seemed rushed and not as well focused as the rest of the bookSo if you want to know how any computer really works read this book
-
What a ride A book about computers “without pictures of trains carrying a cargo of zeroes and ones” — the absolute no nonsense book on the internals of the computer From circuits with a battery switch and bulb to logic gates to a thorough description of the Intel 8080 Great way to fill blanks in my computer knowledgeThe book takes the approach of constructing the computer “on the paper and in our minds” — that's great when you're at least a little familiar with the topic maybe not so when trying to discover a completely unknown territory but the author takes great lengths to go through everything step by step — e g the various gates binary subtraction memory handling etcIn a way this is a perfect book on the topic If you know a better one I want to read it
-
It is a great book I demystified some thoughts I had about software architecture