Sunday, September 19, 2010

Critical Analysis: Computers - The Life Story of Technology (pgs. 1-83)

The beginning of Computers starts off a bit slow, introducing the entire concept of mathematical theories throughout mankind. From cavemen to the Romans to modern day, Ferro speaks of the different, independent number systems developed through the course of history. As it progresses, however, the book becomes far more interesting. Ferro mentions a few of the different mechanical means of applying mathematics invented in the 17th and 18th century. Eventually, he brings up Charles Babbage, which is where the book starts turning away from the fundamental mathematical concepts behind computers and towards the development of what will eventually become the modern-day computer. Until his time, “computers” literally were people who computed numbers. However, Babbage sought to eliminate as much human intervention from the mathematical process as possible. The Analytical Engine is a prime example of this. It is, in many ways, an abstract blueprint for the computer as we know it throughout the 20th century and today. In Ferro’s words, it “is considered the first realizable design for a general-purpose computer.” (pg. 17)



As the book enters the time period of the second World War, computing technology picks up at an infinitely faster rate. IBM becomes the largest player in this newly fleshed out field, going from a “punchcard” company to a digital one. It is during this time that the “second generation” of computer hardware is being created. The second generation, of course, is characterized by the invention of the transistor. It was, according to Ferro, instrumental in “rapidly replacing vacuum tubes in computers . . . because transistors were much smaller, generated less heat, and were more reliable.” (pg. 52) Two important “inventions” are discussed in this section, those being the introduction of software programming languages (FORTRAN and COBOL being the main players) and the idea of the Turing Test to test assess artificial intelligence.



The first half of the book finishes by discussing Jack Kilby and the invention of the semiconductor microprocessor. Room and building-sized computers were not as were reaching the end of their lives as Kilby sought to alleviate their bulk. Similarly, it became impossible to add more transistors to chips, as “The limits of making electronics by hand became apparent”. (Ferro 66) He introduced and patented the idea of thousands of transistors on a single piece of silicon, and simultaneously birthed the modern-day (3rd-generation) computer.



I have noticed that many of my classes this semester are introducing many of the same concepts. I’m learning the history of electronics (The Electronic Century, Nebeker - 01:512:395), fundamentals of computer and programming concepts, and many other alike ideas at once. I’d consider this to be an advantage, because it amounts to many overlapping concepts (which possibly correlates to less overall study time :D). It also is interesting because it allows me to look at one idea (such as, perhaps, the introduction of transistors) from many angles. The rise of computers and computing technology changed society and communication forever, on both the micro and macro level. For example, airlines were able to use computers and punchcards to significantly streamline their efficiency while simultaneously adding thousands more customers. The US Census saved money and time on an exponential scale thanks to developing computer technologies. Thanks to companies like Fairchild Semiconductor and IBM, America became the center of this developing realm of technology, and with it became a larger world superpower.



-kevin

No comments:

Post a Comment