Valley of Genius is a history of Silicon Valley based on the people who made it. Since the work was based on interviews, I expected that it would read as actual interviews, where the dialog exists between the author and the "genius". Instead, the author was removed from the chapters and instead the entire text consisted of the different participants being quoted. This writing style took sometime to accept. Initially, I wanted to know exactly who each person was and their role in the narration, which given the numbers involved would significantly detract from the story being told. Then I stopped bothering with the names, only looking for them when two (or more) speakers refer to each other. And this aspect is the best of the book, to have the different individuals having a debate in the dialog, otherwise each chapter is just narration. That said, the concern is that there is lost context to the quotes, as the author has explicitly stated the interviews had been spliced together.
All in all I enjoyed the book, but that only merits 3.5/5 stars.
(A free copy of the book was provided through Goodreads.)
A discussion of how to do Computer Science well, particularly writing code and architecting program solutions.
Showing posts with label history. Show all posts
Showing posts with label history. Show all posts
Monday, August 27, 2018
Tuesday, November 10, 2015
CSE Distinguished Lecture - Professor David Patterson - Instruction Sets want to be Free: The Case for RISC-V
Similar to every other talk on Computer Architecture, first we need to revisit history. Only by knowing from where we came, do we envision where to go.
History of ISA:
IBM/360 was proposed to unify the diverse lines of mainframes. Slowly the ISAs started adding more instructions to support more things (see below).
Intel 8086 was a crash ISA design program to cover for their original ISA design that was delayed. Then IBM wanted to adopt the Motorola 68000, but the chip was late, so the IBM PC used 8088s.
In the 1980s, did a study that found that if the compiled code only used simple instructions, then the programs ran faster than using all of the instructions. Why not design a simple ISA?
RISC (Reduced Instruction Set Computing) was that ISA. The processor is simpler and faster. Secretly, all processors are now RISC (internally), for example, the Intel and AMD processors translate from their x86 ISAs into their internal RISC ISA.
Maybe several simple instructions together could execute together, so the architecture could be simplified further and the compiler can find these instructions rather than spending time and energy when the program is running. This ISA is VLIW (very long instruction word), where many simple instructions are merged into the long instruction.
Open ISA:
Computer Architecture is reaching certain limits such that processor gains will soon come from custom and dedicated logic. IP issues limit the ability to do research on ISAs. We are forced to write simulators that may or may not mimic actual hardware behavior.
Proprietary ISAs are continuing to grow in size, about 2 instructions per month. This provides the marketing reason to purchase the new cores, rather than just the architectural improvements.
Instead, let's develop a modular ISA using many of the ideas from existing designs. For example, atomic instructions for both fetch-and-op, as well as load link / store conditional. Or, compressed instruction format so certain instructions can use a 16-bit format rather than 32-bits (see ARM).
RISC-V has support for the standard open-source software: compilers (gcc, LLVM), Linux, etc. It also provides synthesizable core designs, simulators, etc.
History of ISA:
IBM/360 was proposed to unify the diverse lines of mainframes. Slowly the ISAs started adding more instructions to support more things (see below).
Intel 8086 was a crash ISA design program to cover for their original ISA design that was delayed. Then IBM wanted to adopt the Motorola 68000, but the chip was late, so the IBM PC used 8088s.
In the 1980s, did a study that found that if the compiled code only used simple instructions, then the programs ran faster than using all of the instructions. Why not design a simple ISA?
RISC (Reduced Instruction Set Computing) was that ISA. The processor is simpler and faster. Secretly, all processors are now RISC (internally), for example, the Intel and AMD processors translate from their x86 ISAs into their internal RISC ISA.
Maybe several simple instructions together could execute together, so the architecture could be simplified further and the compiler can find these instructions rather than spending time and energy when the program is running. This ISA is VLIW (very long instruction word), where many simple instructions are merged into the long instruction.
Open ISA:
Computer Architecture is reaching certain limits such that processor gains will soon come from custom and dedicated logic. IP issues limit the ability to do research on ISAs. We are forced to write simulators that may or may not mimic actual hardware behavior.
Proprietary ISAs are continuing to grow in size, about 2 instructions per month. This provides the marketing reason to purchase the new cores, rather than just the architectural improvements.
Instead, let's develop a modular ISA using many of the ideas from existing designs. For example, atomic instructions for both fetch-and-op, as well as load link / store conditional. Or, compressed instruction format so certain instructions can use a 16-bit format rather than 32-bits (see ARM).
RISC-V has support for the standard open-source software: compilers (gcc, LLVM), Linux, etc. It also provides synthesizable core designs, simulators, etc.
Labels:
architecture,
CISC,
history,
open source,
presentation,
RISC,
VLIW
Monday, September 23, 2013
Book Review: Turing's Cathedral
Recently, I read the work of history, Turing's Cathedral: The Origins of the Digital Universe (Vintage), which is an interesting book that tells of the development of some of the first computers in the United States. It's particular focus is on the founding of the Institute for Advanced Study (IAS), and then John von Neumann's time there. Now, the von Neumann architecture is something I regularly conceptualize and use in teaching. And it was interesting to read of how this architectural model was developed, and why.
In contrast, a significant portion of the book was instead written as a history of the Institute. Given that the Institute provided access to the records used as a significant part of the source material, it is understandable that the author's focus would be so directed. However, it adds to the misleading focus that this work follows.
Of perhaps greater slight is that a work titled "Turing's Cathedral" only features Alan Turing for a small part of the writing. Instead we find greater focus placed on his work and how it fit into the research of that time. Eventually leading von Neumann to explore the usage of electronic digital computers to solve the US military's problems. He, like many European scientists, had left his homeland ahead of Hitler, and these scientists supported work leading to Germany's defeat.
The grand development that von Neumann introduced was making a computer, programmable. Beyond just reconfigurable, the project he lead at the IAS was programmable, the electronic device could store both data as well as codes that were instructions for what the device was to do. Consider that for the next 50 years, programs would be constrained by having to store the instructions in memory, which was often a very limited resource.
So John von Neumann stars in a book titled for Turing, and a book that devotes a third of its pages to references. A good, interesting work that could probably have been improved by an editor's scissors. To trim the writing down to the core bits about computers, and set aside so much of the well researched chapters to attain a focus that is lacking.
In contrast, a significant portion of the book was instead written as a history of the Institute. Given that the Institute provided access to the records used as a significant part of the source material, it is understandable that the author's focus would be so directed. However, it adds to the misleading focus that this work follows.
Of perhaps greater slight is that a work titled "Turing's Cathedral" only features Alan Turing for a small part of the writing. Instead we find greater focus placed on his work and how it fit into the research of that time. Eventually leading von Neumann to explore the usage of electronic digital computers to solve the US military's problems. He, like many European scientists, had left his homeland ahead of Hitler, and these scientists supported work leading to Germany's defeat.
The grand development that von Neumann introduced was making a computer, programmable. Beyond just reconfigurable, the project he lead at the IAS was programmable, the electronic device could store both data as well as codes that were instructions for what the device was to do. Consider that for the next 50 years, programs would be constrained by having to store the instructions in memory, which was often a very limited resource.
So John von Neumann stars in a book titled for Turing, and a book that devotes a third of its pages to references. A good, interesting work that could probably have been improved by an editor's scissors. To trim the writing down to the core bits about computers, and set aside so much of the well researched chapters to attain a focus that is lacking.
Subscribe to:
Posts (Atom)