7884
Comment:
|
8972
|
Deletions are marked like this. | Additions are marked like this. |
Line 6: | Line 6: |
This core of this book is the development of the Apollo flight computer, the first integrated circuit computer to fly in orbit, with a focus on the decisions that expanded its role to the central planning and scheduling tool for the Apollo flights. The computer was an evolution of the Polaris SLBM guidance computer, and both were developed by the MIT Instrumentation Laboratory, renamed the Charles Stark Draper Laboratory after its founder in 1970, and spun off as a non-profit corporation in 1973. | The core of this book is the development of the Apollo guidance computer, the first integrated circuit computer to fly in orbit, with a focus on the decisions that expanded its role to the central planning and scheduling tool for the Apollo flights. The computer was an evolution of the Polaris SLBM guidance computer, and both were developed by the MIT Instrumentation Laboratory, renamed the Charles Stark Draper Laboratory after its founder in 1970, and spun off as a non-profit corporation in 1973. |
Line 8: | Line 8: |
The Apollo computers were heavy, and used hundreds of Fairchild μL914 RTL dual NOR gates. This is personal - though most of my high-school experiments used TTL dual inline packages, I still have some 914's in my parts box, the kinds in 8 pin round cans, not the ceramic flat-packs used by the Apollo Block 2 computers. I was surprised to learn that the chips used by the flight computers, and the many more computers used on the ground for software development and by mission control backroom engineers, used 60% of Fairchild's production. | The Apollo guidance computers were heavy, and used thousands of Fairchild μL914 RTL dual NOR gates. This resonates personally - though most of my high-school experiments used TTL dual inline packages, I still have some round-can 914's in my parts bins, though not the expensive space-grade ceramic flat-packs used by the Apollo Block 2 computers. I was surprised to learn that the chips used by the flight computers, and the many more computers used on the ground for software development and by mission control backroom engineers, used 60% of Fairchild's production. |
Line 10: | Line 10: |
The computers used a cubic foot of magnetic core read only memory, using cores with dozens of wires strung through them, acting as magnetic AND gates to generate a voltage blip on a sense wire. These were strung on long bundles of wire, with special wire weaves creating the equivalent of a dozen bits of pattern memory per core. The strings of cores were called ropes, and represented 72 kilobytes of program code, generated with low level languages and paper tape, and fed to special rope assembly machines (assisted by "little old ladies") at subcontractor Raytheon. | The computers used a cubic foot of magnetic core read only memory, using cores (half inch diameter?) with dozens of wires strung through them, acting as magnetic AND gates to generate a voltage blip on a sense wire. These were strung on long bundles of wire, with special wire weaves creating the equivalent of a dozen bits of pattern memory per core. The strings of cores were called ropes, and represented 72 kilobytes of program code, generated with low level languages and paper tape, and fed to special rope assembly machines (operated by "little old ladies") at subcontractor Raytheon. |
Line 12: | Line 12: |
No schematics (though those are probably available on the NASA technical report server), but enough description of the engineering and software that I can extrapolate how they worked. The focus of the book is not on the hardware, but on the interaction of these new machines with mission design, astronaut training and attitudes, and the performance of the missions. | '''Notes from Wikipedia''' - 16 bit words, 4Kbytes of writable core, 4100 single NOR3s in block 1, 2800 dual NOR3s in block 2. In 1968, Noyce and Moore left Fairchild to found Intel |
Line 14: | Line 14: |
As usual, the media presented a distorted history, and the astronauts themselves changed the story to justify their intervention in landings that could have been performed almost completely by software. The astronauts had an essential role to play - the maps of the landing places were inexact and undetailed - close up, there were many rocks and small craters to avoid, and the commander pilots needed to visually locate safe places to land. However, the descent engine kicked up so much dust below 200 feet that they couldn't see where they were landing, so it would have been better to use the software they had to designate a new landing position, and let the computer do the steering. The commander took the stick for all six landings, with control movements interpreted by the computer. | No schematics (though those may be available on the NASA technical report server), but enough description of the engineering and software that I can extrapolate how they worked. The focus of the book is not on the hardware, but on the interaction of these new machines with mission design, astronaut training and attitudes, and the performance of the missions. |
Line 16: | Line 16: |
The commander looked through a small window - during much of the descent, the window was pointed away from the landing spot. If I was designing things, I would have used telescopic optics and movable mirrors to bring the view to the commander's eyes, allowing the surface to be examined from a much greater distance, and to use parallax to estimate distance and surface slope more accurately. In 2014 we would use video displays and hundreds of tiny cameras - no need for hull-penetrating windows. | As usual, the media presented a distorted history, and the astronauts themselves changed the story to justify their intervention in landings that could have been performed almost completely by software. The astronauts had an essential role to play. The maps of the landing places were inexact and undetailed. Close up, there were many rocks and small craters to avoid, and the commander pilots needed to visually locate safer places to land than the preprogramed targets. However, the descent engine kicked up so much dust below 200 feet that they couldn't see where they were landing, so it would have been better to use the software they had to designate a new landing position, and let the computer do the steering. The commander took the stick for all six landings, with control movements interpreted by the computer, but the automated landing sequence was disabled. The commander looked through a small window - during much of the descent, the window was pointed away from the landing spot. If I was designing things, I would have used telescopic optics and widely spread movable mirrors to bring the view to the commander's eyes, allowing the surface to be examined from a much greater distance, using gratings and parallax to estimate distance and surface slope more accurately. In 2014 we would use video displays and hundreds of tiny cameras - no need for heavy hull-penetrating windows. |
Line 20: | Line 22: |
While later landings wer closer, Apollo 11 overshot by six kilometers (more information [[http://www.apogeerockets.com/downloads/Newsletter276.pdf | here ]] out of a 400,000 kilometer journey. The Curiosity lander travelled 1000 times farther to Mars, but landed only 2 kilometers from the center of the bullseye. Robots can do a very good job. | While later landings came closer to target, Apollo 11 overshot by [[http://www.apogeerockets.com/downloads/Newsletter276.pdf | six kilometers ]] out of a 400,000 kilometer journey. The Curiosity lander travelled 1000 times farther to Mars, but landed only 2 kilometers from the center of the bullseye. Robots can do a very good job. |
Line 22: | Line 24: |
The book ends powerfully, with the 100th annual dinner of the Explorer's club in 2004. Some of the world's greatest explorers graced the stage: Bernard Picard (balloonist), Edmond Hillary, Buzz Aldrin. The last speaker was Dr. Stephen Squyers of Cornell, who gave a spellbinding description of his group's robotic exploration of Mars, as engaging as the physical explorers that preceeded him. | The book ends powerfully, describing the 100th annual dinner of the Explorer's club in 2004. Some of the world's greatest explorers graced the stage: Bernard Picard (balloonist), Edmond Hillary, Buzz Aldrin. The last speaker was Dr. Stephen Squyers of Cornell, who gave a spellbinding description of his group's robotic exploration of Mars, as engaging as the physical explorers who preceeded him. |
Line 25: | Line 27: |
=== The Curiosity Mars Rover === | |
Line 26: | Line 29: |
At a banquet at the [[ http://isdc.nss.org/2012 | 2012 ISDC]], I watched Adam Steltzner, manager for Curosity's Entry, Descent, and Landing (EDL) team give a spellbinding description of that landing, not as a scientist but as an engineer who lead the brillant team that designed that astounding system. Their unprecedented task - build a physics-accurate Mars model in a supercomputer cluster, and a land a model of Curiosity in that computer. The physical landing on Mars years later did exactly what the computer model did, hundreds of times. I am a chip designer - we do the same thing, and with good models, achieve first silicon success on incredibly complex systems. It is thrilling to see this same approach work on Mars, and with the CPUs my friends at Intel design. Collect scientific data, model it accurately, test and calibrate the models with radical but inexpensive physical tests, then build hugely complicated systems from hugely complicated computer models. It works. | At a banquet at the [[ http://isdc.nss.org/2012 | 2012 ISDC]], I watched Adam Steltzner, manager for Curosity's Entry, Descent, and Landing (EDL) team, give a spellbinding description of that landing, not as a scientist but as an engineer who lead the brillant team that designed that astounding system. Their unprecedented task - build a physics-accurate Mars model in a supercomputer cluster, and a land a model of Curiosity in that computer. The physical landing on Mars years later did what the computer model had done hundreds of times before. I am a chip designer - we do the same thing, sumulate and build, using accurate models of incredibly complex systems to achieve first silicon success. It is thrilling to see this same approach work on Mars, and with the complex CPUs my friends at Intel design. |
Line 28: | Line 31: |
The Oregon Museum of Science and Industry sponsors "Science Pub" at local theater-pubs in Portland. Scientists and engineers present their research to a slightly buzzed audience of 300 science enthusiasts. In 2012, Stephen Hammond, chief scientist of NOAA's Marine Science Center in Newport Oregon, presented a talk about their robotic submersible research; rather than slides, we had a live feed from a submersable observing a volcanic event, live by satellite from the bottom of the Tonga trench in the south Pacific. We had a two-way hookup, and could question the scientists operating the robot - who were on the ship overhead, in Newport, and in other centers around the country. There was a noticable speed of light delay through the satellite; but we rapidly accomodated that. With the 60 foot protected display, it was like we were at the bottom of the ocean, with the scientists. | Collect scientific data, model it accurately, test and calibrate the models with radical but inexpensive physical tests, then build hugely complicated systems from hugely complicated computer models. It works. |
Line 30: | Line 33: |
The age of robotic experiment does not take humans "out of the loop" - it brings ordinary people into intimate contact with extreme environments and the geniuses who virtually inhabit them. This is as experiential as walking inside a heavy space suit on the moon - the robots are just another kind of suit, one that all of us can afford to wear. Exquisitely trained astronauts experience thrilling wonders while slavishly plodding through their checklists, and it is good that a very few of our fellow humans do such things, occasionally. But making space meaningful to the other 7 billion of us requires our participation, and ever-better robots with predictive/adaptive control, and terabit communication, is how we will do it. In 1492, the western hemisphere was visited by three tiny wooden ships from Spain; we can appreciate their accomplishment and share their experience without travelling the same way. | ---- === Robotic Submersibles === The Oregon Museum of Science and Industry sponsors "Science Pub" in Oregon theater-pubs. Scientists and engineers present their research to a slightly buzzed audience of 300 science enthusiasts. In 2012, Stephen Hammond, chief scientist of NOAA's Marine Science Center in Newport Oregon, presented a talk about their robotic submersible research; rather than slides, we had high definition color video from a submersible observing a volcanic event, live by satellite from the bottom of the Tonga trench in the south Pacific. We had a two-way hookup, and could question the scientists operating the robot - who were on the ship overhead, and in Newport, and in other centers around the country, all connected by the internet. There was a noticable speed of light delay through the satellite; but we rapidly accomodated that. With the 60 foot projection display, it was like we were at the bottom of the ocean, with the scientists, looking out through a big window. We had a better view than through an astronaut's visor. The age of robotic experiment does not take humans "out of the loop" - it brings ordinary people into intimate contact with extreme environments and the wizards who virtually live there. This is as experiential as walking inside a bulky space suit on the moon - the robots are just another kind of suit, one we can all afford to wear. Exquisitely trained astronauts experience thrilling wonders while slavishly plodding through their checklists. It is good that a handful of our fellow humans do such things, occasionally. But making space meaningful to the other 7 billion of us requires direct participation through ever-better robots with predictive/adaptive control and terabit communication. In 1492, the western hemisphere was visited by three tiny wooden ships from Spain. We can appreciate their accomplishment and share their experience without travelling the same way. ---- === David Mindell Video === A one hour [[https://www.youtube.com/watch?v=MG_-1099UM8 | Google Talk ]] about these issues, and Dr. Mindell's next book. ---- Also, a half hour [[https://www.youtube.com/watch?v=YIBhPsyYCiM | Science Reporter ]] from NASA (made by MIT) that shows assembly and operation of the Apollo computer. The video shows the assembly of a block one computer, built around 4300 RTL 2-input NOR gates in metal cans. The video is undated, "from the mid 60s", but the film segments were probably filmed before 1965. |
Digital Apollo
David A. Mindell, MIT Press, 2008
I may have missed some of this book, because I tried to read it in three evenings, and started falling asleep after midnight. With less exhaustion and an open schedule, I would have done my damnedest to read it straight through in one sitting - it is that good. You can look at the rest of this site to see what I keep busy with.
The core of this book is the development of the Apollo guidance computer, the first integrated circuit computer to fly in orbit, with a focus on the decisions that expanded its role to the central planning and scheduling tool for the Apollo flights. The computer was an evolution of the Polaris SLBM guidance computer, and both were developed by the MIT Instrumentation Laboratory, renamed the Charles Stark Draper Laboratory after its founder in 1970, and spun off as a non-profit corporation in 1973.
The Apollo guidance computers were heavy, and used thousands of Fairchild μL914 RTL dual NOR gates. This resonates personally - though most of my high-school experiments used TTL dual inline packages, I still have some round-can 914's in my parts bins, though not the expensive space-grade ceramic flat-packs used by the Apollo Block 2 computers. I was surprised to learn that the chips used by the flight computers, and the many more computers used on the ground for software development and by mission control backroom engineers, used 60% of Fairchild's production.
The computers used a cubic foot of magnetic core read only memory, using cores (half inch diameter?) with dozens of wires strung through them, acting as magnetic AND gates to generate a voltage blip on a sense wire. These were strung on long bundles of wire, with special wire weaves creating the equivalent of a dozen bits of pattern memory per core. The strings of cores were called ropes, and represented 72 kilobytes of program code, generated with low level languages and paper tape, and fed to special rope assembly machines (operated by "little old ladies") at subcontractor Raytheon.
Notes from Wikipedia - 16 bit words, 4Kbytes of writable core, 4100 single NOR3s in block 1, 2800 dual NOR3s in block 2. In 1968, Noyce and Moore left Fairchild to found Intel
No schematics (though those may be available on the NASA technical report server), but enough description of the engineering and software that I can extrapolate how they worked. The focus of the book is not on the hardware, but on the interaction of these new machines with mission design, astronaut training and attitudes, and the performance of the missions.
As usual, the media presented a distorted history, and the astronauts themselves changed the story to justify their intervention in landings that could have been performed almost completely by software. The astronauts had an essential role to play. The maps of the landing places were inexact and undetailed. Close up, there were many rocks and small craters to avoid, and the commander pilots needed to visually locate safer places to land than the preprogramed targets. However, the descent engine kicked up so much dust below 200 feet that they couldn't see where they were landing, so it would have been better to use the software they had to designate a new landing position, and let the computer do the steering. The commander took the stick for all six landings, with control movements interpreted by the computer, but the automated landing sequence was disabled.
The commander looked through a small window - during much of the descent, the window was pointed away from the landing spot. If I was designing things, I would have used telescopic optics and widely spread movable mirrors to bring the view to the commander's eyes, allowing the surface to be examined from a much greater distance, using gratings and parallax to estimate distance and surface slope more accurately. In 2014 we would use video displays and hundreds of tiny cameras - no need for heavy hull-penetrating windows.
But these landings were, as the author points out in the last chapter, expeditions. Science gathers data; expeditions gather human experience. Expeditions are expensive in money, and sometimes human life. The book "Wheels Stop" about the Space Shuttle program tells us about many more almost-disasters, in addition to the two that destroyed shuttles and killed crews. Using insights from both books, it seems part of the genesis of these shuttle disasters is a design that attempts to give flight control back to pilot-astronauts, and the really bad design decisions (wings???) that resulted.
While later landings came closer to target, Apollo 11 overshot by six kilometers out of a 400,000 kilometer journey. The Curiosity lander travelled 1000 times farther to Mars, but landed only 2 kilometers from the center of the bullseye. Robots can do a very good job.
The book ends powerfully, describing the 100th annual dinner of the Explorer's club in 2004. Some of the world's greatest explorers graced the stage: Bernard Picard (balloonist), Edmond Hillary, Buzz Aldrin. The last speaker was Dr. Stephen Squyers of Cornell, who gave a spellbinding description of his group's robotic exploration of Mars, as engaging as the physical explorers who preceeded him.
The Curiosity Mars Rover
At a banquet at the 2012 ISDC, I watched Adam Steltzner, manager for Curosity's Entry, Descent, and Landing (EDL) team, give a spellbinding description of that landing, not as a scientist but as an engineer who lead the brillant team that designed that astounding system. Their unprecedented task - build a physics-accurate Mars model in a supercomputer cluster, and a land a model of Curiosity in that computer. The physical landing on Mars years later did what the computer model had done hundreds of times before. I am a chip designer - we do the same thing, sumulate and build, using accurate models of incredibly complex systems to achieve first silicon success. It is thrilling to see this same approach work on Mars, and with the complex CPUs my friends at Intel design.
Collect scientific data, model it accurately, test and calibrate the models with radical but inexpensive physical tests, then build hugely complicated systems from hugely complicated computer models. It works.
Robotic Submersibles
The Oregon Museum of Science and Industry sponsors "Science Pub" in Oregon theater-pubs. Scientists and engineers present their research to a slightly buzzed audience of 300 science enthusiasts. In 2012, Stephen Hammond, chief scientist of NOAA's Marine Science Center in Newport Oregon, presented a talk about their robotic submersible research; rather than slides, we had high definition color video from a submersible observing a volcanic event, live by satellite from the bottom of the Tonga trench in the south Pacific. We had a two-way hookup, and could question the scientists operating the robot - who were on the ship overhead, and in Newport, and in other centers around the country, all connected by the internet. There was a noticable speed of light delay through the satellite; but we rapidly accomodated that. With the 60 foot projection display, it was like we were at the bottom of the ocean, with the scientists, looking out through a big window. We had a better view than through an astronaut's visor.
The age of robotic experiment does not take humans "out of the loop" - it brings ordinary people into intimate contact with extreme environments and the wizards who virtually live there. This is as experiential as walking inside a bulky space suit on the moon - the robots are just another kind of suit, one we can all afford to wear. Exquisitely trained astronauts experience thrilling wonders while slavishly plodding through their checklists. It is good that a handful of our fellow humans do such things, occasionally. But making space meaningful to the other 7 billion of us requires direct participation through ever-better robots with predictive/adaptive control and terabit communication. In 1492, the western hemisphere was visited by three tiny wooden ships from Spain. We can appreciate their accomplishment and share their experience without travelling the same way.
David Mindell Video
A one hour Google Talk about these issues, and Dr. Mindell's next book.
Also, a half hour Science Reporter from NASA (made by MIT) that shows assembly and operation of the Apollo computer. The video shows the assembly of a block one computer, built around 4300 RTL 2-input NOR gates in metal cans. The video is undated, "from the mid 60s", but the film segments were probably filmed before 1965.