Monday, April 28, 2014
Eight inch floppies still in use...
Eight inch floppies still in use ... in our ICBM silos. They're older than the young men and women stationed there. Does anyone still make those things? All I could find were a few on sale, as rare antiques :)
California leads the way ...
California leads the way ... in attracting businesses to other states! Some big employers are picking up and moving out, often to Texas. What big businesses? Well, how about Toyota America?
Vintage stewardesses...
Vintage stewardesses... Back in prehistoric times, before the oh-so-politically-correct term “flight attendants” entered our lexicon, there were “stewardesses”. Generally they were attractive young women. Oh, how sexist! But rather nice, actually :)
Americans who haven't traveled outside this country might not realize that in much of the world, such stewardesses are still the norm. That's especially true in eastern Europe and Asia...
Americans who haven't traveled outside this country might not realize that in much of the world, such stewardesses are still the norm. That's especially true in eastern Europe and Asia...
Binary fractions bit Donald Knuth!
Binary fractions bit Donald Knuth! At infrequent intervals, Donald Knuth publishes notes on TeX updates. Even though I haven't used TeX for years, I always read these because, well, Knuth. This one has a passage that tickled my fancy, as it's about one of my own pet peeves: the use of binary fractions (including binary floating point numbers). It seems that Donald Knuth, one of the pantheon of computer science gods, was bitten by them, too:
Companies I worked for have been burned by this issue a surprising number of times. The most memorable case I ran into was a company that implemented an electronic stock and option trading platform that used binary floating point math to hold monetary values. Big, big problems resulted. To fix it, I had to do battle with a dozen or so senior engineers – very experienced and generally competent folks – who had a great deal of trouble accepting the simple fact that binary floating point numbers are incapable of exactly representing a great many decimal fractions...
I made the foolish mistake of using binary fractions internally, while providing approximate decimal equivalents in the user interface. I should have defined a scaled point to be 1/100000 of a printer's point, thereby making internal and external representations coincide. This anomaly, which is discussed further in [5], is the only real regret that I have today about TeX's original design.The gods are fallible, too :)
Companies I worked for have been burned by this issue a surprising number of times. The most memorable case I ran into was a company that implemented an electronic stock and option trading platform that used binary floating point math to hold monetary values. Big, big problems resulted. To fix it, I had to do battle with a dozen or so senior engineers – very experienced and generally competent folks – who had a great deal of trouble accepting the simple fact that binary floating point numbers are incapable of exactly representing a great many decimal fractions...
History of computers in space...
History of computers in space ... NASA's pages. Lots of good stuff for technology history geeks here...
Nutrition “science”...
Nutrition “science” ... is modern, politically-correct snake oil. The more it is seriously examined, the less valid it gets...
The IPCC report is worthless...
The IPCC report is worthless... That's not really news, if you've been listening. The latest revelation just adds to the pile of evidence...
The future of orthopedic casts?
The future of orthopedic casts? 3D printing and 3D scanning are going to change a lot of things...
“Sometimes, it makes my blood boil...”
“Sometimes, it makes my blood boil...” I'll say! It's hard to believe this lawsuit could succeed, even in Canada. I sure hope it doesn't...
The first practical RAM...
The first computers I ever worked with (Univac CP-642A) used exactly this kind of magnetic core bit plane memory – 30 planes, each with 32,768 bits. They occupied about one cubic yard of the computer, packed with not only the bit planes themselves, but lots of both analog and digital circuitry. They required constant adjustment (read and write current levels, pulse length and timing, and sense line gain), a tedious and involved procedure that was part art and part science. In the U.S. Navy school that taught me how computers worked, and how to repair them, I also learned about older technologies: storage tubes and mercury delay line memories. Weird stuff, by today's standards – but exotic and bleeding edge stuff back then...
The first microcomputer I built used a 1702 UV-erasable EPROM, with 256 bytes of ROM – more than the biggest mainframe ROM I'd worked on. Those chips, as I recall, cost about $50 then – which seemed dirt cheap to me. My first digital design project was a programmer for those 1702 EPROMs, so that I could enter two hexadecimal digits for each byte on a keypad and automatically program it. Before I built this programmer, the only way I could “burn” code into a 1702 was to mail it off to a friend in San Diego who had a programmer, along with a hand-typed listing of the code I wanted. When my ship was floating around in the Indian Ocean, the round trip could take several months! I had a big incentive to build that programmer :)
Another gray-haired programmer ...
Another gray-haired programmer ... describing what his early days as a programmer were like. Most programmers my age seem to have similar experiences.
My own programming beginnings were much different, primarily because I had no access to programming tools (compilers, assemblers, linkers, debuggers, etc.). In the early '70s I began programming on Univac “mainframe” computers owned by the U.S. Navy. These were in a lab used for training repair technicians. The only way (initially) to load a program was by using the switches and lights on the front panel to load the program's machine code, one 30-bit word at a time. I wrote that code on lined paper, assembly mnemonics on the left, octal machine code (hand-assembled!) on the right. When I modified code, to avoid re-assembling all that by hand, I'd put a jump instruction at the first modified line of code, to jump to a patch that implemented the change, and then jumped back. After a few rounds of debugging, my code could be quite a mess :)
Later, out of sheer desperation, I developed my own software engineering tools. At the time, I had no idea that better tools existed elsewhere; I was just trying to save myself from the tedium of manual machine code entry. I developed tools that were the rough equivalents of monitors, loaders, assemblers, debuggers, and linkers, though I didn't know them by those names.
I went through the same sequence when I first started programming microcomputers, in 1975 – though by then I knew of the existence of commercial tools. However, those tools generally ran on minicomputers (like the PDP-11) that I had no access to and couldn't possibly afford to buy. Even the software tools were out of reach financially. So once again I wrote my own tools, for several microcomputers (especially the Motorola 6800, RCA 1802, Intel 8080, and Zilog Z80).
Today very few people would consider writing their own software development tools. For one thing, excellent tools are readily available on the Internet for free, thus removing all the things that might force one into writing one's own tools. It's also true, though, that the tools have become far more complex as the CPUs have become more and more powerful. In 1975, one could write a functional assembler in a few hundred lines of code over a week or two. A really nice compiler might be three or four times that effort. A nice assembler for today's Intel chips would probably take several years of work.
I've often noted that having had the experience of writing my own tools gives me a much different perspective on programming than many younger programmers have. Mostly this manifests in having a reasonably good understanding of how those tools work, whereas for many perfectly competent programmers, the mechanism by which their (say) Java source code actually gets executed is a complete mystery – and one which seldom needs to be explored...
My own programming beginnings were much different, primarily because I had no access to programming tools (compilers, assemblers, linkers, debuggers, etc.). In the early '70s I began programming on Univac “mainframe” computers owned by the U.S. Navy. These were in a lab used for training repair technicians. The only way (initially) to load a program was by using the switches and lights on the front panel to load the program's machine code, one 30-bit word at a time. I wrote that code on lined paper, assembly mnemonics on the left, octal machine code (hand-assembled!) on the right. When I modified code, to avoid re-assembling all that by hand, I'd put a jump instruction at the first modified line of code, to jump to a patch that implemented the change, and then jumped back. After a few rounds of debugging, my code could be quite a mess :)
Later, out of sheer desperation, I developed my own software engineering tools. At the time, I had no idea that better tools existed elsewhere; I was just trying to save myself from the tedium of manual machine code entry. I developed tools that were the rough equivalents of monitors, loaders, assemblers, debuggers, and linkers, though I didn't know them by those names.
I went through the same sequence when I first started programming microcomputers, in 1975 – though by then I knew of the existence of commercial tools. However, those tools generally ran on minicomputers (like the PDP-11) that I had no access to and couldn't possibly afford to buy. Even the software tools were out of reach financially. So once again I wrote my own tools, for several microcomputers (especially the Motorola 6800, RCA 1802, Intel 8080, and Zilog Z80).
Today very few people would consider writing their own software development tools. For one thing, excellent tools are readily available on the Internet for free, thus removing all the things that might force one into writing one's own tools. It's also true, though, that the tools have become far more complex as the CPUs have become more and more powerful. In 1975, one could write a functional assembler in a few hundred lines of code over a week or two. A really nice compiler might be three or four times that effort. A nice assembler for today's Intel chips would probably take several years of work.
I've often noted that having had the experience of writing my own tools gives me a much different perspective on programming than many younger programmers have. Mostly this manifests in having a reasonably good understanding of how those tools work, whereas for many perfectly competent programmers, the mechanism by which their (say) Java source code actually gets executed is a complete mystery – and one which seldom needs to be explored...
A British foreshadowing...
A British foreshadowing... Many American liberal academics (“liberal” is almost redundant in that phrase) point to Britain's less-than-absolute free speech as a model for the United States. They'd like to enshrine a new right: the right to not be offended. That British way has led to an absurdity that I find appalling, though I'm sure many on the left would beg to differ.
In Winchester, England, a British political candidate stood up to speak. He quoted the great British leader Winston Churchill – and was arrested. Really.
That's where we're headed. Elements of this kind of speech nannyism are already in place on American college campuses, where such a speech would be similarly banned. One more progressive on the Supreme Court and such arrests are likely to become possible here...
In Winchester, England, a British political candidate stood up to speak. He quoted the great British leader Winston Churchill – and was arrested. Really.
That's where we're headed. Elements of this kind of speech nannyism are already in place on American college campuses, where such a speech would be similarly banned. One more progressive on the Supreme Court and such arrests are likely to become possible here...
Cliven Bundy...
Cliven Bundy... I haven't said anything about this incident because I was completely uncertain what the heck was going on. Some time has gone by now, and it looks to me like most of the salient facts are public. They are:
- Cliven Bundy, a Nevada rancher, ran his cattle on BLM land
- BLM charges a fee for this privilege, which Bundy didn't pay for 10 years
- Bundy refuses to recognize federal ownership of that BLM land
- Bundy claims “ancestral rights” to graze on that land, a privilege non-existent under long-standing U.S. law
- Bundy made (and continues to make) overtly racist comments in a public setting, instantly causing his (mostly libertarian or conservative) supporters to backpedal like mad
- the Desert Tortoise, listed as endangered, is native on that land
- Harry Reid and his relatives have no personal interest in the land Bundy was using, breathless Internet reports and Harry Reid's unsavoriness notwithstanding
- the BLM decided to take action to evict Bundy from its land; the reasons for its timing are unclear
- the BLM used massive, near-military force to seize Bundy's cattle
- the BLM unilaterally suspended ordinary civil rights (such as the freedom of speech) in the process of its eviction