Scientists and floating point math... An article I read this morning makes a valiant attempt to demystify floating point math for scientists. I'm not sure it succeeds :) Reading it reminded me of an experience I had back in the early '80s.
A San Diego company had themselves a real problem. They'd contracted with an independent consulting engineer (very common back then for anything involving microcomputers) to develop a TTY modem that ran on a single chip microcomputer. This was to be part of a device to allow deaf people to use telephones. Telephone companies had recently been mandated to provide such devices for deaf customers, and this company was trying to win the bid to supply the local telephone company with thousands of them.
The problem was that the engineer they hired had quit in a huff, and claimed to have accidentally destroyed the source code for the firmware on the chip. All the company had was a single mostly-working chip, from which the object code could be read. They wanted someone to reverse-engineer the code, deliver the reconstructed source code, and (ideally) to fix the problems.
I took that contract, but only on an hourly rate basis, as I had no idea what I was going to find. The company didn't like that much, but they also had no alternatives.
The firmware had originally been written in assembly language, as most performant things were back then. That was a mixed blessing. It was easy to get the source code back without useful symbol names – but quite challenging to actually understand the code and assign useful symbol names. I spent several weeks totally immersed in that code.
A big part of the code turned out to be a floating point package, one that the previous engineer had apparently written himself. I had just written one of these a few years prior (as part of Tarbell Basic), and I'd also written one while in the Navy, so I had some familiarity with how to do this. This particular package had some trigonometric functions in it that were used by the modem software – and those trig functions were written in a way that caused cumulative errors of exactly the kind described in the linked article.
So I went back to the company who hired me and told them that I'd found a significant problem, and it was one that could be fixed. Naturally, they wanted to know what the problem was. While explaining it to the engineering team, I discovered that the source of the trig function algorithm was a scientist who worked for the company: he had handed that algorithm to the former programmer, who implemented it as asked. So the source of the problem wasn't the programmer, but rather this scientist (a physics guy who knew about digital signal processing). Next thing I know, I'm lecturing this scientist on the details of how floating point worked, and he really, really didn't want to hear it :) In the end, the only way I could convince him was by constructing simple test cases and showing him the results.
There was a happy ending to all this, though. Once I convinced the scientist of the errors consequent to his algorithm, he was willing to listen to some alternatives. One of them was a well-known approach that avoided the cumulative error problem and also was many, many times faster than his algorithm: a simple polynomial approximation. I didn't invent this; I got it from a book written in the '50s by someone with the appropriate degree. Even the scientist was happy with this one!
Tuesday, October 28, 2014
Geek: fast inverse square root hack...
Geek: fast inverse square root hack... This has long been one of my all-time favorite hacks, mainly because it looks so incredibly unlikely at first blush (a magic constant of 0x5f3759df? really?), but then turns out to have a solid reason why it works. I've read explanations before, but none so clear as this one – and I love the generalization of it to other powers besides 0.5...
A habit of ignorance...
A habit of ignorance... Way back in 1972, I had recently enlisted in the U.S. Navy and was attending a series of schools at the Mare Island Naval Base near Vallejo, California. The first training I underwent was for basic computer repair, and that training started with a dose of digital logic. One of the first things we were told was that if we flunked out of this school – as about a third of the attendees would – we would be reassigned to another school on the same base: the PBR (river patrol boat) school. These were plywood boats, lightly armed, that patrolled Vietnamese rivers – and their crews had the highest casualty rates of all the armed forces fighting in Vietnam. That concentrated my mind wonderfully, to steal a turn of phrase from Mark Twain.
One consequence is that I paid serious attention to the classroom work, even when it didn't appeal to me – something I had never done before. I also started observing the students that did well and those who did not, the better to guide myself. There was an “Aha!” moment for me one day that I can still remember vividly. The instructor posed a problem on a blackboard (with chalk!) and asked who had an idea how to attack it. The couple of high-performing students I was observing looked puzzled and kept their hands down. Nearly everyone else raised their hand – and not a one of them had any clue how to tackle the problem posed. The “Aha!” was that the best students were the ones most comfortable with being ignorant. How interesting!
That started me on what became a lifelong (and by now, pretty much reflexive) habit: to frankly acknowledge my own ignorance, both to myself and to others. I don't always succeed at this, as my friends will attest :) I'm better at doing so on technical subjects than on others. I've learned that recognizing my own ignorance is a prerequisite to motivating myself to address that ignorance. In other words, a key step in acquiring new technical knowledge is to recognize that I need that knowledge. That might seem like a little thing, but it's played a very large role in my career. There are an infinite number of interesting technical subjects that I'm ignorant of, and therefore no end of things for me learn. I've never been comfortable when not learning; in fact, any situation that doesn't require me to learn something new is one that I find boring. I've also learned that admitting ignorance does not lower the degree of respect that others have for my technical ability – in fact, it may do the opposite. Because people know that I will readily admit to not knowing something, they tend to believe (not necessarily with justification :) that if I am not claiming ignorance, I must have some idea what I'm talking about.
Because of this context, I found this article quite interesting...
One consequence is that I paid serious attention to the classroom work, even when it didn't appeal to me – something I had never done before. I also started observing the students that did well and those who did not, the better to guide myself. There was an “Aha!” moment for me one day that I can still remember vividly. The instructor posed a problem on a blackboard (with chalk!) and asked who had an idea how to attack it. The couple of high-performing students I was observing looked puzzled and kept their hands down. Nearly everyone else raised their hand – and not a one of them had any clue how to tackle the problem posed. The “Aha!” was that the best students were the ones most comfortable with being ignorant. How interesting!
That started me on what became a lifelong (and by now, pretty much reflexive) habit: to frankly acknowledge my own ignorance, both to myself and to others. I don't always succeed at this, as my friends will attest :) I'm better at doing so on technical subjects than on others. I've learned that recognizing my own ignorance is a prerequisite to motivating myself to address that ignorance. In other words, a key step in acquiring new technical knowledge is to recognize that I need that knowledge. That might seem like a little thing, but it's played a very large role in my career. There are an infinite number of interesting technical subjects that I'm ignorant of, and therefore no end of things for me learn. I've never been comfortable when not learning; in fact, any situation that doesn't require me to learn something new is one that I find boring. I've also learned that admitting ignorance does not lower the degree of respect that others have for my technical ability – in fact, it may do the opposite. Because people know that I will readily admit to not knowing something, they tend to believe (not necessarily with justification :) that if I am not claiming ignorance, I must have some idea what I'm talking about.
Because of this context, I found this article quite interesting...
“Maybe the Vietnamese should send us some advisers.”
“Maybe the Vietnamese should send us some advisers.” That line, from a column by Glenn Reynolds (aka “Instapundit”) certainly caught my eye. I'm old enough to remember when Kennedy sent American military advisers to Vietnam, and I was in the U.S. Navy, taking South Vietnamese refugees on board our ship after the fall of Saigon. That line has an emotional context for me.
So what's Mr. Reynolds talking about? He's riffing on a study showing that just 70% of Americans believe people are generally better off under capitalism, with its attendant inequalities – but 95% of Vietnamese believe that. Maybe we should take on some Vietnamese advisers!
The column makes some interesting points, and is well worth the read...
So what's Mr. Reynolds talking about? He's riffing on a study showing that just 70% of Americans believe people are generally better off under capitalism, with its attendant inequalities – but 95% of Vietnamese believe that. Maybe we should take on some Vietnamese advisers!
The column makes some interesting points, and is well worth the read...
Subscribe to:
Posts (Atom)