One of my biggest fears about financial security in retirement ... is the progressive/Democratic push to find ways to tax the assets I (and zillions of other retirees) depend on. Here's Hillary's plan, which is quite modest compared with what Sanders would try to do. I don't think even Hillary's plan is likely to pass a Congress where either house is Republican-controlled – but on the other hand Republicans have approved a bunch of other things I never thought they would. So I just have to live in fear.
The progressives are relentless on this general topic, though. They need money to finance their social engineering projects, and retiree assets (including pension funds) are one of the more obvious targets. They're going to keep grinding away at it, I'm sure, just as they have on so many other topics...
Monday, July 20, 2015
Some day...
Some day ... some sweet, fine day ... the level of stupid on the Internet may decline to match the level of stupid in the general population. But today is not that day...
Linus downplays AI dangers...
Linus downplays AI dangers... But he says it much more entertainingly than I did:
“I just don’t see the situation where you suddenly have some existential crisis because your dishwasher is starting to discuss Sartre with you.”Ah, Linus. Don't ever change, please!
Am I a “Real Programmer”?
Am I a “Real Programmer”? I ran across this article in my morning reading, and the further I got into it the more it piqued my interest. It's discussing a story about a programmer from the late '50s and early '60s named Mel Kaye. It seems that Mel did all sorts of things that modern programmers, working in high level languages running under operating systems that hide the bare metal, think of as closely akin to magic and certainly incomprehensible to mere mortal programmers.
The surprise for me: I did most of the things the Mel did! Ergo, I must be a “Real Programmer”!
Mel wrote in machine language (no compilers, no assemblers), writing the machine code down in hexadecimal. I did all of my programming before 1978 in machine language, writing the code down, on paper, in octal. My choice of octal was dictated by the machines I was writing code for (primarily Univac 1206s, 1218s, and 1230s). I did so because no programming tools (compilers, assemblers) were available to me, and I didn't even know such tools existed. In 1976 I “invented” something that would be recognizable as an assembler, and was astonished when I discovered (from a Univac tech) that such tools were already available :)
Mel wrote code that was self-modifying. So did I. So did everybody writing code for Univac computers – self-modification was actually a machine language intrinsic. On those machines, to call a subroutine one would use the “call” instruction, which stored the return address in the first location of the subroutine. Geek aside: yes, that means there was no stack and recursion was ... difficult. On the other hand, in those bad old days core memory was an incredibly scarce commodity – and though modern programmers might be completely unaware of this, recursive programming is quite a memory hog. One can always accomplish the same thing with a loop using far less memory. Further, Mel wrote code that used tricky math on self-modifying code. So did I. In the mid-'70s I wrote a floating point math package that made extensive use of this particular trick in the tightest parts of the internal loops, squeezing out every clock cycle that I could. That wasn't remarkable back then – it was the ordinary way of optimizing code, and the computer instruction sets had features designed to make this sort of exploit possible. Today self-modifying code is considered a kind of heresy; before the late '70s it was absolutely the norm in any high-performance system. We ran into this a lot in the Navy, as the performance we needed in these realtime systems was pushing the edges of what was even possible on the hardware of the day.
Mel wrote code that was hand-optimized for the timing of a magnetic drum rotation. I did this, and a lot of it, on a drum machine that was used for training in the technical school I attended. Even the U.S. Navy wasn't actually using drum memory any more by the '70s :) I even did the exact same trick Mel did, using the drum's latency to add delays when printing to a Friden Flexowriter. I was fascinated by the optimization problem for programs executed from drums, and I developed a set of paper forms to make that problem easier. One particular trick that Mel did (optimizing the parts most frequently executed) was one that I focused on as well. Of course I did, as it's the obvious way to get more performance out of a program.
Mel wrote a blackjack program. I wrote a chess program (for multiple computers). I've no idea about the relative complexity of the two games, as I don't even know how to play blackjack. Writing a program for multiple computers, though, is definitely more difficult than a program for a single computer. The multiple computers I wrote for weren't connected with a modern network, either – instead, they were each connected point-to-point to four other computers with “inter-computer channels” in an 8-by-8 array (64 computers) with the edges wrapped around. There was no operating system, so I wrote a system of message forwarding that allowed any participating computer to send a message to any other.
So I'm thinking that I qualify as a “Real Programmer” – and so do a lot of other engineers who worked in the '60s and '70s. I can't be the only one who survived that! :)
The surprise for me: I did most of the things the Mel did! Ergo, I must be a “Real Programmer”!
Mel wrote in machine language (no compilers, no assemblers), writing the machine code down in hexadecimal. I did all of my programming before 1978 in machine language, writing the code down, on paper, in octal. My choice of octal was dictated by the machines I was writing code for (primarily Univac 1206s, 1218s, and 1230s). I did so because no programming tools (compilers, assemblers) were available to me, and I didn't even know such tools existed. In 1976 I “invented” something that would be recognizable as an assembler, and was astonished when I discovered (from a Univac tech) that such tools were already available :)
Mel wrote code that was self-modifying. So did I. So did everybody writing code for Univac computers – self-modification was actually a machine language intrinsic. On those machines, to call a subroutine one would use the “call” instruction, which stored the return address in the first location of the subroutine. Geek aside: yes, that means there was no stack and recursion was ... difficult. On the other hand, in those bad old days core memory was an incredibly scarce commodity – and though modern programmers might be completely unaware of this, recursive programming is quite a memory hog. One can always accomplish the same thing with a loop using far less memory. Further, Mel wrote code that used tricky math on self-modifying code. So did I. In the mid-'70s I wrote a floating point math package that made extensive use of this particular trick in the tightest parts of the internal loops, squeezing out every clock cycle that I could. That wasn't remarkable back then – it was the ordinary way of optimizing code, and the computer instruction sets had features designed to make this sort of exploit possible. Today self-modifying code is considered a kind of heresy; before the late '70s it was absolutely the norm in any high-performance system. We ran into this a lot in the Navy, as the performance we needed in these realtime systems was pushing the edges of what was even possible on the hardware of the day.
Mel wrote code that was hand-optimized for the timing of a magnetic drum rotation. I did this, and a lot of it, on a drum machine that was used for training in the technical school I attended. Even the U.S. Navy wasn't actually using drum memory any more by the '70s :) I even did the exact same trick Mel did, using the drum's latency to add delays when printing to a Friden Flexowriter. I was fascinated by the optimization problem for programs executed from drums, and I developed a set of paper forms to make that problem easier. One particular trick that Mel did (optimizing the parts most frequently executed) was one that I focused on as well. Of course I did, as it's the obvious way to get more performance out of a program.
Mel wrote a blackjack program. I wrote a chess program (for multiple computers). I've no idea about the relative complexity of the two games, as I don't even know how to play blackjack. Writing a program for multiple computers, though, is definitely more difficult than a program for a single computer. The multiple computers I wrote for weren't connected with a modern network, either – instead, they were each connected point-to-point to four other computers with “inter-computer channels” in an 8-by-8 array (64 computers) with the edges wrapped around. There was no operating system, so I wrote a system of message forwarding that allowed any participating computer to send a message to any other.
So I'm thinking that I qualify as a “Real Programmer” – and so do a lot of other engineers who worked in the '60s and '70s. I can't be the only one who survived that! :)