Sunday, September 28, 2008

Apple Sounds Song

Here's a song, of sorts, made entirely of those annoying apple alert sounds.

 

 

Personally, I prefer this (gasp!) windows one, but they are working with much more modern sounds, so I guess that's not surprising.



More info about the apple one on the unofficial apple web log. Thanks to extrapepperoni for bringing that to my attention.

Tuesday, September 23, 2008

'Till Death or Blackberry do You Part

A new study commissioned by Sheraton hotels in honor of Global Out of Office Day (which should clearly be renamed) found that although most people feel that technology gives them more quality time and flexibility with family and friends (84%) and 77% say their PDA helps them enjoy life more, A whopping 87% of professionals bring their PDA into the bedroom, and in what may or may not be a related finding, more than one-third of folks surveyed (35%) say if forced to choose, they'd pick their PDA over their spouse. While some might say that says a lot about the state of so-called crackberry syndrome, I say dump your spouse already -- there's no love there.

Fortunately, there is an answer!



Friday, September 19, 2008

Talk Like a Pirate Day

Dear god, don't say Arrrr today!Today is talk like a pirate day, so whatever you do, do not talk like a pirate, or everyone will know what a big stinking dork you are. If you already knew it was talk like a pirate day, God help you.

Sunday, September 14, 2008

It's not about 1s and 0s

It seems like a lot of people think computers are all about 1s and 0s and if you understand 1s and 0s you'll somehow understand computers. People comment about this and I've been asked how my work relates music to 1s and 0s, even though I basically never ever think about 1s and 0s when I do, I usually think about them in different terms). Today I heard an interview on the Leonard Lopate show with some clever folks. They got asked a question along the lines of "How does a microchip know anything? Like how does it know that 3 * 3 is 9?" and the answer seemed to somewhat stump the clever folks, who stumbled for a moment, and then said, to paraphrase, well it starts with the transistors, which are like switches (they had explained that a bit already) where on is like 1 and off is like 0 and you can build up logical relationships using the transistors... [he gave some examples]... until eventually you can built up to things like that. huh? To be fair to the clever people, this seeming simple question is actually a very hard question, and it's not the type of question they are used to answering, so they were kind of put on the spot. Moreover, they probably hadn't thought about things like that since they were 8 years old, and as clever 8-year-olds I'm sure it made intuitive sense to them, but if someone has no idea what they were talking about and is going to take home any message from that explanation at all, it's that, somehow it boils down to the 1s and 0s.

It does not all boil down to the 1s and the 0s. Computer science has almost nothing to do with 1s and 0s and the reason the answers get so confusing is because for some reason everyone thinks that when you want to break things down for laypeople you break it down with 1s and freaking 0s. 1s and 0s are in the appendix of computer science. An accident of efficiency, convenience, and history, and that's about all. Now I'm a programmer, not a computer scientist, but in every good programmer is a bit of a computer scientist and I'd like to take a stab at explaining this in a different way. Without 1s and 0s. So, here's my version of how microchips and computer science really works (as a bonus, I'll even show you how 1s and 0s fit in so you understand when people try to do you the "favor" of dumbing it down):

Over the years computers have gotten smarter. They started out by being "hand-wired" (or in the case of microchips they were designed) to do exactly what they were "told" to do. The people who designed and built them understood everything there was to know about them because there wasn't that much to know (beyond the mindbogling complexity of the quantum physics underneath it all). But as time marched on, they got complex and they got to the point where they needed division of labor: one person needed to design and one person needed to "program" it, maybe. So they had a meeting and agreed on how stuff would interact, but the programmer didn't care how the designer made it happen and the designer didn't care how it was used, as long as they both stuck to the prearranged agreement. And that's really all computer science is: division of labor, only we call it "abstraction", and as computers get more complex, there's more and more of it. Computer scientists talk about the "machine layers". It's called "abstraction" because, to the programmer, the hardware is just an "abstract" thingamajiger that does stuff, like add and store numbers. The programmer doesn't need to know how, and that's the key to the power of computers: people can build on each others work without having to understand how the previous person did their work. The person designing the hardware can even come up with faster or different ways of performing the same operations, and as long as it meets the agreement, everything should keep working.

Most users recognize the hardware, the operating system and the software, as three important layers. Computer scientists subdivide each of these three layers quite a bit more. These layers and sub-layers are really ways of dividing up labor so that everyone can agree on how things are supposed to work. The layers of course, are purely conceptual. When you are using a computer, you are typically using all layers to some degree, but if you want to understand a computer, or change how it behaves, it is convenient to think about one layer at a time, or to think about how the layers interact.

Now, the lowest layer, supposedly, is the 1s and 0s, but you're not really doing anyone any favors by explaining that, because what has a 1 or a 0 got to do with bigger numbers or something useful like, say, text? That's why in computer science (as opposed to dumbed down computer science) we call the lowest layer the "digital logic level" which sounds scary, but it's something we can actually explain (we can't explain 1s and 0s). Digital logic is simple combinations of switches, memory and other devices made out of transistors, resistors and capacitors which, combined, can perform simple operations such as addition, subtraction, and even multiplication and division, as well as storage of data. An important fact about the digital logic layer is that it must exist in hardware. That is, you can't "reprogram" or change it later short of destroying it with, say, a sledgehammer. It happens to be that all modern computers use binary (which is simply a base-2 number system, in the same way that our familiar number system is base 10), which is a way of encoding numbers, text and all other information as sequences of 1s and 0s because it is fast and convenient to do so, and for no other reason. There have been computers built that are fundamentally modern in design (though they are quite old in terms of computational power) that have used base 10, but you won't find any of these except in a museum.

So the answer to "How does a microchip know anything? Like how does it know that 3 * 3 is 9?" is simply this: it is possible to build hardware that knows the answers to simple arithmetical problems, but you must build your way up from very basic building blocks. Because this type of simple knowledge must be implemented at the digital logic level, it must exist in hardware. You can't program it in later. (actually, I suspect early computers did not have multiplication in hardware, only addition, but that's really beside the point).