Wednesday, March 30, 2005

Johnnie Cochran

ABC News: Defense Attorney Johnnie Cochran Jr. Dies

Scott, Hiren: My condolences to the legal community.

Monday, March 28, 2005

eXtreme Napping

The title is just to tick myself off. I hate it when I'm marketed at by lazy ad execs who think that putting the word extreme in front of a product will make me want to buy it.

Mrs. O and I got home from church yesterday, ate lunch, and took a 3 hour nap. It was ridiculous. Woke up at 6:30, knowing full well our sleep schedule for the night was well hosed.

I must have slept last night, but it sure didn't feel like it. I was either dreaming or hallucinating. Now I'm going to have to rely on coffee to make it through the day.

Sunday, March 27, 2005

In which I miss yet another trend

It seems the latest thing is to make your desktop background look transparent. Who knew?

Friday, March 25, 2005

On Autoeponyms II

Danny came up with obfuscation. It helps if you don't already know what it means.

How about:
  • terse
  • word
  • prodigious
  • sibilant
  • hiss
  • legible
  • iamb (depends on how you pronounce it)

On Good Friday

I've been getting a lot of reading done in my daily 2 hours in the car. Simply Audio Books provides a great Netflix-like service for books-on-CD rentals that I've been using for almost a year now. The latest book I've been going through is called What If 2 edited by Robert Cowley. It's a collection of essays by scholars regarding various "close calls" in history, extrapolating on what might have happened to the course of human events if various things had failed to occur. Examples:
  • Harold Godwinson defeats William at the Battle of Hastings: Scandinavian culture flourishes throughout Europe, Latin culture recedes, and I'd be blogging in Danish or its linguistic heir.
  • China beats Columbus to the New World: Apparently the Chinese had a pretty good exploratory fleet going on during the Ming Dynasty, and they had this admiral named Zheng He who was very good at acquiring tributes to the emperor from other nations in exchange for silks and other Chinese goods. His wanderings were put to a halt after a change in administration, so he never got a chance to head out into the Pacific, but the essay theorizes that if Zheng He had made it to America (probably via Alaska) he would have gotten the same out of the Iroquois Confederacy. Don't quite see how, since they mostly settled in eastern North America...
  • Neville Chamberlain recommends Lord Halifax as his replacement instead of Churchill: Great Britain makes peace with Germany, so there is no British/American partnership after Pearl Harbor. Stalin eventually defeats Hitler and conquers Europe. America, ignoring Europe, focuses its full attention on the Pacific theater. However, there being no help from Britain, the Manhattan Project fails to materialize and America has no option but to get into a war of attrition with Japan. The Soviet Union gets the bomb. I blog in Russian.
Another counterfactual history example posed by Carlos Eire of Yale is what if Pontius Pilate had ignored the wishes of the crowd at the end of Passover and had let Jesus go free instead of Barabbas? Eire points out that from Pilate's perspective, Barabbas would have been much more highly favored an executionee than Jesus. Barabbas was a member of a Jewish faction that sought to overthrow the Roman occupation of Judea, had staged an insurrection, and had murdered at least one person. Jesus, on the other hand, hadn't done anything except say some things that the locals didn't like. In fact, the reason the crowd that night was so hung up on having Him killed was that He wasn't trying to lead an insurrection against Rome. Pilate even had pressure from his own wife to let Jesus be, when she says, "Have nothing to do with that just man, for I have suffered many things today in a dream because of Him" (Matthew 27:19). Pilate himself sees no reason to kill Jesus, saying to the people, "You have brought this man to me as one who misleads the people. And indeed, having examined Him in your presence, I have found no fault in this man concerning those things of which you accuse him. Neither did Herod, for I sent you back to him, and indeed nothing deserving of death has been done by Him" (Luke 23:14-15).

However, as a Christian, I reject the supposition that the final decison by Pilate was a close call. Without the crucifixion and subsequent resurrection, there can be no salvation by faith, and humanity would be doomed. Why? Others have said it much more eloquently than I, but bottom line, being in heaven with God requires perfection on our part. None of us are perfect. We've sinned. It requires sacrifice to take away that sin. A lot of sacrifice. See the Old Testament for details. Jesus' crucifixion was that sacrifice for our sins. Since He was the only person in the history of humanity who had never sinned, only His sacrifice had sufficient weight to pay for the sins of us all. See the New Testament for details. To take advantage of it, all we have to do is recognize that it happened, that we are in need of it, and that Jesus is our Lord. "And now what are you waiting for? Get up, be baptized, and wash your sins away, calling on His name" (Acts 22:16).

In spite of the huge case against Barabbas' release, Pilate acquiesced to the will of the people anyway and had my Lord tortured and killed. Clearly God had a plan here, and I'm eternally thankful that He did.

For the record, Eire goes on to theorize that if Jesus hadn't been killed that day, He would have continued doing what He had been doing for the past 3 years, eventually being forced to watch the destruction of the Jewish temple, and dying of old age with his disciple and best friend John at His side. Christianity as we know it today would be even more splintered than it tragically is, existing as various hybridizations of Judaism mixed with Jesus' teachings. But without that seminal cornerstone event, however, it would be very empty indeed.

Saturday, March 19, 2005

Of Dates and Time III

Sorry, I couldn't leave this alone. This is the last one. Probably.

As I thought about it more, I came to 3 small things we could do that are feasible and practical to help make life easier and make talking about dates and time more precise. Here they are:
  • Switch to the 24-hour clock. AM and PM are stupid, no two ways about it. I didn't even bother to research why and how they came into existence except that it had something to do with the Egyptians. Sample idiotry: This is the only system in which 1 falls after 12. 12 noon is really 12 PM, followed by 1 PM. Most legal documents that have time clauses (from what I understand; back me up, lawyer types) deal with 12:01 AM since it's still unclear to most people whether 12:00 AM belongs to the previous or the following day. This system is so full of oddities and ambiguities that it should be sacked immediately.
  • Stop Daylight Savings Time. I've often said that the day you lose an hour of sleep for no good reason is the worst day of the year. Consider the practical implications of Daylight Savings Time. Say you record a transaction of some sort in a database at 0230 on the day we fall back. When did it really happen? The first 0230 or the second one that happened after it magically became 0200 again? The only compelling argument I've heard in support of Daylight Savings Time is that we don't want kids standing at school bus stops in the dark. This is not an insurmountable problem. Either light the bus stops, or move the school schedules ahead so that it will always be daylight an hour before school starts.
  • Move to Greenwich Mean Time. This move, of course, depends on the implementation of the previous two suggestions. I live in the Eastern time zone, and my parents live in Central time. I hate having to ask, "Is that 6:00 your time or our time?" My in-laws live in Indiana, which makes time arrangements even wackier since they already don't honor Daylight Savings Time. If we all moved to GMT (take Eastern standard time and subtract 5 hours) then we would never need to ask it again. Plane schedules would be simpler, everything would be simpler. Let's get on this.

On autoeponyms

I love fiddling around with words. One set of words that have recently been brought to my attention is the class of self-applicable words, or words that are themselves what they mean. Dan calls these autoeponyms, or cognitive onomatopœia. Examples:
These are hard to think of. Anybody got any more?

And here's a good one. Let's coin a new word:

antiautoeponym (n) \AN-tE-o-t&-ep-uh-nim\ A word that does not describe or apply to itself.

Is this word an autoeponym? If it is, then by its own definition, it is not. Similarly, if it is not, then by definition it is. This is yet another implementation of the Liar's Paradox discovered by Epimenides.

That reminds me of a joke: What happens if you strap a piece of buttered bread to the back of a cat and toss the cat off the balcony? It lands on Bertrand Russell.

Friday, March 18, 2005

madbean.com: 1111111111

madbean.com: 1111111111

In other time-related news, today marked a significant event in the history of time_t. The number hit 1111111111 today at 1:58:31 GMT. Please join me in celebrating this joyous event.

Tuesday, March 15, 2005

The Y2K+38 Problem

Remember Y2K? That was chump change compared to what's coming on January 19, 2038. For a long time up until the end of the twentieth century (even now on certain Unix-based systems) the time and date was stored and manipulated in C and C++ code using a 32-bit signed integer called time_t. This integer stored the number of seconds that have passed since what is called the epoch, which marked midnight on January 1, 1970. 32-bit signed integers can hold numbers up to 2^31 - 1, (the remaining bit reflects the positivity or negativity of the number (kind of)), or about 2 billion and change.

The problem, which will occur at 1/19/2038 03:14:07, is that we'll run out of seconds, and time_t will try to roll over. And it won't roll over to 1970, but since time_t is signed, it'll think the time is suddenly 2^31 seconds before 1970, which falls out to the evening of December 13, 1901.

It's not as simple as Y2K was, in that machines will simply think it's near the tragic end of the McKinley administration. Most programs aren't set up to handle the case of time being negative, and horrible bugs will arise that no programmer would have thought of preparing for. After all, if a programmer were to handle the case of time suddenly going negative, (s)he would also probably not have used time_t in the first place.

Most modern systems use 64-bit numbers to store the time (in milliseconds) now, with the 32-bit numbers being held over from the 90s, when 32-bit computing was the best we had. 2^63 milliseconds after 1970 will put us several years past the expected heat death of the universe, so I'm less concerned about that. But if the last 50 years of computing has taught us anything, it's that a piece of software can be safely declared dead only after it's been taken offline of every system it was ever installed on, the installation media and all source code has been destroyed, and if all of the programmers involved in writing it have been taken out into the street and shot. Credit to Ted Neward for that bit of insight.

Of Dates and Time, cont.

I'm a bit embarrassed. I left my previous post with a cliffhanger. After several paragraphs of downcrying the calendar system used by virtually all of the Western World, I end with the question, "What should we do about it?"

In the harsh light of morning, I find that there's almost nothing that can be done about it. Many proposals have been laid down for how we could reform the timekeeping system, ranging from the quasi-practical to the inane.

The main problem distills to two things. The first is that the human body, for whatever reason, is used to an astronomical day for sleep purposes. This is not an insurmountable problem; look at the folks who live in the extreme Arctic and Antarctic with their 6 month periods of darkness and daylight. We are used to (on average) 8 hours of sleep and 16 hours of wakefulness. Again, there are also people who need more sleep and people who need less sleep. It's more likely that we'll abolish daylight's savings time (something else I'm in favor of) than instituting calendar reform.

The more significant problem and the real deal breaker is one of money. Our calendaring is so ingrained in the infrastructure of our lives that extricating it will be an unbelievably cost prohibitive thing to do. Think of the cost that was involved in merely solving the Y2K problem a few years ago, and raise that cost a few orders of magnitude, and you start to understand what I mean. Every clock will suddenly be obsolete. Every database record that stores a timestamp, every calendar, and almost every piece of software will have to be either patched or rewritten.

The computers themselves will largely be safe, although the operating systems will need to be patched. There's an excellent article called A Brief History of DateTime by Verity Stob detailing the various methods computers have used to keep track of the time over the years. These days, most of the methods include keeping track of an unsigned 64 bit number that counts the number of milliseconds that have passed since January 1, 1970.

Aside: There will be another Y2K-like emergency on January 19, 2038 at 3:14 AM. More about that later.

The thing that started me on this rant is the fact that, computationally, it is very expensive to find out the time of day given that information. Your computer can afford the ~10 milliseconds it takes every minute to write down the date and time in the lower right hand corner of your taskbar, but in other situations, this is an operation that is too costly to perform.

But what can we do about it? Not a thing.

Friday, March 11, 2005

Of Dates and Time

First, a little background:

We live on a planet we call Earth. Long ago we as a society came to recognize the convention of breaking up our lives into a series of time units based on astronomical events, at the time having no other reliable means of marking it. The shortest of these we now call the day, and this time unit was once congruent with the amount of time it takes for our planet to execute one complete rotation. The second we call the month, which was once the same amount of time it took the moon to complete one revolution around our planet. And you know where I'm going with this, the longest one is the year, and the year used to be the amount of time it took for our planet to complete one revolution around the sun. For good measure, the ancient Israelites added another concept to the list: the week, which has nothing to do with astronomy.

In all of these cases except the week, I noted above that the actual amount of time related to them has changed. This came about as we became aware that to our dismay, the astronomical month, day, and year are not, and never have been, on speaking terms. The moon spins merrily around us, blissfully and inconsiderately unaware that the approximately twenty-seven days, seven hours, forty-three minutes, and twelve seconds that it takes to complete an orbit completely threw off our happy notion that we could divide the year up cleanly into twelve of these sections. Many attempts to correct this anomaly were devised. The Romans actually used to occasionally throw in an extra month after February called Mercedonius, which would would only take place every four years or so.

This became cumbersome. Fine, we said, we can live with this if we have to. No longer will the moon be relied upon for timing purposes. Unfortunately, by this time (Julius Caesar had about a year to live) the month as a handy concept to keep in your back pocket was too prevalent to dispose of entirely, so along came the Julian calendar, which among other things, defined that each month would have varying different number of days.

But what about those days? Again to our chagrin, we made the discovery that the earth's rotation and revolution have little to do with each other. Surely, we said, the earth and the sun can get together and come to some sort of integral ratio of days to years, can't they? But it wasn't to be. Caesar and Co. determined that there were about three hundred sixty-five and a quarter days to every year, and so instituted, along with their wacky month system, a leap year, during which good old February (or the "leftover month") would have an extra day every four years so that we stay synched up. To get the world lined up to start this Julian calendar, 46 BC had 445 days in it to make up the difference in centuries of measuring error. This year, quite appropriately, was called Annus Confusionus, or the "Year of Confusion."

Aside: The Romans also had a crazy naming convention for the days within a month. The Kalends was the name for the first day of the month, every month. The Nones was the fifth day or the eighth day of the month, depending on the month (March, May, July, and October had the eighth, and don't ask me why). Finally, the Ides marked either the thirteenth or the fifteenth days of the month in compliance with having the same number of days between the Nones and the Ides. So the Ides of March was March 15, while the Ides of April was April 13. They referred to the days in between in reference to the number of days prior to the next milestone. ante diem VI Nones Mart. was March 2, or "six days prior to the Nones of March."

Clearly, this system was too nasty to keep up with for long. Eventually, as the technology to measure time improved, we discovered that a mere extra day every four years was not accurate enough, as the year was actually about 365.2425 days long. So in 1582, Pope Gregory XIII instituted the Gregorian calendar, which made further corrections to the increasingly ridiculous leap year concept. Now leap years happen every four years except for years that are divisible by 100 and not divisible by 400. An interesting side effect of this ruling is that by papal decree, the day after October 4, 1582, was October 15, 1582. This was to curb the inaccuracies caused by 1600 years of the Julian calendar.

This is the system we have today, although we now know that the Gregorian calendar too is wrong. The year is not 365.2425 days long: it is 365.242375 days long. This means that we will accumulate an extra day every 8000 years or so. Also, given that the earth's rotation is slowing down gradually, we are increasingly becoming disconnected between the astronomical day and the length of time dictated by the International Bureau of Weights and Measures.

What should we do about this? Stay tuned.

Wednesday, March 09, 2005

In which I try to remember what to write about

In general, I don't mind having a terrible memory that much. It's mostly a blessing. I write down the important stuff that I need to remember. I tend to forget the bad things that happen to me, and that's good because painful memories should not be dwelt upon, and I tend to forget the good things that happen to me, and that's also good because when people remind me of them later on, I can be pleasantly surprised that they happened in the first place. Oh sure, my memory has gotten me in trouble a few times, and it irks my wife to no end, but I think I come out ahead most of the time.

One of the problems with it is I think of all these great things to write about in the car and while trying to fall asleep, and then forget them as soon as I sit down at the keyboard.

Things to write about:
Mark Twain
metafying words
words that are what they mean

In which online ads become dead to you

It seems that Scott is actually reading this stuff, so hi!

Firefox by itself is hands down the best browser experience I've ever had, but by installing the following 3 plugins it has gone from mere browser excellence to browser nirvana:
  • Googlebar: Google's browser toolbar (last I checked) only supports IE, but this plugin gives you the same experience, down to the search term buttons that you can click on for searching the text.
  • Mouse Gestures: These are apparently available for free in Opera, but these work great. Install the plugin and you can navigate around your history just by holding down the right mouse button and dragging left and right. I thought this was a stupid idea when I first heard about it, but now I can't live without it. I frequently find myself trying to use gestures in other history-enabled applications like Explorer and Eclipse and get frustrated when I realize they aren't available.
  • Adblock: The best for last. You can instruct Firefox to not download or display any files that match a certain URL pattern. For instance, if you restrict *doubleclick.net* then you will never again have to look at another Doubleclick ad. It just doesn't display it, and it even fills in the page context around it so there aren't any gaping holes. It's beautiful. Haven't had to look at an online ad in a year (except for text-based ads, like Google's, which aren't nearly as irritating).

Friday, March 04, 2005

In which we eat red herrings

Java memory leaks can be much more insidious than anyone has so far led me to believe. The conventional wisdom goes like this:

"Well, the garbage collector is going to take care of it for you. Nothing to worry about, just make sure you aren't holding on to references unless you need them. Watch your static collections and singletons, and you can't go wrong."

We ran into a nice situation where we had a process that ran continuously and upon request, would open a Swing dialog to do some work. You fiddle around in the dialog, close it, and it goes back to sleep until it's needed again.

Using JProbe, we got into there and saw that there were no differences in object counts between iterations.

"Great," we thought, "we're all set." Closed the bug. Then during verification time, we saw that in Task Manager the memory consumption was increasing with every iteration.

"This sucks in a mighty, mighty way," we said. Reopened the bug. Using Process Explorer from Sysinternals (great tool, download immediately), we saw that the handle count for the process kept going up. So although the object counts were constant, more and more native memory kept being allocated. Finally, we tracked it down to a place where we were creating JDialog objects and never calling dispose() on them.

The moral of the story, children, is that memory management is always important, whether there's a garbage collector involved or not. Heed it well.

Thursday, March 03, 2005

In which the honeymoon ends for Mozilla

Hackers never cease to amaze me. I suppose if you throw several thousand minds against any wall, sooner or later one of them will finally stick, but it appears that Mozilla Firefox has released its first security update since it went 1.0. Quite a clever trick, too. It seems that Firefox supports URLs expressed in all Unicode characters. As a result, phishing hackers can misdirect users by exploiting the fact that though the Cyrillic letter a (shouldn't that be alpha?) looks identical to English's version, it's a different Unicode character.