Sunday, June 19, 2005
But the movie is coming out Sept. 30. I'll most likely wind up waiting for it to come out on DVD (and will buy the DVD) as watching movies in the movie theaters just isn't fun anymore.
Tuesday, June 14, 2005
BetaNews | IBM Turns to Open Source Development (via Slashdot)
What was really interesting however wasn't the boxes, but the OS. An entirely new platform which provided a very effective multimedia system. The backbone was a system composed of a large number of threads and clean APIs.
Be died when Apple didn't buy them (they bought NeXT instead) and Gateway and Dell couldn't ship BeOS for x86 on their systems due to their contracts with Microsoft (monopoly anyone?).
It looks like a Germany company has finally brought back the ideals of the BeOS:
yellowTAB - Makers of ZETA . Whoopie. Gateway and Dell still can't ship it.
Sunday, June 12, 2005
This is much better than my Yahoo! search ranking, which was number 3 for the term "ready to assemble bookcase". I used to get over 60% of my web traffic from that term at Yahoo!. It looks like they finally updated their index and I'm no longer listed on the first page. This is good, as those poor visitors were looking at a failed home theater project rather than shopping for inexpensive bookcases.
The caller launched into an opening statement about how great this year has been and how the party has come together, and how wonderful it was that Senator Tom Dashell was voted out of office. The caller asked me if I agreed.
When I replied I didn't think the past year has been that great, the caller asked "why not?". I've never heard a telemarketer hang up so quickly when I said "I'm not sure I'm a Republican anymore."
Apparently the Republican party doesn't have any interest in keeping its prior members. Instead it is only interested in those who haven't started to question the party's policy decisions. At least they didn't keep me from my movie about Bill Clinton's presidential campaign.
Thursday, June 9, 2005
With the expanded PATRIOT Act powers recently being approved by the Senate one can't help but wonder if Lucas timed Star Wars III's release a little too closely with our own loss of liberties.
Then there are those who lived through McCarthy-ism who say we're still better off today then we were in the 1950s, so its still A-OK in the U.S. of A. That's fine and good, but we are now worse off than we were in the 1990s. Why are we going backwards, even if it is by a small amount?
Oh that's right - the Saudi oil fields may be drying up, so its time to use America's troops to invade Iraq and secure future oil revenues for our friends in the House of Saud.
Wednesday, June 8, 2005
Here's my new photo catalog for Brown Bear Theater. I'll create the others soon. I may also post some wedding photos!
I think I only previously saw season 1, and I think I missed season 2 entirely. Fortunately they sell the DVDs. I might have to pick them up.
Unfortunately my wife doesn't get their humor and hates it when I watch Red Vs Blue when she is home. Her loss.
Monday, June 6, 2005
Buying right before the switch is a bad investment as your equipment will become obsolete at least 2x faster than at any other point in its history. Buying right after the switch is a bad idea as the equipment and OS tend to not be as stable due to their immaturity. sigh
MacNN has coverage of the keynote from Steve Jobs.
Too bad I purchased a Macintosh and helped to prop up Apple as they pulled the rug out from under me for a second time.
Where is that fully open PC platform with great laptop hardware? That plus FreeBSD or Linux looks like the only sure bet towards long-term support.
I just purchased a 15" PowerBook in late December. I was hoping it would last me the next 3-5 years.
Last time I purchased a Macintosh computer was in 1994. Apple had just introduced the PowerPC line, but I didn't have the funds for one of the new PowerMacs - so I purchased a Quadra. The Quadras were based on the 68040 processor, and were the last computers shipped by Apple using that CPU. Apple had just started the transition to a new processor platform and I got left behind from day one.
In 1997 when it was time for me to upgrade I went with an SGI O2, as SGI had been on the MIPS architecture for years and showed now signs of leaving it.
A year or so later SGI announced they were eliminating the MIPS processors from their lineup and moving to Intel processors. This didn't worked out very well for SGI, and I was stuck with a MIPS based SGI. SGI is just barely alive. I'm surprised they haven't closed their doors yet thanks to that (and other) disasters.
The SGI/Intel event prompted me to move to x86/Linux for several years.
But now I'm back on the PowerPC based Macintosh, as I just got fed up with the GUI environment on Linux not working well, and most laptop hardware is crap (when compared to Apple's current offerings).
So I guess its just about time for Apple to switch architectures on me. Again. At least I purchased Tiger.
Saturday, June 4, 2005
LATEX source document residing in my Subversion repository which is formatted into HTML by latex2html. I have written a small wrapper Perl script which cleans the output from latex2html into slightly more readable HTML, adds CSS classes to facilitate presentation in Movable Type, then posts the entry into Movable Type through the Perl API Net::MovableType.
What‘s also very cool is the graphics drawn in LATEX code will be automatically converted by latex2html and uploaded into the archive directory. Therefore I can do equations or graphics in
LATEX and they will appear correct on the website. For example here is the fraction “1 over 2” as an equation:
Nifty, isn‘t it?
What‘s great is that I no longer need to worry about HTML formatting,
or really any other presentation issue while I am writing an article.
I can just focus on the content. This is perhaps the greatest strength of LATEX and why I can‘t seem to give it up as a means of presenting my thoughts.
I won‘t really use the posttex script for small entries such as quick links, etc. as it is too much work to write up a LATEX source file and check it into SVN. But I do plan on writing some longer entries at least once week from now on, and these will all be written in LATEX. If space on my webserver doesn‘t prove to be an issue I may also offer PDF forms of the longer articles.
Today though Sara and I were out shopping for cars. It was 89 degrees out and we were getting thirsty. While doing a U-turn in a shopping center plaza (can't make a left turn across a very busy 4 lane road in that part of town) she suggested we stop at McDonalds so she could get a milkshake. The milkshake idea sounded good, right? It promised to be cold. I was hot. The sign at the drive-thru said "2 smalls for $3.00". How wrong could you go?
Couple of hours later my stomach is killing me and I suddenly remember why I don't get milkshakes there anymore. I can be such a moron sometimes. I should have just gotten a small soft-drink instead.
Friday, June 3, 2005
I plan on hooking the script up to Subversion and having it automatically publish tex article files when they get created or get updated. This should make it very easy for me to write up an article, commit it to SVN and let it move its way up into my webspace without me really having to think about it.
What's the point in paying the Compaq price premiums and support contracts if the business is still doing the support itself? My best guess is the business doesn't want to hire the quality of staff it needs to maintain its own computers, so it hires slightly lower quality staff who can replace a Compaq power supply via the pretty pictures in the Compaq instruction manual. We're just paying Compaq to give us pretty manuals instead of paying for some staff training.
At every other company that I've worked at in the past we've supported our own computer equipment. We generally had less down time and were able to afford faster disk drives, CPU, more memory, etc. for the same cash amount. As my earlier note today points out, spending a few more dollars on your disks can really decrease how much time your developers waste on a daily basis. (Yes, I'm waiting for a build to finish right now; its still going. sigh).
I guess losing almost 1/4 of my day (every day) to waiting for a cheap 5400 RPM IDE drive is worth the $500 the company saved by not purchasing me a faster drive for my system.
When will the bean counters at large companies realize that developers have high disk IO demands (when compared to other business-type users, like most Word/Excel users) and actually purchase systems which can keep up with our work?
Wednesday, June 1, 2005
As a developer I have found PVCS Version Manager to be useless and a huge time sink. It is a horrible tool. Its retail price is more expensive then Perforce for a development team of 20 people. It lacks the most basic features. Its garbage and should be pulled from the market.
Developers can't be given the security rights necessary to create new files; consequently I have to write new code then send a bug report for each file to our configuration management team, who creates the file for me in the version control system. Ditto for deletes. This creates a headache for me if I send a bug report to have the file created but then find and fix a bug in the file before it has been added to Version Manager. (Given that the average turn-around on new file creation is 48 hours, this is rather common.) I have no idea which files I need to now check out of Version Manager to check in bug fixes and which files are OK as-is.
Did I mention this is even worse than it would appear as Version Manager offers no way to compare your working copy of the source code to a version in the repository? Sure it offers a single file difference between your working file and the repository version of the file, but what it can't do is give me the differences (if any) for all 8000+ files which make up our small product. Most of them don't typically have differences, but how do I know which ones have them if Version Manager can't tell me?
Did I mention that when another developer requests a file to be deleted from the repository and our configuration management team follows through with the bug report that was sent to them, Version Manager won't remove it from my working directory? Yea... what's up with that? It will happily leave all deleted content in my working directory until the cows come home, pigs fly, or I delete all source from my local machine and refetch it all from the version control system. I think we've lost some 100 man hours over the past 8 months to this problem alone.
Writing up the bug reports for file creations and file deletions is also a huge time sink. There's no automated process to do this, so developers are writing down on pen and paper the names of the files they are creating as they create them; then a week or so later writing out bug reports for each one and submitting them to the configuration management team. We've estimated most developers are spending about 30 minutes to an hour each week doing this activity. Over the span of 8 months this is 240 man hours (0.75 hours * 10 developers * 32 weeks).
We also can't have multiple developers editing the same files at the same time. If I'm adding a new constant at the bottom of a file and Bob is adding a new constant near the top of the file (at least 50+ lines from where I'm adding my new constant) I have to either wait for Bob to finish or make the change on my local system, then days later figure out my file is missing Bob's change and redo Bob's change manually. Between schedule slippage of developers having to wait for a file to be released for them to modify it, or wasted man-hours redoing work which had previously been done (but couldn't be automatically merged by Version Manager as it lacks this ability) each developer has lost ~30 minutes each week for 8 months. That's 160 man hours.
I've seen developers spend an entire day because they can't get their local working copy to compile anymore. This typically happens because they have a partial update from Version Manager. Typical example is the developer has modified file X.java (but hasn't checked in yet); another developer has modified A.java, B.java, C.java, ... E.java, and checked all of those files in. The changes in A-E are dependent on each other, but the dependencies aren't immediately obvious. The developer working on X.java realizes he also needs to modify E.java and checks it out for modification... now they have part of the A-E change, but not all of it, and suddenly they can't compile code anymore or test. Because they can't find out how their local directory differs from the version control tool they aren't willing to just delete the directory and start over (for fear of losing their change in X.java and having to start over from scratch). But they also can't find out that the change in E.java depends on A.java through D.java, as Version Manager can't tell them these files were modified at the same time. This has cost the project at least 8 hours of a developer's time once every couple of weeks over 8 months: 128 man hours.
By default Version Manager will set the modification date of source code files you are getting from the repository to be the modification date of the file when it was checked in by the last author. This is an [sarcastic]awesome feature[/sarcastic] when you are using Ant as a build tool, as Ant makes compilation decisions based on the *.java file being newer (has a later modification date) than the *.class file. If I compile A.java this morning, but Bob then checks in a new version of A.java which he wrote yesterday (and hasn't modified since) and I then get the new version from Version Manager, Ant won't recompile it as A.java is still older than A.class (the source has yesterday's date on it, while the class has today's date).
After discussing this with our local configuration management team they told me the Ant build system was broken; build systems which drive off modification dates on files can't be trusted for incremental builds and another solution should be used. In classic form they couldn't offer up any alternatives. I agree that file modification dates are problematic, but I've rarely had trouble when working only on a local filesystem (NFS without network time servers is a whole other story). Every other version control system that I've used in the past (except ClearCase) always sets the modification date of the source files to the date/time the file was gotten from the version control system, ensuring that a date/time based build system will recompile it properly. (ClearCase gets away with not doing so by providing ClearMake, a build process which drives off the version information stored by ClearCase.)
So given that I was forced to add an a task to Ant which compares file contents by MD5 checksum, then touches the files to move their modification date forward to the current date/time prior to running any other tasks (such as javac). This was 4 hours of development time wasted, and now our build process takes 5 minutes instead of 2 minutes. Given that I try to do about 10-15 builds per day I'm losing half an hour each day to waiting for the build process to update the file modification dates. Other developers are in the same boat.
Because we can't do any reasonable differencing to determine which files we need to write bug reports for configuration management to modify for us, I've had to add special Ant tasks which scan for all files which aren't read-only in a directory and copy them off to a holding area. The holding area serves as a place for the developer to run 'diff' from on the command line or from within the Eclipse IDE to determine what they need to send to our configuration management team. We've spent at least 1.5 person-days on this little set of scripts; 10 developers actively working with them (and producing on average 8 new holding areas per day - 1 per hour) has cost us nearly 20 GB of disk space on our file server. It has also entirely bypassed the security system of Version Manager, as now anyone with access to our file server can read the source code. (A wider user base has access to the file server than to PVCS Version Manager.)
With a (burdended) developer cost of ~$70/hour we're talking about $92,960 (1,328 man hours) of wasted developer resources. Unfortunately management doesn't see these hours as a real cost, but the project is 2 months behind right now. With 10 people working on the project we're behind by at least 1 month thanks to our choice of version control tools. That's just the last 8 months. We've had this tool for years.
Early on in this project we tried to branch our source code base. Version Manager can't handle branches. The vendor claims it can, but it really can't. Our local configuration management team was initially unable to setup a single branch of development properly. Developers were forced to spend a couple of weeks to get our configuration management team to correct the system; for the next couple of months we kept finding files which had been branched improperly. This likely cost us another 200 man hours between developers and configuration management: $14,000.
We could have purchased Perforce, ClearCase, BitKeeper, etc. for a fraction of the cost of the time we have thus far wasted, and we would have only been 1 month behind, not 2 (the other month loss is more likely due to a few ill-defined requirements which were incorrectly implemented early in the project). We can't ship this project late, we're contractally obligated to deliver it on time. If we don't our customer will face a major revenue loss at a time which it can least afford to have one.
I miss the days of CVS. CVS for all its warts just freaking works. It could have saved us nearly $100,000 in man-hours in the past 8 months, kept the project one month less behind sechdule, and saved a whole lotta frustration.
To the PVCS VM developers: Have you even freaking looked at CVS? Perforce? ClearCase? Aegis? I realize some of these products are commercial and may be difficult for you to do competitive analysis on, but CVS is the defacto standard in open source development and its under the GPL. So long as you aren't stealing source code from it it is perfectly acceptable to study. Now that Subversion, darcs, and GNU arch are all stable you might want to also look at those. Oh and don't forget about git, which the Linux kernel team wrote in 1 month because of the whole BitKeeper fiasco. And lets not forget about BitKeeper, which is currently (in my opinion) the best version control tool available on the market.
Joel's right on target - you can learn a lot about someone just by looking at what they have read. The key here is what they have read of course, not what is on their bookshelf. Heck, I have a stack at least 3 feet tall of books I have received from publishers that I haven't even had time to read yet. I suspect only half of them will be any good anyway.
I fell on Joel's book review list and he's got some good things to say about some of these books, and many of them I haven't read yet. In particular Peopleware: Productive Projects and Teams. I just might have to obtain a copy, read it, and weep, because I'm sure my workplace ignores every good part of this book.
Fortunately I'm a better programmer than I am graphic designer.
But we just had to go see Star Wars Episode III when it came out. At midnight on opening day no less. $20 later I have two tickets in hand; another $10 and we have a gallon of soda and a feed bag of stale popcorn to share between the two of us. We could have fed a family of four off just the soda and popcorn (just not a very healthy meal).
The sound was too loud; the seats weren't comfortable (not anything like our recliners at home!); they keep the lights up too bright (when the screen is black the room should be black dangit!). *sigh* And lets not even talk about the Christie digital projectors which are being installed around here to show "The Twenty", a 20 minute TV infomercial we were subjected to watching because we arrived a little bit earlier than the posted start time. The Christie DLPs just don't have enough resolution to not have the screen-door effect visible to me from any seat in the theater except the last two rows, which are some of the worst seats in the room. There's nothing more entertaining then counting the number of pixels which make up the "T" in "The Twenty", or the "N" in "NBC", or better, a popular actress' nose. My wife hates going with me now as I wouldn't shut up about it. At least they showed the movie on film and not the DLP. I would have left and demanded a refund if it was on the DLP.
With DVDs at $15/each during the first week/two-weeks after release and a Netflix membership at $15/month there's no reason for me to go to the theaters anymore. Its just not worth it. Even if I really am interested in seeing the movie; it will look better on DVD at home and I will certainly be able to enjoy it more.
I've wasted many hours sitting in here watching movies. Unfortunately my DVD collection isn't large enough to really make good use of it; no matter what you thought about Star Wars II: Attack of the Clones the movie does get boring after a while! That's why Netflix rocks. We just recently rejoined after over a year without a membership.
On an 8 foot wide screen DVDs look great, but cable TV looks like garbage. So I'm just going to cancel my cable subscription and just use Netflix. There's very little we watch on TV anyway that won't be on DVD within 12 months so I might as well just wait and see it on DVD on my 8' wide screen.
I was planning on quickly hacking together a blog engine based on latex2html and some XSLT to construct article entries from LaTeX source. I'm very much against writing HTML by hand and I'm such a vi nut that I really hate using any text editor other than vi... but MovableType nicely offers up a free version, so I thought I should at least give it a try. At $69 or whatever for the personal edition it may be a good buy if I really start to use it.
Of course now I realize that with my content stored in a database on the web server I can't automatically include it into my nightly backup routine of my fileserver at home. *sigh* I guess this means I now have to rig up a crontab on my home fileserver to SSH into the web server, dump the database, then drag the file back prior to my normal filesystem backups starting up. Being your own sysadmin is such a time consuming task... at least I don't also maintain the webserver, my long-time friend Andrew at Plexpod does that for me! :-)