Thursday, December 28, 2006

Microsoft bribes bloggers

Microsoft bribes bloggers with free $2,299 value laptops.

I don't know if I could be bribed that way to write pro-Microsoft posts. I doubt it, but if Microsoft wants to try, hey, go for it :-)

Tuesday, December 26, 2006

Postgresql - shockingly fast and scalable

I'm a postgresql nut. I have a bias against mysql because I think it's a great database for simple problems, and I don't have any database problems that simple.

Don't get me wrogn, I recommend using the right tool for the right job, and no doubt there are some jobs where mysql is the right tool. I just don't get those kinds of projects.

I would work with Oracle or MS SQL Server or IBM DB2 if I had the opportunity. However, when I work with a client, I recommend postgresql first, and only after they've rejected that (they really want to spend money) would I look at proprietary databases. That hasn't happened yet. Everyone faints with sticker shock. I know some people who use Microsoft solutions whenever they can, but then that's because that's what they know, and I don't think the cost of the software factors in since they just pirate it.

In any case, there's a great essay on how one guy used postgres and got incredible performance. As he says:


’ve pushed postgres on performance. And rather than finding it slow, I’ve found it shockingly fast and scalable. The traditional reward for a job well done is usually another, harder, job. But I’m not worried. I’m using Postgres. The worst that’ll happen is that I’ll get an excuse to actually get new hardware.


He used partitioning. I need to do that sometime. Maybe now, with version 8.2 out. Or possibly when 8.3 comes out. I'm concerned about some optimizations for queries that hit multiple tables (e.g., 160x30 tables). When people on the pgsql-general mailing list start talking about how great the optimizations on partitioned tables are, it'll be time to switch. For now, fat tables with appropriate indices and autovacuum/autoanalyze are good, and when queries still take too long (I have 300+GB of data at $DAYJOB) then I just build materialized views which are populated by triggers. Everything is good so far and I don't see any need to optimize further for another few years. Maybe when the data hits 1 terabyte (in 2-3 years) or when some individual tables hit 100GB I'll need to look into partitioning.

Sunday, December 24, 2006

The PRIME networking truths

From the twelve networking rules, the primes

(2) No matter how hard you push and no matter what the priority,
you can't increase the speed of light.

(3) With sufficient thrust, pigs fly just fine. However, this is
not necessarily a good idea. It is hard to be sure where they
are going to land, and it could be dangerous sitting under them
as they fly overhead.

(5) It is always possible to aglutenate multiple separate problems
into a single complex interdependent solution. In most cases
this is a bad idea.

(7) It is always something

(11) Every old idea will be proposed again with a different name and
a different presentation, regardless of whether it works.

Saturday, December 23, 2006

why does schneier blog squid?

I've always wondered why Bruce Schneier has regular squid entries on his blog. A quick google search (i.e., a lazy google search) doesn't provide anything canonical. There's a comments page with some riffs on why schneier blogs squid though.

I thought it was just a riff on squid ink and security by obscurity. But on second though, no, that doesn't work well either.

Thursday, December 21, 2006

php problems

There's a list of problems with the PHP language.

I agree with a lot of it. But I still use PHP more than any other language just now. That may change again, but for now, there isn't really any language I'd rather work with when doing web development or command line programs.

At some point I'll find the next big thing, but for now, PHP, warts and all, is what I prefer to work in. I'll need to review those problems and comment on one or another of them, sometime.

Woohoo!

I was informed that my day job is giving out a bonus equivalent to a month's pay, due tomorrow (21 Dec). That's cool. I'm very happy with work, and the team I work with, and I don't have problems at all with the organization, even if, as it grows, it ossifies in some ways.

But the bonus is good. We'll save most of it, but around 1/3 we'll keep for a vacation trip to Camiguin in May, for 2 dives a day, every day, for a week :-). I can't wait.

Hmmm, by then my brother Tim and his family might be back in Cagayan de Oro already. We'll bring them all along! It'll be the first cousins week for Timmy (who shares a name with his uncle :-).

I don't think we'll do that

There's an article on toilet training 3 month old babies, and using no diapers while doing it. Well, no diapers for number 2, anyway.

I don't think we have the courage or the adventure in us to try that. It's interesting though. Maybe I'll try to convince sol to try. Or at least, to read the article. She's so tired from taking care of the baby that she doesn't have time to check her mail, even with me helping as much as I can.

I'm told it gets better. I sure hope so, and the sooner the better.

DIY geekery

Knives! Magnets from old hard drives!!!. Alright, I don't know if the knives are geeky, but the magnets. from. old. hard. drives. sure are :-).

The article woke me up because I also happen to have a thing for knives, (and glue, and string, and tape, and fire, with which, one can fix anything :-).

Tuesday, December 19, 2006

Fun, Interesting and Long

Richard Feynman and the Thinking Machines Connection Machine.

Great article. Long, interesting, fun.


In the meantime, we were having a lot of trouble explaining to people what we were doing with cellular automata. Eyes tended to glaze over when we started talking about state transition diagrams and finite state machines. Finally Feynman told us to explain it like this,

"We have noticed in nature that the behavior of a fluid depends very little on the nature of the individual particles in that fluid. For example, the flow of sand is very similar to the flow of water or the flow of a pile of ball bearings. We have therefore taken advantage of this fact to invent a type of imaginary particle that is especially simple for us to simulate. This particle is a perfect ball bearing that can move at a single speed in one of six directions. The flow of these particles on a large enough scale is very similar to the flow of natural fluids."

This was a typical Richard Feynman explanation. On the one hand, it infuriated the experts who had worked on the problem because it neglected to even mention all of the clever problems that they had solved. On the other hand, it delighted the listeners since they could walk away from it with a real understanding of the phenomenon and how it was connected to physical reality.

Saturday, December 16, 2006

Great talk on designing a good API

Josh Bloch's great presentation (video talk) on designing good APIs

It's a long talk and I'm only at the very beginning of it. Sounds good so far though.

Thursday, December 14, 2006

reddit.com not resolving

I'm an internet addict and I regularly go to Reddit for my daily fix. I can't go there just now though. DNS isn't resolving. I tried to go to a ping gateway, but the hostname wouldn't resolve there either. There's probably something wrogn with their DNS server just now. I don't see any mentions of what might be wrogn though. But I only wonder since I'm starting to miss my fix (heheh, not really,there's always Digg).

Old laptop, old distro

I had a problem with the right LCD hinge of my previous laptop a Winbook J4 733E pro. I sent the laptop back to be repaired and it's back and working perfectly. That laptop had Mandriva 2005 on it. In the meantime, I've been using a toshiba laptop with ubuntu dapper drake on it. I'm now trying to decide which laptop to stick with.

Ubuntu won't install on the Winbook. It fails around 80% of the way through the install progress bar. I'd forgotten that ubuntu wouldn't install here. So I've installed Mandriva 2006 (I don't have 2007 yet since I'd stepped away from Mandriva for the duration). It's now (slowly) downloading upgrades and new packages that I didn't install from the DVD but which have been upgraded so they need to be downloaded.

I like the winbook and can deal with its minuses (mainly the fact that it looks lived in, it doesn't look brand new by any means, plus the weight and the fact that it runs hotter than the toshiba since that's a desktop P4 chip in there, not a mobile P4). The Toshiba certainly is prettier, but pretty is actually a minus with me. A laptop (and a motorcycle, or shoes, etc) that doesn't look lived in doesn't feel as good or as comfortable.

On the other hand, I don't know how to get the wifi on the winbook working. This was never a problem before since I was always wired. But now that we've got wifi at home, if the wifi doesn't work, then I can't use the laptop (well, I could buy a PCMCIA card I guess, or a USB wifi device that linux supports). That's probably the biggest problem. I was getting used to the crypto /home filesystem on ubuntu and I'll need to figure out how to get that working in Mandriva.

While trying to get Ubuntu installed on the winbook (I swapped the toshiba and winbook drives due to drive size) I might have made a mistake with the ubuntu gui installer and trashed my encrypted /home partition. I'm keeping the partition around for a while (lots of space on this drive) and I'll try to recover the data later, if that's possible.

All in all, it's going to be an eventful distro switching Christmas experience. If I get tired of all this, well, I'll probably just switch back to the Toshiba, although I need to figure out some things. Among other things, the toshiba turns itself off when I run anything extremely CPU intensive (like a chess game) that runs for a long time (say 5 minutes). This doesn't happen in Windows (I'm told), so the current conjecture is that toshiba has special thermal control functionality that its windows drivers know about, but which Linux doesn't know about, so linux doesn't know how to make the fans spin faster, or whatever is needed to keep the laptop from turning itself off.

Damn, I might have to sell both of these laptops and get a lightweight, less powerful, non-desktop-replacement laptop for everyday use. I've got two other lightweight winbooks for just that purpose (everything works perfectly except for sound, but that's easy, I've got that working before, I just don't remember what module to load), but I like the power of the toshiba and the Winbook J4, so I'll try to get them both working first before giving up, restoring windows on them, and selling them at a bit of a loss.

Tuesday, December 12, 2006

Cancelled Interview

I had a candidate for a software development position scheduled for an interview today (Monday). She emailed early that she would not be able to make it because she had already received an offer from another IT company.

This was fine. It was a good break for her, and I was glad to have her email me early to inform me of the change in plans (shows good sense, consideration and a certain maturity).

I thought she made the wrong call though. Currently, in the Philippines, it's a geek's market. There is more demand than supply for good developers. I think that even for mediocre developers there is more demand than supply, although I don't want to have anything to do with that class of developer.

Therefore, I think that software developer candidates (particularly the really strong ones, but even those with no experience but who came from good schools) should not take the first offer they get. They should be willing to interview for a month or two (if things don't work out, there are always call center jobs available) and they should look at factors other than the pay. How much challenge will there be on the job? How many different languages/platforms/frameworks will be used on the job? Does the company have a commitment to helping make its developers as mature and strong as they can be?

For developers just out of school, how much will they learn in the first 3-5 years at the company? If they're going to be pigeon-holed into some specialization (e.g., SAP, Siebel, Peopleware or only one language or database), then that's the wrong company. The first 3-5 years out of school are a great opportunity to expand the developer's mind. Pigeonholing limits the developer and (except for those who actively search out new ways of doing things, a minority in everything) stunts his or her growth. Perhaps (likely, certainly, for most developers) forever.

Sunday, December 10, 2006

Exceeding Gmail's 2GB limit

Brajeshwar posts his observations on what happens when you exceed the 2GB gmail storage limit.

Mildly interesting but, frankly, a bit dumb. If you're approaching the limit, take countermeasures. In the comments someone suggests downloading to thunderbird or outlook. I strongly recommend that no one should trust any web-based system with all their email. Gmail and yahoo (and particularly hotmail) have been known to lose email. Yahoo and Hotmail will expire an email address if the email is not checked for a certain period (a month or two), apparently MySpace will take profile names whenever they find a commercial reason to do so ("Bones"). Gmail doesn't have that weakness, but there are recent comments that Gmail seems to be slowing down. The company isn't going to go away, and it's as good a web-based brand as any, but trusting gmail completely is a bug (and trusting hotmail or yahoo even just a little is a stupidity and not merely a bug).

So backup your email. And why think about removing spam? Just go ahead and do it. Unless you're collecting spam for whatever reason, in which case, forward your spam to another email address or (if that can't be done, i haven't tried it, frankly), forward your legitimate email to another gmail address. Or forward all emails above a certain size to another email address.

Asking gmail for more space may help. I think it's just obnoxious whining though. Workaround your problems first. If you've got 2GB of email, there is something wrogn with how you're using gmail. And frankly, if gmail actually gives you more space so that you can abuse the service some more, there is something wrogn with that decision too. But it's their network, they can do with it whatever they want :-).

Saturday, December 09, 2006

internet cafe computers

I was at SM Megamall today, at the Villman internet cafe which is located in the middle of the promenade among all the computer shops.

I wanted to post this (or a similar) article from there, but I didn't. No internet cafe computers can be trusted with my password (not even for throwaway accounts like my hotmail spamcatcher account), and certainly I wasn't going to type my blogger password there.

The computer was supposed to be locked down. After the internet cafe support guy enabled sol's terminal and mine, I saw that I could select from a few categories (I chose internet). And once a category was chosen it let me select internet explorer (strangely enough, windows explorer was an option, so I might have been able to run an internet game, or solitaire, or whatever).

I can't stand explorer though (although I haven't seen IE7, which is supposed to have tabbed browsing, that might make it usable instead of instantly execrated). And anyway, explorer was a bit weird because I couldn't right click on links (if I can't open a tab, I want to at least be able to open a link in a new window). Sure, I could have done Alt-F (something else to get to new window), but that's too much work. So I surfed on over to mozilla.org and found a link to download and install firefox. I was surprised that it installed. I was pleasantly surprised since at least I'd be able to use a competent browser instead of the crap that comes with XP, but I worried some more since clearly I could just as well have downloaded any trojan or virus or spyware I wanted and it would probably have installed.

Possibly that computer had antivirus, but I doubt that. Too much expense for a cafe computer. Or if it did, then it would have been pirated (actually, I assume everything on that box was pirated, but maybe it wasn't, if installed in an SM). Anyway, that is a seriously incompetent setup. Maybe they reinstall everything from a ghost (or similar) image every day, but anyone could still install keyloggers and such and at least harvest a bunch of people's passwords all day.

If Villman can't even get this right (and they're big enough to know how to do things right), who *does* get things right? Maybe Netopia?

Sunday, December 03, 2006

15-20 minutes

Bill Gates took 90 minutes to solve Petals on the Rose, I took a bit less. I doubt if I'm smarter. For one thing, I don't have the memory to do the memorizing trick he did to partly solve it early on. I also had the advantage of not having to wait for the next roll. I just read through the article, looking at the rolls and the answers.

The name of the game also helps. And I think maybe I might have seen the story (or the game) before. Maybe in Scientific American or something similar, related to or discussing Godel, Escher, Bach.

Actually, I didn't time myself. I might have taken less than 15 minutes (that's about my threshold for puzzles, if something takes longer to solve than that, then I need to be paid to pay attention). But it certainly wasn't less than 5 minutes.

Wednesday, November 29, 2006

firefox gone wild

I saw this entry in reddit.com, the tag was "Firefox Gone Wild".

Heh, gone wild, indeed. Ian Sison says he has a lot of firefox extensions. I sure don't have as many as he does. I'm sure he hasn't gone as wild as that guy though :-).

Monday, November 27, 2006

ubuntu toshiba mousepad fun

I'm currently using a toshiba satellite A75 S-231 while waiting for my winbook to have its LCD hinge repaired. The Winbook is on the way back, I should have it in two days or so (along with a bunch of other goodies, a vonage unit and a long range wireless phone to attach the vonage to.

When I get the winbook back I'm going to have to decide which laptop to sell. Both are good boxes, both are *heavy* boxes. These are not lightweight low power boxes, they're desktop replacements.

I've been having problems with the mousepad on this toshiba for a while. There would be occasions when moving my finger over it just would not move the mouse. On the other hand, right now, at 2AM, the mousepad is beautiful. Not as beatifully smooth moving as on my wife's smaller winbook, nor on my brother-in-law's incredibly sweet fujitsu lifebook, but smooth. For a while, I'd just not be able to get the mouse to move at all (or if I could, it would involve using the bird finger or the ring finger. I don't know why it's suddenly working so well.

My best guesses are:

  1. ubuntu upgrades - ubuntu dapper drake said that it had 200 updates to install (I reinstalled because I didn't have the patience to figure out what was wrong when X wouldn't run). It might be that some of those upgrades involved improving the synaptic mousepad drivers (it wouldn't have been the kernel though, I was already running the most recent ubuntu 2.6.15-27-i686 kernel back when X stopped working.

  2. humidity - if the problem recurs tomorrow, then it probably has to do with environmental factors. I found that wiping my index finger would sometimes help (but not always), pressing down a bit harder would sometimes help (but less often than wiping). Using the bird finger or the ring finger almost always worked, but not that smoothly and not predictably (not clear which finger would work).


I hope it was one of those 200 updates though. This is a lovely box and it'll be cheap for the value when I sell it (3Ghz CPU, 1GB RAM, 100G disk, everything except the winmodem works in ubuntu [well, maybe the winmodem works in ubuntu, I doubt it, but I've not tested since I don't need the modem anymore]).

If the mousepad works perfectly for the next week, it's going to be a tough call, which laptop to sell.

Sunday, November 26, 2006

Misread

When I saw the title on Reddit and followed the link to TechEBlog's article on Controlling your XBox with a PSP I read it as controlling the xbox with ESP.

Crepe, ESP would have been cool!
:-)

Saturday, November 25, 2006

Linutop!

The Linutop looks incredible. It's a very small, low power, diskless, Linux desktop. Primarily meant to be a cheap web surfing platform, this could hit Microsoft in the fundament when partnered with web based office productivity applications already available.

I tried out Google Spreadsheets yesterday and it was surprisingly usable. It was slow, but certainly usable. I already use gmail as my main email system. At work we use BaseCamp. And almost all of our business systems (except Point of Sale systems, although we saw a very impressive demo of a web based point of sale system from Imperium technologies a few months ago.

For an increasingly web-based world, the Linutop is a great box. It's got a slow CPU in there and not a lot of RAM, so a year or two after launch, I wouldn't be surprised if the first iteration were to become very cheap and they would then be perfect as LTSP clients (which we also run at work, and which we will be pushing out slowly, one department at a time, so that maybe in ten years only 10% of our employees will still be in Windows).

Friday, November 24, 2006

Ituloy ang Sulong SEO

I'm helping out a friend Ituloy AngSulong with his web site. Apparently there's some sort of contest and he needs some backlinks.

Al is a good guy to work with, knows his SEO, knows his website design, I was teaching him PHP at one point. I wouldn't be surprised if, at some point, he leaves his day job and gets web design and optimization work on his own. I'm more conservative than that, I think that having one stable income and one contract income in a household is better than having either two stable incomes or two contractual incomes. But in order to do that, he's either going to have to work double time, or marry his girlfriend. I think it's time, Al :-).

Thursday, November 23, 2006

wow, ubuntu installer

I had problems with my ubuntu installation on my laptop. X wouldn't start anymore. I don't want to try to fix it, I'll just reinstall after backing up relevant parts of /.

I had forgotten how cool the ubuntu installer is. Installation is done in the live CD gnome setup. So, while the OS is installing, I'm able to connect to the internet and surf. I could also have played any of the games on the live CD. I remember an older distribution that allowed you to play tetris while the OS was installing from CDs. This is much cooler than that.

I don't know about Vista, but none of the Windows versions up to XP can do that. In more than just the obvious way, installing Windows is a total waste of time.

too bad, no postgresl

I was excited when I saw a link to site5.com's bargain basement web hosting pricing. I'd love to have cheap hosting, that way, even if I had nothing to put there, it wouldn't be too expensive.

Unfortunately, there's no postgresql in those hosting plans. Mysql is good enough for most people, and it's improving, sure, but I can't stand it and will never use it if a better free database is available, and postgresql is that better database. I haven't looked into firebird yet. Haven't really needed to since Postgresql does everything I need and it continues to make great progress. I've heard some good things about it though. I may look at it sometime soon. At least at the feature set. It's going to be hard to set up a comparison with postgresql though, my database at work is upwards of 300GB now and it's going to get larger as I import more of the company's data into it.

Sunday, November 19, 2006

Linux Imap email client that does auto-expunge?

I access my work email via IMAP. At work I have evolution with a whole bunch of filters to move mailing list and non-work related email to local folders. That way, the inbox is for work-related email only.



The problem is, evolution doesn't auto-expunge the inbox. So when i run evolution at home (locally, not over X or over VNC) I can still see the emails which the work evolution has already moved to folders. Those emails are still visible in the inbox.



I'm going to have to check on either command line (fetchmail or similar) or graphical (perhaps sylpheed or thunderbird) email software that I can configure to periodically check IMAP, move non-work-related email to folders (or an alternate email address) that can be managed separately, and actually expunge the moved email (not just mark them deleted). Of course, I've been thinking about this long enough that I really should have done it months ago. The problem is, it's not very inconvenient to just vnc to my desktop at work, start evolution and manually expunge with CTRL-E. The degree of inconvenience is directly related to the likelihood of getting motivated enough to actually look for a solution. If the itch doesn't scratch enough, nothing will be done :-).





powered by performancing firefox

Performancing

I don't remember where I found a link to Performancing for Firefox. It was only recently. Certainly within the last two weeks or so. I think I'll switch to using it all the time now. The editor isn't as fast as the one on blogger itself, but it's very convenient. It allows me to post to any of my blogs (well, for now there's just this one and my other me, but that could change as virtual personalities multiply) from just the one screen that pops upward in the browser itself.



I'm still learning the ropes though. I've learned one good thing now though. the default mode was Rich (text-only) editing. I've switched to editing the HTML source since I'm used to doing that already anyway (despite a very basic knowledge of HTML). Took me a few iterations of trying to embed the fainting goats youtube link though to figure out that I needed to go to editing the source (the " icon at top left) .



I haven't played around with this enough to see if there's a way to save drafts *in*the*browser*. That way, I could start out posts in performancing even when not connected to the net. With fast internet wherever I go now though (my friends diggi and barb at camiguin action geckos tour and adventure tell me that there's PHP1000 eat-all-you can internet from Smart now on Camiguin. It's probably very slow, but it's 24x7. so slow doesn't matter much. they were even able to download and view the youtube video of my newborn son (completely impossible previously, with the local telco's dialup internet service), that's going to be a rare requirement. But I'm sure there'll be the occasional trip to Sagada and Banaue. There, there won't be ubiquitous internet for another year or so, I guess.



This reminds me, I've got lots of pictures to upload to flickr or picasa so that they can go on to my other blog. I need to fix those old photoblogs too to point at flickr. Right now they point at geocities sites that I used to store the images. That's inconvenient though since it's necessary to touch the site every month or so so that yahoo doesn't throw away the content for lack of movement.



powered by performancing firefox

Saturday, November 18, 2006

Netgear WGR614v2 firmware update

We recently got DSL and wifi at home. I've kept it open (no WEP, no WPA) so I can see if someone will notice and leech off it. I haven't seen anyone riding onto the wifi signal yet, so I'll probably keep it open continuously.

We have a Netgear WGR614V2 wifi router. We saw quite a bit of instability with this. Once or twice it was because the PLDT DSL link just went away for 2 or 3 hours after 10PM. That hasn't happened often enough yet to complain to PLDT about it. The Netgear router, though, does go nuts (to coin a technical term) pretty often. Every day, sometimes twice a day, it just stops responding to ping and routing. It's still on, and I can control it when wired, and the wireless link is good. It just doesn't reply on the wifi interface.

Last night I downloaded and installed the latest firmware update on the wifi router. It's been stable so far, and I'm hoping it'll remain that way. It probably will, it was probably fixed in the first firmware update :-). The firmware update also adds WPA support and some other things. I'm not going to use WPA, but it's good to know that it's there if I need it. It'll also be useful to test ubuntu's WPA support, just so I know how that's done.

Saturday, November 11, 2006

Encrypted filesystems and US border laptop searches

So I'm concerned about having my laptop legally stolen at U.S. borders, or sort-of-legally stolen by TSA [or totally illegally stolen by baggage handlers] when I check it in.  Ignoring the two other cases, I considered putting my data on an encrypted partition on the laptop.  That's not so hard these days, with linux (and windows too, except I don't use windows).  But that doesn't fix the first case either.  If anything, it would just raise red flags for TSA and U.S. customs, and they'd confiscate the laptop and probably detain me until I told them the passphrase.  And then I'd go to jail since the passphrase might be something very uncomplimentary to the TSA.

So, no, encrypting /home wouldn't help.  It's probably still a good idea for another use case, where I get held up and the laptop gets stolen.  At least there would be no leaks of proprietary information.  So I may do that anyway.  But it won't help at U.S. borders.

Alright, I've convinced myself.  Now I just need to backup the data so I can wipe out /home and setup the encrypted partition :-).


powered by performancing firefox

Friday, November 10, 2006

Apparently, laptops may be legally seized and searched at U.S. borders, they may even be kept and not returned.

“My laptop was initially searched by one person, but he called for backup” when he saw the blueprints. “It seemed they were convinced I was sent to plant bombs in those nonexistent buildings.” He said he hasn’t seen the laptop since.


It probably happens very rarely, perhaps randomly (although there was the case of someone who had his laptop searched because there was a report of child pornography or similar on him).

One more reason to avoid going to the U.S., I guess. Not a decisive reason, my family and I aren't going to the U.S. until at least after George W Bush is out of office, and probably not even then, unless the TSA and the Department of Homeland Security have either been dismantled or become accountable for what they do. As it is, I'm not risking my laptop in checked luggage (although possibly that restriction is lifted now) and I'm not risking losing it and never getting it back just because it's impossible to get anything out of TSA or Homeland Security (since the judiciary and police will bend over backwards even if there is no case).

There's no important information on my laptop (all my mail is on gmail, and the CIA already has all of my mail, and even if they don't, the NSA does), but I'm not some rich american who can afford to buy a laptop every month. So I'm just not going where they can take my laptop and never return it (and because I've been so critical of George W Bush's incompetent administration, I'm not going to be a random sample, they'll do it on purpose).

security bank ATM bugs

I'm not very happy with Security Bank ATMs. I don't have much choice in the matter since my company has my salary go into a Security Bank ATM account. But I'm definitely not impressed with their ATM software or their commitment to quality, if any.

Long ago I saw a Security Bank ATM bug where, if I were to try to withdraw, say PHP 11,000, the ATM would say that it was out of 100 peso bills (I think, or 500 peso bills) and that it could not process the request. The actual error is that Security Bank has a cap of PHP 10,000 on individual withdrawals. The error message is wrong though.

At exactly the same time (during the same withdrawing session), I also saw a strange bug. I was at the Security Bank branch at Timog and Tomas Morato in Quezon City and the ATM said that it was in Baguio. I don't know what the deal was there. Maybe they moved the Baguio ATM and physically installed it at Timog and Morato. Or maybe they took the install disks or CDs from Baguio and installed them elsewhere. Whichever it was, the lack of quality control is disturbing in a bank.

Just yesterday (8 Nov 2006), I was at the Security Bank on Shaw Blvd (right by Jose Rizal University). The old gentleman ahead of me took a long time to process his transaction. He was confused though, he seemed to think that either he could withdraw PHP 1432.57 (or something like that), or he might have been trying to do a merchant payment. He was trying to do all this at Security Bank even though he was a Metrobank subscriber. So, OK, he was just confused. I pointed him in the direction of the Metrobank branch up Shaw Blvd a bit.

When it was my turn at the ATM, I was concerned because the screen was blank except for some ASCII at the top (which had an A:\, and some random characters, some of them greek). So I thought maybe the ATM was not working. When I pressed a button though (ENTER), a text menu came up asking me which telephone company I wanted to pay bills for. That's bug #2. Why was I being asked to choose a telephone company when I hadn't even inserted my ATM card yet.

I wasn't interested in paying any telco bills, so I pressed CANCEL and it went to the ASCII rubbish at the top bug. I wasn't too confident about this, but the bank staff were still in the bank (but the bank was closed). I figured if something went wrong, I'd just knock on the door and tell them the problem.

So I fed the ATM my card and was able to withdraw money. When the transaction ended, the screen went blank again, with the ASCII rubbish at the top.

I pointed this out to the guards, but they said there was nothing wrong and they had just reinstalled from CDs (or floppy, I couldn't quite understand that part). I guess the guards had been *told* (perhaps by the manager, or some IT guy, or whoever was in charge of the care and feeding of the ATM) that it was OK. In which case, whoever that person was had no clue.

The clearly visible bugs in the ATM detract from the confidence that depositors or account holders have in the COMPETENCE of the bank. If they can't fix the bugs in their ATMs, how confident should I be in their financial and accounting competence? The fact that they *KNOW* about these confidence sapping bugs in their software, and they don't care enough to at least hide them from view (by taking the damn ATM offline until the glitches are fixed) weakens my confidence in them some more.

With Banks (and with other companies too, but with Banks and hospitals and air traffic control systems in particular, where lives or money may be lost) there must be no tolerance of error. Even if the branch manager knows that the ATM will continue to function correctly, the customer doesn't know that.

Monday, October 30, 2006

Damn - I'm in trouble for googling svn

There's an article about how google is in bed with the CIA. I hope it's not true. As it happens, my own sense is that there's probably a lot of overreaching going on there (the interviewee also thinks that the U.S. government probably had a hand in creating 9/11,I mean, that's possible, and Cheney and his gang are evil, but probably not *that* evil).

In any case, if this stuff is true, I'm in sooo much trouble. I keep forgetting (actually, I keep not remembering to write it down or look in my overflowing bookmarks) the URL to subversion. So I keep googling for it (there's another problem, I keep saying "googling" with no (tm) or whatever it is that will appease the lawyers, mainly because I don't know what the protocol is there. I bet I'm going to receive a cease and desist email any day now, from a company that's in cahoots with the CIA, i'm *sooo* going to be waterboarded).

Ho-hum. I do wonder how much of that is true. Maybe a lot. But I don't worry too much about it. I don't intend to come anywhere near U.S. territory until this insanity ends (probably going to be a long wait). I'm certainly not guilty of anything (well, except for saying that Bush is an incompetent nincompoop and his vice president and advisers are evil, that might be a crime now, not really eager to test it), I can explain what subversion version control is, in so much detail it would be torture on the torturers, I'm so enthusiastic about it, but there's just insanity and paranoia there. Best to leave them alone until they decompress and are more likely to leave the innocent alone.

Saturday, October 28, 2006

Upgrading ubuntu without reinstall

I see that There's a step by step guide (with pictures even) to upgrading ubuntu dapper drake to eft

I'm going to try that one of these days. But first I think i'll watch someone else do it at work :-). I've been burned quite a few times upgrading distros (mostly mandrake and mandriva) upward. It's possible to do, but a lot of things can break in the middle (with mandrake and mandriva, urpmi being so dependent on perl, there are lots of things in perl upgrading that could break urpmi which would break the whole thing.

Yeah, I think I'll watch someone do it first. I wouldn't be surprised if it goes smoothly. A lot of things go smoothly with ubuntu (although I still need to get sound working on my wife's laptop, I got that working when I had a similar laptop more than a year ago, but I've forgotten the magic incantations necessary).

A* pathfinding fun

On one of the games programming feed aggregators I monitor (or rather, that I have Bloglines monitor), there was a recent post pointing to an A* pathfinding discussion for beginners.

I haven't finished it yet, but it's sufficiently fun that I'm blogging it already :-). I doubt if I'll actually implement it. But I like knowing about random things not directly related to work. It might come up in an interview. If it doesn't, I *might* bring it up in an interview :-). And, anyway, a problem in the future might come up that looks like a pathfinding problem. It's useful to know things that are close to but not exactly in ones normal problem domain.

At work, my company happens to have How To Solve It: Modern Heuristics and I read that slowly, mostly on breaks. Not because I necessarily understand everything I read, but it's interesting, expands the mind, and even if I don't understand, it's useful to know that this or that technique exists so that, if something comes up in the future whose shape looks like it's solvable by the familiar technique, I know which book to look to. Or at least, which wikipedia article to follow :-).

Tuesday, October 17, 2006

wifi insecurity

By today I should have set up the wifi router to use WEP (I don't see where I can set up WPA on the netgear, which is a bit strange since it *is* 802.11g capable, so it's *new*).

I've decided against doing that for now though. I'll just make sure all computers in the house are firewalled off, and I'll keep the wifi router open. There's not that much bandwidth to share, but I'm interested in:

A. *Whether* someone will piggyback on the wifi at all (we're
in a lower middle class neighborhood and it'll be interesting
to see if anyone even notices that there's some free wifi
available.

B. If anyone *does* notice the free wifi and decides to piggyback,
I'll just sniff the network and snoop a bit :-). I don't care
about any of their data, I won't crack their windows box or
anything (although I may portscan, you come in the door without
knocking, I'm going to frisk your donkey), I'll just see what kind
of traffic they generate.

C. If they start doing peer-to-peer downloading, well, I'll block
them just to see if they know enough to change their Mac address
to get back in :-).

Monday, October 16, 2006

PLDT mydsl, wifi, woohoo!

I was mistaken in my http://monotremetech.blogspot.com/2006/10/pldt-mydsl.html previous post on pldt mydsl not working with linux. It *does* work. The problem I encountered was that when requesting an IP address over DHCP, there is a delay between when the first computer (which got an IP) stops using the internet and when the second computer (which could not get an IP) finally gets an IP over DHCP. I haven't done much more experimentation than that. I got the same problem with the computers reversed. When I was able to get an IP via DHCP on my Linux laptop, when I switched to the Windows computer and tried to get an IP, it wouldn't get one until many minutes later.

My guess is that this is related to the DHCP lease time. PLDT probably just won't provide another IP to a computer behind the DSL modem until the most recent DHCP lease provided to another computer behind that same modem has expired and has not been renewed.

In any case, I've got a new NetGear WGR614 v2 wifi router courtesy of Jijo Sevilla and after a painless (because I didn't try to set any security, I'll do that tomorrow or the day after) setup, my laptop (Ubuntu linux), my brother-in-law's laptop (Windows XP) and my brother-in-law's desktop computer (Windows XP) are now all able to go online, although the desktop computer is wired.

I'm typing this from the bedroom, but I'm going to have to turn this computer off soon. It's just nuts to be surfing continuously for the whole weekend. There hasn't been enough quality time with my family and I haven't even done work-related stuff that I was going to do over the VPN over the weekend ;-).

Tuesday, October 10, 2006

pldt mydsl

Sol and I moved recently. We loved our apartment, but with the baby coming, it makes sense to move in with her family, for help at birthing, with the baby, and to consolidate costs.

I miss Destiny Internet I had hoped that I'd be able to bring it with me to Sta Mesa, but they don't have cable internet available there. So we decided to go with PLDT MyDSL, the PHP 999 low end version. Installation was pretty quick and the bandwidth is as expected (actually, it's a bit faster than advertised, which is all to the good).

There are some problems though. They don't use PPPOE, instead it looks like all that's necessary is to get an IP via DHCP and you're golden. This works with my brother-in-law's windows XP desktop and laptop. It doesn't work with my linux laptop though (dhclient eth0 just doesn't succeed). I'm wondering what I need to do to get this done (apart from buying hardware, although we might get a linksys wifi router so the two laptops in the house can use the link too).

In the process of trying to decide what router to use, my brother-in-law called myDSL support and asked them what routers work with myDSL. He got a singularly stupid reply from the tech support person there. It seems that (since we just got the connection yesterday) we're not in their database yet as being connected. Paperwork on the way from their DSL installation contractors I guess. But because we're not in their database, they won't help us.

Idiots.

If I were connected to ZPDee or Destiny or Globelines or some other internet provider, they won't help me switch to PLDT because they won't tell me whether the router I've already got will work with their system? I'd be a loss already. But maybe it's policy, they don't want to help anyone share the internet connection. It's still a loss though. People are going to share their connections anyway. I certainly am. Putting a speed bump in the way just pisses off paying customers. That's a stupidity worth a firing.

Sunday, October 01, 2006

milenyo overhang

We just had what must have been the typhoon of the decade pass through manila. Electricity went down within an hour or two of Milenyo making contact with metro manila. We stayed home (people told me of their cars shaking in the wind, cars being hit by flying billboards and people blown off their motorcycles by the wind).

Electricity didn't come back until Saturday night and by Friday night we were fed up with the heat and the uncertainty of getting electricity back (plus not being able to charge celphones, see the news on TV, etc. There's only so much AM radio one can stand).

So we decided to go to Roxas Boulevard and stay in a hotel. That worked out really well. The Aloha hotel (which doesn't have a lot of stars but is reasonably comfortable and reasonably priced) had airconditioning, hot and cold water and a great szechuan restaurant on the ground floor.

One night later, the power is back in Sta Mesa and we're home.

I haven't been online (on this laptop) since Thursday and am at Robinson's galleria waiting for Sol and her sister to come out of their Victory Christian Fellowship sunday service. It's always a pain finding good places to sit down with a laptop. There's Starbucks, where they let you charge your battery (extra points for that), but I don't like coffee enough to pay $2 for it.

Fortunately, nearby is Cafe Mediterranean. I'm having a Greek Salad, olives and ice cold San Miguel Beer. I'm also posting this blog article. This, I think, is where I'll wait in the future. I spend more than I would at Starbucks, but I like the food and they just don't serve beer at Starbucks. now if only those two women over there would finish their meal so I can take over their table and plug in to the wall socket right by them.

Saturday, September 16, 2006

Matt Cutts SEO tips

I don't know if these SEO tips by Matt Cutts are good since I haven't read them yet. Posting them here though so I can find them again.

Google Browser Sync is very convenient, but it's led to a bookmarks list that is so long I'm just never going to go back to any of those links, probably :-).

Friday, September 08, 2006

catching up

I've spent some time working on requirements for immigration to New Zealand, and I got sick in the process (1.5 days off from work this week, recovering from fever, head cold and coughing, the last being a bit of a lifesaver, strangely enough). And work has been pretty intense too, when I was able to go.

Now I need to catch up. For one thing, my RSS feed reader tells me that I've missed *5* days of The Daily WTF. I'm not going to read everything since it's sol's birthday tomorrow. But I've certainly got to catch up with the WTF's of the week :-).

Saturday, September 02, 2006

A Gamer's Manifesto

I'm not really a gamer, but I had to point at A Gamer's manifesto (partly, so I'm reminded), because it's so funny. In the same vein, there are Weird Laws of the United States, which is funny but in whose veracity I don't have any data or confidence in. And there's also Schadenfreuidian slips which is just plain fun :-).

Thursday, August 31, 2006

Erlang concurrency discussion

Joe Armstrong has a good and interesting discussion of concurrency in Erlang. One point being, if the multiple/concurrent processes don't share memory but instead only send messages to each other, concurrency is easy and robust. Certainly, it's race conditions and incorrect locking of shared resources that makes concurrency difficult in languages which share memory.

Recently too, Joel had a discussion of MapReduce which, apparently, is something google uses to parallelize massively. Which is related to the Armstrong's article since the concurrent processes don't share memory either, they just work independently allowing massive concurrency scaling pretty much linearly as the resources.

I saw a blog post that said that Google doesn't work that way (and that discussed how map/reduce actually works in terms of the, I think, C++ API). I've lost that link though. I don't know if I ever bookmarked it, and if I did, well, I work on around 5 computers. So it's somewhere, but I don't know where it is. I could use Google Browser sync I guess. But I'm not convinced. I'm a bit paranoid about it (mainly about keeping passwords in there). I'm told that that's optional (a feature I'd expect from such an excellent company as google, certainly. But I'm still thinking about it. Oh, I'm sure I'll capitulate, eventually. But there's going to be at least one tulog before I do so.

Alright. and that's quite enough randomness already.

Monday, August 28, 2006

Libraries, books, Wow

I don't quite care for the title, but yeah, those *are* Lovely pictures of books and libraries and stacks. I would love to visit each of those at least once. For a year each :-).

Sunday, August 27, 2006

per-site color preferences in mozilla would be nice

I was surfing over to phpPatterns and I couldn't read it. It was in some sort of white over green font that was hard for me to read. At least it wasn't white (or light yellow) on black. But it's still not pleasant to read.

On the other hand, I *really* want to be visiting there regularly. I didn't want to have a separate firefox profile just to view that one site (or those sites which are hard to read due to their color schemes). I might still do that, but I'd rather avoid it if possible. So I thought I'd just go into Preferences|Content|Colors and force all pages to display in black on white.

That works OK. It messes with the tabbed style on, for instance, blogger, so that the line between the editor form and the Edit Html and Compose tabs is not there. I'm sure it messes with lots of other stylish pages too. I think I'm liking this though. I'll stick with it. If I start missing the styles on other pages I may go back, and then I guess I'll read phpPatterns on My RSS reader instead. I don't think that works right though, since I'd need to follow links, and those links wouldn't be in bloglines but would be straight to phpPatterns, and the problem re-emerges.

I would be nice if site color preferences in firefox were configurable per-page. So that when someone's pages goes unreadable, I can force the colors, but just for that page.

Thursday, August 24, 2006

Re-engineering: Assertions versus Unit Tests

I'm working on re-engineering a whole lot of badly designed code at work. The code has been mostly working for 4 to 5 years (some of it seems to have been working since 2000 even, if I can go by the author:date comment at the top). Part of the problem is that many things have changed in the business since (any particular/the) code was written. Another part is that the database design was optimized for uploading (easy to identify all the new records for the day to be uploaded), but it is incredibly anti-optimized for querying. There are around 24 basic transactional tables (it could be whittled down to around 12-15, probably, that's another issue, related things should be together in one table) and every day, an additional 24 are created with the date (in mmddyyyy format, not even yyyymmdd) tacked on.

So creating a multi-day report involves running the same query over multiple tables and processing the data on the client side, or it involves a union over many tables.

The first part of the re-engineering involves creating suitable abstractions over this mess (and no, it's not fixing the table structures to be more sane, that comes later, there's too much other code that depends on the insanity). The abstractions will allow us to provide a more reasonable view of the DATA (hiding the database details) to higher (reporting) levels. This is at some cost in CPU processing though. We use PHP, and it's just not very efficient when shoveling a lot of data around while transforming it.

I'm working on basic classes that will interact with the database (everything else interacts with these classes, no higher level code will touch the database directly). I'm finding a problem though. The code aggressively checks parameters and (for anything complex), invariants inside the methods. Checking is done through assertions. At the same time though, I developed a UnitTest class (a simpler replacement for PHPUnit and similar). And when I develop unit tests, I find myself wanting to write tests that make sure the assertions are working. But the assertions can't be tested if assert.bail=1, and the assertions would get much longer if I wanted them to return failure codes to the unit testing framework (because assertions should be simple, just make the program fail immediately).

I would also like to know *which* assertion failed (my parameter checking is pretty paranoid, so there are many checks upon method entry, even if there's only one paranoid, it might even be that there are checks for object validity even if there are no parameters).

ifdef-ing (well, if(defined(....)-ing), the assertions when unit testing doesn't seem to be the right thing to do either. But I haven't really thought that through yet. Maybe I'll try that and see what falls out.

I wonder how others have resolved assert versus unit test (actually, assert versus the goal of complete test coverage, or close to it). Hmmm, I'll google that over the weekend, maybe.

Saturday, August 19, 2006

Jerry Pournelle's Byte column -- moved

It's too bad that Jerry Pournelle's contract with byte was not renewed. I've always enjoyed his columns (well, back when I could read them in dead-tree byte, and for a while, on free online byte). It's good to see that he's keeping the format though, and maintaining the site for himself at Chaos Manor Reviews.

I'd like to be able to subscribe, but that's not possible (the subscription is not inconsequential in the third world, even at my pretty good IT salary). I don't have a credit card, and even if I did, I'm not sure the transaction would be honored. Philippine credit card transactions are tricky.

I do hope that he keeps enough subscribers to make maintaining the whole chaos manor system worth his while. There are always good discussions there. I don't go there as often as I used to (due to Digg and Reddit, but it's always educational to swing by every few weeks to see what's being discussed.

Sunday, August 13, 2006

ubuntu wifi

I'm at Bo's coffee club in the Robinson's Galleria mall. I would put links there but frankly, I can't find the official websites with some very half-hearted googling. I'm running ubuntu on a toshiba laptop with an Atheros AR5212 802.11abg wifi card and ubuntu detected the wifi card flawlessly.

I also installed ubuntu on a winbook laptop with an Intel wifi chipset (it was a 1.5Ghz centrino system) and that was directly detected too. I'm not sure how ubuntu did that (with the intel system), since the intel firmware is required for that. I wasn't paying very much attention, just installing stuff and downloading packages from the network. Maybe it auto-downloaded the firmware upon detection. Or maybe it came with EasyUbuntu.

That probably cements my transition to ubuntu from mandriva. The main factor in the transition is the rate of development. Ubuntu has working svk packages (I never could get svk working correctly with Mandriva, although the packages are there) long before Mandriva does. And I never could get wifi working with Mandriva (although, to be fair, that was me being lazy, if I were younger and had more time and patience, I'd have figured it out long ago).

I never have been a linux fanboy. I'm a pragmatist. Whatever works better, that's what I use. Windows doesn't work for me since it's impossible to keep secure and, frankly, I can't afford the MS-Office license (windows itself I could afford, barely). Now, I *do* get a valid and legal MS-Office license with the laptops I purchase from the U.S., but then they have that pesky "legal only in the U.S.A." clause, so even if they're legal there, I can't use them in the rest of the world. At one point RedHat was the distribution to use (and before that, slackware). But now RedHat is moving in the direction of enterprise systems and I don't want to be in Fedora (where I would need to be if I wanted svk, etc).

I moved to Mandrake/Mandriva because of the superior package management, but now the speed of development isn't quite right for me. Ubuntu feels right. Although I wouldn't be surprised if I were to move on to something a bit faster later on.

Ah, but this post was not supposed to be about reminiscence. Instead, I was going to celebrate finally getting wifi working. I'm sitting in Bo's coffee club and for the price of a Cafe Latte I'm able to connect to Robinson's free wifi. The DNS is very slow (one reason why I don't have links above, the other being mentioned above), but surfing to this or that site, once it's been resolved, is pretty good. The DHCP server provides two IPs (the first of which does not reply to ping). That's a bit incompetent. But removing it from the list doesn't speed up DNS queries much (or at all, as far as I can tell). Once DNS resolves though, things are pretty good.

Connection to my office VPN is pretty fast (faster than at home), so I guess they use either meridian or PLDT (my home connection via Destiny Cable Internet takes a roundtrip overseas, so the VPN connection is reasonable, but not this fast).

OK. nmap -v says that the first DNS server *is* up, just not replying to ping.

I'm very happy to get wifi up and running, finally. Previously I didn't care very much. I still don't, really, since fast internet is available at work and at home. But it's a great convenience to be able to connect to wifi networks.

Next thing to learn, I guess, is how to connect to closed networks. It's been 4 years since I last had anything to do with that and it was a pain then (even with windows). I'm sure it's much easier now. Haven't done it at all though, yet. So there's going to be some learning involved.

Friday, July 21, 2006

switching svn URLs with rsync

I've got an svn working copy of close to 300MB. I'm across the internet from the source and I didn't want to check the whole thing out again. I had originally checked it out with svn+ssh:// but I wanted to switch to svn://. I needed to switch because svn+ssh takes longer than just svn://. The ssh part needs to reconnect with every svn * command, and I didn't want to pay the 2-3 seconds of that ssh negotiation. There might be a way to use a persistent ssh session, but that would look dumb, providing no beneft.

I thought maybe svn switch might do the trick, but after some trial and error, and actually reading The free svn online book show that switch doesn't work the way I thought. While trying stuff out I also looked at --relocate. There's an example of svn switch --relocate here, but that doesn't do what I need either. I'm not moving anything to a branch or elsewhere in the repository, I actually want to change the protocol used to communicate with the repository.

Finally, I fell back on rsync. I use rsync a LOT. It solves a lot of problems that I run across at work (and in the occasional consulting job). For this task, I first did an svn co svn:// checkout *at* the same computer as the repository. This was just so I'd have a working copy with svn://, normally, I use file:/// there.

Once I'd got a checked out copy using svn://, I rsynced that with my remote copy (which used to be svn+ssh://). There was probably something slightly wrong with what I did. It correctly synced the subdirectory .svn directories to use the new protocol, but the root didn't get changed. So I rsynced that .svn directory separately.

Total bandwidth cost of the rsync was around 4MB received and 2MB sent. Since the whole checked out working copy is 327MB, that's quite a savings. It actually took less time to do the rsync than my previous attempt, which was to tar and compress the remotely checked out svn:// working copy. I was then going to scp or rsync the archive over. When the archive blew up to 60MB though (and still going), I gave up on that. I certainly didn't want to copy over 60MB if there was some other way.

Wednesday, July 12, 2006

Switching to ubuntu -- but with icewm

My desktop computer (inherited from my brother-in-law) always had problems with Mandriva 2006 installed on it. These were mainly graphics problems, e.g., on drakconf, rpmdrake, gaim (some KDE programs too, so it wasn't necessarily just GTK), icons in windows would display to the left of where they were supposed to show. This wasn't a big deal at all, but it was irritating. And there was the svk problem (which I solved as below, by installing from CPAN, but it was another thing to worry about).

I thought I'd try some other distribution. I'd installed Ubuntu and Kubuntu at work. There were problems (mainly because we didn't understand what we were doing). Ubuntu and Kubuntu apparently come in one CD only. I'm not sure if they come in DVD or multiple CDs. In any case, at work both installed great looking desktop environments but then we spent some time figuring out how to get postgresql-8, php-5, devel tools, servers etc on there. Now, after looking at the Ubuntu website, it looks like there are different install CDs for desktop and home use.

At work, I didn't really spend enough time to figure out how to do all of that, so we just ignored Ubuntu and we now use debian (with some old versions of software, see above, installed from source or from debian-unstable) for the server and Mandriva or Novell Linux for the desktop.

On my home desktop, though (I've got Mandriva 2006 on DVD, but the computer doesn't have a DVD reader), I first installed debian (via network install, just for kicks). That worked out pretty well, but I decided not to stay with debian-stable because the versions of software on there were just too old for me. I tried debian-unstable, but I gave up on that since I got confused by some things. some things were also, well, unstable but those were just me not knowing what I was doing and getting confused).

I had upgraded to debian-unstable by just changing the apt sources and then upgrading everything automatically. That worked well enough (better than the similar technique would have in Mandriva), but I didn't like something about the whole thing, so I upgraded once more (same technique) to ubuntu. That worked pretty well and, while I had packages from three different versions of debian, I didn't really have any problems. The icons problem went away and the whole thing was much more responsive than it had been under Mandriva.

For the future, I'll be leaning toward ubuntu for my desktop computers. I haven't made up my mind yet about servers.

I don't like Gnome or KDE (preferring, instead, icewm), so the first thing I do after I've gotten a good install (well, I've only done this twice so far, but I'll be doing it a lot more, I bet) is to change the apt sources and install icewm, rxvt, etc, and remove as much of gnome and KDE as possible. I keep individual gnome or KDE applications though (quickly coming to mind are evolution, kuickshow and k3b).

I doubt if I'll go to debian. The distro moves too slowly (postgres-7.4 instead of postgres-8, for instance), and I'll never get anyone to agree to install debian-unstable, just because of that name :-). Ubuntu, though, seems to have just the right amount of development/upgrading velocity. I liked the Mandrake velocity, I don't think I'd be comfortable with gentoo, I *know* I don't want to be using Fedora.

Long ago I switched to mandrake because of urpmi (and later, rpmdrake), and also because of the rate of development. I didn't really need everything that the Mandrake *drake software gave me, but urpmi's automatic dependency handling was great, and the rate of development felt just right.

All of that is still there, so I don't *really* need to switch distributions. The kicker though is that I use svk for working with version-controlled source code when I'm not connected to the internet. And I haven't yet gotten svk to work from the Mandriva packages that are available. Not sure why that is, it's probably just pilot error, but, on the other hand, debian and ubuntu make it dead easy to install svk.

On Mandriva, I usually install svk from CPAN, which is a major pain (that's another rant right there). CPAN has some major problems finding the right protocols and servers to use for installing things, last time I did that I had to figure out what it was trying to do and do them manually, putting the downloaded files where it expected to find them, again, probably pilot error or just not enough deep knowledge of how to configure CPAN installations, but if ubuntu makes something dead easy, I'm switching as long as nothing else I absolutely need is missing.

Saturday, June 24, 2006

byte.com -- accessible

So I still think that Byte.com still has a dumb login/logout interface but at least now it's letting me in. Previously, I would type in my login/pass (the one they had emailed me) and that wouldn't work. That was after the point when the ddj page would say that I needed to login but wouldn't give me the form to log into.

Today (about a week after I first encountered the problem), the site finally showed me a login screen so I could login with the login/pass that had been emailed to me. I think it might have shown me a page before and I couldn't use it because the password was wrong. Anyway, there was a link to have it email me another password, so I clicked on that sometime last week, logged in today, and that works now.

I would still be annoyed, except I've almost forgotten the problem now. I'll be annoyed next time there's some non-deterministic behavior on that site that makes me unable to login again, or do something similarly obvious, but perhaps implemented in a non-obvious way over there.

Maybe things got fixed earlier than a week ago, but I've been to Banaue with my wife, mom, sister and her entire family. That was a great trip. I wish I could afford to do that (timewise, since it's really not that expensive a trip) more than once a year or so.

Sunday, June 18, 2006

byte.com's incredibly bad interface

So I pointed out that byte.com is now free. That's great. A few days after that post though, I'm seeing the downside. That web site has an incredibly bad interface, at least with respect to logins and logouts.

1. There is no logout button or link. or even any indication, as far as I can tell
that one *is* logged in or logged out.

2. There is no login form either. There's a form to subscribe to the free
newsletter, but no login form.

So I guess the only way to login is to click on a protected article (any article, really, and hope that it's protected, since i'm able to view some articles but not others and I sure can't tell why there's a distinction except maybe age of the article) and if it brings me to a page that says it's protected and I should login, then to login.

That's what happened when I first tried it out for free a few days ago. It didn't even ask for registration details, just wanted an email address and then it let me in. Then it emailed me with my login and (random) password.

That's all great so far. Except now, since it won't let me logout, I keep getting a page that says I should register. Except I've *already* registered. I think maybe the authentication credentials are wrong (after all, it just let me in that first time without me having to first check my email, maybe it didn't insert my randomly generated password in my session yet). But now I can't find out where to logout so I can login properly.

I *could* throw away all my cookies and saved passwords, but I'm not about to do that for one free site that has bugs in its authentication system.

Of course, though, I might be unfair here. Maybe the 1 year old article I'm trying to read is just protected and I need to pay for it (similar to how the NYTimes has things setup, and why I never go to the NYTimes site anymore, nor even link to it), but the page it brings me to sure doesn't seem to say that. It won't even let me login, instead asking me to register again! But then remembering who I am and pre-filling the form with information I already posted before (but strangely enough, with the wrong country, I remember registering as being from the Philippines but it thinks I'm in Anguilla, which might be a firefox bug since I often register as being from Anguilla to other sites, e.g., the Washington Post).

So, OK, maybe I'm just confused. But if that's the case, they still need to fix their site. If they can get me confused, I've got to wonder what their paying customers think of the ease of use of that site.

Thursday, June 15, 2006

evolution outbound filters and mangling the Sender

Related to my previous post, I'd always thought that it would be great if the Gnome Evolution email client were capable of mangling the Sender in the email based on user-specified conditions. If that were possible, then I'd be able to setup an outgoing filter so that if at least one of the recipients were those mailing lists then it would change the Sender to my mailing list address.

There would still be corner cases where that wouldn't work (but where an elaboration of it might work, e.g., instead of mangling the sender directly, it might create a new email with the sender being corrected, but leaving the original email intact).

I could probably do something like that with the "Pipe to Program" action, with some sed magic thrown in. But, no. It *is* a useful inconvenience and I'm too lazy to figure it out. Hehe, long ago when I was wishing for that feature, I already knew I'd be able to do it with Pipe to Program and sed. I was too lazy then too :-).

A useful inconvenience

I have several email addresses and one of them I use for subscribing to mailing lists. I never use gmail for reading mailing lists (instead i download it via fetchmail and use evolution to check the local POP3 server). This is mainly because I just don't have the time to monitor mailing lists anymore.

Evolution uses my default email address for sending mail out (unless I change the From address explicitly). That isn't the same as my mailing lists address. That's inconvenient, since often I'll send out a reply to a list post and it'll bounce back because I forgot to change the From address.

I find though, that the inconvenience is useful. It's an additional brake on emails that I might not actually want to send out yet (they need editing, the tone is too strong, the tone is too weak, the point is missing, whatever). It's too bad though. Sometimes I'll write a long, informative, interesting (maybe) post, and then I'll delete it when it bounces back because it doesn't pass muster. But that's the nature of useless posts. They're a waste of time, created in the heat of the moment, starting as a rant and ending as a flame, or the other way around, moving from to whimper, or just wandering around, pointless and dumb. If they're a waste anyway, better not to pollute lists (and the impressionable) with them, or feeding the trolls.

byte.com is free!

I read Jerry Pournelle's Chaos Manor proto-blog regularly. Not as much as I used to, I've been busy, but regularly. This week he says that Byte.com is now free. It used to be free and for many years I read it online. Before that, I was in the U.S. and i actually had a subscription (along with subscriptions to Scientific American, Dr Dobbs and various C/C++ magazines.

Byte became payware for though, so I stopped reading Jerry's column there, instead switching to the main blog. Now that it's free again, I'm going to have to read several years of articles :-). Jerry is always fun, so I'll read all of that. The articles though, I'll read only if they're relevant and not too dated.

It's too bad that Moshe Bar's column ended, I enjoyed those, although they were quite a bit over my head (probably still are, since I'm not as hard core as I'd like to be).

Thursday, June 08, 2006

devshed rss

I use Bloglines as my RSS reader (too lazy to find standalone RSS readers for Linux). I had DevShed in there, but I find that it doesn't seem to be a good RSS citizen, or maybe I'm just doing something wrong.

With other feeds, as soon as I even click on the folder in bloglines, the articles are marked read and when I go to bloglines again, I don't see those articles as unread anymore. Devshed articles aren't like that. The same articles keep coming up as new even though I've read them already. Not sure what that's about. I've already removed it from my list of feeds and I don't care to add them again to debug the problem. My guess is, though, that they allow comments to update the feed. Most other sites (e.g., digg) don't.

It's really no big deal, but even the little effort required to look there and notice that I've read the articles already is too much for me, I'd rather not see anything there anymore than have to go through tomorrow seeing the same articles about how to optimize mysql or how to implement a socket server in php.

Creative Zen mp3 player and gnomad2

We got a Creative Zen Xtra MP3 Player as a gift from my brother. It was to be just my birthday gift, but it's so perfect for just now (sol is pregnant, for a bit she was on bed rest, very bored), and Sol's birthday is near enough to mine (which is still months away) that we thought it was the perfect gift for both of us for the rest of the year :-).

Unfortunately, the Zen doesn't look like just a USB external storage device, it uses some other protocol. I didn't want to be dual-booting into Windows (at some point I'm going to remove Windows from this laptop completely, I kept it around for the Zen) everytime I need to move files onto the MP3 player or to modify playlists and such.

Fortunately, GNomad2 implements the MTP protocol and allows file copying and manipulation of playlists and such. It didn't work for me the first few times I tried it, but after a week of looking at it off-and-on, I finally got it working (devfs was the charm). It might be possible to run it without devfs, but the RPM I installed it with didn't seem to have the nomad.usermap file on it, so I just boot into a devfs capable kernel when I need to copy files. And since I don't want to dualboot, well, I just always boot into a devfs capable kernel these days :-).

Things to remember, and the point of this post: DEVFS.
Also: less laziness and maybe building from source and actually reading the documentation might make DEVFS unnecessary. Unfortunately, the laziness is systemic.

Thursday, June 01, 2006

Frameworks and new development

At work the IT team is re-implementing/improving the software system for managing the business. The head office software and database schemas are going to be replaced entirely, and the remote site software is going to be partly re-implemented, but we're keeping the remote database schema as is for now. Fixing *that* monster is probably for next year.

It's exciting to be rebuilding the system since the current system is so much of a mess.

After a week and a half of checking out various php MVC frameworks we've settled on Symfony. We looked at a lot of frameworks (including
one that was developed by the contractor we're working with). Many were cut because it didn't look like they could serve our needs, they had one or two vital features missing. Even symfony was almost cut, except we figured out a way to get around its assumption that primary keys are serial (so, never editable, in its generated CRUD scaffolding). CakePHP was cut because it doesn't seem to support compound primary keys. I really like Prado but it seems to have a sufficiently high learning curve that I'd have trouble training new developers to work with it. I'll use it on some personal project at some point, but we won't use it at work.

Some other frameworks (Seagull, QCodo, etc) we tested for a bit, but when we encountered bugs or couldn't make them work with our system (postgresql, pretty complex database structure) we gave up on them.

The framework will be a help, but I'm more excited about the great work we'll be doing with architecting the business logic and database layers so that we will be able to more easily fix/modify the database without needing to modify higher layers. Or if the change propagates that high, that we'll be able to easily determine which database service providing classes/methods need to be changed and which reports need to be modified because of that.

Doxygen (or PHPDocumentor) will be a big help with all this too, and subversion/svk, of course.

We're also architecting how much of the business logic should sit inside the database. I'm aware that some people (a lot in the Oracle camp, and in MS SQL Server too) like to put all their business logic in the database (some to the point of not allowing access to tables directly, instead allowing access only to functions which are the ones that access the tables). I'm not on that side of the issue.

The old database went too far the other way, having pretty much no primary keys, no check constraints, no referential integrity constraints or (in many cases) even not null constraints.

I'm leaning toward something in the middle, where the database has enough business logic (constraints, primary keys, referential integrity and some triggers for complex constraints) to protect itself from some program bugs and invalid inputs), rejecting transactions which fail those constraints. The user level programs will check their inputs and reject invalid inputs, etc, but the database will reject anything that gets through the user level programs.

This will involve some cost due to doing some of the work in both places (the programs and the database constraints need to be in sync about what is or is not invalid, I'm not too concerned about the execution cost since good database design and insightful use of indexes will mitigate those), but I think it's worthwhile. Otherwise, the database is at the mercy of buggy programs.

Friday, May 26, 2006

Slow computers

My brother-in-law, my wife's younger brother, is blind. He's a good programmer though, great brains in there. He uses MS-Windows with JAWS (screen reader software). What he can do with windows is amazing, considering that it's so graphical.

He used to use a 400Mhz PIII with 192MB of RAM. I've inherited that PC since I found a used PC that is much faster (an AMD 1.7Ghz box) and gave that for him to use.

I run linux on the slow box, with icewm for the window manager (I use icewm everywhere, even on the fastest computers because I can't stand the slowness of Gnome and KDE). It runs everything reasonably well except Firefox. It's great to use Firefox on fast boxes, but it's dog slow on a slow box. I've got to look at Opera for that box.

I don't remember running openoffice on that box. Maybe I've done it, I don't use very complex files though, so I wouldn't have been stressing openoffice anyway, even if I ran it. I'm sure it's slow, but it's probably usable. I tend to use gnumeric for spreadsheets and I just don't use a wordprocessor that much though. So on a slow, memory-constrained computer, careful choice of office software can make the computer usable.

The PIII box used to run windows 2000. It was reasonably usable, but I'm not sure how well it compared with linux. It was probably about the same speed, or maybe a little slower (because of the antivirus reading everything before handing data off to applications reading the disk). On the other hand, it was *far* less capable than my current setup (apache, php, C, C++, Java, ruby, scheme, guile, postgres all running on it, plus all the client tools I use).

I've got to find an alternative to firefox though. Or wait until someone comes up with a comprehensive howto on how to minimize the memory and CPU that firefox uses.

Friday, May 05, 2006

philvbdev

I subscribed to the philvbdev yahoogroup to see if I could hire VB developers off there. The company I work for is hiring. After I posted though, I started getting all these emails from Indian freshers. There must be a network (or a bot) where newbie South Indian software developers get email addresses and the email addresses spread like wildfire. It's bad enough that if anyone sends me anything with "fresher" in the subject, I'm not going to see that email at all, it'll be automatically deleted [not even spending any time in the purgatory that is my spam mbox or trash].

It's a waste of time for them to be emailing me because we need the developers for local employment and we're not going to be paying anyone to come over from a foreign country to work here (with the attendant problems with visas and such).

I joined the group because I thought it was a forum for philippine VB developers. It looks like it's been abandoned by filipinos though. Very few posts are about VB problems or solutions. Mostly what I see (what I saw, I've since unsubscribed) are spam posts about certifications, certification questions and training, etc.

Wednesday, March 29, 2006

laptop computer speed

My cousin-in-law asked me last week what I'd choose between a computer (laptop, actually), with a 1.xGhz CPU and 512MB of RAM and a 2.xGhz CPU with 256MB of RAM. I said I'd choose the first.

I've been monitoring CPU use on the laptop I currently use (icewm makes that really easy, with CPU and network monitors on the taskbar), and I just don't see a lot of CPU peaking. There are occasional peaks when a scheduled rsync job runs (copying data over a vpn from the office to my laptop, cheap remote backup), and when i run xboard+gnuchess. When I test one or another program I'm developing I may see some peaking, and sometimes when I run vmware (not a lot of that though).

Mainly, though, my CPU usage stays pretty low.

I recently finally built a kernel that works well with my mobile AMD CPU and now I'm able to set the CPU maximum speed to a large range of speeds (.66Ghz to 1.7Ghz, plus 6 other speeds in between). I've also got the cpufreq/scaling_governor set to ondemand. The computer pretty much ALWAYS stays at the slowest speed all the time except when running xboard+gnuchess or very occasionally, when something CPU intensive runs. When an application needs more CPU, the CPU speed ramps up immediately, and when it's no longer needed, the CPU slows down to .66Ghz.

From the fact that the scaling_max_freq pretty much stays at the slowest speed except for very short periods, my opinion on which laptop to buy is reinforced. The icewm monitor also displays related information (e.g., how much IO is occurring) and I find that the laptop slows down more due to waiting for IO (well, laptop drives are slow) than due to needing more CPU. So more RAM will help with the IO requirements (the OS can cache more of the disk), speeding up the general experience, more than higher CPU speed.

Actually, his original question had to do with a celeron (or some other internal-cache-challenged CPU) versus a full CPU at the same or a higher frequency. I think that RAM is still more important, generally, than CPU speed.

Of course, if the main use of the computer is going to be for photoshop or corel draw effects or CAD and 3D rendering, then I'd go with all the CPU power I could get PLUS all the RAM I could get. but for general use, even for a software developer like me, any 1.3Ghz box with more memory is probably going to be the better deal than a higher frequency box with half the memory.

Wednesday, March 08, 2006

mozilla extensions

I've been wary/skeptical for a long time about mozilla/firefox extensions. Firefox is a memory hog as it is, and there are some hard to isolate weird CPU hogging things in there too. So, while I love the browser, I go out of my way to NOT make it any more memory/cpu hungry. And in general, for complex things that have weird behaviors, I prefer to keep things as simple as possible so that problem isolation is easier, or at least that the chances of tickling a bug are lower.

Late though, and strangely enough, because I saw a link to an article that discussed how to make firefox eat less memory and use less CPU, I followed another link and decided to install a few very conservatively chosen extensions. I'll add links at some point, for now, just discussion.

NoScript is great. I have it block all javascript by default. And then, for sites that need scripts, I turn javascript on either temporarily (current session) or permanently. Very nice since most of the time I don't want javascript and it just slows things down anyway.

AdBlock works pretty well. Some ads get through and it's not convenient to point them out to AdBlock, but all in all, it's great. I wish I could *unblock* google ads. They're not intrusive, no moving graphics and crap like that, and they're generally relevant, although I rarely click. I'll figure out how to do that at some point.

Tab Mix Plus is great. There are lots of features there, most of which I don't know how to use. The main thing for me is that it has tab memory like Opera has. That is, it's possible to configure it so that when the browser starts, it will auto-load the tabs that were there when the browser was last closed. It also saves tab status while browsing, so that firefox and crashes (rare, but not completely eradicated) don't completely wipe me out. If I've downloaded 15 tabs to view, then tab memory will bring them all back up when I restart firefox. It seems to work better than Opera's memory, but I haven't used Opera in 2 years. I'm sure Opera has made their tab memory much more robust by now, although it was already a great and wonderful feature when I used it back then.

PDF download is convenient since I've never been able to stand acrobat reader displaying itself in the browser and then taking up so long to start. Not that I have acrobat reader on this computer anymore, or any computer I use, but the extra control over PDF downloads is nice. This is a relatively low profit item for me though, I may drop it.

Sunday, March 05, 2006

Work hard, crash

I've been working very hard for about six months now, since I took over management of the development team at work. Previously I'd been a consultant, project based. That was fun. Lots of work, but no management responsibilities.

The company insisted on hiring me on though and after some resistance, I accepted. Things are still fun, particularly when i get to code, but there's too damn much coordination and meeting. All very necessary, but it sure reduces time available for programming.

After six months of this though, reality catches up. I got a very bad sore throat some other stress related infections and just general weakness (due to the combined infections, probably). This happens to me, once in a while. After a sustained period of late nights, not enough sleep, stress, my body crashes for a short period. If I don't get enough of a break, my mind snaps too and I quit ;-). Not going to happen with this job, but it's sure happened before.

Five days of rest have made for an improvement. Not enough though, still going to be weak tomorrow. Damn. I'll rest again on Wednesday.

Tuesday, February 21, 2006

php+postgresql+cursors

I ran into a problem recently because PHP was dying when processing a query that returned some 20 million rows. PHP was running out of memory.

After some googling and running the program several more times (with some runs involving increased memory limit for PHP), it became clear that PHP's postgres support code is loading whole result sets into RAM (and feeding it to PHP one by one, when the PHP code calls pg_fetch_row or pg_fetch_array) rather than using cursors so that it can load one row at a time. I don't know if ADODB or PEAR do better. If they're still using the underlying postgres specific functions though, then they're still going to have the same problem.

The solution is to use cursors explicitly. There are some caveats.
1. Using a cursor requires using a transaction. This isn't onerous, just
something to remember.

2. It's necessary to check after the FETCH to see if there are no more rows.
There's probably a prettier way, but I just go with an infinite while loop
that breaks when there are no more rows.

// assume that $conn is the db connection and that $q is the query, the ff is typed
// straight into blogger, might contain minor errors.


pg_query($conn,"BEGIN WORK");

$cres=pg_query($conn,"DECLARE my_cursor CURSOR FOR $q FOR READ ONLY");
if(!$cres)
{
die(pg_last_error($conn);
}

// query succeeded, fetch the rows from the cursor
while(true)
{
$res=pg_query($conn,"FETCH my_cursor");
if(!$res)
die(pg_last_error($conn));

// quit loop if no more rows in cursor
if(pg_num_rows($res)==0)
break;

// there's a row, get it and process it.
$row=pg_fetch_array($res);
process_row($row);

pg_free_result($res);
}

pg_query($conn,"CLOSE my_cursor");
pg_free_result($cres);

pg_query($conn,"END WORK");


fetching rows one at a time, above, is for illustration only. of course,
for efficiency, one would fetch many rows at a time and loop through them.

Saturday, February 18, 2006

Title Wrong - ridiculously

So in my previous post the title says that java studio is ridiculously slow, and then I forgot to mention that fact in the rest of article. On Saturdays I lose my brains.

Anyway, when my colleague was showing me the class diagrams that he developed in Java Studio, I couldn't stand the fact that scrolling across the diagram space, zooming in, and dragging items from one place to another was slow. It wasn't really unusably slow, but it was irritating to anyone used to how fast dragging things in Konqueror (itself slow enough that I never use it, even on a fast box, but a good comparison because, despite it being slow, Java Studio is slower) is.

Maybe things will get better on my own computer though. I'm still waiting for the sun Developer Network site to let me in though, before I can download and test Java Studio Creator. Maybe under strace, in case that can show what's making it slow (I doubt if it'll help, but it's a weekend, I can spend the time looking at strace :-).

Java Studio - slow - ridiculously

A colleague at work was using Sun's Java Studio Creator for some prototyping. He was running it on the LTSP server and I noticed that it would sometimes eat 100% of the CPU (well, 50% of the CPU since it's a dual-CPU box, the culprit to be mentioned further on doesn't seem to be a multi-thread issue, at least) continuously, while it was doing nothing (everyone had left for home, no one was working in Java Studio anymore).

That seems like a stupidity, and I thought I'd download it to my own computer so I could see if the behavior would manifest there also.

I've subscribed to the Sun Developer's network though, and I can't log in to where I can download Java Studio. They haven't sent me a confirmation email yet, I can't log in. Seems like some slowness over there at Sun. probably due to java. Damn, I'm getting disillusioned with working in Java.

Friday, February 17, 2006

First rule of debugging

My wife and I came home from Tagaytay and there was no internet. The cable modem had the receive light blinking and the send and online lights off. OK, I had to get to work and didn't care enough to fix it right then (mistake, Sol wanted to work at home, but it's a short walk to work, so no huge deal). When I got home, the cable modem was still down. I texted the destiny cable help line and went to bed. In the morning, same thing.

I texted destiny again and they called me (but I didn't answer since my phone was on silent+vibrate).

And then I looked at the cable modem and realized that the cable wasn't connected. In removing the laptop power cable for the trip to Tagaytay, I had moved the cable enough that it disconnected from the modem.

I had violated the first rule of debugging, it's *MY* bug. Almost all bugs are my bug. First thing to do is check if I did something wrogn before reporting to anyone else that they have a problem with their system.

No big deal, but useful to be reminded.

Tuesday, February 07, 2006

cpufreq on an AMD 2400+ - strange

I rebuilt my 2.6.12-15mdk kernel, with a Pentium III/Celeron/Coppermine ... CPU processor family because when I built it with either AMD K6 or K7, it was just not showing the right CPU frequencies to choose from in cpufreq.

Or maybe it was, but I didn't like the choices as they all seemed too high.

cat /proc/cpuinfo says (among other, snipped, things):

vendor_id : AuthenticAMD
cpu family : 6
model : 10
model name : mobile AMD Athlon(tm) XP-M 2400+
stepping : 0

Maybe there's a problem with the detection of the CPU frequencies. Or maybe the frequencies it shows are right (unlikely, since the lowest frequence is around 1.3Ghz or so, and the highest is much higher than 2.4Ghz, even though the CPU only goes to around 1.7Ghz in actual clocks.

Anyway, no big deal. I've got a working kernel, it does swsusp, it does cpufreq, it does most of what I need. The only thing I'd really like now would be if I could get the damn ALSA sound working. When I load the ALSA sound driver for this VIA82cxxx, I get no sound, and when I load the OSS driver I can't change the volume in kmix. I can get around that by using headphones with volume control, but I just want the ALSA sound driver to work right :-).

I *did* get it working at some point, or maybe that was with the Mandriva 2005 kernel, 2.6.12 something.

Hmmm, might be time to build a generic kernel now.