Sunday, September 30, 2007

OLPC -- Inspiring

There's a great post on OLPC uptake and how it's changing the school experience in Arahuay, Peru.


This is very inspiring. If all (or even half of all) social/economic development projects were to end up 1/10th as effective as OLPC-Peru/Arahuay has been, the world would be incredibly good.
Antonio is repeating second grade, but:


The first day Antonio came to school with his XO laptop, instead of playing during recess, he kept on exploring the XO. He quickly became one of the most acquainted with it and empowered to help his classmates by telling them how to do this or that. The change was suddenly and highly noticeable. His teacher was amazed of how he had become more focused doing the class work and was helping his classmates with the computer activities.

And Diego's story is hard but heartwarming.


Diego is in second grade. He had stopped coming to school. However, he did come to pick up his XO laptop the day we handed them out. His teacher explained to me, that he has no father, and that his mother went to Lima, the capital, to work. And now, Diego and his two brothers are living by themselves, but a kind neighbor gives them their meals.

After that quote, there's a picture of Diego helping a first grader on using the OLPC.


Emilio is a hero:


“Once, when the teacher was explaining to the children how to look up some new words in the on-line Diccionario de la Real Academia EspaƱola [the best Spanish dictionary, costly and practically out of the children's reach in book form]. Emilio understood the procedure and quickly looked up the entire list of words while the teachers and the other students, together, slowly went through the process. The Internet connection went down. The teacher made the best of the situation and explained what the Internet was and about the satellite connection, for them to understand what was going on, and ended by saying, "We will have to wait a little bit until the connection is reestablished." To which Emilio replied, "No need, teacher. I got all the words, and everybody can copy them from me.”

But not just any hero. Anyone can shoot other people or order peons to shoot other peons. Emilio is a hero of the mind.


It's pretty clear that the OLPC is inspiring kids to stay in school. At best, they get interested in school and using the OLPC. At worst, they'll stay in school because if they don't, the OLPC will be taken from them.

Saturday, September 29, 2007

high speed!

I've been very disappointed in my external USB laptop drive enclosure. Whenever I'd run my rdiff-backup or just a regular rsync backup (with --progress, because speed is what I'm nuts about) I'd get pissed off because the speed would only stay at arund 1-1.2MB per second.

Today though, I hit 5-5.5MB per second. I figure it was because I read some random advice (I think on the Philippine Linux users group mailing list) about plugging both USB jacks into the laptop because sometimes the power coming down one cable isn't enough (or the cable is substandard).

I'm a Psych major, so I never actually took any courses on electricity or electrical engineering (the computer science and computer engineering people do). I'm very glad though that, in fact, plugging in the second cable seems to have sped up the transfer rate. I may go buy a higher quality cable too as that is supposed to increase transfer rate too. For now though, ohhhh, 5+MBps is SOOoooo much better than 1+Mbps.

It's now fast enough that backups will be a joy, or only a minor nuisance, rather than the big production it used to be (I've got tens of gigabytes of data to transfer, it takes a while to rdiff-backup all of that, even if nothing's changed and all rdiff-backup has to do is read through the files to see if any bits have flipped).

[Update]
I don't think anymore it was the second cable (or the power issue). I think I probably got an kernel update that finally supports fast USB on this laptop. I experimented tonight, copying the data with both cables, and then with one cable. The speeds are the same. I'm still very glad to be getting 2.5-20MBps, all speeds are a lot faster than the 1.0-1.2MBps I'd get previously. But it's not the cable. That's too bad, since waiting for the kernel to get fixed for a particular laptop chipset is harder and less deterministic than just plugging in the other cable :-).

I'll keep the other cable plugged in though, still. I'm sure it'll help with stability, if the external drive gets enough power instead of sometimes getting too little.

Saturday, September 22, 2007

Wrestling with pure eeevveeeel

Jeremy Miller has a whiny rant on a project he's working on.

The whiny part isn't from me, it's from his comments section.

I've been wrestling with pure evil too. It's not java, and it's a whole program that's too long, not a function. there's only one function in there, it's called doublenum, and it doesn't arithmetically double its parameter (which would be at least descriptive), instead it doubles its LENGTH. sometimes. by prepending a zero if it's only one digit wide.

so, applying some items from Jeremy's rant,


  1. Very long method [check, although it's not even a method, it's just PHP code that's not in a function, so everything is global]

  2. Variables are reused for different things throughout the method [check, but at least variable names aren't totally bogus, although it does use a class prepended with the initials of its author]

  3. Lots of if/then exception cases [check - in fact, since there are about 8 common cases, the whole program is bunch of nested if/thens with one code path being one of the common cases, with exactly the same actions copy-pasted all over the place, plus deeply nested if/then exception cases peppered through the code (copy-pasted again), and at the end, just one of those common cases will be executed]

  4. Deeply nested Arrowhead code [ check, except its several arrowheads ]

  5. All data stored in Hashtable's -- and it's this data that's getting bamboozled somewhere [check - plus parallel PHP arrays (integer indexed), plus parallel (sort of) associative arrays where the keys are concatenated strings of the subkeys, plus some two dimensional arrays where the concatenated string keys previously mentioned are split up and make up the two dimensions]

  6. Numbers that come out wrong - [ check, or, actually, a really obscure bug where orders get their order quantity zeroed ]



And for extra bogus points:

  1. $r=7-($cday-$x);
    $mt = sprintf("%2d",chop(`date +%m --date '$r days' `));
    $dy = sprintf("%2d",chop(`date +%d --date '$r days' `));
    $yr = sprintf("%2d",chop(`date +%Y --date '$r days' `));

    what moron calls a system function THREE TIMES to get the pieces of a date? If they didn't know anything about time functions in php (or perl, this crap is all over the perl code these people wrote too), they could at least have called date once, and then extracted the fields out of there. But then, if they were too dumb to learn how to use the time functions (or even just to learn that they exist), I suppose it's too much to expect that they would figure out, The function names aren't helpful, but that's not as much a bogosity as forking to get the time, three times. Even selecting the date from the database would be better than this stupidity, and if they couldn't wrap their brains around that, at least fork only once!,

    date +%m-%d-%Y --date '1 day'

  2. oh, and that doublenum function? The only function defined in the whole program? It's actually inline in the code TWICE. But php doesn't do static analysis worth anything. Instead, if the two function definitions are in perfectly separate code-paths (it's never possible for both function definitions to be executed in the same run), then PHP sees no problem with that. PHP only points out function redefinition errors when the function definition is actually executed, e.g., no error here:

    function f()
    {
    return 0;
    }

    if(false)
    {
    function f()
    {
    return 31415;

    }

    }

My mistake with this abortion is that I've been trying to live with it, fixing it incrementally, when fixes needed to be done. I should have thrown this away long ago though, and just rewritten it so that it wouldn't be so bogus.

I was avoiding doing that because the horrendous date logic was totally obscured by the implementation, so it wasn't possible to extract what the date logic was supposed to do at all just from the source code (not possible for me anyway, I would get angry too early and give up). Now I'm going to have to understand the date logic and rewrite it all. Or if that can't be done, then use the current horrendous implementation as a test for proving that my implementation is "correct" (or at least complies with the current implementation, since I can't understand the current implementation, I'm not currently capable of proving *IT* correct. I can only take it as a given).

I'd better not meet any of the previous implementors in this or any future world. They're going to KNOW what I think of them, and then I'll hire them so I can fire them one second after.

Thursday, September 20, 2007

screen multitasking

I've known about screen for a few years. I never did use it until recently though. When I tried it out a few years ago I came away with the impression that it wasn't worth the trouble since I'd have to learn another set of keystrokes. When I tried it back then, I didn't like Ctrl-A or any of the other keystrokes that came with it (all prefixed with Ctrl-A).

Of course the activating keystroke (Ctrl-A) can be configured to be something else, but the rest of the keystrokes still seemed unnecessary cruft to learn.

I recently looked at screen again though and I've fallen in love with it. I don't use it as efficiently as I might. I still have a problem with the keystrokes, so I learn the minimum that I need (Ctrl-Ad, Ctrl-AA, Ctrl-A[, Ctrl-A],Ctrl-An, Ctrl-Ap, Ctrl-A", Ctrl-AX) and leave the other commands alone until I find that I need them. I still only have limited space in my brain for keystrokes, so I economize.

The problem with screen now is that I've got 8 processes running in screen at work (I'm working remotely because my baby is sick and I need to be home to help take care of him). I can barely multi-task 3-4 processes, and now I'm running 8. Of course, they're all long running database processes and once they're set up they run independently until they complete. I could then move on to another set of 3-4 things to do. I avoid doing that too though since when any in the first set of tasks finish, I'll have forgotten what the purpose of it was :-). And frankly, some of these tasks are going to take hours. I can't afford to keep adding sets of 3-4 tasks until I can't count them anymore :-).

I do find screen very useful though. Mainly, as a replacement for nohup. Now I can start a screen session remotely, run the program in screen, give it any interactive commands it might need (hard with nohup) and then leave it running. Previously, I'd have problems with running a program over ssh (when the vpn would burp, I might lose the session, I certainly couldn't control it remotely anymore). And nohup has problems with interactivity. Screen is a great help there since I can get interactivity, and if ssh breaks because of network issues, I just ssh there again and re-attach to the screen.

I didn't know about -x back then too, when I was first looking at screen. I like working with 3 or 4 actual terminals on my monitor. That's one reason I don't like KDE, it's necessary to manually detach a tab. In gnome, I can just turn off the menus (in fact, I use aterm now, but for a while I was using gnome-terminal) and have terminal windows automatically open in new windows rather than tabs (when I do Ctrl-Shift-N in one terminal it creates a new terminal as a separate window.

For screen, I just ssh to the remote box, screen -x, do that 3 or 4 times, and then have each terminal go to a different screen sub-session (Ctrl-Alt-" or Ctrl-Alt-n a few times).

I've even used screen for discussing/team-programming. The several people in the discussion all connect to the same screen session and then discuss, view, and test code while all looking at the same editor sessions, debug screens, etc. That works for command line work, or just basic code slinging. of course, testing in a browser, or in some graphical environment is harder. I've avoided those complexities for now though, although of course vnc is waiting in the wings for when I need it.

Tuesday, September 11, 2007

robinson's free wifi - going downhill

I used to love sitting in the Cafe Mediterranean in Robinson's Galleria because of the good food, cold beer, and free wifi. I've also used it downstairs, by Bo's cafe. At the time, the wifi signal itself was good. The DNS was slow, as was the actual connection, but I didn't mind since I wasn't downloading huge files anyway. It was a pleasant place to surf and have coffee or a snack.

The wifi at Robinson's Pioneer was always bad, one could use it, but it was intermittent and slow when available.

I recently went to both Pioneer and galleria and checked out the wifi situation (at Cafe Med at galleria, and at Old Spaghetti House at pioneer). The wifi is much worse than it used to be. Possibly the hotspot has been moved at Galleria, but when I was last there, the signal wasn't available at all at Cafe Med. There were some secured networks, but no free wifi. I didn't go to Bo's to test there. It wasn't important enough.

At Pioneer, Old Spaghetti House charges a ridiculous PHP 50.00 per half-hour just for plugging in to the wall. And the wifi was very slow and unstable when I was there and it's completely down now (right now, my wife is meeting a fellow developer there). We liked the coffee and dessert at Old Spaghetti House, but the cost of plugging in is sufficiently stupid that I believe I'm going to tell anyone interested about how Old Spaghetti House is extremely unhospitable to laptop warriors. I think it's great that they have wall plugs at all. A more reasonable price or policy would be cool. For now though, absent either, the policy is just dumb and I'm not going to give them any of my money again. Although if my wife insists on going there, she can pay for lunch. The creme brulee WAS very good.

ionice - redux

I love ionice enough to give it thre exclamation points but I'd been having some problems with it. Mainly, I was optimizing too much. I was using -c 3 on the assumption that everything would run much better if svn update or git-svn rebase was taking too much IO bandwidth.

This was a mistake. I would have problems when the ionice -c 3 process was running since -c 3 (idle priority) requires root access, so if it created files, those files would be owned by root since ionice -c 3 needs to be run under sudo or directly, as root (sudo su, root login or /etc/crontab).

It took me more than a month of frustration (and actually stopping the use of ionice, or doing a chown -R at the end) to finally ask google :-). I must not have been all *that* frustrated.

There's an explanation here of why ionice -c 3 requires root access.

So finally I'm convinced and will now run large svn updates (or my huge rdiff-backup based backup system to a slow external USB drive that only runs at USB 1.0 speeds) under ionice -c 2 -n 7. A few quick tests shows that performance of the laptop (no locking of X, etc) is about as good as with -c 3 but there's no need to run under sudo, so no problems with file ownership or permissions later.

Thursday, September 06, 2007

Downloading OpenSolaris sucks

After a good email from David Meyer, on the PLUG mailing list I decided to download OpenSolaris and test it out on vmware.

I chose to use the Sun Download Manager last night. I'm not going to do that again. The download was chugging along slowly but steadily but today it stopped. And the Sun Download Manager has this clunky, unintuitive interface and apparently some weird logic in there so that only one file can download at a time and there's a maximum number of retries and then a file will never be downloadable again. Possibly the internet went up and down today and it hit the maximum 10 retries so it stopped trying to download. I changed the retries to 20 though (the maximum, why is there a maximum??? If I want to keep downloading forever on an expensive and slow GPRS link, why not let me do it?) and it STILL wouldn't download anymore.

When I got back this evening it was still downloading, but at 5kBps, when I can normally get up to 80-120kBps on my DSL connection at home, I wasn't going to wait 70 hours or whatever it predicted the wait time to be.

Since I couldn't make it work anymore, I just nuked the whole Sun Download Manager thing and am currently downloading the files via the browser. I'm downloading all three files at the same time and they're all coming down at 30-80kBps. That should get faster as the night deepens.

This sucks though. I should get better speeds from Sun microsystems. I'll keep trying to download, but the piece of crap that is the Sun download manager doesn't inspire any confidence in me. Likely OpenSolaris will be very good. It's a very mature piece of work. I'll see if I get comfortable with it. But my expectations aren't very high just now. I hope I'm disappointed in my expectations.

Saturday, September 01, 2007

svn mv and atomicity - lesson learned

I was moving some directories around in $DAYJOB's svn repository and I wasted a lot of time doing it. The problem was that I didn't realize that svn mv requires that both source and destination parent directories need to be committed at the same time.

I needed backward compatibility, so what I was doing was similar to:

svn mv {parentdir/srcdir} {elsewhere/destdir}
svn commit {parentdir} -m "some message"
ln -s {elsewhere/destdir} {parentdir/srcdir}
svn add {parentdir/srcdir}
svn commit {parentdir/srcdir} -m "some message"

svn commit {elsewhere/destdir}

usually I'd do that for a few directories before bunching the
commits of elsewhere.

I was then getting svn errors because the source wasn't around anymore. When
doing svn mv, it's necessary to commit the A (in the destination directory) and
the D (in the source directory) in the same commit, otherwise there will be
a lot of backfilling and rework necessary. What I had to do was find the
revision at which the given directories were deleted from the parentdir,

svn cp -r [rev] [URL] [destdir]

them (specifying a revision one less than the one they were deleted in), and
commit, bringing them back. Then I'm doing the svn mv operation again, but
this time committing both at the same time and only doing the "ln -s" at the end,
and in its own commit (can't be done along with the svn mv because I want the
link to have the same name as the source file, but the source file hasn't been
deleted yet, the ln -s and svn add of the link needs to be done after the
source file D operation has already been committed.

The links (for backward compatibility) are making a regular svn update fail.
This is because svn update needs to create the link, but the original file
is still there. svn doesn't replay the commit history, so it doesn't have
the opportunity to delete directory entries which are no longer the same type
(formerly directories, now links). I was going to just check out the whole
tree, but I think I'll wait on that, instead I'll first just remove all links
(find . -type l | xargs rm -f) and then svn update since the only same-name
files in there are links inserted for backward compatibility.