Thursday, December 21, 2006

I submitted my PhD thesis, and all I got was this crappy balloon

Well, it's not really all that crappy ... the balloon is a nice happy gesture to mark the occasion. I even got to pick the colour. It took me far too long to write and submit this thing, it's a relief to not have to look at it for a few months. My thesis, entitled "The structure of outer mitochondrial protein import receptors", may well be the first Creative Commons Licensed thesis submitted in Australia (although I doubt it) . Once it's been examined (hopefully I pass), I'll release it online and allow everyone to poke holes and rip it to shreds (or they can poke at the associated peer reviewed publication instead .. unfortunately it's probably not Open Access).

Afterthought: One thing that slowed down the final submission was the bloody Latex typesetting. I'm a Latex novice, and while I really like the final result, Latex is an abomination (much like Perl).

Update, 15th October, 2007.

I've finally got around to submitting the final post-examination version of my thesis to the University of Melbourne ePrints server. You can get a PDF copy of my thesis here. I used the xmpincl Latex macro to embed XMP Creative Commons licensing data into the final PDF version generated by pdflatex. I probably didn't get the format of the licensing XML exactly right, but I'm sure it will be good enough that search engines can (or will one day) determine the correct licensing for the work.

Thursday, December 07, 2006

I have bags under my eyes ... and a Wii

We stayed up late to collect a pre-ordered Wii at JB Hifi in the city last night. The midnight release 'party' was really funny with lots of nintendo fanboys hanging around playing their Nintendo DS handhelds in the queue.

So I've had about 2 hours to play it (Wii Sports Bowling, Wii Play ping pong, shooting, pose) and it's really fun.

There were a few small glitches getting connected to my Linksys WRT54G wireless router. Initially I was getting an "Error 51130" when trying to connect to the network. I needed to use the full WEP key in hex, not the passphrase to get it to connect to the router. Also, I need to turn the routers wireless mode from "G-only" to "Mixed" .. presumably the Wii uses 802.11b, not 802.11g ?

During the initial update I then got an "Error 32002" (after 5 mins or so of downloading) .. to solve this one I just tried again and it worked.

Also, the User Agreement "click-wrap" for the Wii network sucks ... nothing new really ... Nintendo owns your soul etc (WiiConnect24 can update YOUR machine with THEIR hacks [er .. patches] to get around your useful software modifications at any time, no questions asked). Anyhow .. for the moment I'm forgetting all this nastiness and just enjoying playing it .. in the future I may have to diconnect it from the network in order to run Linux on it ...

Now, I think I can fit in one quick game before work ... :)

Wednesday, November 29, 2006

First Online EMBL PhD Symposium

This looks interesting ... the First Online EMBL PhD Symposium, a sort of 'online' conference for the life sciences. Everybody with a scientific background is invited to participate. Registration is free.

The programme (Career Development Session, Omics Session / Systems Biology, Scientific Communication 2.0 and Participant's Contributions) and speakers list makes it look sort of like a "Biology 2.0" conference.

Apart from the (possible) IRC sessions, hopefully the fact that everything is stored as video/audio + comments on their content managment system means the 'inconvenient' timezone in Australia won't limit my participation too much.

(via the worldwide bioinformatics cabal :), Neil via Pedro, Roland and Stew)

Wednesday, November 22, 2006

International Genetically Engineered Machine competition videos

The 2006 iGEM Jamboree (International Genetically Engineered Machine competition) happened at the start of this month. This is a synthetic biology 'competition' where teams of talented undergraduates from around the world engineer an organism for a specific purpose ... like E. coli that produce mint or banana smell, or form simple logic gates the could potentially be used to make a 'biological computer'.

They are encouraged to use BioBricks from the Registry of Standard Biological Parts, which at the moment is essentially comprised of series many well-characterized DNA constructs (promoters, repressors, selection markers, lots of fluorescence protein coding sequences, etc) with standardized restriction site that can be mixed and matched to produce new and interesting behaviours in bacteria, yeast or mammalian cells. BioBricks are sent out to teams in in 96-well format, so everyone has a good basic set of starting components.

Videos of the student presentations have finally turned up on Google Video. (Unfortunately, the videos only show the speakers, not the slides for the presentation ... which makes some parts pretty hard to follow).

I watched the presentation by the University of Arizona team. They printed bacteria onto paper using a stock-standard inkjet printer, with the ink simply removed from the cartridges and replaced with a solution of bacteria. They could then tranfer this to agar plates to grow in whatever pattern they printed. Very simple, but inkjet hardware hacking crossed with molecular biology is just plain cool. As a side discovery, they noticed some weird fractal patterns in colonies under the confocal microscope, apparently based on variation in the fluorescent protein expression level of cells in a single colony.

I wonder how much interest there would be from undergrads (and their supervising acedemics) to start an Australian iGEM team for 2007 ? Funding would also be a tricky issue, as always.

Tuesday, November 14, 2006

Protein structure sculpture

Check out these amazing protein structure sculptures by Julian Voss-Andreae. The GFP (Green Fluorescent Protein) in shiny steel is particularly striking.

He has even provided instructions on how to construct your own [pdf] ... Before the advent of molecular graphics on computers, making physical models similar to this was what crystallographers (and Linus Pauling) did to build protein models.

I've gotta find time to make one of these ... the question is, do I make something of personal significance, or a choose a structure that is actually a little more challenging ?

Friday, November 03, 2006

Sheik Taj Din al-Hilaly compares men to puddy cats ...

Okay, okay, so this news is a few days stale, but the outrage has only just hit me ....

In a sermon delivered in Arabic (but thankfully translated by The Australian for the sake of some racial-hatred-arousing journalism) Sheik Taj Din al-Hilaly has compared men to cats which have no free will and cannot help but eat uncovered meat left out on the street.

As a red-blooded Aussie male, I'm extremely offended at being compared to such a prissy weak rotting-meat-on-the-street eating animal. Maybe something more masculine like a rhinoceros or a Great White Shark may have been a more thoughtful comparison on the Sheiks part ... even a "big cat" such as a lion wouldn't be so bad.

But a feral flea bitten moggy eating uncovered meat on the street ! This guy should aim to improve on his metaphors.

Thursday, November 02, 2006

The SDS-PAGE Hall of Shame

For the uninitiated .. SDS-PAGE is a method that biochemists use to separate mixtures of proteins (and sometimes other biomolecules, like short pieces of DNA). It's a really useful technique, and most of the time it works perfectly, giving a nice little 'ladder' of bands with large proteins at the top and the smallest ones at the bottom.

Occasionally, something goes wrong ... enter the SDS-PAGE "Hall of Shame".

This is a hilarious gallery of botched SDS-PAGE gels, which doubles as a useful trouble-shooting guide.

Over the years, despite trying my best to avoid it, I've occasionally run gels which have suffered from most of these problems. This one resembles the gel I ran yesterday ... I was in a hurry and turned the voltage up too high. Normally I'd get away with it, but this time the cooling wasn't adequate enough. It doesn't pay to rush these things.

(For the non-scientists: SDS-PAGE is an acronym for for sodium dodecyl sulphate polyacrylamide gel electrophoresis ... sorry you asked ?)

Wednesday, November 01, 2006

Amarok 1.4.4 on Ubuntu Dapper

A new version of Amarok, my favorite music player for Linux, has been released.

This version boasts numerous bug fixes, and an nice interface to the Magnatune music store. Magnatune is cool since the full length tracks are under a Creative Commons license and are free to listen to. If you decide to support an artist you enjoy, you can buy downloads and choose how much you wish to pay. The artist splits the profits 50:50 with Magnatune, and you get uncrippled MP3/FLAC/Ogg files, which can be re-downloaded at any time if you loose them somehow. Since Magnatune operates like an enlightened version of a traditional record label, meaning they only select "high quality" artists ... they don't push loads of dross from self promoting artists that suck like the old (RIP) did. "Brad Sucks" is (non-exclusively) on the Magnatune label, but his music doesn't suck.

Hopefully in the future Amarok will include some generic API to interface with other enlightened music stores and repositories of Creative Commons music, so that Magnatune doesn't get accused of monopolising :). For instance, I'd like to be able to add say, ccMixter and maybe IUMA in addition to Magnatune. An open web services API for music stores would make this possible, and while I haven't looked "under the hood" of the new Amarok-Magnatune browsing feature yet, I suspect this is what they have already created.

Anyway, there doesn't seem to be a backported version of Amarok 1.4.4 in the Ubuntu / Kubuntu Dapper in the repositories (yet). There are some Edgy Eft packages, but I don't want to upgrade to Edgy at the moment.

Instead, I've compiled my own and have made some deb packages, using the official deb source packages. I haven't tested this version heavily yet, but it seems to work. I had to override one dependency, since it complained that the Dapper "Common Debian Build System" (cdbs package) was not recent enough .... hopefully this was a safe thing to do.

You can download my packaged versions here:

amarok_1.4.4-0ubuntu1_i386.deb (link fixed .. Thanks victor !!)



Install them by typing:

$ sudo dpkg -i amarok_1.4.4-0ubuntu1_i386.deb amarok-xine_1.4.4-0ubuntu1_i386.deb amarok-engines_1.4.4-0ubuntu1_i386.deb
Yah, I should probably GPG sign these and try to get them included in Dapper backports or something .... but no time to do the job properly at the moment.

Update - if you have trouble with some missing dependencies, this may help:

$ sudo apt-get install ruby python-qt3 kdelibs4c2a libifp4 libnjb5 libpq4 libqt3-mt libtunepimp3 libvisual-0.4-0 libxine-main1

Hopefully that catches most of the dependencies that are likely to be missing, particularly for those running Ubuntu Dapper and not Kubuntu Dapper.

Friday, October 27, 2006

Firefox 2.0 installation and tweaks

I guess you've heard ... Firefox 2.0 is out.

I resisted running the earlier Firefox 2.0 release candidates, but now that the official 2.0 release is out, I thought I'd give it a go. In reality, there are no dramatically new features between the 1.5.x and 2.0 release, but I was more interested in the claim that 2.0 was faster and more stable than the (already pretty good) 1.5.x versions. And for the record, I'd say it is faster and more stable in my hands.

I thought I'd give a quick rundown of the installation and some tweaks, if anything for my own future reference.

I'm running Ubuntu Dapper 6.06, and since Firefox 2.0 hasn't been backported to the Ubuntu Dapper 6.06 repositories yet (and may never be), and I'm not prepared to upgrade to Edgy Eft 6.10 (yet), I had to find another option for getting my Firefox 2.0 goodness.

After backing up my ~/.mozilla directory ($ cp -r ~/.mozilla ~/.mozilla., I installed Swiftfox , a processor-type optimized build of Firefox (downloaded appropriate deb and ran $ sudo dpkg -i swiftfox_2.0-1_pentium-m.deb). After installing, 'Swiftfox' was added under 'Internet' in the Applications menu. As far as I can tell, apart from the compile time optimisations and changing the word 'Firefox' to 'Swiftfox' in a few places, there is no other difference to the vanilla Mozilla Firefox 2.0 releases.

So far, it's been working nicely, but there were a few little steps to migrate my old Firefox config and extensions to Swiftfox 2.0.

Rescuing disabled extensions:
Most of my Firefox configuration was carried over correctly, however a few extensions that have not (and may never be) updated for Firefox/Swiftfox 2.0 were automatically disabled.
The trick around this is to install the 'Nightly Tester Tools' extension. Once you have restarted Swiftfox/Firefox, you can go to Tools->Add-ons and then right-click on any disabled Extension and select 'Make compatible'. I'd suggest doing it one extension at a time, restarting and testing, since things can go stupid if an extension is truly not compatible.

Many extensions work fine, despite not being designed for anything higher than Firefox 1.5.x (eg NeedleSearch and SwitchProxy to name a few). However, not every extension behaved correctly when forced to run (eg. TabMixPlus), and so after having some strange behaviour, I re-disabled those. A lot of extensions tend to be fixed and updated a week or two after a major Firefox release, so there is hope that the broken ones will become available for Firefox 2.0 soon.

I also used this as an opportunity to disable or uninstall any extension I wasn't using. I find it often makes Firefox run faster, use less memory, (leak less memory :P), and in a very small way it makes your browser more secure by not having lots of random unmaintained extensions hanging around.

Here are a few things I like to tweak in a new Firefox installation, and some brand new options to tweak in Firefox/Swiftfox 2.0. (All these can be accessed by typing about:config in the address bar, and many were pinched from here).

Firefox 2.0 already restores all tabs and forms after a crash, but it can also be set to restore tabs back to their original state after a normal shutdown. This is about the only feature of the TabMixPlus extension that mattered to me ... it's great that it is now part of Firefox proper !

To enable auto-restore all the time, change: = 3

(this one can also be found under Edit->Preferences->Main->When Swiftfox starts->"Show my tabs and windows from last time")

You can fit a few more visible tabs along the top by setting:

browser.tabs.tabMinWidth = 75

I hate those stupid close buttons on the tabs. I'm forever accidentally closing a tab unintentionally. Technically, it seems like putting the close button on the tab is actually better user interface design, but old habits die hard, so I put it back on the far right hand side in the style of Firefox 1.5.x with:

browser.tabs.closeButtons = 3

I got rid of the 'Go' button at the end of the url bar. I don't need it, since there is a big, highly accessible 'Go' button on my keyboard already, labelled 'Enter'.

browser.urlbar.hideGoButton = true

I also use some HTTP pipelining and rendering tweaks. Apparently pipelining can break some sites such as ... but in the case of Myspace, how could you tell if the site was broken or not ?

network.http.pipelining = true
network.http.proxy.pipelining = true
network.http.pipelining.maxrequests = 8
(don't set maxrequests higher, Firefox won't allow more anyway since it adheres to the HTTP spec)
nglayout.initialpaint.delay = 0
(may need to experiment with this one)

I'm a cookie snob, so I disable third-party cookies (which doesn't have a GUI configuration option in Firefox 2.0 like it did in 1.5.x ... it's gotta be some conspiracy):

network.cookie.cookieBehavior = 1

Finally, turn off link prefetching. This is a 'feature' whereby the browser follows links which authors have marked to be prefetched in the current page, potentially wasting your bandwidth and CPU time on the chance that you will click on one of the pages that gets "precached". With it turned on, prefetched pages are likely to load much faster. Personally, I don't trust that it won't be abused by some page authors and I like to have more control of what my browser fetches. So, without going off into a paranoid rant ... I turn it off.

network.prefetch-next = false

Well, that's enough tweaking for today. I'll leave the obligatory "Must have Firefox Extensions" post for another day.

Saturday, October 21, 2006

Bioinformatics data (non-)formats

(spurred on by my own comment here)

Anyone know if the Clustal alignment file format (eg ClustalW output) has any strict definition somewhere ?

Some Googling suggests it has never been "formally" described .. eg, from the ClustalX help:

"CLUSTAL format output is a self explanatory alignment format. It shows the sequences aligned in blocks. It can be read in again at a later date to (for example) calculate a phylogenetic tree or add a new sequence with a profile alignment."
Well, it is fairly self explanatory, and as a result there are lots parsers around for Clustal format alignment data, and lots of programs that claim to output alignments in "Clustal format". I say claim, since many programs output Clustal alignments with different headers to the original ClustalW program (eg “MUSCLE” instead of “CLUSTAL”) .. and some parsers don’t handle that very gracefully (eg Biopython’s Bio.Clustalw).

Unfortunately, these ‘pseudo-Clustal’ formats aren’t going away, and so it is probably up to the parsers to be a little more flexible. Fortunately, the variation is usually only in the header on the first line of the file, so it should be trivial fix the Biopython parser so that it is more forgiving. One idea would be to simply add an optional keyword flag like "ignore_header = True" to the the Bio.Clustalw.parse_file() function. This way, something like:
alignment = Bio.Clustalw.parse_file(my_muscle_align_file, alphabet=IUPAC.protein, ignore_header=True)
should happily slurp up most variations on the Clustal format.

Eventually I’ll get this to the Biopython mailing list (I'll probably write a proper patch first).

Friday, October 06, 2006

Ways of seeing the world ...

As blogged by several others ... the Science Magazine Science and Engineering Visualization Challenge winners have been announced.

I'm a big fan of innovative visualization ... sometimes it can the difference between seeing something meaningful in data, or just seeing noise. I was initially disappointed at the large number of finalists that are purely 'educational' in nature, rather than providing novel representations of 'raw' data. But, the National Science Foundation site explains it: "The spirit of the competition is for communicating science, engineering and technology for education and journalistic purposes.". It's important to have these types of events pitched so that the 'general public' (i.e. non-scientists, or scientists of vastly different fields) can get something out of it .. after all, they are often indirectly providing the funds for a lot of the research, and in some cases a cool image is all they get for their tax dollars. Nonetheless, I'd like to see a competition dedicated to innovative visualization of new experimental or statistical results, with no opening for purely 'textbook' style educational compositions (I bet there's one or two out there ... comments anyone ?).

Also, congratulations goes to one of the (tied) 1st place winners in the non-interactive multimedia section, Drew Berry and François Tétaz at The Walter and Eliza Hall Institute (WEHI) and Jeremy Pickett-Heaps at the University of Melbourne. It's nice to see some local Aussies getting some recognition.

Friday, September 29, 2006

Combio 2006, last day roundup

Yesterday I breezed into Brisbane for the last day of the Combio 2006 meeting, to catch some talks, and make a showing to accept an award from the ASBMB.

Neal Saunders has been posting summaries of this meeting in Brisbane on his blog, so I thought I'd give my take on the last day too.

Here's are my highlights:

In David Claphams talk on transient receptor potential (TRP) ion channels, I learnt that menthol feels cold because it binds to an activates a TRP channel involved in cold sensing. Think about that next time you taste that cool minty freshness. (I woke up a 4 am to fly to Brisbane. The brain wasn't really kicking over just yet).

In the "Molecular Basis of Disease and Drug Design" session, K. Krause gave a very honest and entertaining talk on what he termed his "Night Science". ("Day Science" is the stuff that works out nicely, shows logical progression with no nasty inconsistencies or loose ends and gets talked about at plenary lectures. "Night Science" is the stuff that doesn't work out as well as we'd like .. it's confusing, there are loose ends and inconsistencies, despite carefully doing all appropriate controls. Not to be confused with "Bad Science"). Krause and his group were unlucky enough to find that a lead compound discovered through an in silico screen, which initially appeared to be a great inhibitor of alanine racemase, turned out to in fact be a potent inhibitor of another enzyme in their coupled assay. I wasn't inhibiting their target well at all (doh!).

There were actually a few examples of some somewhat disturbing results from in silico screens in this session, which I've seen similar examples of a few times before. Researchers do an in silico screen, and find some top-ranking hits, one or two of which are also good inhibitors in an assay. The co-crystal structure is solved, and reveals that the compound is not actually binding in anything like the conformation that the computational docking predicted (sometimes not even the same site). What is going on here ? Is it just the fact that in twenty random compounds one will turn out to be a weak inhibitor ? Unlikely, since then high-throughput real-world screens would have a much higher hit rate. Is it that the computational docking is half right, fitting one fragment of the compound which has high affinity well, and the other non-binding or weak binding half doesn't matter ? Probably more likely, but it still doesn't explain the cases where the compound binds in a completely unpredicted site. Food for thought: maybe many docking scoring functions for small molecules are good at selecting generally sticky molecules ...... (I don't do this kind of work directly, so I'm really an ignoramus on the issue).

I also went to the "Cancer - Emerging Drug Targets" session. Andrew Scott from the Ludwig Institute for Cancer Research presented some really encouraging results of early clinical trails for an EGFR antibody, and Michelle Haber of the Children's Cancer Institute Australia presented some results from two cell based assays, where 'high-throughput' screens have identified some inhibitors of the N-myc oncogene, and a drug efflux pump (MRP) inhibitor. I'd never really thought about it, but apparently those pesky cancer cells up regulate this efflux channel and actively pump out anti-cancer drugs, in a similar way to some parasites that become multi-drug resistant.

In the final plenary lecture, Nick Proudfoot told us about his work on transcriptional termination. It's still too early for the textbooks, but it looks like transcriptional terminators bind at the termination site and near the promoter regions in a lot of cases, turning genes into physical 'loops'. Whether this helps the RNA polymerase jump from the end of a gene straight back to the start to make the next mRNA transcript is still not proven, but it's an attractive model.

Combio is always a bit of an eclectic mix, but if you take it in the right frame of mind it can be good fun, and a nice way to broaden the scientific horizons a little. Needless to say, I slept like a log after all that.

Blogger templates ...

So, I've been fiddling with the template of this blog for the last couple of weeks. I think I've stabilised on a look that I'm happy enough with ... so now it's time to start posting for real.

Let's see if I can keep this up (and keep it interesting+useful).