Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Saturday, October 27, 2007

Qutemol using Cedega

Pawel over at Freelancing Science recently highlighted Qutemol, a nice looking molecular viewer that does real-time ambient occlusion rendering. There isn't any official Linux version, but I found that the Windows version runs okay on Linux using Cedega (a version of Wine that has better DirectX support, especially for games). Since Cedega is based on the Open Source Wine code, you can compile your own command line version ... but it's a good idea to buy a maintenance subscription from Transgaming and support it's further development, if you can afford it.

Here's a screenshot of Qutemol running under Cedega on Ubuntu Gutsy Gibbon, just to prove it.


No, it's not Photoshopped ... (or GIMPed) ... :)

Wednesday, September 12, 2007

ARIA verson 2.2 released

I don't usually post about NMR (Nuclear Magnetic Resonance) and structural biology related stuff, but I've always intended to. In this post I'm pulling out all the stops on specialist lingo and assumed background knowledge, so hopefully it isn't too incomprehensible to the non-structural biology crowd :).

ARIA version 2.2 has been released in the last few weeks. ARIA is an automated NOE assignment and structure calculation package, which (in theory) takes some of the pain and slowness out of producing protein (and DNA and/or RNA) structures from Nuclear Magnetic Resonance data. I'll say up front; I haven't tried this version yet, but some of the improvements look exciting.

Here are two new features worth noting ... followed by what I think it all means:


  • The assignment method has been improved with the introduction of a network-anchoring analysis (Herrmann et al., 2002) for filtering of the initial assignments.
  • The integration of the CCPN has been completed. The imported CCPN distance constraints lists can enter the ARIA process for calibration, violation analysis and network-anchoring analysis. The final constraint lists can be exported as well.

In the past I have done some quick and dirty tests comparing the quality of protein structures produced using Aria 2.1 vs. Peter Gunterts CYANA 1.07 and 2.1, using the exact same NMR peak input lists (with slightly noisy data containing a number of incorrectly picked peaks). CYANA always won hands down, assigning more NOE crosspeaks correctly and producing an ensemble of model structures with much lower RMSD and generally better protein structure quality scores (ie using pretty much any decent pairwise pseudo-energy potential, and Procheck). Also, ARIA produced 'knotted' structures which were almost certainly incorrect, while CYANA did not. Other postdocs and students in my former lab had done similar independent tests with ARIA 1.2 vs. CYANA 1.0.7, and had come to similar conclusions.

The disclaimer: It should be noted here that assessment of the quality of an ensemble of NMR structure coordinates can be problematic, and is really the topic of another long post (and probably tens if not hundreds of peer-reviewed journal articles). So saying "CYANA version X is better then ARIA version X" based on the RMSD of the final calculated ensemble is a bit unfair ... in fact using RMSD of the ensemble to gauge structure quality is just plain wrong in this context. In my (unpublished, non-peer reviewed) tests, it is possible that ARIA could be producing high RMSD but essentially 'correct' structures, while CYANA could be producing tightly defined but 'incorrect' structures, but I doubt it. The gap between the output of each program was wide enough to suggest that under real-world conditions where the input peak list contained a number of 'noise' peaks, ARIA was failing to give a set of consistent solutions (probably due to lack of NOE assignments), while CYANA was giving a set of tightly defined structures (which may or may not have represented the 'correct' solution). Other evaluations (protein structure quality measures, Procheck, comparison to known structures of similar proteins) indicated that the CYANA structures were not grossly 'incorrect', so I'd say CYANA was just giving a better defined (ie lower ensemble RMSD) set of plausible solutions.

My gut feeling is that ARIA 2.2 will perform much better than past versions, due to one key feature that has been 'borrowed' from CYANA; the introduction of a network-anchoring analysis. In a nutshell, network-anchoring scores essentially weight distance constraints (or NOE assignments) based on how 'connected' that constraint is within the graph formed by other constraints. This means that in effect a single, isolated constraint pulling two residues on opposite sides of a protein together is down-weighted, while if multiple constraints link those residues (or their neighboring residues) then those constraints are considered more trusted and hence weighted heavier. For better or worse (usually better), this score simulates what the human NMR spectroscopist would do when assigning NOE crosspeaks manually ... usually two residues in contact will show multiple NOE crosspeaks connecting them and involve multiple different nuclei, however a single lonely NOE between two nuclei which are distant from eachother in the primary protein sequence is heavily scrutinized and regarded with suspicion since it is likely to be mis-assigned. I'm very keen to test ARIA 2.2 on my old data set and see if I'm actually right (I may be able to try it with network anchoring turned on, and off, and see just what sort of contribution that score is making).

Another completed feature, the integration between ARIA and the CCPN libraries/analysis package should also be a big plus. I haven't used the CCPN analysis software yet, but a few years ago I wrote some code to help make CYANA and the Sparky NMR assignment program work together better. The result was functional, but very hackish (and I'm probably the only person in the world who understands how it was intended to be used, since I still haven't got around to writing any documentation. Naughty, naughty). CCPN + ARIA may turn out to be the better option for spectral analysis and structure calculation in the future, as opposed to my currently preferred Sparky + CYANA combination.

I'm really itching to find a good reason to do an NMR structure project now ... back to work !!

Thursday, June 28, 2007

Google Desktop for Linux Released

Google Desktop for Linux has been officially released. It's a real-honest-to-god native linux application, and doesn't use Wine like the Linux version of Picasa.

I've just installed it on Ubuntu Feisty Fawn from the Google Linux software repositories, and while it's currently only indexed about 1 % of my files, my initial tests suggest it is pretty slick ... a quick Ctrl-Ctrl, and up pops the search box. Apart from all the things I'd expect, like indexing the content of pdf files, directories like "/usr/man" are included on the default path list, so I presume it also looks inside man pages. One problem I've noticed so far in my very quick testing is that it seems to not follow symlinks to directories and won't let me add them as paths to index. The effect is that my "/home/perry/documents", which is actually a symlink to a larger partition, does not get indexed unless I add it to the path list with it's real path.

While there are already similar offerings for Gnome (eg Beagle) and KDE (eg Kat), my gut feeling is that Google Desktop will be my preferred option for the moment. Maybe one day we will get lucky, and Google will even make it FOSS (not holding my breath though).

Thursday, January 18, 2007

Flash Player 9 for Linux released : Quick install for Ubuntu Dapper

Adobe Flash Player 9 is finally out of beta ! No more feeling like a second class netizen on "flashy" sites !

Here's how I installed it on Ubuntu Dapper (the package is for Debian Sarge, but seems to work fine):

Download flashplugin-nonfree.

Use right-click, "Save Link As ..." and save it to \tmp.

$ cd /tmp

$ sudo dpkg -i flashplugin-nonfree_9.0.21.78.4~bpo1_i386.deb

(you'll be prompted for your password, and once you provide it, the install should happen)

You can check if it worked by typing about:plugins into the URL box in Firefox. You should see something like "
Shockwave Flash 9.0 d78" on that page.

Now go view some F-F-F-Flash cartoons :) (I don't think Homerstarrunner requires Flash 9, but it's the only Flash site I use on any regular basis)

Wednesday, January 10, 2007

Changing "Illustration" to "Figure" in OpenOffice Writer

I've decided to try and use OpenOffice Writer properly .. like take advantage of some of its more powerful features rather than just using it as a text editor with formatting.

For drafting manuscripts of scientific papers, pictures/photos/illustrations etc are usually referred to as "Figures", however when inserting a picture via "Insert -> Picture -> From File .." the default behavior of OpenOffice is to use the caption "Illustration". This will not do.

From the OpenOffice Writer Guide, Chapter 8 [pdf], here is how to get it to use "Figure" by default:

• Open the "Tools -> Options –> OpenOffice.org Writer—> AutoCaption" dialog box.

• Under "Add captions automatically when inserting section", check
OpenOffice.org Writer Picture, and make sure its checkbox is ticked.

• Under the Category drop-down list, enter the name that you want added,
eg, Figure, in the place by overwriting any sequence name in the list (it will probably have "Illustration", before you overwrite it.) I also like my Figure label bold, so I also selected "Strong Emphasis" from the "Character Style" dropdown box. Press OK to save the changes.

Now you can insert a Picture using "Insert -> Picture -> From File .." and the label should be "Figure", not "Illustration". The picture comes in its own frame, and you can edit the figure legend directly in the document.

Hmmm ... Latex is not looking so bad again ....

Friday, October 27, 2006

Firefox 2.0 installation and tweaks

I guess you've heard ... Firefox 2.0 is out.

I resisted running the earlier Firefox 2.0 release candidates, but now that the official 2.0 release is out, I thought I'd give it a go. In reality, there are no dramatically new features between the 1.5.x and 2.0 release, but I was more interested in the claim that 2.0 was faster and more stable than the (already pretty good) 1.5.x versions. And for the record, I'd say it is faster and more stable in my hands.

I thought I'd give a quick rundown of the installation and some tweaks, if anything for my own future reference.

Installation:
I'm running Ubuntu Dapper 6.06, and since Firefox 2.0 hasn't been backported to the Ubuntu Dapper 6.06 repositories yet (and may never be), and I'm not prepared to upgrade to Edgy Eft 6.10 (yet), I had to find another option for getting my Firefox 2.0 goodness.

After backing up my ~/.mozilla directory ($ cp -r ~/.mozilla ~/.mozilla.1.5.0.7), I installed Swiftfox , a processor-type optimized build of Firefox (downloaded appropriate deb and ran $ sudo dpkg -i swiftfox_2.0-1_pentium-m.deb). After installing, 'Swiftfox' was added under 'Internet' in the Applications menu. As far as I can tell, apart from the compile time optimisations and changing the word 'Firefox' to 'Swiftfox' in a few places, there is no other difference to the vanilla Mozilla Firefox 2.0 releases.

So far, it's been working nicely, but there were a few little steps to migrate my old Firefox 1.5.0.7 config and extensions to Swiftfox 2.0.

Rescuing disabled extensions:
Most of my Firefox configuration was carried over correctly, however a few extensions that have not (and may never be) updated for Firefox/Swiftfox 2.0 were automatically disabled.
The trick around this is to install the 'Nightly Tester Tools' extension. Once you have restarted Swiftfox/Firefox, you can go to Tools->Add-ons and then right-click on any disabled Extension and select 'Make compatible'. I'd suggest doing it one extension at a time, restarting and testing, since things can go stupid if an extension is truly not compatible.

Many extensions work fine, despite not being designed for anything higher than Firefox 1.5.x (eg NeedleSearch and SwitchProxy to name a few). However, not every extension behaved correctly when forced to run (eg. TabMixPlus), and so after having some strange behaviour, I re-disabled those. A lot of extensions tend to be fixed and updated a week or two after a major Firefox release, so there is hope that the broken ones will become available for Firefox 2.0 soon.

I also used this as an opportunity to disable or uninstall any extension I wasn't using. I find it often makes Firefox run faster, use less memory, (leak less memory :P), and in a very small way it makes your browser more secure by not having lots of random unmaintained extensions hanging around.

Tweaks:
Here are a few things I like to tweak in a new Firefox installation, and some brand new options to tweak in Firefox/Swiftfox 2.0. (All these can be accessed by typing about:config in the address bar, and many were pinched from here).

Firefox 2.0 already restores all tabs and forms after a crash, but it can also be set to restore tabs back to their original state after a normal shutdown. This is about the only feature of the TabMixPlus extension that mattered to me ... it's great that it is now part of Firefox proper !

To enable auto-restore all the time, change:

browser.startup.page = 3

(this one can also be found under Edit->Preferences->Main->When Swiftfox starts->"Show my tabs and windows from last time")

You can fit a few more visible tabs along the top by setting:

browser.tabs.tabMinWidth = 75

I hate those stupid close buttons on the tabs. I'm forever accidentally closing a tab unintentionally. Technically, it seems like putting the close button on the tab is actually better user interface design, but old habits die hard, so I put it back on the far right hand side in the style of Firefox 1.5.x with:

browser.tabs.closeButtons = 3

I got rid of the 'Go' button at the end of the url bar. I don't need it, since there is a big, highly accessible 'Go' button on my keyboard already, labelled 'Enter'.

browser.urlbar.hideGoButton = true

I also use some HTTP pipelining and rendering tweaks. Apparently pipelining can break some sites such as Myspace.com ... but in the case of Myspace, how could you tell if the site was broken or not ?

network.http.pipelining = true
network.http.proxy.pipelining = true
network.http.pipelining.maxrequests = 8
(don't set maxrequests higher, Firefox won't allow more anyway since it adheres to the HTTP spec)
nglayout.initialpaint.delay = 0
(may need to experiment with this one)

I'm a cookie snob, so I disable third-party cookies (which doesn't have a GUI configuration option in Firefox 2.0 like it did in 1.5.x ... it's gotta be some conspiracy):

network.cookie.cookieBehavior = 1

Finally, turn off link prefetching. This is a 'feature' whereby the browser follows links which authors have marked to be prefetched in the current page, potentially wasting your bandwidth and CPU time on the chance that you will click on one of the pages that gets "precached". With it turned on, prefetched pages are likely to load much faster. Personally, I don't trust that it won't be abused by some page authors and I like to have more control of what my browser fetches. So, without going off into a paranoid rant ... I turn it off.

network.prefetch-next = false

Well, that's enough tweaking for today. I'll leave the obligatory "Must have Firefox Extensions" post for another day.