I know very little about computation genetics/biology but it sounds interesting.
On the other hand many of the bioinformatics software solve a specific scientific question and usually are written by people with mostly non-computational background. They use higher level languages such as Python/Perl/R and people often don't have the expertise or time to implement them for GPUs.
However now that machine learning and deep neural network approaches are being picked up by the field, the workloads might change a and also there are frameworks that make it easer to leverage GPUs (Tensorflow, etc)
That's an interesting thought, has anyone ever attempted to get 'regular' programmers interested in this stuff as a 'game'/code golf kind of thing?
(Too many) Years ago one of the programming channels I was active in got distracted for 3 weeks while everyone tried to come up with the fastest way to search a 10Mb random string for a substring, not in the theoretical sense but in the actual how fast can this be done, that was the point I found out that Delphi (which was my tool of choice at the time) had relatively slow string functions in it's 'standard' library and ended up writing KMP in assembly or something equally insane, I got my ass handed to me by someone who'd written a bunch of books on C but eh it was damn good fun, it was also one of the first realizations I had just how fast machines (back then) had gotten and just how slow 'standard' (but very flexible) libraries could be.
Obviously the total scope of re-writing researchers code would probably be far far beyond that but if they could define the parts they know are slow with their code and some sample data I know a few programmers who would find that an interesting challenge.
Thanks for the response.
(More than a decade ago, I struggled to / barely succeeded in building a Beowulf cluster; I am just amazed at how far both the hardware & the software tools have come..)
In other areas of comp bio though, GPUs I think are finding use. Protein folding, molecular dynamics. Also, with STORM & such: super resolution microscopy? I think increasingly, gpus will become important.
Also, whole cell simulations?
You are also right that some of the comp bio areas (CryoEM, protein folding, molecular dynamics) are well suited for GPUs
One of the nice things about HN is you get to look outside your own bubble (I mostly do Line of Business/SME stuff so this stuff isn't just outside my wheelhouse it's on the other side of the ocean).
GPUs excel at problems where you can apply exactly the same logic to lots of data in parallel. CPUs can handle branching cases, where each operation requires a lot of decisions, a lot better.
Sufficiently large FPGA chips could accelerate certain parts of the workflow, if not the whole thing, since they're extremely good at branching in parallel. This is why early FPGA Bitcoin implementations blew the doors off of any GPU solution, each round of the SHA hashing process can be run in parallel on sequentially ordered data if you organize it correctly.
FPGAs run hot, don't have many transistors, limited clock rate, and are a pain to program.
So yeah a "Sufficiently large" chip, a "sufficiently fast clock", and a "sufficiently well written app" could theoretically do well. Problem is in the real world they aren't and developers aren't targeting them.
Your user name: a fan of the cre-lox system, or the enzyme itself?
Cool uid!
In my past life, I've used flp/frt & cre/lox; and studied mismatch repair enzymes. And topoisomerases.. :)