Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
![]() |
World Community Grid Forums
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
No member browsing this thread |
Thread Status: Active Total posts in this thread: 2
|
![]() |
Author |
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
As I understand it, this project has worked on, or is working on a first principles or a-priori approach to the protein folding problem. In other words, various posibilities are tried to find minimized energey states and so forth with knowledge of the rules but not the 3d structure of the protein. I know serveral NMR specialists who work the opposite way. They start with protein samples, (which can take quite a while to collect), and then they go into a giant NMR system and an an attempt is made to determine structure that way. My question is, how does your research interesect with theirs and vice versa? Does your solution make their work obsolete? Probably not. If not, why not? At some point do the NMR people and the ISB people have to collaborate to find differences or mistakes? What is the bigger picture of the proteome problem. There seem to be a lot of teams still taking many different approaches.
|
||
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
The ISB has a huge NMR section plus crystallographers (if my spelling is correct). Computers are cheap, as witness the 90 genomes we have done. The point is that our results are cheap, but are useful only if they are actually used by scientists. Eventually most (or even all) will be replaced by laboratory measurements. But that might not happen for decades. Of course, the problem with lab measurements is that they are only good for the particular environment that the lab equipment requires the protein to be in. Computers might prove to be more flexible in the long run, as long as lab measurements are used to verify the programs.
|
||
|
|
![]() |