Why Computers Are Only Just Emerging as a Tool for Scientists

Surprise! Only a small percentage of scientists use computers for their research.

Give me a moment, I can explain.

It turns out that even though a lot of folks carry a powerful computer in their pockets (their smartphone), the use of computers in basic scientific research has so far been fairly limited. Of course, scientists use computers for collating data in databases and spreadsheets, to write reports and so on, but I am referring to use of computers to simulate scientific phenomena.

Experiments have been the foundation of science since time immemorial. In the last few thousand years, scientists started laying theoretical foundations to explain what experimentation showed them or to explain observed phenomena.

Tell us how computing is becoming the third pillar of science in your field in the comments section below.

About 50 years ago, scientists used computers for the first time to simulate physical phenomena. They started with nuclear reaction simulations and weather simulations – two physical phenomena that were difficult to perform experiments on or to observe closely.

For decades, national governments invested in ever more powerful supercomputers to simulate physical phenomena and chemical reactions, but computerized simulations remained limited to a small percentage of scientists who had access to these big supercomputers.

Computing as the third pillar of science

In the last few years, something dramatic happened that changed science forever. The advent of inexpensive and very powerful desktop supercomputers based on GPU accelerators has given every scientist the ability to run simulations that are detailed enough to mimic real physical and chemical phenomena. A GPU-accelerated PC today is as powerful as the fastest supercomputer just 10-12 years ago.

This high performance addresses one of the key reasons scientists didn’t use computer simulations broadly before, which is that computer simulations were not detailed enough to mimic real “wet lab” experiments. Higher performance in computers means more detailed computer simulations with much larger datasets that account for more environmental conditions.

For example, computational biochemists can now take a library of known drugs (chemicals), say 1 million drug candidates, and simulate which of these suppress a target virus or bacteria. This can narrow down the potential drug candidates to say a 1,000 potentials, which biochemists can take to a wet lab and do physical experiments to study the efficacy of these drug candidates against the virus or bacteria. These simulations were either not possible earlier or were limited to those who had access to the world’s most powerful supercomputers.

Scientific research is inherently an iterative task. Scientists have a theory or notion. They run an experiment to test their theory, look at the results, modify the experiment, run it again and iterate, till they have an insight. Now, with powerful desktop GPU supercomputers, scientists can simulate their hypotheses very, very fast.

I believe that GPU-accelerated computers have fundamentally accelerated the pace of scientific research, by making computing the third pillar of scientific research along with experimentation and theory.

If you want to learn more about how GPUs are revolutionizing scientific discovery, be sure to attend our GPU Technology Conference this March. 

Tell us how computing is becoming the third pillar of science in your field in the comments section below.


Similar Stories

  • Berhanuz

    “A GPU-accelerated PC today is as powerful as the fastest supercomputer just 10-12 years ago” This is absolutely true and design engineer could get feedback on their concept designs within hours and could stream line design directions accordingly. Challenge remaining is creating such awareness across industries.

  • Sumit Gupta

    Thats right Berhanuz.    When I was a graduate student, I used to run jobs on servers on a cluster I built and maintained myself.   These were hardware chip simulation jobs.

    Now I can get more performance from a desktop PC than the entire 16 node cluster we had back then!

    Awareness is key; especially in industry.   Research folks always pickup these advances fast.   But if HPC gets adopted widely in industry, we can have a true impact on the economy

  • http://www.facebook.com/profile.php?id=1046783461 Yo Ann

    Even though you are speaking of more scientific stuff, i experience the same thing in illustration. My workflow has always been the same: getting a concept/idea and detail it by writting, then making some mental visualisation, then going to the paper and make a ton of thumbnails to manage to find the scenery exactly as i imagined it.
    Now, i go instead directly on a 3d software after mental visualisation. Why ? Thanks to the quality of realtime rendering, the power of modeling, the quality of lighting and casting shadows, i can test much more possibilities than i would have by drawing. I change the camera and the focal length in no time, which would have meant a new drawing from scratch.
    This is productivity. I still of course love to draw on paper, it is just another experience.
    When I saw the possibilities of Nvidia maximus, i got suddenly crazy. I wanted all my softwares to be greatly GPU enhanced to improve more this experience. I want to see my viewport 2.0 in maya become a viewport 7.0 where i can put a ton of realtime light effects, and materials that are state of the art.
    So yes, even in art the revolution is in progress. I only hope that software developpers while provide it as fast as possible. (go call Zbrush please ^^).

  • Sagar Rawal

    This is absolutely true…and thanks to the low cost of entry to get started with HPC, especially with the rise of Cloud Computing, adoption rates in industry should increase drastically.

  • http://computerstories.net/ Rowan Gonzalez

    Computer use can only intensify with time I guess, also for scientists. Perhaps one day we even become part of them entirely. 

  • http://www.facebook.com/people/Sergey-Shevchenko/100002087985653 Sergey Shevchenko

    For example, I used to do the calculation in the schedule and saw that inside the nuclei of atoms can be trapped electrons. Protons are heavy, so they can form a solid wall around the electrons. And when these walls are broken releasing energy, to form a new smaller core.Without the program, I would not have known that.
    Например я когда-то делал расчеты в графике и увидел что внутри ядер атомов могут быть зажаты электроны. Протоны тяжелые, поэтому они могут образовывать вокруг электронов прочные стены. И когда эти стены разрушаются происходит высвобождения энергии, с образованием нового меньшего ядра.

  • Ezrad Lionel

    tell me more about how moores cores fascinates you

  • http://www.facebook.com/mihir.khandekar73 Mihir Khandekar


  • http://www.facebook.com/mihir.khandekar73 Mihir Khandekar

    the thing about computers is that they provide us with a virtual reality. No matter how serious the consequences of a given experiment might be, if yu can ask a computer to simulate the experiment the worst result yu will get is FAIL.
    It provides an edge over physical experiments as yu can conduct as many as yu possibly can.

  • Atanu Roy

    Wondering your fast chips with less cost will break the ‘digital divide’, and will have a big social impact 

  • http://twitter.com/Elia_Attardo Elia Attardo

    In specific research areas like Computational Electromagnetic, use of supercomputers is mandatory. To model an antenna or even to study the scattering properties of an object without the right tool can be quite impossible. In this field, I’m experiencing an increasing interest in the use of GPUs to accelerate these problems. Couple of years ago, you should have wait days before getting the results of your simulation.