ScienceWise - Jul/Aug 2008

Beyond the test tube

Supercomputer quantum chemistry in the twenty-first century

Chemistry is the study of whether or not substances react and what they produce if they do. Chemists know how different chemicals interact from their experiments, but what if they could predict how they react by simply running a computer program? If they could do so this would reduce the amount of reagents needed and make experimental chemistry more efficient. This dream has been largely realised over the last 50 years with major advances in computational chemistry, however problems still remain, and work performed in the ARC Centre for Excellence for Free Radical Chemistry and Biotechnology is helping to solve them.

The reactivity of chemical compounds is governed by energies: how much energy is released or required for a reaction to take place, and how large an energy barrier exists between the reactants and the products. These energies can be determined using computers, and are used to predict how fast a reaction is, and whether it will occur or not. However, the energies must be determined very accurately for these predictions to have any relevance in the real world.

There are many different ways of working out these energies, but the best methods are based on quantum mechanics. As objects get smaller and smaller, the physical laws that normally govern how things behave start to break down. The “Newtonian” mechanics that describes how cricket balls, cars and bicycles move no longer holds for the electrons in molecules; they do not even have definite positions and speeds, but instead have a wave function Ψ that defines their properties.

This wave function, as well as the total energy of the molecule, is determined by solving the Schrödinger equation. This equation can only be solved exactly using pencil and paper for some very simple cases such as the hydrogen atom; everything else needs to be treated approximately using computers.

The better the approximation, the better the results. However, the most accurate quantum mechanical methods (termed “ab-initio” methods) are far too time-consuming for systems with more than a few atoms, even with supercomputers. Molecules of relevance in biology or polymer science are often much, much bigger than this, and an enormous amount of research has been directed towards finding ways of doing the quantum mechanics that capture the essentials, but work faster than the old methods. This allows chemists to predict and rationalise the products they obtain and determine how fast reactions occur for large compounds that are relevant to real-world applications.

One major advance in this area is Density Functional Theory (DFT). In 1964, Hohenberg and Kohn showed that instead of solving the Schrödinger equation, the energy of a system could be obtained from the “electron density” ρ, the probability of finding an electron at each point in a molecule. This is a much simpler mathematical object than the wave function, so the calculations are intrinsically much quicker than the best of the approximate quantum-mechanical methods, and the results are, in principle, just as good. The problem is that no one knows the true relationship between the electron density and the energy. Even the way to go about approximating the true relationship is unknown. So chemists and physicists are forced to use a trial-and-error approach, designing new methods based on a series of assumptions. If a new method works, then the assumptions the method is based on are probably good ones, and the assumptions can then be refined to generate better methods.

Unfortunately, while DFT gives excellent results in many cases, even the best of these methods do not work well in all situations.

Dr Michelle Coote and her team at the Australian National University in the ARC Centre for Excellence for Free Radical Chemistry and Biotechnology are addressing this problem, in collaboration with Peter Gill and Andrew Gilbert, also at the Australian National University.

They are trying to determine when and why DFT methods fail, and whether or not the problems can be eliminated. This will allow chemists to make more accurate predictions on large molecules of relevance in biology and polymer science, reduce the computer resources required for smaller molecules, and help design better DFT methods that work for a broader range of problems.

Until the failures of DFT have been properly characterised, Michelle Coote’s research group will use state-of-the-art ab-initio methods to investigate reactions of interest in polymer science and biology. Although these methods are extremely time-consuming, reactions involving quite large molecules with up to 20 “heavy” atoms (atoms other than hydrogen) can be treated very accurately, but the calculations require a supercomputer with an extremely large amount of memory. Together with many other chemists, the Coote group uses The Australian Partnership for Advanced Computing National Facility; the fastest computer in Australia. It has 1680 processors, 3.56 terabytes of memory and a peak speed of more than 11 trillion calculations per second. It was the 26th-fastest computer in the world when it was completed in 2005, and is currently ranked 200th fastest.

At present, massive computers such as these are required to perform accurate calculations on all but the smallest of molecules, and those with more than 20 “heavy” atoms are a significant challenge. Many molecules of interest in biology and polymer science cannot be investigated accurately for this reason. DFT methods allow much larger molecules to be investigated, however they are not yet accurate enough for general use, and until their failings have been characterised, they will be used with caution.

Computational chemistry is an exciting science that offers much insight into chemical reactivity, and yields information that is complementary to experimental data. Unfortunately, only small- to medium-sized molecules can be treated accurately at present, but the boundaries are continually being pushed, and it will not be long before accurate techniques are available that can be applied to large molecules as well.

 

Reproduced with thanks to Science Teachers’ Association of NSW

Supercomputer quantum chemistry in the twenty-first century
Making sure that what we think is good for box-gum woodland restoration, actually is.
New insights into visual control of flight
New 3D model of the earth’s crust may change theories of continental drift
Possibly Related ANU Research Articles
Supercomputer quantum chemistry in the twenty-first century
How chemists are increasing the longevity of plastics
The cool thing about science
How billions of years of practice created the world’s most efficient hydrogen source
Applying quantum mechanics to chemistry
The world's first two-qubit quantum logic operations based on solid-state impurity sites

Updated:  11 December 2013/ Responsible Officer:  Director, RSPE/ Page Contact:  Physics Webmaster