In nanotechnology, the position of a single atom can make all the difference—whether a material functions as a semiconductor or an insulator, whether it triggers a vital chemical process or stops it cold. The ability to define every atom in a nanoparticle precisely would permit full control of the properties and behavior of a nanomaterial. But deep-down atomic imaging techniques, such as electron microscopy and scanning tunneling microscopy, are not enough for nanoengineering, because they do not provide the precise mathematical coordinates of every atom that nanotechnologists need.
“Beautiful pictures of nanostructures capture the imagination, but if a picture is worth 1,000 words, then a table, filled with accurate atomic coordinates, is worth 1,000 pictures,” says Simon Billinge, who studies what he has dubbed the nanostructure problem at Columbia University and Brookhaven National Laboratory. Billinge and his like-minded colleagues instead are looking to combine methods and use conventional techniques in novel ways.
Defining the exact atomic structure of everyday solids, as opposed to those of nanostructured ones, is relatively easy, because they feature what physicists call long-range, or crystalline, order: a regular, repeating structure that does not change much over atomic or molecular scales.
Scientists have traditionally examined such materials by crystallography, which relies on scattering techniques: a beam of x-rays or neutrons shines on a sample of material, and the atoms scatter and reflect the beam, forming patterns called Bragg diffraction peaks (after Sir William Henry Bragg and his son, who discovered the phenomenon in 1903). The Bragg peaks, which are related to the spacing between atomic layers, provide details from which the ordered atomic structure of the substance can be mathematically determined. This powerful method has revealed how the atoms of many substances—from cosmic dust to our own DNA—are put together.
But crystallography does not provide the resolution needed for the nanoscale, where structural differences occur over much shorter distances. When a nanomaterial is examined with traditional crystallography, “the Bragg peaks essentially broaden out and completely overlap, and you can no longer differentiate them from each other,” Billinge explains. “The algorithms that were developed for crystallography fail,” he adds, and investigators cannot tell where each atom lies. Without precise structural data, nanotechnology fabrication remains a game of approximations and best guesses.
Because a simple, one-size-fits-all solution is not anywhere on the horizon, researchers are using a combination of various imaging techniques and mathematical methods to tame the nanostructure problem. Such a multifaceted strategy builds accurate and useful models from different sets of data, in what is called complex modeling.
Billinge has combined crystallography with an approach that has long been used to examine noncrystalline substances, such as glasses and liquids. It makes use of the so-called pair distribution function (PDF), which describes the probability of finding one atom at a certain distance from another and provides statistical data from which structure can be computed. “The PDF technique is the realization of the fact that there’s all this information in between the Bragg peaks,” says Stephen Streiffer, acting director of the Center for Nanoscale Materials at Argonne National Laboratory.
In 2006 Billinge and his colleagues proved the PDF strategy by computing from first principles the soccer-ball structure of the carbon 60, or buckyball, molecule. Since then, they have developed more algorithms to reconstruct other nanoscale structures.
Although ingenious algorithms are indispensable, Streiffer says that imaging techniques must also continue to improve. “The holy grail of x-ray microscopy right now,” he observes, “is to be able to put a single nano-object into an x-ray beam and know not only the nanoscopic shape but the position and chemical identity of every atom that makes up that nanoscopic structure.” Matthias Bode, also at Argonne’s center, notes that spectroscopic methods—the study of materials based on the light they absorb or emit—will be another weapon in the imaging arsenal. “Usually what you want to do in nanoscience is correlate structure with some kind of property that acts on the nanoscale,” he explains, adding that spectroscopy would enable investigators “to correlate, say, the size or shape of the particle to specific electronic or magnetic properties.”
Taming the nanostructure problem will be the key to achieving the ultimate goal of nanotechnology: custom-designing nanomaterials for specific functions. “We’re obviously very far away from that,” Billinge admits. Still, he insists, “it’s a rich and exciting problem, and I’m kind of glad it’s not solved. It gives me something exciting to do.”
Note: This article was originally published with the title, "Big Little Problem".