For years researchers have held out hope that graphene would be the material to pick up the mantle in the electronics industry when silicon hits its limits as the material of choice for making devices smaller, faster and cheaper. Yet, turning graphene's promise into a reality has been difficult to say the least, in part because of the inherent difficulty of working with a substance one atom thick.

Methods of cutting graphene into useable pieces tend to leave frayed edges that mitigate the material's effectiveness as a conductor. Now, a team of researchers at Georgia Institute of Technology led by Walter de Heer claims to have made a significant advance in that area by developing a technique for creating nanometer-scale graphene ribbons without rough edges. (A nanometer is one billionth of a meter.)

Graphene has, of course, made headlines throughout the scientific world this week, thanks to the awarding of the Nobel Prize in Physics to two researchers at the University of Manchester in England who in 2004 pioneered a way of isolating graphene by repeatedly cleaving graphite with adhesive tape. The Nobel Prize committee recognized Andre Geim and Konstantin Novoselov "for groundbreaking experiments regarding the two-dimensional material graphene."

Unlike the approach taken by Geim and Novoselov, de Heer and his team in the past have created graphene sheets by heating a silicon carbide surface to 1,500 degrees Celsius until a layer of graphene formed. The graphene was then cut to a particular size and shape using an electron beam. "This was a serious problem because cutting graphene leaves rough edges that destroy a lot of graphene's good properties, making it less conductive," says de Heer, regents' professor in Georgia Tech's School of Physics.

De Heer's new approach, described October 3 in Nature Nanotechnology, is to etch patterns into the silicon carbide and then heat that surface until graphene forms within the etched patterns. (Scientific American is part of Nature Publishing Group.) In this way graphene forms in specific shapes and sizes without the need for cutting. "The whole philosophy has changed," he says. "We're not starting with an infinite sheet of graphene; we're growing it where we want to grow it."

The researchers claim to have used the technique to fabricate a densely packed array of 10,000 top-gated graphene transistors on a 0.24-square-centimeter chip, a step toward their ultimate goal of creating graphene components that can be integrated with silicon for new generations of electronics. Such a consolidation would be a key milestone towards making microprocessors able to operate at terahertz speeds, 1,000 times faster than today's chips (whose speeds are clocked at billions of hertz). Another goal is to reduce heat generation as an increasing number of transistors are packed onto each chip. Such advances would continue to validate Moore's law even as silicon circuits reach their miniaturization limit. "In principle, graphene can overcome silicon's limitation," de Heer says. "If we completely succeed [only] time will tell."

Graphene and silicon will be able to coexist much the same way that airplanes and freight ships are used for transporting cargo. "They move at different speeds, but both are important because they have different costs," de Heer says. "I think a similar thing will happen in electronics."

De Heer is also quick to acknowledge that, although the study of graphene dates back to the 1970s, the field still has a long way to go. He and his team are now investigating how the ribbons they created will perform over time and to what degree their new approach improves on cutting pieces of graphene out of larger sheets.

With so many open questions about graphene's viability, de Heer says he was surprised that the Nobel selection committee recognized graphene at this time. The technology has tremendous potential but only a fraction of that potential has been realized to date. "It's a little early," he says. "If you ask me the bottom line—What has graphene accomplished?—it's still trying to find its way."