Aug 20, 2010 08:24 GMT  ·  By
In recent years, more and more circuits had to be thrown away because they don't meet the timing-, power- and lifetime-related specifications manufacturers need
   In recent years, more and more circuits had to be thrown away because they don't meet the timing-, power- and lifetime-related specifications manufacturers need

A team of experts from a group of universities is currently about to embark on a new research initiative, which will approach the challenges of nanotechnology from a software perspective.

The work is incredibly important, given the role that scientists and industry representatives have in stored for nanoscale devices and components.

In the near future, nanotechnology will become an integrated part of almost all technological processes, and will have incredibly diverse applications in electronics, medicine, environmental science and so on.

But there are still some challenges and issues associated with producing viable nanoparticles, or circuits and chips at the nanoscale, particularly those having to do with the variability that exists in production processes.

At this point, manufacturing errors, aging-related wear-out, and varying operating environments are some of the factors that affect the functioning of nanoscale systems to a great extent.

Six universities in the United States, including the University of California in Los Angeles (UCLA), decided to pool their resources together to address this issue.

They did so by putting together a team of computer scientists and electrical engineers that will “deal with the downside of nanoscale computer components by rethinking and enhancing the role that software can play in a new class of computing machines that are adaptive and highly energy efficient,” officials at UCLA say.

The research initiative is called “Variability-Aware Software for Efficient Computing with Nanoscale Devices,” and it has recently been awarded a $10 million grant for the US National Science Foundation (NSF).

The money, which will be given away over a period of five years, was awarded through the NSF Expeditions in Computing program.

The initiative sponginess studies that “promise significant advances in the computing frontier and great benefit to society.”

“We envision a world where system components – led by proactive software – routinely monitor, predict and adapt to the variability in manufactured computing systems,” says Rajesh Gupta.

The expert is a professor of computer science and engineering at the University of California in San Diego (UCSD) Jacobs School of Engineering, and the director of the Variability Expedition at the university.

“Changing the way software interacts with hardware offers the best hope for perpetuating the fundamental gains in computing performance at lower cost of the past 40 years,” he adds.

“If this project is successful and the breakthroughs are transferred to industry, we will have contributed to the continued expansion and reach of the semiconductor and computing industries,” concludes expert Subhasish Mitra, from the Stanford University.