Which technologies get better faster
MIT News Office Boston MA (SPX) May 23, 2011 Some forms of technology - think, for example, of computer chips - are on a fast track to constant improvements, while others evolve much more slowly. Now, a new study by researchers at MIT and other institutions shows that it may be possible to predict which technologies are likeliest to advance rapidly, and therefore may be worth more investment in research and resources. In a nutshell, the researchers found that the greater a technology's complexity, the more slowly it changes and improves over time. They devised a way of mathematically modeling complexity, breaking a system down into its individual components and then mapping all the interconnections between these components. "It gives you a way to think about how the structure of the technology affects the rate of improvement," says Jessika Trancik, assistant professor of engineering systems at MIT. Trancik wrote the paper with James McNerney, a graduate student at Boston University (BU); Santa Fe Institute Professor Doyne Farmer; and BU physics professor Sid Redner. It appears online this week in the Proceedings of the National Academy of Sciences. The team was inspired by the complexity of energy-related technologies ranging from tiny transistors to huge coal-fired powerplants. They have tracked how these technologies improve over time, either through reduced cost or better performance, and, in this paper, develop a model to compare that progress to the complexity of the design and the degree of connectivity among its different components. The authors say the approach they devised for comparing technologies could, for example, help policymakers mitigate climate change: By predicting which low-carbon technologies are likeliest to improve rapidly, their strategy could help identify the most effective areas to concentrate research funding. The analysis makes it possible to pick technologies "not just so they will work well today, but ones that will be subject to rapid development in the future," Trancik says. Besides the importance of overall design complexity in slowing the rate of improvement, the researchers also found that certain patterns of interconnection can create bottlenecks, causing the pace of improvements to come in fits and starts rather than at a steady rate. "In this paper, we develop a theory that shows why we see the rates of improvement that we see," Trancik says. Now that they have developed the theory, she and her colleagues are moving on to do empirical analysis of many different technologies to gauge how effective the model is in practice. "We're doing a lot of work on analyzing large data sets" on different products and processes, she says. For now, she suggests, the method is most useful for comparing two different technologies "whose components are similar, but whose design complexity is different." For example, the analysis could be used to compare different approaches to next-generation solar photovoltaic cells, she says. The method can also be applied to processes, such as improving the design of supply chains or infrastructure systems. "It can be applied at many different scales," she says. Koen Frenken, professor of economics of innovation and technological change at Eindhoven University of Technology in the Netherlands, says this paper "provides a long-awaited theory" for the well-known phenomenon of learning curves. "It has remained a puzzle why the rates at which humans learn differ so markedly among technologies. This paper provides an explanation by looking at the complexity of technology, using a clever way to model design complexity." Frenken adds, "The paper opens up new avenues for research. For example, one can verify their theory experimentally by having human subjects solve problems with different degrees of complexity." In addition, he says, "The implications for firms and policymakers [are] that R and D should not only be spent on invention of new technologies, but also on simplifying existing technologies so that humans will learn faster how to improve these technologies." Ultimately, the kind of analysis developed in this paper could become part of the design process - allowing engineers to "design for rapid innovation," Trancik says, by using these principles to determine "how you set up the architecture of your system."
Share This Article With Planet Earth
Related Links - Space Tourism, Space Transport and Space Exploration News
New software to support interest in extreme science Chicago IL (SPX) May 02, 2011 The University of Chicago's Flash Center for Computational Science released a major new version of supercomputer code, called FLASH 4-alpha, on April 29. Based on previous software for simulating exploding stars, this is the first version of the FLASH code that has extensive capabilities for simulating high-energy density physics experiments. The U.S. Department of Energy's National Nuclea ... read more |
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2010 - SpaceDaily. AFP and UPI Wire Stories are copyright Agence France-Presse and United Press International. ESA Portal Reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement,agreement or approval of any opinions, statements or information provided by SpaceDaily on any Web page published or hosted by SpaceDaily. Privacy Statement |