Free Shipping on orders over US$39.99 How to make these links

Q&A: Neil Thompson on computing power and innovation | MIT News


Moore’s Law is the famous prognostication of Intel co-founder Gordon Moore that the number of transistors on a microchip will double every year or two. This prediction has often been met or exceeded since the 1970s – computing power has doubled every two years, while better and faster microchips have become cheaper.

This rapid increase in computing power has fueled innovation for decades, but in the early 21st century researchers began ringing the alarm bells that Moore’s Law was slowing down. With standard silicon technology, there are physical limits to what small transistors can get and how much can be compressed on an inexpensive microchip.

Neil Thompson, an MIT research scientist at the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Sloan School of Management, and his research team began counting on the importance of more powerful computers for improving the results of whole society. In a new working paper, they examined five areas where calculation is important, including weather forecasting, oil exploration, and protein folding (important in drug discovery). The working paper was co-authored by research assistants Gabriel F. Manso and Shuning Ge.

They found that between 49 and 94 percent of the improvements in these areas could be explained by computing power. For example, in weather forecasting, increasing computer power by a factor of 10 can improve three -day forecasts by a third of a degree.

But the advancement of the computer is slowing, which could have a huge impact on the economy and society. Thompson spoke MIT news about this research and the implications of concluding Moore’s Law.

Q: How do you approach this analysis and calculate the impact of computing in different domains?

A: Calculating the effect of computing the actual results is difficult. The most common way to look at computing power, and the growth of IT in general, is to study how much companies spend on it, and see how it relates to outcomes. But spending is a difficult measure to use because it partially reflects the amount of computational power purchased. For example, today’s computer chip may be the same value as last year, but it’s much more powerful. Economists try to adjust for quality change, but it’s hard to get your hands on exactly what that number is. For our project, we measured computational power more directly-for example, by looking at the capabilities of the systems used when protein folding was done for the first time using in-depth learning. By looking directly at the capabilities, we can get more accurate measurements and thus get better estimates of how the computing power influences the performance.

Q: How can more powerful computers improve weather forecasting, oil analysis, and protein folding?

A: The short answer is that increasing computing power has a huge impact in these areas. With the weather forecast, we see that there will be a trillion-fold increase in the amount of computing power used for these models. That puts into perspective how much computing power is added, and how we also use it. It’s not someone who just takes an old program and puts it on a faster computer; However, users have to constantly redesign their algorithms to take advantage of 10 or 100 times more computer power. There’s a lot more human wisdom that needs to go into performance improvement, but what our results show is that much of that wisdom focuses on how to use the more powerful computer machine.

Oil exploration is an interesting case because it is more difficult over time as easy wells are drilled, so the rest is more difficult. Oil companies are fighting that trend with some of the world’s largest supercomputers, using them to interpret seismic data and map underground geology. This will help them do a better job of drilling in the right place.

Using computing to achieve optimal protein multiplication has been a long-standing goal because it is essential for understanding the three-dimensional shapes of these molecules, which in turn determines how they interact with other molecules. In recent years, AlphaFold systems have made remarkable achievements in this area. What our analysis shows is that these improvements are well predicted by the many increases in the computing power they use.

Q: What are some of the biggest challenges in conducting this analysis?

A: If one looks at the two trends that have evolved over time, in this case performance and computing power, one of the most important challenges is to separate what is the relationship between them. the cause and what the real correlation is. We can answer that question, partially, because in the areas we studied companies invest a lot of money, so they do a lot of testing. In modeling time, for example, they don’t just spend tens of millions of dollars on new machines and then expect them to work. They did an evaluation and found that running a model for twice as long could improve performance. Then they buy a system big enough to be able to do that calculation in a shorter period of time so they can use it in operation. That gives us great confidence. But there are also other ways we can see the cause. For example, we see that there are several major leaps in the computing power used by NOAA (National Oceanic and Atmospheric Administration) for weather forecasting. And, when they bought a larger computer and had it installed all at once, performance really jumped.

Q: Would these improvements be possible without exponential increases in computing power?

A: That’s a difficult question because there are so many different inputs: human capital, traditional capital, and also computing power. All three change over time. One might say, if you have a trillion-fold increase in computing power, that’s sure to have the biggest impact. And that’s good intuition, but you also have to take into account the reduction in marginal returns. For example, if you’re going from no computer to a computer, that’s a big change. But if you move from having 100 computers to having 101, that extra one won’t yield nearly as much profit. So there are two competing forces-huge increase in computation on the one hand but decrease in marginal benefits on the other. Our research shows that, even though we already have tons of computing power, it’s growing rapidly which explains many of the performance improvements in these areas.

Q: What are the implications of the slowdown in Moore’s Law?

A: The implications are very worrying. As computing progresses, it strengthens weather forecasting and other areas we study, but it also improves countless other areas that we don’t measure but nonetheless those are critical features. in our economy and society. If that engine of progress slows down, it means that all follow-up effects will slow down as well.

Some may disagree, arguing that there are many ways to change – if one path slows down, others will pay. To some degree that is true. For example, we have already seen increased interest in designing special computer chips as a way to pay for the end of Moore’s Law. But the problem is the magnitude of these effects. The gains from Moore’s Law are so great that, in many areas of application, other sources of innovation are irreplaceable.



Source link

We will be happy to hear your thoughts

Leave a reply

Info Bbea
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0
Shopping cart