It is often said that atomic-level protein simulation is costly, which is why multi-scale modeling was devised. However, I am yet to understand why it is so expensive.
Suppose we want to simulate the folding of Titin, which has 38000 amino acids and is the longest, in the 442 petaflop Fugaku supercomputer. In what way is this simulation impossible in Fugaku?
What exactly is the level at which you are trying to understand this? Molecular calculations are expensive for many reasons, but I will give you two that should be intuitive. First, there is a large number of molecular interactions to compute. Second, a conformational space to be explored is vast. Even the tiniest movement of the tiniest part of a large molecule mandates new energy calculation. There are many movements to be tested, and the energy gradient is not always informative as there are numerous local minima.
It is like being dropped into the middle of Atlantic ocean with a task to swim to London, but without the benefit of knowing which side is east or west. On top of that, after each stroke you'd have to count to three before making another. Finally, even if you reached New York City, or Rio de Janeiro, or Lisbon, or Dakar, the only feedback you would get is not London. You wouldn't get any real direction to London, just like in protein folding we don't know a priori what the expected energy minimum should be. How long would it take to swim to London under those circumstances?