Michio Kaku is wrong

A renowned theoretical physicist argues that no system could ever simulate the 10 million decillions of atoms in the universe (yes that’s the real number)

But why?

Considering Moore’s law (the observation that the number of transistors on a microchip doubles approximately every two years, leading to an exponential increase in computing power) and with the rate we’re at right now, I find his outright dismissal very strange.

Exponential technological development aside, the interconnectedness of quantum physics and spirituality point directly to the possibility of a programmable reality.

See, Planck’s constant (a fundamental value in quantum mechanics that implies a minimum granularity to the universe) suggests that at the smallest scale, reality may not be continuous but composed of discrete, indivisible units. Spacetime can essentially be quantized, much like pixels in a video game. And when we take that into consideration with quantum superposition (the idea that particles exist in all possible states until observed) we begin to understand the fluidity and dynamism of the universe in a logical way.

These concepts serve as the direct scientific basis of the law of attraction, the idea that our conscious intentions play an active role in shaping reality. Even at the quantum level, particles are influenced by the observer, and we, the observers, collapse potential realities into specific outcomes.

You can view reality as random chance, or you can see the patterns; either way, you end up at the same paradoxical conclusion that unpredictability is controlled by some deeper force.

Thus, we begin to see existence not as a fixed, external experience but as a flexible, adaptable, responsive system. So what would stop a post-singularity civilization from developing the computational power required to simulate realities down to the quantum level?