Consider that light follows the path minimising travel distance through space, between two points A and B (a straight line, or curved by gravity). It's as though each possible path was identified, and the shortest actually used (a single virtual actualised). Now add to the gedankenexperiment a block of a special material, not on the direct path. This special material is something that bends space in as complex a manner as possible - maybe a metre cubed block of a foam of tiny black holes, that would do - and make the foam a three-dimensional fractal so finding the optimal path through it is really hard. Then do two things. Firstly, calculate the fraction of possible paths that would pass through the material, and secondly calculate how difficult all possible paths through the material are to compute (use some universal computing measure). Then let some light out at source A, and record what happens when it gets to destination B.
What might happen is that it takes a tiny, tiny time longer to reach B than when the block wasn't there. That tiny extra time is the measure of the raw computing power of the universe itself, how long it takes for the universe to compute the paths through the black hole foam cube and discover that none of them are shorter than the straight line. And possibly, just possibly what will happen is that the front of the beam of light will take longer but all subsequent light in the ray won't take longer at all because the path will already have been calculated, so the head of the beam will be slightly yet measurably brighter. That is, if the universe has also implemented caching.
Consider that light follows the path minimising travel distance through space, between two points A and B (a straight line, or curved by gravity). It's as though each possible path was identified, and the shortest actually used (a single virtual actualised). Now add to the gedankenexperiment a block of a special material, not on the direct path. This special material is something that bends space in as complex a manner as possible - maybe a metre cubed block of a foam of tiny black holes, that would do - and make the foam a three-dimensional fractal so finding the optimal path through it is really hard. Then do two things. Firstly, calculate the fraction of possible paths that would pass through the material, and secondly calculate how difficult all possible paths through the material are to compute (use some universal computing measure). Then let some light out at source A, and record what happens when it gets to destination B.
What might happen is that it takes a tiny, tiny time longer to reach B than when the block wasn't there. That tiny extra time is the measure of the raw computing power of the universe itself, how long it takes for the universe to compute the paths through the black hole foam cube and discover that none of them are shorter than the straight line. And possibly, just possibly what will happen is that the front of the beam of light will take longer but all subsequent light in the ray won't take longer at all because the path will already have been calculated, so the head of the beam will be slightly yet measurably brighter. That is, if the universe has also implemented caching.