It has to be admitted that if light goes 186000 miles/sec. that changing distance from 10' to 1' is only the matter of 1/1,000,000,000 th of a second vs. 1/100,000,000 the of a second. Even increasing distance to 100' means that it still needs only 1/10,000,000 th of a second for the signal to travel between transmitter and receiver.
If there is an issue caused by short distance, it is most likely the fact that the increase of signal strength overwhelmed the receivers' dynamic range, and the receiver has gone into saturation.
So needing to slow down shutter from 1/200 to 1/160 or 1/125 is caused by A + C, not B.
Talk to an RF engineer about propagation anomalies. Its not the time, its the integrity of the signal. A and C will have more or less known delays considering firmware design and protocol translation algorithms. The unknown is the radio wave in the air. At least that's what RF engineers have repeatedly told me when I've asked about similar technology changes in a telecommunications system. WiFi in one's house is an example we all deal with.