Wiki says coherence bandwidth is 1/D and D is channel delay spread. Coherence time is 0.423/fm and fm is Doppler frequency shift and equals to vf/c where v is moving speed, f is carrier freq, and C is the light speed constant. So how come coherent bandwidth * coherence time becomes # of symbols during which channel is static? Symbol rate is not in the picture at all?
ARao, good question and here is my understanding. As you said, coherence time comes from Doppler shift and it tells during this time the channel is static. Coherence bandwidth comes from multiple path effect (delay spread) and it says if the symbol rate is not higher than this the channel fading is a flat fading rather than a frequency selective fading.
Back to your question of coherence interval =coherence time * coherence bandwidth, I think what it means is during the coherence time of 1ms and if the symbol rate is just 200KHz (same as coherence bandwidth), the 200 symbols (1ms * 200KHz) don’t suffer time selective fading (due to Doppler shift) and frequency selective fading (due to multipath delay spread)