Post date: Feb 26, 2014 11:08:13 PM
A time delay does not affect the magnitude plot of a signal, which makes sense because the it's the same signal, just shifted in time. However, the phase of the time-delayed system keeps falling proportional to [delay*frequency]
An intuitive explanation of this provided here:
http://lpsa.swarthmore.edu/BackGround/TimeDelay/TimeDelay.html
I am copying here just in case this material is taken off the web.
Aside: An intuitive explanation of linear phase
To understand why the phase shift associated with a time delay is linearly proportional to the phase of the signal consider three sine waves (at 100, 200 and 400 Hz) each delayed by 1.25 mSec.
You can see from the graphs, that the phase shift associated with a delay of 1.25 mS is
-45° for the 100 Hz signal. Since the 100 Hz signal has a 10 mS period, this corresponds to 1/8 of a period of 45°.
-90° for the 200 Hz signal. Since the period of the 200 Hz signal is only 5 mS, this corresponds to 1/4 of a period (90°).
-180° for the 400 Hz signal. period is only 2.5mS, so 1.25 mS is 1/2 of a cycle.