ESP Logo
 Elliott Sound Products Doppler distortion in loudspeakers - Real or Imaginary? 

Doppler distortion in loudspeakers - Real or Imaginary?

© 2004 - Rod Elliott (ESP)
Page Created 20 Aug 2004
Updated 09 Dec 2004

HomeMain Index articlesArticles Index

Contents
Preamble

In order to save everyone time (including and especially me), I must make a couple of points before anything else is discussed ...

Please make sure that you read and understand the above before you decide it is necessary to send me an e-mail pointing out things that I have already acknowledged either above or in the body of the article itself.  Should you get the impression that I am sick of e-mails pointing out that the Doppler effect involves a change of phase, or telling me that 'Doppler effect' is the best way to describe the loudspeaker distortion, you are quite correct.  I know the phase effect, and need to point out that had the effect been called Phase Modulation Distortion from the outset, this article (and probably hundreds of others) would never have been written.


To say that this article has stirred up a hornets' nest is putting it mildly.  Since the initial article was published, there has been a great deal of correspondence, both for and against what I have written.  Many people seem to have missed the point, so to make absolutely sure that it is not missed, I will make it right at the beginning ...

Frequency shift in a loudspeaker is real - it does happen, and the amplitude (frequency deviation) of the shift is identical to that predicted by 'conventional' Doppler theory.  The peak phase shift occurs at the extremes of the cone movement as one should expect, and indeed, there is zero phase shift at the low frequency zero crossing - the cone's rest position.  The rate of change of phase is greatest at the zero crossing point, and this corresponds to the maximum frequency shift.

In the previous version of this article, I was under the impression that there was no phase variance at the zero crossing point, however, after some correspondence and some further (rather tedious) tests, it can be shown that the peak frequency shift does indeed occur at the point of peak cone velocity - this is not readily apparent on the waveforms shown below, because the resolution required is so great that it could not be achieved using my oscilloscope.  As it transpires (and in hindsight - always an exact science), this makes sense, since the peak rate of change of phase (and therefore frequency) must occur during the transition from one phase angle to another.  My apologies to anyone who may have been mislead.

The Doppler effect is generally applied to moving vehicles (etc.) and the most common explanation is that the wavefront is 'compressed' as the object approaches the listener, and 'stretched' as it heads away.  Another way to look at this is that the phase of the signal is constantly changing, resulting in the familiar frequency shift.  The main point here is that the effect lasts for some relatively considerable period of time, and the observed frequency shift is directly related to the velocity of the approaching (or receding) sound source.

Consider a sound source approaching at 30m/s (108km/h).  The sound source is generating a frequency of 1kHz, so as it approaches, the listener will hear not 1kHz, but about 1095Hz - a clearly audible change.  As the sound source recedes, the perceived frequency will be 912Hz - again, readily audible (see [ 4 ] for the calculator that I used).  As each successive compression or rarefaction of the wavefront reaches the listener, the phase is effectively advanced (or retarded) by approximately 108us or 39°, thus creating 'new' frequencies that are proportional to velocity and the original frequency.  I shall leave it to the reader to analyse this further if desired, but suffice to say that this phenomenon has been known for a very long time.

While the principle is well known, the 'phase method' of describing it is not - it is far more common to see the explanation deal with the compressed or expanded wavefront (and wavelength).  There is no anomaly here - both descriptions are equally valid, and explain exactly the same phenomenon.

Based on the measurements shown below, it is obvious that the conventionally explained Doppler effect does not occur with a loudspeaker - the periods where the cone is moving towards or away from the listener are too short, and the cone is not moving through the medium (air).  It could also be stated that it is not the cone that is making the sound, but rather the movement of the cone.  Several correspondents have complained that what amounts to 'movement of the movement' of the cone cannot possibly be seen to be a moving sound source per se.  This can create a philosophical conundrum if one delves into it deeply enough, but fortunately this is not necessary.  The phase shift model explains this, and it is obviously impossible for the cone to move without causing phase shift.

In short, the term 'Doppler distortion' (for a loudspeaker) is completely real, but is also a simple misnomer (IMO) - the frequency shift exists, but it can be shown that the shift in phase is static as well as dynamic.  Naturally, the same can be said for any sound source that moves with respect to the listener, regardless of velocity.  Minimising cone excursions for any driver that must reproduce higher frequencies is the key to obtaining better sound reproduction - this much is very clear.


Introduction

There seems to be a new area of debate on the Net, and it can be found in newsgroups, forum sites and websites in general.  For a long time, the existence of so-called Doppler distortion in loudspeakers was taken for granted by most (including the author), but there are now many challenges to this claim - some are well reasoned and have considerable merit, while others are based firmly on the basis that the challenger simply doesn't believe that the effect exists at all.

Doppler distortion in loudspeakers is mentioned in so many articles, web sites and references that for years it was considered a simple fact.  A great many authors have applied their (not inconsiderable) mathematical skills to the subject, and it is fairly obvious that many (most) of the published results are based on the assumption of cone velocity, rather than absolute position.

The effect of Doppler shift in loudspeakers is believed by many to be audible, and it is (relatively) easy to measure the frequency spectrum and see sidebands that seem to show modulation consistent with frequency modulation.  Indeed, the existence of (velocity based) Doppler distortion in loudspeakers was originally established many years ago [ 1 ] and is maintained to this day.

At this point, I would like to thank Siegfried Linkwitz [ 2 ] and Art Ludwig [ 10 ] for their input to this article.  My initial published data caused both gentlemen a great deal of work (which I'm sure they could have well done without), but in the spirit of international co-operation they have been very helpful, and contributed a great deal to this (final?) version of the article.

In e-mail correspondence with Siegfried Linkwitz, he said ...

The considerable effort you put into this difficult measurement should not go unnoticed and lead to a deeper understanding of the Doppler effect.  I certainly had never thought about and made the connection between the now obvious phase shift of the high frequency tone with cone displacement and the Doppler effect.  In the same way that the frequency shift is more easy to observe under certain conditions, so is the phase modulation in the case of a loudspeaker.  Both are just different descriptions of the identical physical phenomenon to which Doppler's name has become attached.

Siegfried also said that "I have not seen the associated PM pointed out in text books, but it follows from FM or PM modulation theory" - this is really the crux of what this article is about - that no textbooks describe the phase change (and resulting phase/ frequency modulation) as the cone moves closer to or further away from the listener, and this leads to a general misunderstanding as to the real cause of the effect, and obscures other effects that could have been extrapolated had this been more widely known.  More to come on this aspect shortly, but rest assured that it will not make a large difference to the way we design (or listen to) loudspeakers, but simply explains other phenomena that were not seen or predicted previously.

This paper has been a long time coming, and has had a considerable amount of input from others - both 'believers' and 'non-believers'.  Indeed, it was a non-believer who first took me to task because I referred to 'Doppler distortion' in one of my other articles, and it took some effort on his part for me to see that there was something amiss.  Since he prefers to remain anonymous, he will not be named, but suffice to say that his insistence was such that I had to work very hard to prove him wrong.  I initially thought that I had failed, but it turns out that the effect was real from the very beginning, but explained in such a way that made no logical sense (to me and presumably to him).  Papers such as that by John Kreskovsky [ 3 ] are very convincing, but unfortunately work from the wrong premise (i.e. velocity rather than phase) in the first place.  Note that the page had been updated at some stage, and some of the original claims (as I recall) were no longer there, or had been modified.  In fact, the page has disappeared and I've not been able to find it again.

The original experiments I performed - well over a year ago at the time of writing - were inconclusive, and despite a lot of effort, it was not possible to determine exactly what was happening.  Even during tests to try to locate evidence of a frequency shift, while there was some evidence that a shift existed, the resolution of the equipment I had at the time was insufficient to be certain.  In addition, I did what I suspect many others have done before me, and removed the low frequency component from the captured audio signal - this was a mistake, because that information is needed to be able to see exactly where the shift occurs.

This study is as much theoretical as practical - the theory was needed to predict an outcome, and to examine other possibilities.  I must thank those who have listened to my rants and hypotheses and supplied additional ideas ... it was one simple point that brought the whole issue into focus, and allowed this page to (finally) become reality ... phase!


1.0 - The Doppler Effect

The Doppler effect is very real, and has been heard by everyone at some stage.  The classic example is of the siren or horn on a moving vehicle, which is increased in pitch as the vehicle heads towards you, and reduced in pitch as it heads away.  There are countless references and descriptions in books and the Web, and I shall not even try to cite references except one - [ 4 ].  A Web search will tell you everything you ever needed to know (plus a lot more).  Doppler shift requires that the sound source is moving with respect to the listener, which means that the listener is either stationary, or moving at a different speed from the originating sound source.  An ambulance driver or passenger does not hear any Doppler shift, because s/he is moving at the same speed as the siren, so there is no relative difference in velocity.

There is a requirement for the Doppler effect to exist that there be relative movement through the medium (between the sound source and the listener).  It is generally assumed that the medium itself is not moving, or is moving at a lower velocity than the sound source.  Should the medium be moving as well, this may increase or decrease the effect from the listener's perspective (information is rather scarce on this point).

That the phenomenon of the Doppler effect is real is not in question - what is in question is whether the same effect occurs in a loudspeaker reproducing more than a single tone simultaneously.  Furthermore, we should clarify the term 'distortion', since the word is normally applied to a non-linear function.  Should the effect be demonstrated to exist in a loudspeaker, then it is obvious that it will appear in an ideal (or theoretically perfect) driver as well, so there is no non linearity.  Based on this, the effect probably should not be called 'distortion' (although it must be said that anything that adds frequencies that did not exist in the original recording actually is distortion, but that is probably a philosophical debate rather than one to be considered by the engineering fraternity).

The first thing to consider is the maximum velocity of a loudspeaker cone, when driven by a signal of any given frequency.  This is generally much lower than we might imagine, and the velocity may be calculated by ...

Vp = 2π × fL × Xp[ 2 ]

where Vp is the peak cone velocity, fL is the low frequency and Xp is cone displacement.  For example, a cone that is moving ±5mm (10mm total) at a frequency of 50Hz will have a peak velocity of 3.14m/s (11.3km/h) - this is certainly not fast, and in itself would account for a rather small frequency shift.  In a system where (conventional velocity based) Doppler effect does change the frequency, the change is given by ...

ΔfH = 2π × fL × fH × Xp / c[ 2 ]

where fH is the high frequency (modulated) tone and c is the velocity of sound (nominally 343m/s).  Using the velocity calculated earlier, we get a shift of 9.1Hz for a high frequency tone of 1kHz (less than 1%).  Bear in mind though, that the rate of change needs to be maintained for a reasonable period of time before the frequency shift will become apparent, and 10ms (the time it takes a 50Hz signal to swing from maximum positive to maximum negative) is probably insufficient to cause an audible frequency shift.  Our hearing is such that it needs a reasonable number of cycles (which varies with frequency) before we can identify the pitch of a tone.

Remember that for the (conventionally explained) Doppler effect to exist, the 'carrier' (vehicle, train, etc.) or listener must be moving through the medium, but a loudspeaker cone does no such thing.  The medium (air) will be pushed outwards and sucked inwards by the low frequency movement of the cone, so there is no (or very little) relative movement between them - the medium is moving with the cone! What is heard (and most commonly incorrectly attributed to Doppler effect) is a combination of things (in descending order of importance) ...

Intermodulation distortion
Amplitude modulation
Phase shift

Of these, the only one that comes close to frequency modulation (as predicted by much existing theory and practice) is phase shift, and this is caused by the relative position of the cone at any instant in time in relation to the listener.  That the shift is small is obvious, and equally obviously it depends on the peak to peak travel of the cone.  Assuming a large cone movement for the low frequency signal of (say) 10mm in each direction, this represents a complete cycle (360°) shift at 34,500Hz, or 36° at 3,450Hz (for example).  It must be considered that any loudspeaker that is expected to have anywhere near 20mm travel at low frequencies, and is expected to reproduce high frequencies as well, will almost certainly generate considerable intermodulation of the higher frequencies.

Interestingly, the phase shift observed will be exactly the same at DC as at any low frequency that causes the same displacement - even moving the whole box relative to the microphone will generate the same amount of (static) phase shift.  This is simply a function of the velocity of sound in air, and the wavelength of the high frequency tone.  A 10mm change in the position of the box will cause a 36° phase shift (of a 3,450Hz tone) in exactly the same way that 10mm of cone movement will - this is a simple physical relationship.

An interesting effect of amplitude modulation (within the range we can expect from a loudspeaker) is that when you hear it, it sounds like frequency modulation.  Again, this is a well known effect, and references abound.  The effect is caused by the way our hearing works, and this may be the predominant effect that is heard by listeners who have 'proven' that Doppler distortion exists because they have 'heard' it.  In general, the pitch seems to reduce slightly as amplitude is increased for frequencies below around 2kHz, while for frequencies above 2kHz the pitch increases with increased amplitude [ 5 ].

The majority of studies (on websites or elsewhere) that have shown that Doppler distortion does exist, have used a spectrum analyser (or FFT - Fast Fourier Transform) to show the sidebands so generated.  The spectrum analyser is completely the wrong tool to use for this, as it only shows frequency information - the effects we want to capture are in the time domain, and are best seen with an oscilloscope.

The theory of Doppler shift is velocity based (although as noted above, it can also be explained by phase) - the higher the (relative) velocity between source and listener, the greater the frequency shift.  This is extremely difficult to even attempt to measure, since the peak velocity of the cone is so low.  With phase shift, velocity is not relevant - only the variation of distance between source and listener determines the shift, and it can be shown that the peak phase shift coincides with the peak excursion of the cone (positive and negative).

In order to dispense with the idea of loudspeaker Doppler distortion, we must find the flaw in the seemingly intuitive and rather persuasive argument that has been used historically to 'prove' that Doppler distortion does in fact exist.  Simply making the point that the sound source must move through the medium with respect to the listener is not sufficient, and it has been argued that the cone does in fact do just that.  That this action would violate the laws of gaseous matter when subjected to a force is (apparently) insufficient explanation, so further proof is needed.

That there is a shift is not in question - what is at question is the exact nature of the shift, and where it occurs on the low frequency waveform.  If the shift observed is indeed in frequency, and caused by the Doppler effect, then it must occur at the point where cone velocity is at its peak - at the zero crossing point of the waveform.  However, if the shift is simply one of phase, then the peak (phase) shift will occur at the extremes of excursion, where the cone is effectively stationary.  So which is it?  Indeed, are the two effectively identical?

One of the biggest difficulties with measuring any of these effects is the fact that the delays (and associated phase shifts) are very small.  If we assume that the effect is indeed Doppler induced and that the deviation is 9Hz as predicted for our demonstration system, then the delay is only (approx.) ±28us.  Measuring this in conjunction with ambient noise, distortion, and instrument resolution limitations is not easy, since the waveform period is 1ms - the absolute maximum shift possible is ±2.8%, but is likely to be a lot less.  While these things are not a limitation with a simulator or mathematical approach, real life makes it a lot harder for us.


1.1 - Frequency Modulation Characteristics

It is useful to look closely at the effects of frequency modulation, so that we know what we are looking for.  A lack of understanding at this level must result in a flawed conclusion, so a thorough understanding of what we should find is imperative.  If we find exactly the signal characteristics as described below, then the original theory is proven - this is not open to conjecture, since we are dealing with simple facts.

However, should we get results that are quite different, then the observed phenomenon cannot be frequency modulation, and is therefore not related to the Doppler effect.

The signal frequencies used for the physical tests performed later will be different from those used in the theoretical discussion that follows, for reasons that (I hope) will become clear later, but driver total excursion will be as close to ±5mm (10mm total travel) as possible.

For the theoretical examination, the low frequency will be 50Hz, the high frequency 1kHz and the peak to peak cone excursion shall be 10mm (±5mm), therefore frequency shift will be 9Hz (as predicted by the formulae above).  The high frequency tone will be modulated at the 50Hz rate, to obtain a maximum of 1009Hz and a minimum of 991Hz.  This represents a Modulation Index of 0.18 [ 6 ].  Let's do a quick examination of wavelength, based on the formula above that predicts a frequency change of ±9Hz.

W = c / f

where W is wavelength in metres, c is the velocity of sound (nominally 343m/s) and f is frequency in Hertz.

W = 343 / 1000 = 0.343m = 343mm
W = 343 / 991 = 0.346m = 346mm
W = 345 / 1009 = 0.340m = 340mm

The total change in wavelength is therefore 6.21mm.  We could analyse this data forever (or until we became bored), but that is probably enough for now.

fig 1.1
Figure 1.1 - Frequency Spectrum of 1kHz, Modulated by 9Hz at 50Hz Rate

As can be seen from the spectrum display (from a synthesised FM waveform), a frequency modulated (FM) waveform modulated as described has multiple sidebands - although only a limited number is shown for clarity, they are actually infinite and diminish into the noise floor quite rapidly.  It is easily determined that the fundamental is at 1kHz (at a level of 0dB), with the first set of sidebands at -20dB (950Hz and 1050Hz).  The next set is at -48dB, at frequencies of 900Hz an 1100Hz.  The final set is measured at -78dB, at frequencies of 850Hz and 1150Hz.  After that, the sidebands are at -111dB or lower - well into the noise floor.  A pure sinewave predictably shows a single peak at 1kHz - no surprises there, and it is not shown.

The above assumes that the shift will actually be 9Hz - it has already been explained that this will almost certainly not be the case.  In a listening test on a 1kHz tone, modulated by the full 9Hz at a 50Hz rate, it is possible to hear the FM as a slight roughness in the tone ... however using headphones (to eliminate standing waves, room reflections, etc.), the roughness is barely audible.  if we assume that the frequency shift is only 4.5Hz - probably still unrealistic, but useful for our purposes - the signal through headphones sounds almost as clean as a pure sinewave.


1.2 - Phase Modulation Characteristics

Phase modulation [ 7 ] is actually very similar to frequency modulation - so much so that some narrow-band FM communications systems actually use PM (Phase Modulation) because it is easier to implement.  Unfortunately, while it may well be easier for radio frequency signals, it is harder to synthesise than FM in audio, and no sound editor that I have access to is capable of creating a PM signal.  The end result is sufficiently similar though (audibly speaking) that it is almost immaterial.

What is certainly not immaterial is the spectrum of a PM signal, so I was forced to build a phase modulator (using my Simulator and a standard analogue phase shifting circuit) that would operate satisfactorily at the modulation frequency of 50Hz.  This proved irksome, but the results were gratifying, as it is now possible to display a spectrum analysis of a PM waveform, with (almost) the same characteristics as the FM signal referred to in section 2.1.  This is an important step - without this, we do not have a useful reference and may be left with conjecture.  Imagine the embarrassment if it had transpired that PM has a spectrum similar to intermodulation  .

For the sake of this exercise, the spectrum of the PM signal is not shown - it is essentially identical to that shown above for FM.  For all intents and purposes there is no difference whatsoever.  Although there actually is a (very) subtle difference, it is not visible until it is below the -120dB level, so is irrelevant in practical terms.

fig 1.2.1
Figure 1.2.1 - Phase Modulator as used to Generate Phase Modulation

For reference, Figure 1.2.1 shows the schematic for the phase modulator that I used to make the comparisons.  This was created in Simetrix [ 8 ] and is not too different from a real-life working circuit.


1.3 - Test Methodology

The next stage is to determine a test method that will show the effect - this is not a simple task, because we know that any shift will be small, and is invariably accompanied by amplitude modulation and intermodulation.  Because of this, a spectrum analyser is useless, as it will show intermodulation sidebands that will completely mask any frequency variation.  The instrument of choice is an oscilloscope, because that is capable of displaying information in the time domain, where the problem occurs.

If the (velocity based) Doppler theory is correct, we should expect to see a set of waveforms similar to that shown in Figure 2.3, where a low frequency tone is mixed with a high frequency tone, and fed to a loudspeaker.  This signal is also fed to one channel of the oscilloscope, and the other is used to display the audio waveform reproduced by the loudspeaker, and picked up by a microphone.  The gains of the two are carefully matched so that the waveforms are perfectly overlayed - any discrepancy should be visible.  This can only be done using a digital storage oscilloscope, because the waveform must be captured and expanded so that any shift is visible.

fig 1.3
Figure 1.3 - Expected Oscilloscope Display for Doppler Modulation of the High Frequency

The red trace shows the unmodulated carrier riding on the LF waveform (i.e. the electrical signal sent to the loudspeaker), and the blue trace shows what we should expect to see on the oscilloscope from the audio signal picked up by the microphone.  Note that the frequency deviation has been exaggerated for clarity - if it were that easy to see there would be no argument.  It is quite apparent that with FM, the modulation is at its maximum at the LF zero crossing (maximum acceleration), and is at minimum (signals in phase) at the LF signal peaks.

If the shift is a genuine Doppler shift, then we will see the signal compressed or stretched at the zero crossing points of the LF waveform, and at the peaks of the LF waveform the HF signal should be in phase, since LF cone movement is effectively zero (for a moment in time, at least).

On the other hand, if the shift is one of phase, then the maximum variation between electrical and acoustical signals will occur at the peak of the LF waveform (positive or negative), as shown in Figure 2.4 below.  This stage of theoretical analysis is very important - it is far easier to figure out what one should look for in advance than it is to try to figure out what one is looking at with no background.

fig 1.4
Figure 1.4 - Expected Oscilloscope Display for Phase Modulation of the High Frequency

The red trace again shows the unmodulated carrier riding on the LF waveform (the electrical signal), and as before the blue trace shows what we should expect to see on the oscilloscope from the audio signal.  Note that the phase modulation too has been exaggerated for clarity.  You can see easily that for phase modulation, the maximum variation is at the LF waveform peaks, where the distance from the cone's rest position is at its greatest.  There is no modulation at the zero position, despite this being the point of maximum velocity for the LF signal.

This alone should start to give the astute reader a hint - the cone is definitely moving closer to and farther away from the microphone, so there must be a variation in the relative phase.  After all, the microphone position had to be adjusted to ensure that the acoustic and electrical high frequency signals were in phase during the setup phase.  If you are anything like me, at this point you'll be saying 'Of course! This must be so.' (or words to that effect ).

Figure 2.5 shows a schematic of the test setup.  Several things are critical to get a usable result, and these include the following ...

fig 1.5
Figure 1.5 - Test Setup Circuit Diagram

The 50k pot allows the exact 'blend' of low and high frequency signals for the reference trace (Channel 1 on the oscilloscope).  By varying the levels from the two oscillators, the loudspeaker drive signal is changed to suit requirements, and the mic preamp also has a gain control so the acoustic signal level can be exactly matched to the reference - in both phase and amplitude.  The 560nF cap was needed to rotate the LF phase enough to be able to use a sensible frequency.

It is important to understand the exact test setup used to perform the test.  First, the speaker is excited by the HF tone only, and the microphone gain and physical position adjusted so that the electrical and acoustic signals are exactly in phase.  This is the reference signal, and it is imperative that this is done very accurately, or the end result will be nonsense.  The HF signal is then removed, and the LF signal is applied - because of the use of a close microphone, there will be little LF phase shift ... but the speaker must be operated at (or near) resonance, or the reactive load will cause the electrical and acoustic waveforms to be out of phase.

This can be used to our advantage - by changing the low frequency slightly, we can advance or retard the phase just enough to make sure the signals are again perfectly aligned.  By operating the woofer near resonance, the best relationship between cone excursion and power is possible - setup and adjustment take some time, and a smoking woofer voicecoil is not helpful.  Likewise, for the high frequency signal, it is advantageous to use the highest frequency possible, so that any shift that occurs is maximised, and becomes easier to see on the oscilloscope.  Because of the relatively short wavelength, mic positioning will correct all phasing errors (the voicecoil is inductive, and causes phase shift at high frequencies).  Also note that the woofer will not be a linear piston at such a high frequency, so measured phase variations may not agree 100% with those calculated.

One might imagine that the test would be easier if the low frequency were filtered out, since that allows a better resolution of the HF, but then we would lose the essential information that shows us where the LF zero crossing points and waveform peaks are - these are the parts of the waveform where we need to see what is happening, and we need the LF data.

As a point of interest, it must be stated that the microphone's diaphragm does not in any way influence the test results, since it is an electret capacitor mic.  These microphones respond to pressure, and diaphragm movement is virtually nil - it is probably around the same as the movement of the human ear-drum, so any thoughts that this may introduce an 'anti-Doppler' effect are simply untrue.  In comparison to cone movement, the travel is infinitesimal, and can safely be ignored.

Not so the woofer however - it proved necessary to wedge the woofer (sitting in a plastic crate filled with fibreglass) with a piece of wood from the ceiling to stop it from walking around the workshop floor! With an applied frequency of 36.8Hz (required to obtain the correct phase relationship and sufficient travel), there was no way it wanted to stay put.  Even the high power 'digital' amp I used proved to be a problem as it kept cutting out, and I finished up using one of my P101 MOSFET amps - this was more than up to the task, and didn't get above warm, despite the load.


2.0 - Test Results

This is almost an anti-climax, since it should be fairly obvious by now what we will see.  The results do not show a great deal of phase shift, but it is visible.  It was necessary to adjust the signal levels slightly to get the best alignment, since as you can see in Figure 3.1 there is significant LF waveform distortion.

fig 2.1
Figure 2.1 - The Composite Waveforms - Signal and Acoustic

As with all the following plots, the electrical signal is in red, and the acoustic signal in blue.  It is obvious that any phase shift in the acoustic signal is not visible in the above, so the following three graphs show the expanded waveform at different points.  The oscilloscope used is a Tiepie PC based instrument, and for this work was operated at a sample rate of 2.5Ms/s (2.5 million samples/ second) at 12 bits resolution, and with a record size of 130,972 samples.  It exceeds the resolution available from my normal bench oscilloscope by a wide margin.

Amplitude modulation is easily seen, and was also quite audible during the test - even with earmuffs.  Look at the peaks of the waveform, and you can see that there is a big difference between the electrical and acoustic signal HF amplitudes - in a perfect woofer, these would be identical.  Likewise, the low frequency signal would not be distorted (and this was from a high excursion subwoofer).  Although the waveform looks a little like the power amp was clipping, this is not the case - the distortion seen is from the loudspeaker itself.

fig 2.2
Figure 2.2 - Signals at LF Zero Crossing

It is quite apparent that there is no phase shift at the LF zero-crossing point, even though this is the area where cone velocity is highest.  While not really visible here (much higher resolution is needed to be able to measure it), there is a change to the period of the reproduced waveform - unfortunately, it only amounts to a pixel at best in the chart, and is easily missed.  If the periodic time of a waveform changes, so does its frequency.  For a 1kHz signal (having a period of 1000us), we can expect to see a change of only a few microseconds in the period (typically less than 10us, or 1%).  This is a challenge to measure, as should be obvious.

fig 2.3
Figure 2.3 - Signals at LF Positive Peak

Here, we can see that the HF acoustic signal is arriving just before the electrical signal - the cone is closer to the microphone, so the signal arrives earlier.  Look at the waveform peaks - not the zero-crossing points.  The difference is discernible, but is still quite small.  This is quite possibly the first time you have ever seen this effect, and it's all captured from a real test.

fig 2.4
Figure 2.4 - Signals at LF Negative Peak

While not quite as pronounced as with the previous example, a small amount of shift is visible, since the cone is now further from the microphone than before.  Even using a 4.65kHz signal, the wavelength is still just over 74mm, so the cone travel I was able to achieve without excessive distortion makes the variations smaller than desirable.  The variation is there though, as predicted.

It is important to point out that the only manipulation of these graphs was to reverse them, because the microphone I used inverted the signal (so the positive and negative excursions were back to front).  Otherwise, the graphs are as they were saved from the PC oscilloscope, but the text (inserted when the graph is saved) was removed to minimise file size.


Conclusion

Upon testing, it is easily seen that the maximum phase shift of the HF signal occurs at the peak of the LF waveform.  There is little or no discernible (phase) shift at the LF zero crossing, so the effect is phase shift, caused by the cone being closer (or further away) from the observer/listener at any point in time.  Yes, there is an effective frequency shift, but (and although this is an apparently minor point, it very important), the shift is not caused by the Doppler effect per se (i.e. the 'conventional' velocity based interpretation), so cannot (or should not) be correctly called 'Doppler distortion'.  Exactly the same phase shift can be seen simply by disconnecting the LF signal, and manually moving the cone or microphone (slowly) by the same amount, or by using DC or an extremely low frequency to achieve the same result.  This cannot cause the Doppler effect, as should be obvious.  What it does is shift the phase, and if the phase shift is fast enough, then the phase modulation will 'create' frequency modulation - exactly as predicted.

It is fair to state that 'Doppler distortion' in loudspeakers exists, but the term is misleading.

It is equally fair to state that Phase Modulation in loudspeakers also exists, and is the correct definition of the effect.

Using the predictions that may now be obtained based on phase modulation, it can be demonstrated that the phase shift is independent of the LF modulation frequency, and depends only on the peak-to-peak amplitude.  Nonetheless, the FM component of the phase modulation that is produced is a direct function of the LF modulation frequency.  The two theories are not at odds with each other, but describe the same effect from a different perspective.  The phase modulation model is more intuitive, and eliminates any of the arguments about the sound source moving through the medium etc.  While it is unlikely that any of this will make the arguments go away, the procedures described here can be reproduced by anyone with the right equipment.  Indeed, a complete test is not even needed to be able to see clearly that changing the mic position with respect to the sound source will change the phase - everything else can be calculated from there.

Further analysis reveals that the traditional predictive formulae that have been used work - they predict exactly the same frequency shift as is obtained by analysis of the phase modulation.  What was not taken into consideration was the fact that the observed FM was really PM all along (a subtle difference).  Since the peak cone travel was always used as part of the equation - as it must be to determine instantaneous velocity - the analysis turns out to be identical regardless of the method used.  When the rate of change of phase is considered (where maximum rate of change implies maximum frequency shift), the peak frequency variation coincides with the maximum rate of change of phase, so the maximum frequency shift is also seen at this point.

Note: By examining the phase model described herein, and upon realising that it is identical to the conventionally explained Doppler effect, it is now obvious that there is no requirement for the sound source to move through the medium at all.  While this is a common argument by the non-believers, it fails to stand up to scrutiny.  All that is required is that the sound source moves with respect to the listener - whether (or not) it moves through the medium is immaterial.  While this may not be apparent from the traditional descriptions and explanations of the effect, the phase model makes the situation quite clear.

Very much unlike many (most?) of the other examinations of the subject, this test procedure was performed on a real speaker, and the results of these real-life tests are shown above.  There is sufficient information in this paper to allow anyone else to duplicate the results, provided the necessary equipment is available.  This is encouraged - the more people who understand the physics that cause the effect the better, and I would be delighted to hear from others who have applied this (or any other) test method to demonstrate the mechanism involved.

This is not merely a theoretical discussion of the effect, but contains the unadulterated results of the test procedure described.  At the risk of offending those who believe that there are things in audio that no instrument other than well trained ears can detect, this is proof that a properly designed test method can be devised, and a theory thus proven or disproved.

The effect is very small (to the point of being virtually inaudible by itself), and is usually swamped (or masked if you prefer) by amplitude modulation and intermodulation distortion, so could be considered immaterial in any typical loudspeaker system.  If people really want to describe the effect as a distortion - not an unreasonable assumption, since it does exist - then it should be re-named.  The correct term is (in my opinion) Phase Modulation Distortion (PMD), and I suggest that the term 'Doppler distortion' be dropped from usage, since it is too easy for people to misinterpret.

Even PMD is possibly technically incorrect, since the effect is perfectly linear, and will occur in a perfect loudspeaker.  Regardless, it is probable that people will want to call it distortion.  Linear or not, it still adds something to the signal that was not there in the first place, and it is not unreasonable to call that distortion.

Many readers will be glad to see that there is no mathematical proof of PMD, nor have I attempted to devise a formula to allow you to calculate the shift of any given frequency with a known LF cone displacement.  So many others have already done this that there is no reason for me to add more on the topic. 

Siegfried Linkwitz is one who has done exactly that - the mathematical proof (and equivalency) may be seen on his site, and his page now covers the topic from both the traditional and 'new' perspectives.

The methods that may be used to minimise PMD are exactly the same as those used to minimise intermodulation distortion, primarily, reduce the excursion of the mid-bass driver.  This may be done by using a crossover and a subwoofer, or by choosing an alignment that reduces the low frequency excursion to the absolute minimum.  Naturally, a 3-way system will outperform a 2-way in this respect, since the midrange driver's excursion will be minimal with no bass content.

Finally, it should be understood that this is a purely physical phenomenon, and short of extensive DSP (Digital Signal Processing) nothing can be done to prevent it if a driver is expected to handle bass and high frequencies simultaneously.  While DSP is quite capable of applying a delay correction to remove the effect, the frequency shift is so small that the benefits are of dubious value.

If you really want to hear Doppler shift in a loudspeaker, get hold of a Leslie rotating speaker cabinet [ 9 ].  The high frequency rotating horn (in particular) satisfies all the criteria for Doppler effect - the sound source moves through the medium, and has sufficient diameter (and speed) to create a spectacular sound effect.  There is also considerable amplitude modulation, and interesting phase effects as well.  That you cannot expect to get this from a conventional loudspeaker is readily apparent, since Don Leslie would have not had to go to all that trouble if it were otherwise.


References
1Paul KlipschCE Hall of Fame (link broken)
2Issues in loudspeaker design - 1 (section 'J') Siegfried Linkwitz
3Doppler shifts in loudspeaker. Fact or fiction?John Kreskovsky (Website no longer exists)
4HyperPhysics (Georgia State University)Very useful and detailed explanation of the Doppler effect
5HyperPhysics Effect of Loudness Changes on Perceived Pitch
6Types of ModulationBy Dennis J. Ramsey (Website no longer exists)
7Phase ModulationIntegrated Publishing
8Simetrix One of the best simulators I have ever used.
9Unearthing ... The Leslie Cabinet Clifford A. Henricksen - Community Light & Sound (1981)
10Piston Vibrating in a TubeArt Ludwig

Other useful reading material
1Doppler Distortion Piston Vibrating in a Tube Including the Effect of Excursion - Art Ludwig
2Sound Clint Sprott - Physics Department, University of Wisconsin
4Christian Andreas DopplerBiography of the man himself

 

HomeMain Index articlesArticles Index
Copyright Notice. This article, including but not limited to all text and diagrams, is the intellectual property of Rod Elliott, and is Copyright © 2004.  Reproduction or re-publication by any means whatsoever, whether electronic, mechanical or electro- mechanical, is strictly prohibited under International Copyright laws.  The author (Rod Elliott) grants the reader the right to use this information for personal use only, and further allows that one (1) copy may be made for reference.  Commercial use is prohibited without express written authorisation from Rod Elliott.
Page created and copyright © 20 August 2004