## Thursday, July 3, 2014

### Decibels vs. Distance

This is a random mathematical musing.

Since sound intensity drops as a function of (the square of) distance away from sound source, I was curious as I fell asleep last night how it would look to graph Decibels (which is a logarithmic quantity) as a function of distance away from source.

Intutiively, you'd expect Decibels to still be negatively correlated to distance. As distance increases, Decibels should drop. But, would it drop linearly? Rapidly? It's basically wrapping a logarithmic curve on top of a rational curve, or in other words, a real-life function composition!

I did a bit of digging.  This website has a clear visual for why sound intensity drops as a square of the distance. (Basically, for non-physics readers, you're spreading the loudness over a greater surface area, if you consider the sound source as sending vibrations in all spatial directions, spherically outwards.) So, if you wrap the functional definition of Decibels around that, you get this function that maps x, the distance away from source, to d(x), the Decibels measured at that distance:

d(x) = 10 log(a^2/x^2) + b, where (a, b) is a point of known Decibel value. For example, if at 4 feet away from the source, the loudness is 20 decibels, then (a, b) = (4, 20).

The "common" (ie. Googlable) observation you get about this graph is that every time you double the distance away from the source (for example, if you go from x = 4 to x = 8, or x = 32 to x = 64), the decibel value drops linearly about 6 decibels. The graph looks like an upside-down log graph, which makes sense if you apply log rules to the function d(x) to decompose it into constant parts and variable parts.

Another interesting feature of this graph that raised questions for me was the end behavior. On the right side, the function dips below the x-axis, which indicates that after 40 feet, this particular sound source can no longer be heard (0 dB is the threshold of hearing). On the left side, however, it wigs me out that the Decibel should approach infinity as you get closer and closer to the sound source. Is that because I never really understood Decibels before? Mathematically, I could see how as the surface area approaches zero, the intensity of the sound approaches infinity, but somehow, it's hard for me to wrap my mind around the fact that the measurable decibels would also go towards infinity as x --> 0.

Now, can we corroborate this with some kind of experiment?