For a current hobby project I need an ultrasonic distance sensor. They work by sending a short pulse of ultrasound and measuring the time t it needs to get reflected back from the first obstacle. A nice introduction can be found here. The distance traveled by the sound d can then be calculated from a measurement of t: multiply the speed of sound v_\mathrm{sound}=340.29\frac{\mathrm m}{\mathrm s} by d. Divide by two to get the distance between the emitter and the reflecting object (the sound had to travel the distance twice to get there and back again).


I want to measure distances with an accuracy of a few centimeters so the time measurement needs to be accurate by

    \[\Delta t<\frac{2d}{v_\mathrm{sound}}\]

For one centimeter this amounts to about 60 \mu\mathrm s. This timing accuracy would be easy to achieve on a microcontroller where you can use hardware interrupts and have cycle-by-cycle accuracy of your timing (so at 1MHz you can get an accuracy of 1 microsecond). On a linux system your timing accuracy suffers from the fact that your measurement competes with other software for the CPU. Especially for an interpreted language like python, timing might get far worse than 60 \mu\mathrm s. The first solution I tried was this python code. No problem on an idle system. But as soon as you start recording HD video in parallel, your distance measurement starts to get quite noisy. It will be jumping by as much as half a meter from measurement to measurement).

For better timing accuracy one has to minimize delays between the incoming signal on the hardware pin and the recording of the time. With this goal I tried to modify the script to get an interrupt-based version. (An interrupt is a way of telling your machine, that when something happens it should automatically run a piece of code, allowing for decent timing in the presence of foreground processes.)

The result you can find here. It is object oriented and built such that you initialize the distance sensor and from then on it automatically updates the current distance regularly. It even calculates the standard deviation for you as an estimate of whether the measurement was noisy. (When the timing gets garbled up due to high load, you get a noisy measurement result). An example of how it can be used can be found in the main loop of the script.

You use the class like this:

import distanceMeter
#here is your code
#whenever you need the current distance you get it from the object
print "current distance is %0.3f, deviation %0.3f" % (distanceSensor.distance,distanceSensro.deviation)

#as the sensor runs in a thread you should tear it down properly:

Still this is not the end of it, even with the interrupt method I continue to get noisy distance data when the measurement code competes with other IO-heavy software on the PI, so stay tuned for an update with a C-based measurement module.

Update The post about the c-based module is here