Skip to content

Calibration

Ashley Gittins edited this page May 25, 2024 · 1 revision

Calibration isn't strictly required, because Bermuda considers the relative distances between proxies, so the absolute value or distance isn't relevant, except where the max_radius setting might consider a device too far away from an area to be considered within it. But for deciding between areas, calibration won't have any affect.

This will change when we have support for per-device calibration, at which point it will be quite helpful to be able to calibrate each receiver's sensitivity. Also, when we have full trilateration, per-transmitter calibration may turn out to be useful in 2d, and will amost certainly be relevant if we ever gain 3d support.

Anyhow, the current way to calibrate is:

  • Place a transmitter 1 metre (just over 39 inches) from a scanner (bluetooth proxy)
  • In the attributes section of the transmitter's Distance sensor, watch the "Area rssi" value. Get a feel for what you consider to be an average. This will be your "Reference power at 1m" value.
  • Now move the transmitter away some distance (and measure that distance). Having a clear line-of-sight between the transmitter and the scanner is a good idea.
  • Now watch the "Distance" value. You want it to average around the right distance (but error towards a higher value, since a shorter measured distance is statistically less likely). That sentence is deliberately coy, since RF is a black art and I am not an ordained sorcerer. Also, some reflections might be in phase, most will not.
  • If the distance measured is always too high, decrease your attenuation figure. If it's too short, increase it.

Repeat this procedure until you decide nothing works, the universe is pure chaos and it's time to give up.

Clone this wiki locally