Skip to content

Error detection and correction

Oona Räisänen edited this page Jun 24, 2024 · 5 revisions

RDS uses a cyclic code to find synchronization and detect the presence of bit errors caused by noise. It is a powerful scheme but will not detect 100% of errors; this is a design tradeoff.

RDS even has the capability to correct small bit flips of 1 to 2 bits, so we don't have to reject all erroneous blocks. This enables us to receive more blocks in the presence of noise but, as a tradeoff, will also lead to some more errors getting through.

If you prefer to see fewer errors in noisy conditions (and also fewer correct blocks) you can, counterintuitively, disable error correction with --no-fec. This makes redsea always reject instead of correct errors; redsea will 'give up' when it's too noisy. Note that even this is not 100.0% error-free if noise is very bad; more on that below.

What the literature says

As it is written (US RBDS Standard, 1998, Annex B), the error correction code

  • Detects all single and double bit errors in a block.
  • Detects any single error bursts spanning 10 bits or less.
  • Detects about 99.8% of bursts spanning 11 bits and about 99.9% of all longer bursts.

The code is also an optimal burst error correcting code and is capable of correcting any single burst of span 5 bits or less.

However, during redsea's development, the 5-bit error correction hasn't been so reliable in practice. Quoting another source (Kopitz & Marks 1999: "RDS: The Radio Data System", p. 224), we have limited redsea's error correction functionality:

the use of the full error-correcting capability greatly increases the undetected error rate and thus also reduces the reliability [...] the error-correction system should be enabled, but should be restricted by attempting to correct bursts of errors spanning one or two bits.

The statistics

This plot shows how enabling vs. disabling forward error correction, or FEC, affects the amount of correct and incorrect data coming through redsea 1.0-SNAPSHOT (y axis) in worsening conditions (x axis). When FEC is disabled, so few errors get through that you can't see the dark red bars, but it's on the order of 0.2 % at worst.

There's more tabulated data in this discussion.

We used purely random noise in this test although the standard specifically talks about burst errors. So it might not reflect a real situation in that sense. Error correction was limited to a maximum of two-bit bursts.

FEC statistics

See Benchmark results for the impact of SNR.