Microphone Calibration
RUBAT’s calibration factor is a single number — Pa per ADC unit — that maps the raw normalised audio samples (range −1 to +1) to physical sound pressure in Pascals. Once entered, the live waveform display switches to dB SPL re 20 µPa and the Auto-mode threshold is interpreted in the same absolute scale.
Every audio sample x arriving from the ADC (in the range −1 … +1) is converted as follows:
L_SPL = 20 × log₁₀(p_Pa / 20×10⁻⁶) # dB SPL re 20 µPa
What is a “Pa per ADC unit”?
A modern audio interface presents samples normalised so that full-scale digital = 1.0 (and −1.0). The physical voltage that produces that full-scale reading is the interface’s full-scale input voltage, $V_{\text{FS}}$.
The microphone element produces a voltage proportional to acoustic pressure. Given microphone sensitivity $S_m$ (V/Pa) and total preamp gain $G$ (linear, unitless):
\[C = \frac{V_{\text{FS}}}{S_m \cdot G} \quad \text{[Pa/unit]}\]Three practical routes to obtain $C$ are described below — choose the one that fits your equipment.
Method 1 — From manufacturer datasheets No extra hardware
You can calculate $C$ entirely from published specifications with no acoustic reference source.
What you need:
- Microphone sensitivity (datasheet) — usually given as mV/Pa or dBV/Pa (re 1 V/Pa)
- Audio interface full-scale input level — usually given as dBu or dBV
- Preamp gain setting
Step-by-step
The datasheet will give a value such as −34 dBV/Pa or 20 mV/Pa. Convert to V/Pa: $$S_m \,[\text{V/Pa}] = 10^{S_{dBV}/20}$$ Example: −34 dBV/Pa → $10^{-34/20} = 0.0200$ V/Pa = 20 mV/Pa
Interface specs often give a maximum input level in dBu (reference 0.775 VRMS) or dBV (re 1 VRMS): $$V_{\text{FS, peak}} = \sqrt{2} \cdot 0.775 \cdot 10^{V_{dBu}/20} \quad \text{(if given in dBu)}$$ $$V_{\text{FS, peak}} = \sqrt{2} \cdot 10^{V_{dBV}/20} \quad \text{(if given in dBV)}$$ Example: +18 dBu full scale → $\sqrt{2} \times 0.775 \times 10^{18/20} = 6.16$ V peak.
Note: RUBAT's ADC unit corresponds to peak amplitude. If your interface datasheet gives an RMS full-scale figure, multiply by $\sqrt{2}$ to get peak.
If the gain knob is set to, e.g., +30 dB: $$G = 10^{30/20} \approx 31.6$$ If gain is set to 0 dB (unity), then $G = 1$.
Microphone sensitivities are measured at 1 kHz under controlled conditions. The actual sensitivity of a given unit may differ by a few dB from the nominal value, and gain knob tracking on analogue preamps is rarely exact. Treat this method as a good first estimate — validate it acoustically whenever possible.
Method 2 — Real-time calibration with a reference sound source Most accurate
A sound level calibrator (e.g., a pistonphone or IEC 60942 Class 1 calibrator) emits a pure tone at a precisely known SPL and frequency — typically 94 dB SPL at 1 kHz or 114 dB SPL at 1 kHz. This is the standard field method.
x_rms = rms(y); % ADC units (RMS of the 1 kHz tone)
Method 3 — Two-channel comparison with a reference microphone No calibrator needed
If you have one calibrated microphone with a known factor $C_{\text{ref}}$ but want to calibrate a second (test) microphone, you can derive the test factor by recording the same sound source on both channels simultaneously.
x_ref = rms(y(:, ch_ref)); % reference mic channel
x_test = rms(y(:, ch_test)); % test mic channel
This method assumes both microphones occupy acoustically identical positions — valid in a free-field if capsule separation ≪ wavelength, but not valid near a reflecting wall or at high frequencies where even a few centimetres of separation introduces a phase difference. For ultrasonic work (>20 kHz), co-locate capsules to within ~1 cm.
Quick reference — dB SPL ↔ Pa
| dB SPL | Pa (RMS) | Typical source |
|---|---|---|
| 20 dB | 0.000 2 Pa | Rustling leaves (threshold of hearing) |
| 60 dB | 0.02 Pa | Normal conversation |
| 94 dB | 1 Pa | Standard calibrator (Class 1) |
| 114 dB | 10 Pa | Standard calibrator (high-level) |
| 120 dB | 20 Pa | Threshold of discomfort |
| 140 dB | 200 Pa | Jet engine at 30 m |
Entering the value in RUBAT
The Calibration field in the recording panel accepts a value in Pa per ADC unit. Enter 0 to disable (dBFS display). The field is available at all times — you can update it while the stream is stopped or running without restarting.
| Uncalibrated (C = 0) | Calibrated (C > 0) | |
|---|---|---|
| Waveform Y axis | dBFS (0 dB = full scale) | dB SPL re 20 µPa |
| Waveform range | −120 dBFS … auto | 0 … 140 dB SPL … auto |
| Auto threshold | Relative amplitude | Absolute dB SPL |