EEG Signal Quality Validation in Mobile Apps: What to Check Before Recording
A user puts on their EEG headset, opens your app, and taps "Start Session." Thirty minutes later they review their data and find that half the recording is unusable because an electrode was sitting on top of hair, not scalp. They blame your app, not their headset placement. In consumer neurotechnology, the mobile app is the user's primary interface for understanding signal quality, and most apps do a poor job of it.
This post covers the specific signal quality checks we implement in EEG companion apps at DEVSFLOW, including real-time impedance validation, artifact detection algorithms that run efficiently on mobile hardware, and the UX decisions around when to warn users versus when to automatically reject data.
Impedance Checks: The First Line of Defense
Electrode impedance is the single most important indicator of signal quality. High impedance means poor electrical contact between the electrode and the scalp, which results in higher noise floors, increased susceptibility to 50/60 Hz power line interference, and attenuated brain signals. For dry electrode consumer headsets, acceptable impedance is generally below 200 kilohms. For wet (gel-based) electrodes used in research contexts, the threshold drops to 5 to 10 kilohms.
Most consumer EEG chipsets (ADS1299, ADS1294R, NeuroSky TGAM) provide built-in impedance measurement by injecting a small AC test signal at a known frequency (typically 31.25 Hz or 7.8 Hz) and measuring the resulting voltage. Your app receives this impedance data either as a raw measurement or as a pre-classified quality level (good, moderate, poor) depending on the firmware.
When the headset provides raw impedance values, your app should display per-channel impedance on a head map visualization. Color-code each electrode position: green below 50 kilohms, yellow from 50 to 150 kilohms, red above 150 kilohms. These thresholds work well for dry electrodes. For wet electrode systems, shift everything down by an order of magnitude.
The impedance check should run as a mandatory pre-recording step. Do not let users skip it. In our experience with the RE-AK Nucleus-Kit, adding a mandatory 15-second impedance check before each session reduced support tickets about "bad data" by over 60 percent.
Real-Time Artifact Detection
Even with good electrode contact, EEG signals are constantly contaminated by non-neural electrical activity. The three most common artifacts in consumer EEG are eye blinks, muscle tension (EMG), and motion artifacts from the headset shifting on the scalp.
Eye Blink Detection
Eye blinks produce large-amplitude (100 to 300 microvolt) deflections in frontal channels (Fp1, Fp2) with a characteristic waveform lasting 200 to 400 milliseconds. Detection is straightforward: apply a bandpass filter from 1 to 10 Hz on frontal channels and trigger when the amplitude exceeds 100 microvolts with a rise time under 200 milliseconds. On mobile, a simple threshold detector on the filtered signal runs with negligible CPU cost.
Eye blinks are normal and expected. Your app should not treat them as errors. Instead, count them per minute and flag epochs that contain more than 15 blinks per minute as potentially compromised for frontal analysis. For posterior channels (O1, O2, Pz), eye blinks have minimal impact and can typically be ignored.
Muscle Artifact (EMG) Detection
Muscle activity from jaw clenching, forehead tension, or neck movement produces broadband high-frequency contamination, typically above 20 Hz. In the EEG frequency domain, this manifests as elevated power in the beta (13 to 30 Hz) and gamma (30 to 100 Hz) bands that is clearly non-neural in origin.
Detect muscle artifacts by computing the ratio of high-frequency power (30 to 50 Hz) to total power (1 to 50 Hz) in sliding 2-second windows. If this ratio exceeds 0.4 on temporal channels (T3, T4, T5, T6), muscle contamination is likely dominant. On mobile, compute this using a 256-point FFT, which is fast enough for real-time processing even on older devices using Apple's Accelerate framework or Android's equivalent.
Motion Artifacts
Motion artifacts produce sudden, large-amplitude (500+ microvolt) shifts that are often rail-to-rail on the ADC. They are easy to detect: flag any sample that exceeds 80 percent of the ADC's dynamic range. For a 24-bit ADC with a 4.5V reference like the ADS1299, this means flagging values above roughly +/- 3.6 million counts.
Signal-to-Noise Ratio Assessment
Beyond individual artifact detection, your app should compute a running signal-to-noise ratio (SNR) estimate for each channel. The simplest effective approach for EEG is to compute the ratio of power in the alpha band (8 to 13 Hz) to power in a noise reference band (e.g., 55 to 65 Hz, which captures power line interference).
For consumer EEG during resting-state recording with eyes closed, a healthy SNR on posterior channels should show alpha power at least 3 to 5 times greater than the noise floor. If this ratio is below 2, the electrode likely has poor contact or the headset is poorly positioned.
We compute SNR using Welch's method with 1-second Hanning windows and 50 percent overlap, updated every 500 milliseconds. On iOS, this maps cleanly to vDSP_desamp and vDSP_fft_zrip from the Accelerate framework. On Android, we use a custom implementation backed by FloatBuffer to avoid garbage collection pressure during real-time processing.
Real-Time Feedback UI Design
Raw signal quality metrics are meaningless to most users. Your app needs to translate impedance values, artifact counts, and SNR into clear, actionable feedback. Through user testing across multiple headset products, we have settled on a three-tier system:
- Green (Good): Impedance below threshold, SNR above 3, fewer than 5 artifacts per minute. The user sees "Signal quality: Good" and can start recording.
- Yellow (Adjust): Impedance marginal or SNR between 2 and 3, or moderate artifact rate. The user sees a specific instruction like "Adjust electrode at position Fp1" or "Try to relax your jaw muscles."
- Red (Not Ready): One or more channels have impedance above threshold, or SNR below 2, or the ADC is railing. Recording is blocked with a clear explanation of what to fix.
The critical design decision is whether to block recording or merely warn. For consumer wellness apps, we recommend blocking only on red conditions and warning on yellow. For research-grade data collection, block on both red and yellow. This distinction should be configurable in your app's settings or set by the study protocol.
When to Auto-Reject vs. Warn
During an active recording session, signal quality can degrade. An electrode might shift, the user might start moving, or environmental noise might increase. Your app needs a strategy for handling degradation mid-session.
We use a two-threshold approach. If signal quality drops to yellow for more than 10 consecutive seconds, show a non-intrusive banner warning. If signal quality drops to red for more than 5 consecutive seconds, pause the recording and prompt the user to readjust. For research protocols, automatically mark degraded epochs in the recording metadata so downstream analysis pipelines can exclude them without manual review.
Never silently discard data during recording. Even noisy data may contain useful information after offline artifact removal with ICA (Independent Component Analysis) or ASR (Artifact Subspace Reconstruction). Your app should flag bad epochs, not delete them. Store quality metrics alongside the raw data so that analysis software can make informed decisions about what to include.
Implementation Considerations for Mobile
All of these checks need to run in real time without draining the battery or introducing latency in the data pipeline. On a modern iPhone or mid-range Android device, the computational cost of impedance display, blink detection, EMG detection, motion detection, and running FFT-based SNR is well under 5 percent of a single CPU core at 250 Hz sampling with 8 channels.
Keep quality computations on a dedicated background thread, separate from both the BLE receive thread and the UI thread. Update the UI at 2 Hz for quality indicators. Updating faster than that provides no benefit to the user and wastes CPU cycles on view rendering.
Building an EEG companion app that gives users confidence in their data starts with proper signal validation. If your team needs help implementing real-time quality checks for a neurotechnology product, talk to DEVSFLOW Neuro. We build the mobile layer for headsets and wearables that people actually trust.