EEG Mobile App Development: A Technical Guide for Biosignal Hardware Teams
Building a mobile app for an EEG headset is not the same as building a fitness tracker companion app. The data rates are higher, the timing requirements are stricter, the consequences of a dropped packet are real (you lose research data or clinical signal), and the platform-level Bluetooth stack will fight you at every step. This guide is for hardware teams and CTOs at neurotechnology companies who are either building their first mobile companion or wondering why their current app keeps disconnecting in the middle of a recording session.
We have built and integrated mobile applications for EEG, fEMG, EDA, HR/HRV, and other biosignal devices, including the Nucleus-Kit platform from RE-AK Technologies and product engineering work with CLEIO. The lessons in this guide come from shipping in production, not from reading the BLE specification.
Why Biosignal Apps Fail Differently
Most mobile apps are forgiving. A delayed message, a dropped frame, or a brief disconnection is invisible to the user. Biosignal apps are unforgiving. A 100-millisecond gap in EEG data is a visible artifact. A dropped BLE connection during a recording session can invalidate an entire research run. A clock drift of 50 milliseconds between EEG and EMG channels can render the synchronized analysis useless.
The constraints that define this domain:
- High-frequency continuous streaming. EEG at 250 Hz across 8 channels is 2,000 samples per second. At 24-bit resolution, that is roughly 48 kbps of payload, before BLE overhead. Higher channel counts and higher sampling rates push this well past 100 kbps sustained.
- Strict timing accuracy. Research and clinical use cases require timestamps accurate to within milliseconds. Multi-sensor fusion requires the timestamps to be aligned across devices.
- Zero data loss tolerance. Most consumer apps can drop a Bluetooth packet without consequence. A biosignal app cannot. Lost data must be detected, logged, and ideally recovered.
- Long sessions. Recording sessions of 30 minutes to several hours are common. The BLE connection has to survive the entire session, including phone screen-off, app backgrounding, and brief radio interference.
- Hardware variation. The same app may need to talk to a custom headset, an off-the-shelf wearable, and a research-grade device with completely different firmware behavior.
BLE on Android vs iOS: Two Different Worlds
Bluetooth Low Energy on iOS (CoreBluetooth) is consistent and predictable. Bluetooth Low Energy on Android is a multi-vendor patchwork of Bluedroid, Fluoride, OEM customizations, and hardware-dependent quirks. The same code that runs flawlessly on a Pixel 9 can produce silent failures on a Samsung A23.
The biggest differences that affect biosignal apps:
- iOS background BLE works reliably with proper background mode declarations. Android background BLE is at the mercy of OEM battery savers (Samsung Sleeping Apps, MIUI, EMUI), which kill foreground services unpredictably.
- iOS handles connection parameter negotiation transparently. Android exposes more knobs, which is both a blessing and a curse. Get the connection interval wrong and your throughput drops by an order of magnitude.
- iOS GATT operations are queued internally. Android requires you to build your own queue. Issuing a characteristic read while a write is in progress on Android leads to silent failures.
- The infamous error 133 (GATT_ERROR) on Android has no clean equivalent on iOS. It is a catch-all that can mean stale bonding, too many connections, or a race condition in the stack.
For the deep technical breakdown of these Android pitfalls, see our guide to undocumented BLE behavior on Android. For iOS background reconnection patterns, see our piece on iOS BLE reconnection in the background.
The BLE Bandwidth Math
Before you commit to a streaming architecture, do the bandwidth math. BLE 4.2 has a theoretical maximum throughput around 305 kbps. BLE 5.0 with extended PDU and 2 Mbps PHY can reach 1.4 Mbps in practice. Real-world throughput, factoring in connection interval, MTU, and platform overhead, is usually half of the theoretical.
For an 8-channel EEG at 250 Hz with 24-bit samples, the raw payload is:
- 8 channels × 250 Hz × 3 bytes = 6,000 bytes/second = 48 kbps
- Add packet overhead (timestamps, sequence numbers, channel metadata) and you are closer to 70 kbps sustained
- BLE 4.2 with default 23-byte MTU and 30 ms connection interval cannot keep up. You will see backpressure, dropped samples, and connection timeouts
The fixes that matter:
- Request a higher MTU after connection. On Android, call
requestMtu(247)immediately. On iOS, the negotiation is automatic but the result varies by device. - Request a short connection interval. Default is often 30 to 50 ms. For high-throughput streaming, 7.5 to 15 ms gives much better throughput at the cost of battery.
- Use BLE 5.0 with the 2 Mbps PHY where supported. The change is one API call (
setPreferredPhy), but the throughput gain is real. - Pack multiple samples per BLE notification. Sending one notification per sample is wasteful. Buffer 10 to 20 samples at the firmware level and send them as a single notification with a header.
For higher channel counts (32+ channel EEG), BLE may not be enough. Consider Wi-Fi Direct, USB-C tethering, or a custom radio. Make this decision before the product is built, not after.
Multi-Sensor Synchronization
Many neurotechnology platforms record multiple modalities simultaneously: EEG plus EMG, EDA, HR/HRV, GPS, and sometimes video. The Nucleus-Kit platform we worked on combines all of these. Synchronizing them accurately is one of the hardest problems in this space.
The naive approach is to timestamp every sample with the phone's system clock as it arrives. This fails for two reasons. First, BLE notification delivery is bursty. Samples can arrive in batches of 5 to 10 with the same wall-clock time. Second, different sensors connect over different channels (some BLE, some via the phone's internal sensors), each with its own latency.
The patterns that work:
- Each sensor includes its own monotonic sample counter in the BLE payload. The phone uses this counter, not the arrival time, as the canonical timestamp.
- At session start, perform a clock alignment handshake. The phone sends its current time to each sensor. Each sensor records its local clock at that moment. After the session, you can map every sensor sample back to a unified timeline.
- For sub-millisecond accuracy, hardware-level sync (a shared trigger line, an external clock signal) is the only reliable approach. BLE alone cannot guarantee this.
- Validate sync after every session by looking at known events (a tap, a button press, a flash) across modalities and checking that they line up.
For a deeper technical breakdown, see our piece on multi-sensor BLE synchronization.
Background BLE on iOS: The Watchdog Problem
If your use case requires recording data while the phone is in the user's pocket and the screen is off, iOS makes this difficult on purpose. Apple does not want apps to drain battery silently. The protections they have built make sense for most apps but create real engineering challenges for biosignal recording.
The key behaviors to plan for:
- Declare
bluetooth-centralinUIBackgroundModes. Without this, your app cannot maintain a BLE connection in the background at all. - Use State Preservation and Restoration. iOS can suspend your app and re-launch it later when a BLE event arrives. If you do not implement state restoration, the connection is lost permanently.
- The system reserves the right to terminate your app if it consumes too much background CPU or memory. Do as little processing in background as possible. Stream data to disk, defer analysis until foreground.
- BLE scanning behavior is heavily restricted in background. You can scan for known peripheral UUIDs but with much longer latency. Plan for 30+ second reconnection times after a brief disconnect in the background.
- Test on multiple iOS versions. iOS 17 and iOS 18 changed background BLE behavior in subtle ways. What works on one may degrade on the other.
For the full background BLE pattern, see our guide on iOS background BLE for wearables.
Data Integrity and Zero-Loss Streaming
"Zero data loss" is not a marketing line for biosignal apps. It is a technical requirement. Researchers and clinicians need to know that the EEG segment from minute 12 to minute 14 is complete and uncorrupted, or that any gaps are explicitly flagged.
The architecture that delivers this:
- Every BLE notification carries a sequence number. The phone tracks the expected next sequence and detects gaps the moment they occur.
- When a gap is detected, the phone requests a retransmission from the device buffer. Modern firmware should hold a few seconds of recent data in a circular buffer for exactly this purpose.
- If the gap cannot be filled, it is logged with start time, end time, and reason. Downstream analysis can mark this segment as invalid rather than silently using bad data.
- Data is written to disk in append-only files with checksums per chunk. A crash mid-recording does not corrupt the entire session.
- At session end, the file is sealed with a session summary including duration, sample counts per channel, and gap log.
For specific implementation patterns, see our breakdown of EEG data streaming over BLE with zero loss.
Battery and Thermal Management
BLE streaming at high throughput is energy-intensive on both the device and the phone. A poorly designed companion app can drain a phone battery in 90 minutes. The wearable side is even more constrained: a coin cell or small lithium-polymer battery has to last hours of continuous recording.
On the phone side:
- Avoid keeping the screen on during recording. Use a foreground service notification instead.
- Pause non-essential UI updates. If the user is not looking at the screen, do not run live waveform rendering.
- Batch disk writes. Instead of writing every sample, accumulate and write every 250 to 500 ms.
- Defer post-processing until session end and the device is charging.
On the wearable side, this is mostly firmware territory, but the app should help:
- Negotiate the longest connection interval that meets timing requirements. A 30 ms interval saves significant battery vs 7.5 ms.
- Disable BLE features the app does not need (some firmware advertises richer characteristics than the app actually uses).
- Surface battery level to the user in the app, with low-battery warnings before recording starts.
For a focused guide on this topic, see battery optimization for biosignal recording sessions.
SaMD and FDA Considerations
If your device or app provides clinical insights, diagnostic information, or treatment recommendations, you may be operating as Software as a Medical Device (SaMD). The FDA, Health Canada, and equivalent regulators in other jurisdictions have specific requirements for software development, verification, and validation in this space.
This is not a regulatory tutorial, but the engineering implications matter from day one:
- Design history file (DHF) requirements mean every architectural decision needs to be documented. Retrofitting documentation later is much harder than maintaining it as you go.
- Software of Unknown Provenance (SOUP) tracking applies to every third-party library. Choose dependencies you can justify under audit.
- Verification testing must cover the requirements you specified. Vague requirements lead to expensive test cycles. Be precise.
- Cybersecurity controls are now scrutinized. Authentication, encryption, and update mechanisms need to be designed in, not bolted on.
- The line between "wellness app" and "medical device" is narrower than many founders realize. Any clinical claim, including a soft one in marketing, can pull you across that line.
For a deeper look, see our guide on FDA SaMD considerations for EEG companion apps.
What We Learned from RE-AK and CLEIO
Our work with neurotechnology clients has reinforced a few patterns that generalize across this space:
- Hardware teams underestimate mobile complexity. They expect "just receive the BLE data and show it on a screen." The reality is two to three times more work than expected, mostly in handling edge cases.
- The first version is rarely the final version. BLE protocols evolve as you discover what the firmware actually delivers vs what the spec says. Plan for at least one protocol revision after first integration.
- Test on the actual phones your customers will use. A research-grade headset that works perfectly on a Pixel 9 may be unusable on a 4-year-old Android device that a clinical site already owns.
- Reliability beats features. Clinicians and researchers will tolerate a sparse UI if the data is rock-solid. They will abandon a beautiful app that drops data.
- Embed engineers who understand both sides. The best results come when the same person can read the firmware, debug the BLE traffic, and write the mobile code. Hand-offs between teams introduce errors.
Frequently Asked Questions
How long does it take to build a mobile companion app for an EEG device?
For a single-platform MVP that handles connection, basic streaming, session recording, and minimal visualization, expect 3 to 4 months with one senior mobile engineer who understands BLE. A dual-platform production-grade app with multi-sensor sync, background recording, cloud upload, and clinical-grade reliability takes 9 to 12 months with a small team. Anyone quoting 6 weeks has not built one.
Should we build native or cross-platform?
For BLE-heavy biosignal apps, we recommend native Swift and Kotlin. Cross-platform frameworks can work for the UI, but the BLE stack is where most of the complexity lives, and that is best handled at the native layer. React Native and Flutter both have BLE plugins, but they wrap platform APIs that already have quirks. Adding a wrapper on top tends to surface more bugs, not fewer.
Can we use Web Bluetooth instead of a native app?
Web Bluetooth is supported in Chrome on desktop and Android, but not on iOS. For most clinical and research use cases, missing iOS is a deal-breaker. Web Bluetooth also has weaker support for background operation, longer-running sessions, and complex GATT operations. It is good for prototyping and demos. It is not yet ready for production biosignal apps.
How do we handle firmware updates from the app?
BLE firmware updates (sometimes called Device Firmware Update or DFU) are a separate protocol on top of the standard GATT connection. Nordic devices use the Nordic DFU protocol. Other vendors have their own. Implement DFU as a separate module from the data streaming code. Most teams underestimate the testing burden: you need to handle interrupted updates, downgrade scenarios, and recovery from corrupted firmware.
How do we validate that our recorded data is scientifically usable?
Beyond engineering tests, you need a parallel recording validation. Record the same signal on both your device and a known reference device (a research-grade EEG amplifier, for example). Compare the signals offline using domain-specific metrics: spectral content, signal-to-noise ratio, electrode impedance correlation. Researchers will not trust your data until you can show this comparison.
DEVSFLOW Neuro builds mobile companion apps for EEG headsets, biometric wearables, and neurotechnology devices. If your firmware is ready and you need a mobile app that does not lose data, let's talk about your project.