How to Stream EEG Data Over BLE Without Dropping Packets on Android

Bluetooth Low Energy is the default transport layer for consumer and research-grade EEG headsets. On paper, BLE 5.0 and above can handle the throughput requirements of multi-channel EEG at 250 Hz or higher. In practice, Android's fragmented BLE stack makes reliable streaming a genuine engineering challenge. Dropped packets translate directly into corrupted EEG epochs, unusable spectral features, and frustrated users who see artifacts in their data.

This post covers the specific techniques we use at DEVSFLOW to achieve zero-drop EEG streaming over BLE on Android. These are not theoretical recommendations. They come from shipping production firmware and companion apps for devices sampling at 250 Hz and 500 Hz across 8 to 32 channels.

Understanding the Throughput Requirement

Before tuning anything, you need to know exactly how many bytes per second your device produces. A typical 8-channel EEG headset sampling at 250 Hz with 24-bit resolution generates roughly 6,000 bytes per second of raw sample data. Add packet headers, sequence numbers, and timestamps, and you are looking at 7,000 to 8,000 bytes per second. That is well within BLE's theoretical capacity, but only if every layer of the stack cooperates.

At 32 channels and 500 Hz with 24-bit samples, the requirement jumps to approximately 48,000 bytes per second. This pushes BLE hard and leaves very little margin for retransmissions or scheduling delays. Every configuration parameter matters at this data rate.

MTU Negotiation: Get It Right on the First Connection

The default BLE MTU is 23 bytes, which gives you only 20 bytes of usable payload per ATT notification. At 250 Hz with 8 channels, you would need roughly 300 notifications per second at that size. That is not sustainable on most Android devices.

The fix is straightforward: request a larger MTU immediately after connection. On Android, call BluetoothGatt.requestMtu() with your desired size. We recommend requesting 247 bytes, which yields 244 bytes of usable payload. Most modern Android devices and BLE chipsets (nRF52, nRF5340, ESP32-S3) support this without issue.

There are two critical details that many developers miss:

Connection Interval Tuning

The connection interval determines how frequently the central and peripheral exchange data. Android's default connection parameters are optimized for power savings, not throughput. You will typically get a connection interval of 30 ms to 50 ms, which limits you to roughly 20 to 33 notification opportunities per second per interval.

For high-throughput EEG streaming, request a connection interval of 7.5 ms to 15 ms. On the peripheral side, set your preferred connection parameters in the GAP configuration. On Android, use BluetoothGatt.requestConnectionPriority(CONNECTION_PRIORITY_HIGH) immediately after MTU negotiation completes.

Be aware of a subtle problem: Android does not guarantee it will honor your requested interval. The actual interval depends on the Bluetooth controller, other active BLE connections, and the device manufacturer's power management policies. Samsung devices in particular are known to override connection parameters aggressively when battery saver modes are active. Your app must detect suboptimal intervals and warn the user, or better yet, adapt its buffering strategy dynamically.

Measuring the Actual Connection Interval

You cannot read the active connection interval directly from the Android API. Instead, measure it empirically by timestamping incoming notifications and computing the inter-arrival time distribution. If your median inter-arrival time exceeds 20 ms when you requested 7.5 ms, the controller is not honoring your request. Log this metric in production builds so you can correlate it with packet loss reports.

Buffering Strategies on the Peripheral

Even with optimal MTU and connection interval settings, there will be moments when the BLE link cannot drain data as fast as the ADC produces it. Your firmware must buffer samples and handle backpressure gracefully.

We use a double-buffered ring architecture on the peripheral. One buffer accumulates samples from the ADC interrupt handler. When a buffer is full (sized to match one BLE notification payload), it is enqueued for transmission. The BLE stack pulls from this queue on each connection event. If the queue depth exceeds a threshold (typically 4 to 6 packets), the firmware inserts a sequence gap marker so the mobile app knows samples were dropped and can zero-fill or interpolate rather than silently concatenating non-contiguous data.

The key insight: never block the ADC interrupt to wait for BLE. If BLE cannot keep up, it is better to drop old samples and maintain timing integrity than to stall sampling and create unpredictable timing jitter. EEG analysis algorithms are far more tolerant of known gaps than of unknown timing errors.

Android-Side Receive Pipeline

The onCharacteristicChanged callback runs on the Binder thread in Android's Bluetooth stack. If you do any processing in this callback, you risk blocking subsequent notifications. Every millisecond you spend here is a millisecond the next notification has to wait.

Our recommended architecture:

  1. In onCharacteristicChanged, copy the raw byte array into a pre-allocated ByteBuffer from a pool and enqueue it onto a lock-free ring buffer. Do nothing else in this callback.
  2. A dedicated processing thread drains the ring buffer, validates sequence numbers, reassembles multi-packet samples if needed, and writes to a memory-mapped file for persistence.
  3. A separate thread handles real-time display updates at a decimated rate (typically 30 Hz for UI rendering).

Avoid allocating objects in the callback path. On older Android versions (8.x and 9.x), garbage collection pauses in the Binder thread can cause notification loss. Pre-allocate your buffer pool at connection time.

Dealing with Android Fragmentation

The hardest part of BLE on Android is not the protocol. It is the inconsistency across manufacturers and OS versions. Here are the specific problems we have encountered and worked around in production:

Sequence Numbers and Error Recovery

Every notification payload from your peripheral must include a monotonically increasing sequence number. On the mobile side, check for gaps on every received packet. When you detect a gap, record the number of missing packets and the timestamp. This data is essential for two purposes: real-time interpolation of missing samples, and post-hoc quality assessment of recorded sessions.

For EEG applications where clinical or research validity matters, we also embed a microsecond-resolution timestamp from the peripheral's clock in every Nth packet (typically every 50th). This allows the mobile app to reconstruct precise sample timing even if BLE delivery is bursty, which it always is.

Testing Under Realistic Conditions

Lab testing with a peripheral sitting 30 cm from a phone in a quiet RF environment tells you almost nothing. Test in environments where Bluetooth congestion is real: offices with dozens of BLE devices, conference venues, and homes with active Wi-Fi routers on 2.4 GHz. BLE and Wi-Fi share the 2.4 GHz band, and coexistence issues are a primary cause of packet loss in the field.

We maintain a test suite that runs continuous 30-minute streaming sessions on a matrix of 12 Android devices spanning budget, mid-range, and flagship tiers. Any configuration that drops more than 0.1% of packets on any device in the matrix does not ship.

Reliable BLE streaming for EEG is hard to get right across the Android ecosystem. If your team is building a neurotech product and needs this level of expertise, talk to DEVSFLOW Neuro. We help neurotech companies ship firmware and mobile apps that work in the real world.