Feature Requests

With Expo, you can write iOS and Android experiences in JavaScript using React Native.
[expo-cli] Allow for remote debugging
I'm using macOS for running the expo dev server and connecting to it via SSH in vscode on a windows machine to code the app. Everything works fine as far as building, coding and running, the only problem I had so far was debugging. Pressing J on the cli to launch the debugger only launches the debugger on the host machine and that is to be expected, but expo-cli seems to block connections NOT coming from the host machine when It comes to debugging, specifically these two lines https://github.com/expo/expo/blob/cb7c6ddfef26b492d5fe772e5152fdee996c5e0c/packages/%40expo/cli/src/start/server/metro/debugging/createDebugMiddleware.ts#L67 https://github.com/expo/expo/blob/cb7c6ddfef26b492d5fe772e5152fdee996c5e0c/packages/%40expo/cli/src/start/server/metro/debugging/createDebugMiddleware.ts#L77 By changing them to not terminate the socket connection when it's not a host machine and returning the middleware function respectively I can reach http://macOS-IP:8081/json , get the debug metadata, install the @react-native/debugger-shell package on my windows machine and launch the debugger from there using the data I got from the metadata json. I created a separate windows application just to list out the devices and launch the debugger from that metadata using this method. Since all of this is hardcoded into the expo-cli, I was wondering if there could be a option to either allow external connections to get the json data in order to launch the standalone debugger or change both expo-cli to allow for that option and the standalone debugger frontend to select which app and device to debug. I believe this was changed somewhere down the line very recently, since I recently just upgraded the expo and react native versions of the app I'm working on from a older version to the newest expo version and I don't think the expo-cli was blocking external connections before.
0
expo-audio: Add real-time PCM buffer streaming support for microphone recording
Summary expo-audio currently lacks the ability to stream raw PCM buffers from the microphone in real-time. This is a critical missing feature for use cases like live speech recognition, audio analysis, waveform visualization, and WebSocket audio streaming. Current Behavior expo-audio supports basic recording and playback, and provides PCM samples for playback visualization — but there is no way to receive a continuous stream of raw PCB buffers from the microphone during recording. Expected Behavior Developers should be able to receive a real-time callback with raw PCM Float32Array buffers as audio is being captured from the microphone. Proposed API const { startRecording, stopRecording } = useAudioRecorder({ sampleRate: 16000, channels: 1, onPCMData: (buffer: Float32Array, sampleRate: number) => { // send to WebSocket, run through ML model, visualize waveform, etc. } }); Proof of Demand I built and published expo-audio-stream-pcm ( https://www.npmjs.com/package/expo-audio-stream-pcm ) specifically to fill this gap. The library already has a working native implementation in Kotlin (Android) that delivers real-time PCM buffers via a callback. Other community packages solving the same problem: @siteed/expo-audio-studio expo-audio-stream (mykin-ai) @speechmatics/expo-two-way-audio The existence of multiple community packages solving the same problem is strong evidence this belongs in the official SDK. Use Cases Live speech-to-text (OpenAI Whisper, Google Speech, etc.) Real-time audio waveform visualization WebSocket audio streaming to a backend On-device ML audio analysis Willingness to Contribute I am willing to submit a PR implementing this feature, porting my existing Kotlin implementation into expo-audio and adding the corresponding iOS (Swift) implementation to match.
0
[expo-audio] iOS: Expose MPNowPlayingInfoPropertyIsLiveStream for live/radio streams
## Summary expo-audio does not expose MPNowPlayingInfoPropertyIsLiveStream , which means live radio and streaming apps cannot show the native iOS LIVE indicator on the lock screen and Control Center. Instead, iOS renders a broken scrubber with no duration, which is incorrect UX for live streams. ## Environment Expo SDK: 55 expo-audio : ~55.0.8 Platform: iOS Use case: Live radio stream (no duration / infinite stream) ## Expected behavior When playing a live stream, the lock screen and Control Center should show: A LIVE badge instead of a progress scrubber No elapsed time or duration display This is the standard iOS behavior for any app streaming live audio (Apple Podcasts, Spotify, RadioApp, etc.) ## Actual behavior Lock screen shows a scrubber with no duration (stuck at 0 or broken) No LIVE indicator is shown There is no API in expo-audio to signal to iOS that the stream is live ## Requested API Add an isLiveStream boolean field to updateLockScreenMetadata (or equivalent): ```ts AudioPlayer.updateLockScreenMetadata({ title: "Station Name", artist: "Live Radio", isLiveStream: true, // new field }); On the native iOS side this should map to: ```swift nowPlayingInfo[MPNowPlayingInfoPropertyIsLiveStream] = true // Also clear duration and elapsed time for correctness: nowPlayingInfo.removeValue(forKey: MPMediaItemPropertyPlaybackDuration) nowPlayingInfo.removeValue(forKey: MPNowPlayingInfoPropertyElapsedPlaybackTime) MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo ## Workaround There is currently no workaround within expo-audio . Achieving this requires either a native module or dropping down to react-native-track-player which supports isLiveStream out of the box. ## References Apple docs: https://developer.apple.com/documentation/mediaplayer/mpnowplayinginfopropertyislivestream react-native-track-player implementation: https://rntp.dev/docs/api/objects/track#islivestream Similar request in expo-av : expo/expo#12606
0
Load More