-
Notifications
You must be signed in to change notification settings - Fork 153
Description
Summary
After leaving a LiveKit session and awaiting room.disconnect(), app memory remains high. Instruments (Leaks + Debug Memory Graph) report leaked LiveKit internals related to data channels. I perform a teardown, switched strong references to weak like mentioned in the docs, and verified my view model deinit. Still, several LiveKit objects remain and memory increases by ~30 MB on each reconnect cycle.
Environment
- LiveKit iOS SDK: 2.4.0 and 2.9.0 (updated today to check if it fixes the leaks)
- iOS / iPadOS: 18.6.2
- App stack: SwiftUI + ARKit. I publish AR frames via a custom
ARVideoCapturerto aBufferCapturer→LocalVideoTrack(source:.screenShareVideo).
Memory profile
- Start/Idle (no room): ~80 MB
- In room (AR video + mic): ~800 MB
- After leaving the room: ~550 MB
- After every subsequent rejoin: +30 MB retained
(e.g., 1st leave 550 MB → 2nd leave 580 MB → 3rd leave 610 MB, …)
Switching all LiveKit track/publication references to weak recovered ~130 MB (from ~680 MB to ~550 MB), but the main issue persists.
Leak objects reported by Debug Memory Graph
DataChannelPairLiveKit.AsyncCompleterLiveKit.MulticastDelegate<LiveKit.DataChannelDelegate>LiveKit.StateSync<LiveKit.DataChannelPair.(State in _5E1FD0D2F6306987E45F26575B668A19)>LiveKit.StateSync<LiveKit.MulticastDelegate<LiveKit.DataChannelDelegate>.(State in _5E24999AC9573707969E43B46E111503)<> >LiveKit.TTLDictionary<Swift.String, Swift.UInt32>MutexWrapper(retained viaLiveKit.AsyncCompleterandLiveKit.MulticastDelegate<…>paths)Foundation(retained viaLiveKit.MulticastDelegate<…>paths)- Plus many other
LiveKit.StateSync<…>entries but without leaking.
Note: After my leave flow, the app-side objects do deinit (verified with logs):
SessionViewModel, itsRoom,ARView,ARVideoCapturer, and all local tracks/publications.
Expected
- After
await room.disconnect()and releasing app-side references, internal LiveKit data-channel structures should be released. - Memory should return near the pre-join baseline.
Actual
DataChannelPair,AsyncCompleter,MulticastDelegate<DataChannelDelegate>,StateSync<…>remain in memory; Instruments flags them as leaks.- Memory remains elevated (~550 MB after first leave) and increases by ~30 MB per rejoin.
App-side code
Properties (LiveKit references kept weak per docs)
final class SessionViewModel: NSObject, ObservableObject, Identifiable {
@Published var room = Room()
// Tracks & publications are weak per LiveKit recommendation
private weak var sendingVideoTrack: LocalVideoTrack?
private weak var sendingUserVideoTrack: LocalVideoTrack?
private weak var sendingUserAudioTrack: LocalAudioTrack?
private weak var sendingVideoTrackPublication: LocalTrackPublication?
// Capturer bridging ARView snapshots to BufferCapturer
var videoCapturer: ARVideoCapturer?
// AR
var arView: ARView?
// …
}
connect() (publish user audio)
func connect() async throws {
let serverURL = ...
let sessionAccessToken = ...
room.add(delegate: self)
try await room.connect(url: serverURL, token: sessionAccessToken)
do {
let audioTrack = LocalAudioTrack.createTrack(name: "UserAudio")
try await room.localParticipant.publish(audioTrack: audioTrack)
self.sendingUserAudioTrack = audioTrack
} catch {
throw SessionError.noAudioDeviceAvailable
}
}
startSendARVideo() (create buffer track, start custom capturer, publish)
@MainActor
func startSendARVideo() async throws {
let captureOptions = BufferCaptureOptions(dimensions: .h720_169, fps: 30)
let videoTrack = LocalVideoTrack.createBufferTrack(
name: "ARVideo",
source: .screenShareVideo,
options: captureOptions
)
self.sendingVideoTrack = videoTrack
guard let bufferedCapturer = videoTrack.capturer as? BufferCapturer else {
throw SessionError.noBufferCapturerAvailable
}
let capturer = ARVideoCapturer(bufferedCapturer: bufferedCapturer)
capturer.isCoachingActiveProvider = { [weak self] in self?.isCoachingActive ?? false }
capturer.startCapture(of: arView)
self.videoCapturer = capturer
let screenShareEncoding = VideoEncoding(maxBitrate: 3_000_000, maxFps: 30)
let publishOptions = VideoPublishOptions(
name: "ARVideo",
encoding: nil,
screenShareEncoding: screenShareEncoding,
simulcast: false,
simulcastLayers: [],
screenShareSimulcastLayers: [],
preferredCodec: nil,
preferredBackupCodec: nil,
degradationPreference: .maintainFramerate,
streamName: nil
)
let pub = try await room.localParticipant.publish(videoTrack: videoTrack, options: publishOptions)
self.sendingVideoTrackPublication = pub
}
stopSendARVideo() (unpublish + explicit stop)
@MainActor
func stopSendARVideo() async throws {
if let sendingVideoTrackPublication {
try await room.localParticipant.unpublish(publication: sendingVideoTrackPublication)
}
videoCapturer?.stopCapture()
sendingVideoTrackPublication = nil
if let videoTrack = sendingVideoTrack {
try await videoTrack.stop()
sendingVideoTrack = nil
}
}
disconnect() (teardown, then replace room)
@MainActor
func disconnect() async throws {
// Unpublish & stop capture
try await stopSendARVideo()
// Clean up LiveKit
await room.localParticipant.unpublishAll()
room.remove(delegate: self)
await room.disconnect()
// Pause AR
if let arView { arView.session.pause() }
// Clean up tracks
sendingVideoTrackPublication = nil
try await sendingVideoTrack?.stop()
sendingVideoTrack = nil
try await sendingUserVideoTrack?.stop()
sendingUserVideoTrack = nil
try await sendingUserAudioTrack?.stop()
sendingUserAudioTrack = nil
// Clean up capturer + ARView
videoCapturer?.stopCapture()
videoCapturer = nil
arView = nil
}
Questions
Am I missing something that still needs to be released?