Some frequently asked questions about Larix mobile apps and SDK
Let's say you have a streaming URL rtmp://a.rtmp.youtube.com/live2 and a stream key abcd-efgh-abcd-efgh. To perform streaming you need to create a connection with the following connection URL: rtmp://a.rtmp.youtube.com/live2/abcd-efgh-abcd-efgh. Enter it into corresponding field and start streaming.
Take a look at Stream From Mobile Using Larix Broadcaster article from Restream.io help center.
RTMP has several authentication methods, the default one is the Adobe authentication. Unfortunately it's a proprietary technology which is eligible for patent claims. To avoid any patent infringement, we do not have support for it.
Instead we propose using parameter-based authentication. Nimble Streamer supports parameters in URL out-of-the-box, Wowza has ModuleSecureURLParams for this. Check this article to see example of parameter usage.
Another option is to use RTSP, its authentication is fully supported both by our SDK and by all major media servers.
iOS supports the following video formats: H.264 Baseline Level 3.0, Baseline Level 3.1, Main Level 3.1, and High Profile Level 4.1. Please refer to this article and this article for details.
On Android please refer to this article on profile and this article on level. You need Android 5.0 for profile and Android 6.0 for level. And note that profile/level combination support depends on device’ hardware type.
On iOS video recording is impossible in background (capture session will be interrupted by OS, no way to survive this). Audio record in background will need additional investigation, let us know if you need this.
On Android it's possible to put streamer instance to background service on Android. Larix Screencaster works this way. However, it's a bad practice to keep control over camera on background service. Applications should release the camera immediately in onPause() (please refer to this documentation). But if you want to try video streaming from background service, you can implement your own camera capture and feed streamer with images from camera while in background, you can refer to Camera2Demo example in SDK package.
On Android we use pure Java.
On iOS Larix Broadcaster sample application is created with Swift 3 with static streaming library. To use our library with Objective C you need only AVFoundation and CoreImage frameworks (CoreImage is used only to implement live rotation feature). You can convert AVFoundation-related Swift logic to Objective C one-to-one, Apple has example on this page.
On iOS we rely on iOS File Sharing, see this article. Until iOS 11 is released, you'll need iTunes do download or delete individual files. If you want to customize path, refer to Streamer.swift / startRecord() and customize below code block:
let documents = try FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
let df = DateFormatter()
df.dateFormat = "yyyyMMddHHmmss"
let fileName = "MVI_" + df.string(from: Date()) + ".mp4"
let fileUrl = documents.appendingPathComponent(fileName)
On iOS you can apply any CoreImage filter to outgoing video stream. You will implement CoreImage filers directly, same way like apply them to photo. Please refer to this Apple article.
On Android it's possible to stream any picture - see Larix Screencaster as an example of this. You will render with OpenGL Surface and it will be encoded and streamed. It's also possible to implement custom camera image post-processing: you get preview from camera, then apply filter using OpenGL and finally render it to Surface.
No. The iOS doesn't support this capability for third-party apps.