FAQ for Larix mobile apps and SDK

Some frequently asked questions about Larix mobile apps and SDK

Q1: How do I stream to YouTube Live?

Let's say you have a streaming URL rtmp://a.rtmp.youtube.com/live2 and a stream key abcd-efgh-abcd-efgh. To perform streaming you need to create a connection with the following connection URL: rtmp://a.rtmp.youtube.com/live2/abcd-efgh-abcd-efgh. Enter it into corresponding field and start streaming.

Q2: How do I stream from Larix to to Restream.io?

Take a look at Stream From Mobile Using Larix Broadcaster article from Restream.io help center.

Q3: How can I do authenticated streaming via RTMP?

RTMP has several authentication methods, the default one is the Adobe authentication. Unfortunately it's a proprietary technology which is eligible for patent claims. To avoid any patent infringement, we do not have support for it.
Instead we propose using parameter-based authentication. Nimble Streamer supports parameters in URL out-of-the-box, Wowza has ModuleSecureURLParams for this. Check this article to see example of parameter usage.
Another option is to use RTSP, its authentication is fully supported both by our SDK and by all major media servers.

Q4: Can I set input gain (incoming audio volume)?

On iOS you can set input gain using standard AVAudioSession’s API. Please refer to this article and this article.
Android doesn’t provide API to set mic input gain.

Q5: Can I set specific profile and level for output stream encoding?

iOS supports the following video formats: H.264 Baseline Level 3.0, Baseline Level 3.1, Main Level 3.1, and High Profile Level 4.1. Please refer to this article and this article for details.
On Android please refer to this article on profile and this article on level. You need Android 5.0 for profile and Android 6.0 for level. And note that profile/level combination support depends on device’ hardware type.

Q6: Can I make my application perform streaming from the background?

can we do streaming when the app is closed, is in background or when the device is locked?

On iOS video recording is impossible in background (capture session will be interrupted by OS, no way to survive this). Audio record in background will need additional investigation, let us know if you need this.
On Android it's possible to put streamer instance to background service on Android. Larix Screencaster works this way. However, it's a bad practice to keep control over camera on background service. Applications should release the camera immediately in onPause() (please refer to this documentation). But if you want to try video streaming from background service, you can implement your own camera capture and feed streamer with images from camera while in background, you can refer to Camera2Demo example in SDK package.

Q7: What languages do you use in your SDK?

On Android we use pure Java.
On iOS Larix Broadcaster sample application is created with Swift with static streaming library. To use our library with Objective C you need only AVFoundation and CoreImage frameworks (CoreImage is used only to implement live rotation feature). You can convert AVFoundation-related Swift logic to Objective C one-to-one, Apple has example on this page.

Q8: Where can I find saved files after video recording?

On iOS we rely on iOS File Sharing, see this article. Until iOS 11 is released, you'll need iTunes do download or delete individual files. If you want to customize path, refer to Streamer.swift / startRecord() and customize below code block:

let documents = try FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
let df = DateFormatter()
df.dateFormat = "yyyyMMddHHmmss"
let fileName = "MVI_" + df.string(from: Date()) + ".mp4"
let fileUrl = documents.appendingPathComponent(fileName)

On Android we use DCIM/LarixBroadcaster on the internal storage. DCIM on the external storage is not supported due to Android limitation. See this article for more details.

Q9: How can I apply image or text or animation overlay on the outgoing stream?

On iOS you can apply any CoreImage filter to outgoing video stream. You will implement CoreImage filers directly, same way like apply them to photo. Please refer to this Apple article.
On Android it's possible to stream any picture - see Larix Screencaster as an example of this. You will render with OpenGL Surface and it will be encoded and streamed. It's also possible to implement custom camera image post-processing: you get preview from camera, then apply filter using OpenGL and finally render it to Surface.

Q10: Does iOS SDK support flash light?

No. The iOS doesn't support this capability for third-party apps.

Mobile solution usage example

Take a look at Mobile solutions snapshot with other Softvelum products usage in actions.