FAQ for Larix apps and SDKs
First, check detailed pages of respective SDKs:
- Larix Broadcaster SDK for Android
- Larix Broadcaster SDK for iOS
- Larix Player SDK for Android
- Larix Player SDK for iOS
Q1: Is there a trial for your SDKs?
Our SDKs can be tried in action using corresponding freeware applications available in app stores. The SDKs have source code of those apps.
Also take a look at architecture overview of Larix Broadcaster for Android and for iOS.
We also have a couple of tutorial apps, they are also provided as part of SDKs to show how to start development.
Q2: Are there any limitations on SDKs usage?
No. You may create any number of applications for publishing by your company, based on purchased SDKs. If you un-subscribe, your apps will remain fully functional.
Q3: Can I use your SDK if my SDK subscription is canceled?
Yes, you can use SDK and release your apps even if SDK was canceled after one or more months of payments. However, you will not be receiving SDK updates nor will you be able to get our technical support.
Q4: I used to be subscribed to SDK before, then canceled, how can I get updates now?
Just subscribe again for those SDKs which you're interested in.
Q5: Can you make a branded app for me and submit it in stores?
No. Our team in concentrated on the core product, and we recommend hiring mobile dev professionals who can make custom branding for you.
This won't take much money and time though. Just subscribe for the SDK you need and give it to an integrator of your choice, they'll complete it within a couple of days, depending on a level of customization.
You may also consider contacting integrator companies which have experience with our products. They are not affiliated with Softvelum but you should definitely try contacting them.
Q6: Can I develop apps with your SDK for my customers?
You must have a separate SDK subscription for each of your customers.
Re-sale or re-distribution of SDKs or their parts outside of original subscriber organization is forbidden.
So yes, you can develop apps for other companies but for each new customer you will have to make separate subscription under the same terms as you would have for your own development.
Q7: How do I know about SDK updates?
If you are subscribed to any SDK, you will get notification about updates via email.
You may also check SDK releases history page to check what we have at the moment.
Q8: How do I stream to YouTube Live, Facebook Live or my media server?
Visit Larix documentation reference to see full list of instructions for various platforms like YouTube, Facebook, Twitch, Restream.io, DaCast and others.
Q9: Can I set specific profile and level for output stream encoding?
iOS supports the following video formats: H.264 Baseline Level 3.0, Baseline Level 3.1, Main Level 3.1, and High Profile Level 4.1. Please refer to this article and this article for details.
On Android please refer to this article on profile and this article on level. You need Android 5.0 for profile and Android 6.0 for level. And note that profile/level combination support depends on device’ hardware type.
Q10: How does Larix handle bitrate setup?
On Android the bitrate parameter is defined by just typing the value.
On iOS the bitrate parameter has predefined set of values, and by default it's selected based on the resolution.
In general, once you define a bitrate (or use default one), the device encoder will use it as target bitrate for encoding and Larix will publish the stream with that bitrate. If network conditions get worse, then you will see frames loss at some point for RTMP and RTSP. SRT will try compensating that with its error recovery within the "latency" parameter period.
If you know that your network will not be fine, you can enable Adaptive bitrate feature.
Bitrate matches resolution option is also available in case you don't know the exact value. The following ruls apply here:
- Bitrate is selected according to this table based on resolution:
["2160":4500, "1080":3000, "720":2000, "540":1500, "480":1000, "360":700, "288":500, "144":300]
- If you use HEVC then the bitrate is multiplied by 0.5.
- If you use 50FPS and above then the bitrate is multiplied by 1.6.
Q11: How does adaptive bitrate (ABR) work?
Adaptive video bitrate is supported in 3 modes:
- Logarithmic descend - gracefully descend from max bitrate down step by step. Retries to raise back to previous step every minute. Best fit for good networks.
- Ladder ascend - first cut bitrate by 2/3 and increase it back to normal as much as possible. Retries to raise back to previous steps in 15 seconds, 1.5 and then 5 minutes. Best fit for networks with big losses.
- Hybrid approach - calculate percentage of factually delivered packets and decrease the target bitrate by that ratio. Minimum bitrate is 25%. Larix tries to restore the bitrate every 60 seconds by 500Kbps steps.
- Variable FPS can be used as an option, it will reduce bitrate by decreasing FPS in addition to changing the bitrate value.
The trigger for start switching to lower bitrate or frame rate is the number of lost packets per certain period of time.
For logarithmic descent it's 4 packets per 10 last seconds.
For Ladder ascend it equals "bitrate/300000" for the last 10 seconds, e.g. for 2Mbps it's 6 packets.
Example for hybrid: you set target bitrate to 6000Kbps, the actual outgoing bitrate is 5000Kbps. Due to network failures the real delivery bitrate is dropped by 50% to 2500Kbps. So the target bitrate is reduced buy half to 3000Kbps. A minute later it tries to restore it to 3500Kbps.
In case of RTMP and RTSP connections we count lost packets' stats ourselves.
On iOS 11+, the packets are not lost but are kept in system buffer, so if the ABR is not used, there will be an increase it delivery delay.
In case of SRT the packet loss is defined by pktSndDrop property and it depends on how SRT handled the loss in accordance to latency and other internal factors.
Can ABR implementation be changed?
For iOS you can check that ABR logic in StreamConditioner.swift.
For Android it's defined across StreamConditionerBase.java, StreamConditionerMode1.java and StreamConditionerMode2.java
Q12: Larix FPS is set to 25FPS, but my decoder shows 30FPS or no FPS data at all.
Short answer is: mobile encoders do not add proper SPS information into the content which cause some decoders to get confused and to use some default value like "30".
On iOS platform we use the iOS system encoder. In general, the produced framerate encoder may be variable, we cannot control it. At the moment we can only use the encoder setting which "recommends" the encoder to use certain frame rate.
This is how it's described in Apple docs: "This is not used to control the frame rate; it is provided as a hint to the video encoder so that it can set up internal configuration before compression begins. The actual frame rate will depend on frame durations and may vary."
The same applies to Android. There we can select some frame rate range from pre-defined a list of rate rages supported by the encoder. Some ranges may contain just one value (30..30) - that a "fixed frame rate", some contain ranges (1..25) - that's a "variable frame rate". But that's also a recommendation.
60FPS support notice: most of Android devices with 60 fps cameras do not provide this capability to third-party apps, so only in native camera app can do it. So if your device has that support, most probably Larix won't be able to use it.
There's no way to set the definite FPS at the moment. The output content will not have proper information in SPS, the encoder just doesn't provide it. That's why we cannot provide it as well.
Q13: Can I make my application perform streaming from the background?
How can we do streaming when the app is closed, is in background or when the device is locked?
On Android it's available as non-default feature. In order to use it, go to "Advanced options" menu, enable Developer options, then enable "Background streaming" and re-start app by double tapping Back button.
On iOS it's now supported in Larix Screencaster application as example.
Q14: Can I set input gain (incoming audio volume)?
Q15: Why Larix Screencaster does not provide audio from my apps?
Android platform version 9 or earlier does not allow its applications to take audio from other applications. This is a security constraint, so Larix is not able to do that.
On Android 10, app audio recoding is allowed from apps which support external recording. To use this option, choose Audio -> Sound settings -> Media sounds.
iOS platform allows capturing the screen of user device and puts some limitations on audio. If you stream your screen, you can only use your microphone.
If currently opened application supports ReplayKit, then you'll be able to stream its sound.
Q16: What languages do you use in your SDK?
On Android we use pure Java.
On iOS Larix Broadcaster sample application is created with Swift with static streaming library. To use our library with Objective C you need only AVFoundation and CoreImage frameworks (CoreImage is used only to implement live rotation feature). You can convert AVFoundation-related Swift logic to Objective C one-to-one, Apple has example on this page.
Q17: Where can I find saved files after video recording?
On iOS, the recordings can be saved to iCloud Drive and Photo Library.
If you want to customize path via SDK, refer to Streamer.swift / startRecord() and customize below code block:
let documents = try FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
let df = DateFormatter()
df.dateFormat = "yyyyMMddHHmmss"
let fileName = "MVI_" + df.string(from: Date()) + ".mp4"
let fileUrl = documents.appendingPathComponent(fileName)
On Android 8+ you can select any destination for recording, including SD card. On earlier OS versions we use /DCIM/LarixBroadcaster/ on the internal storage due to Android limitation.
Q18: How can I combine together the files with split recording?
You may install MP4Box and run command like "MP4Box -add input_file1.mp4 -cat input_file2.mp4 output_file.mp4".
Q19: How can I apply image or text or animation overlay on the outgoing stream?
On iOS you can apply any CoreImage filter to outgoing video stream. You will implement CoreImage filers directly, same way like apply them to photo. Please refer to this Apple article. You can see example of overlay in StreamerSingleCam.swift of LarixSample - there is an internal function overlay() inside of rotateAndEncode, uncomment its call to see the result.
On Android it's possible to stream any picture, check MainActivityGLES.java / mCameraLogo. You can check more extended examples in Camera2demo and CameraFX sample applications from Android SDK.
If you'd like to make overlay based on web page, you can convert browser window into bitmap and then make overlay with it. On Android it's described here. On iOS, you'll need to convert WebView into bitmap.
Q20: My Android device has multiple rear and frontal cameras, how can I use them?
Larix shows all cameras which are available via system API. Android 10+ provides extended capabilities for capturing from physical cameras and Larix has that support. However, many manufacturers allow using additional cameras mostly only in their own apps. So if some device doesn't provide this information, Larix won't be able to work with it.
Q20: What is "Max buffer items" field for in Developer options?
This field defines maximum number of items stored in buffer before sending out. Each item is a video or audio frame. So for 1 second of 30fps stream you will have 30 frames of video and about 40 frames of audio, which is 70 total. The bigger value you set, the bigger streaming latency you get.
The default value is "300". The minimum value is "70".
Mobile solution usage example
Take a look at Mobile solutions snapshot with other Softvelum products usage in actions.
If you still have questions, contact us.