Banuba AI Video Editor SDK allows you to quickly add short video functionality and possibly AR filters and effects into your mobile app. On this page, we will explain how to integrate it into an Android app.
- Requirements
- Dependencies
- Video Editor SDK size
- Starting a free trial
- Supported media formats
- Camera recording video quality params
- Export video quality params
- Token
- Connecting with AR cloud
- What can you customize?
- FFmpeg build issue (Error compressed Native Libs)
- Integration
- Customization
- Disable Face AR SDK
- Configure export flow
- Configure masks and filters order
- Configure watermark
- Configure media content
- Configure audio content
- Configure audio browser
- Configure stickers content
- Configure the record button
- Configure camera timer
- Configure Cover preview screen
- Configure screens
- Configure additional Video Editor SDK features
- Check Video Editor SDK availability before opening
- Localization
- FAQ
- Third party libraries
This is what you need to run the AI Video Editor SDK
- Java 1.8+
- Kotlin 1.4+
- Android Studio 4+
- Android OS 6.0 or higher with Camera 2 API
- OpenGL ES 3.0 (3.1 for Neural networks on GPU)
- Koin
- ExoPlayer
- Glide
- Kotlin Coroutines
- ffmpeg
- AndroidX libraries
- Banuba Face AR SDK. Optional. Video Editor SDK disables Face AR for devices with CPU armv7l(8 cores) and armv8(working in 32bit mode).
Please see all used dependencies
If you want to use the Video Editor SDK for a short video app like TikTok, the Face AR module would be useful for you, as it allows you to add masks and other AR effects. If you just need the video editing-related features, the AI Video Editor SDK can work on its own.
Options | Mb | Note |
---|---|---|
✅ Face AR SDK | 74.3 | AR effect sizes are not included. AR effect takes 1-3 MB in average. |
❌ Face AR SDK | 43.2 | no AR effects |
You can either include the filters in the app or have users download them from the AR cloud to decrease the app size.
You should start with getting a trial token. It will grant you 14 days to freely play around with the AI Video Editor SDK and test its entire functionality the way you see fit.
There is nothing complicated about it - contact us or send an email to [email protected] and we will send it to you. We can also send you a sample app so you can see how it works “under the hood”.
Audio | Video | Images |
---|---|---|
.aac, .mp3, .wav .ogg, .m4a |
.mp4, .mov | .jpg, .gif, .heic, .png, .nef, .cr2, .jpeg, .raf, .bmp |
Recording speed | 360p(360 x 640) | 480p(480 x 854) | HD(720 x 1280) | FHD(1080 x 1920) |
---|---|---|---|---|
1x(Default) | 1200 | 2000 | 4000 | 6400 |
0.5x | 900 | 1500 | 3000 | 4800 |
2x | 1800 | 3000 | 6000 | 9600 |
3x | 2400 | 4000 | 8000 | 12800 |
Video Editor SDK classifies every device by its performance capabilities and uses the most suitable quality params for the exported video.
Nevertheless it is possible to customize it with ExportParamsProvider
interface. Just put a required video quality into ExportManager.Params.Builder
constructor. Check out an example, where multiple video files are exported: the first and the second with the most suitable quality params (defined by sizeProvider.provideOptimalExportVideoSize()
method) and the third with 360p quality (defined by using an Video Editor SDK constant VideoResolution.VGA360
).
See the default bitrate (kb/s) for exported video (without audio) in the table below:
360p(360 x 640) | 480p(480 x 854) | HD(720 x 1280) | FHD(1080 x 1920) |
---|---|---|---|
1200 | 2000 | 4000 | 6400 |
We offer а free 14-days trial for you could thoroughly test and assess Video Editor SDK functionality in your app. To get access to your trial, please, get in touch with us by filling a form on our website. Our sales managers will send you the trial token.
Video editor token should be put here.
Also you can load token from Firebase. Check to configure firebase.
There is an opportunity to load token from any Remote Server. Check to configure such case.
To decrease the app size, you can connect with our servers and pull AR filters from there. The effects will be downloaded whenever a user needs them. Please check out step-by-step guide to configure AR Cloud in the Video Editor SDK.
We understand that the client should have options to brand video editor to bring its own experience to the market. Therefore we provide list of options to customize:
✅ Use your branded icons. See details
✅ Use you branded colors. See details
✅ Change text styles i.e. font, color. See details
✅ Localize and change text resources. Default locale is 🇺🇸
✅ Make content you want i.e. a number of video with different resolutions and durations, an audio file. See details
✅ Masks and filters order. See details
❌ Change layout
❌ Change screen order
❗ We do custom UX/UI changes as a separate contract. Please contact our [email protected].
❗ If in the Video Editor process work you see the message "Error compressed Native Libs. Look documentation", then do next:
- Add the
android.bundle.enableUncompressedNativeLibs=false
in thegradle.properties
:
android.bundle.enableUncompressedNativeLibs=false
- Add
android:extractNativeLibs="true"
in the<application>
path ofAndroidManifest.xml
:
<application
...
android:extractNativeLibs="true"
...>
GitHub packages are used to download the latest Video Editor SDK modules. You will also need them to receive new AI Video Editor SDK versions. GitHub packages are set up for trial.
Note: pay attention that for getting access and downloading the Video Editor SDK modules you need to use the credentials(banubaRepoUser and banubaRepoPassword), see the build.gradle for more details.
...
allprojects {
repositories {
...
maven {
name = "GitHubPackages"
url = uri("https://maven.pkg.github.com/Banuba/banuba-ve-sdk")
credentials {
username = banubaRepoUser
password = banubaRepoPassword
}
}
maven {
name = "ARCloudPackages"
url = uri("https://maven.pkg.github.com/Banuba/banuba-ar")
credentials {
username = banubaRepoUser
password = banubaRepoPassword
}
}
...
}
}
Please, specify a list of dependencies as in app/build.gradle file to integrate AI Video Editor SDK.
implementation "com.banuba.sdk:camera-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:camera-ui-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:core-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:core-ui-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-flow-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-timeline-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-ui-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-gallery-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-effects-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:effect-player-adapter:${banubaSdkVersion}"
implementation "com.banuba.sdk:ar-cloud:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-audio-browser-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:banuba-token-storage-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-export-sdk:${banubaSdkVersion}"
implementation "com.banuba.sdk:ve-playback-sdk:${banubaSdkVersion}"
To manage the main screens - camera, gallery, trimmer, editor, and export - you need to add the VideoCreationActivity to AndroidManifest.xml. Each screen is implemented as a Fragment.
<activity android:name="com.banuba.sdk.ve.flow.VideoCreationActivity"
android:screenOrientation="portrait"
android:theme="@style/CustomIntegrationAppTheme"
android:windowSoftInputMode="adjustResize"
tools:replace="android:theme" />
Once it’s done, you’ll be able to launch the video editor.
Note the CustomIntegrationAppTheme line in the code. Use this theme for changing icons, colors, and other screen elements to customize the app.
You can override the behavior of the video editor in your app with DI libraries and tools (we use Koin, for example).
First, you need to create your own implementation of FlowEditorModule.
class VideoEditorKoinModule : FlowEditorModule() {
override val effectPlayerManager: BeanDefinition<AREffectPlayerProvider> =
single(override = true) {
BanubaAREffectPlayerProvider(
mediaSizeProvider = get(),
token = androidContext().getString(R.string.video_editor_token)
)
}
...
}
You will need to override several properties to customize the video editor for your application. Please, take a look at the full example.
Once you’ve overridden the properties that you need, initialize the Koin module in your Application.onCreate method.
startKoin {
androidContext(this@IntegrationApp)
modules(VideoEditorKoinModule().module)
}
You can use Java in your Android project. In this case you can start Koin in this way:
startKoin(new GlobalContext(), koinApplication -> {
androidContext(koinApplication, this);
koinApplication.modules(new VideoeditorKoinModuleKotlin().getModule());
return null;
});
Please, find the full example of Java Application class.
There are several classes in the Video Editor SDK that allow you to modify its parameters and behavior:
- CameraConfig lets you setup camera specific parameters (min/max recording duration, flashlight, etc.).
- EditorConfig lets you modify editor, trimmer, and gallery screens.
- MusicEditorConfig allows you to change the audio editor screen, e.g. the number of timelines or tracks allowed.
- ObjectEditorConfig allows you to change text and gif editor screens, e.g. the number of timelines or effects allowed
- MubertApiConfig - optional config class available only in case you plugged in audio-browser-sdk module allows to configure music tracks network requests
If you want to customize some of these classes, provide them with just those properties you need to change. For example, to change only max recording duration on the camera screen, provide the following instance:
single(override = true) {
CameraConfig(
maxRecordedTotalVideoDurationMs = 40_000
)
}
To start Video Editor from camera:
val videoCreationIntent = VideoCreationActivity.startFromCamera(
context = context
)
startActivity(videoCreationIntent)
You can use Java in your Android project. In this case you can create intent to start Video Editor from camera in this way:
Intent videoCreationIntent = VideoCreationActivity.startFromCamera(
context,
Uri.EMPTY,
null,
null,
null,
CameraUIType.TYPE_1
);
startActivity(videoCreationIntent)
More information about how to launch Video Editor you can fing here
You can use AI Video Editor SDK without Face AR SDK. Please follow these changes to make it.
Remove BanubaEffectPlayerKoinModule().module
from the video editor Koin module
startKoin {
androidContext(this@IntegrationApp)
modules(
AudioBrowserKoinModule().module,
VideoEditorKoinModule().module,
- BanubaEffectPlayerKoinModule().module
)
}
And also remove dependency com.banuba.sdk:effect-player-adapter
from app/build.gradle
implementation "com.banuba.sdk:ve-effects-sdk:${banubaSdkVersion}"
- implementation "com.banuba.sdk:effect-player-adapter:${banubaSdkVersion}"
implementation "com.banuba.sdk:ar-cloud-sdk:${banubaSdkVersion}"
The Video Editor SDK exports recordings as .mp4 files. There are many ways you can customize this flow to better integrate it into your app.
To change export output, start with the ExportParamsProvider
interface. It contains one method - provideExportParams()
that returns List<ExportManager.Params>
. Each item on this list relates to one of the videos in the output and their configuration. Please check out guide to configure ExportParams. See the example here.
The end result would be four files:
- Optimized video file (resolution will be calculated automatically);
- Same file as above but without a watermark;
- Low-res version of the watermarked file.
By default, they are placed in the "export" directory of external storage. To change the target folder, you should provide a custom Uri instance named exportDir through DI.
Should you choose to export files in the background, you’d do well to change ExportNotificationManager
. It lets you change the notifications for any export scenario (started, finished successfully, and failed).
❗ If you set shouldClearSessionOnFinish
in ExportFlowManager
to true, you should clear VideoCreationActivity
from backstack. Otherwise crash will be raised.
By default, the masks and filters are listed in alphabetical order.
To change it, use the implementation of the OrderProvider
interface.
class CustomMaskOrderProvider : OrderProvider {
override fun provide(): List<String> = listOf("Background", "HeadphoneMusic", "AsaiLines")
}
This will return the list of masks with the required order. Note: The name of mask is a name of an appropriate directory located in assets/bnb-resources/effects directory or received from AR cloud. Example.
class CustomColorFilterOrderProvider : OrderProvider {
override fun provide(): List<String> = listOf("egypt", "byers")
}
This will return the list of color filters with the required order. Note: The name of color filter is a name of an appropriate file located in assets/bnb-resources/luts directory. Example.
The final step is to pass your custom CustomMaskOrderProvider
and CustomColorFilterOrderProvider
implementation in the DI to override the default implementations:
override val maskOrderProvider: BeanDefinition<OrderProvider> =
single(named("maskOrderProvider"), override = true) {
CustomMaskOrderProvider()
}
override val colorFilterOrderProvider: BeanDefinition<OrderProvider> =
single(named("colorFilterOrderProvider"), override = true) {
CustomColorFilterOrderProvider()
}
Note: pay attantion that OrderProvider
should be named "maskOrderProvider" and "colorFilterOrderProvider" for masks and filters, respectively.
To use a watermark, add the WatermarkProvider
interface to your app. The image goes into the getWatermarkBitmap method. Once you’re done, rearrange the dependency watermarkProvider in DI. See the example of adding a watermark here.
AI Video Editor SDK is provided with its own solution for media content (i.e. images and videos) selection - the gallery screen. To use it as a part of SDK just add a dependency into build.gradle:
implementation "com.banuba.sdk:ve-gallery-sdk:1.0.16"
and put the new koin module into startKoin
function:
startKoin {
androidContext(this@IntegrationApp)
modules(
// other Video Editor modules
+ GalleryKoinModule().module
)
}
The gallery provided by the SDK is fully customizable according to this guide.
Also there is an option to use your own implementation of the gallery. This is available according to this step-by-step guide.
Banuba Video Editor SDK can trim audio tracks, merge them, and apply them to a video. It doesn’t include music or sounds. However, it can be integrated with Mubert and get music from it (requires additional contract with them). Moreover, the users can add audio files from internal memory (downloaded library) from the phone.
Adding audio content is simple. See this step-by-step guide guide for code examples.
Check out step-by-step guide to use audio browser in your app.
The stickers in the AI Video Editor SDK are GIFs. Adding them is as simple as adding your personal Giphy API into the stickersApiKey parameter in videoeditor.json file.
GIPHY doesn't charge for their content. The one thing they do require is attribution. Also, there is no commercial aspect to the current version of the product (no advertisements, etc.) To use it, please, add "Search GIPHY" text attribution to the search bar.
If you want to use the default record button provided by the Video Editor SDK with some color, size and animation customization, follow this guide.
If you want to fully change the look of the button and the animation on tap, you should provide your custom record button implementation. This is how it’s done:
-
Create a custom view.
-
Implement
CameraRecordingAnimationProvider
interface. Here the view created in step 1 should be provided through methodprovideView()
within this interface. Example. -
Implement
CameraRecordingAnimationProvider
in the DI.
This will allow your users to take pictures and videos after a delay. The timer is managed by the CameraTimerStateProvider
interface. Every delay is represented by the TimerEntry object:
data class TimerEntry(
val durationMs: Long,
@DrawableRes val iconResId: Int
)
Besides the delay itself, you can customize the icon for it. See the example here.
More advanced timer settings are available with Hands-Free feature.
If you want to manage Cover preview screen you need to override CoverProvider property in DI.
override val coverProvider: BeanDefinition<CoverProvider> = single(override = true) {
CoverProvider.EXTENDED
}
There are 3 modes:
enum class CoverProvider {
SIMPLE, // enable cover screen with simple UI
EXTENDED, // enable cover screen with extended UI
NONE // disable cover screen
}
You can use the Android themes and styles to change the screens in the mobile Video Editor SDK. You can also change the language and text.
The AI Video Editor SDK incudes the following screens:
- Camera screen
- Editor screen
- Gallery screen
- Trimmer screen
- Aspects screen
- Music Editor screen
- Timeline Editor screen
- Cover screen
- Alert Dialogs
- Picture in picture
The Video Editor has multiple entry points. Please check out guide.
The SDK is protected by the token so its presence is a vital part of Video Editor launch. To check if the SDK is ready to use you may use the following property:
VideoEditorUtils.isAvailable
Also you can check token expiration with help of
VideoEditorUtils.isExpired
property. See FAQ page to get more details about token expiration.
There are a few devices, that does't support Video Editor. To check you may use the following property:
VideoEditorUtils.isSupportsVideoEditor
To change any particular text in the Video Editor SDK just provide your custom value for string resource provided in String resources section of every screen (check out an example of string resources on editor screen). Keep ResourceId the same and change only related value.
To localize Video Editor SDK follow an official guilde and provide string resources for every locale in your app with the same ResourceId and translated values.
View information about third party libraries
1.0.15.1
1.0.16
1.0.16.1
1.0.16.2
1.0.16.3
1.0.17
1.0.17.1
1.0.18
1.0.18.1
1.19.0
1.20.0
1.21.0
1.22.0