Background
In the past, I’ve created and even shared an example of how to create MP4 file from a series of Bitmaps, here, based on here, and I’ve also published the code on Github, here.
It seems to work fine with a single image, as such:
@WorkerThread
private fun testImage() {
Log.d("AppLog", "testImage")
val startTime = System.currentTimeMillis()
Log.d("AppLog", "start")
val videoFile = File(ContextCompat.getExternalFilesDirs(this, null)[0], "image.mp4")
if (videoFile.exists())
videoFile.delete()
videoFile.parentFile!!.mkdirs()
val timeLapseEncoder = TimeLapseEncoder()
val bitmap = BitmapFactory.decodeResource(resources, R.drawable.test)
val width = bitmap.width
val height = bitmap.height
timeLapseEncoder.prepareForEncoding(videoFile.absolutePath, width, height)
val frameDurationInMs = 1000
timeLapseEncoder.encodeFrame(bitmap, frameDurationInMs)
timeLapseEncoder.finishEncoding()
val endTime = System.currentTimeMillis()
Log.d("AppLog", "it took ${endTime - startTime} ms to convert a single image ($width x $height) to mp4")
}
The problem
When I try to work on multiple frames, and even just 2 frames, I can see that sometimes it skips some frames, making the video also shorter.
For example, this scenario that it should take 2 frames, each takes 5 seconds, yet the output gets to be 5 seconds instead of 10 seconds, and it ignores the entire second frame:
@WorkerThread
private fun testImages() {
Log.d("AppLog", "testImages")
val startTime = System.currentTimeMillis()
Log.d("AppLog", "start")
val videoFile = File(ContextCompat.getExternalFilesDirs(this, null)[0], "images.mp4")
if (videoFile.exists())
videoFile.delete()
videoFile.parentFile!!.mkdirs()
// Log.d("AppLog", "success creating parent?${videoFile.parentFile.exists()}")
val timeLapseEncoder = TimeLapseEncoder()
val bitmap = BitmapFactory.decodeResource(resources, R.drawable.frame1)
val width = bitmap.width
val height = bitmap.height
timeLapseEncoder.prepareForEncoding(videoFile.absolutePath, width, height)
val delay = 5000
timeLapseEncoder.encodeFrame(bitmap, delay)
val bitmap2 = BitmapFactory.decodeResource(resources, R.drawable.frame2)
timeLapseEncoder.encodeFrame(bitmap2, delay)
timeLapseEncoder.finishEncoding()
val endTime = System.currentTimeMillis()
Log.d("AppLog", "it took ${endTime - startTime} ms to convert a single image ($width x $height) to ${videoFile.absolutePath} ${videoFile.exists()} ${videoFile.length()}")
}
What I’ve tried
I tried to go over the code and also debug, but it seems fine…
Weird thing is that if I change the duration and also add more frames, it seems to be fine, such as:
This will produce 12 seconds video, when first 6 seconds is of one image, and the rest 6 seconds are of another image.
I also tried to have the equivalence to what I originally did, just in more frames:
for (i in 0 until 500)
timeLapseEncoder.encodeFrame(bitmap, 10)
val bitmap2 = BitmapFactory.decodeResource(resources, R.drawable.frame2)
for (i in 0 until 500)
timeLapseEncoder.encodeFrame(bitmap2, 10)
This didn’t create 5 seconds for each image at all…
I thought that maybe it’s some issue with fps, but it’s set fine in the code already, to 30, which is reasonable and it’s probably above the minimal that’s allowed for MP4 format.
The questions
-
What’s wrong with how I used this?
-
Is there perhaps a better way to create MP4 files from images, where you set the duration of each frame, one after another? A solution that doesn’t require a large library and doesn’t have a problematic license?