Decode Android's Hardware Encoded H264 Camera Feed Using Ffmpeg In Real Time
Solution 1:
Have you tried using java.lang.Runtime?
String[] parameters = {"ffmpeg", "other", "args"};
Program program Runtime.getRuntime().exec(parameters);
InputStream in = program.getInputStream();
OutputStream out = program.getOutputStream();
InputStream err = program.getErrorStream();
Then you write to stdout and read from stdin and stderr. It's not a pipe but it should be better than using a network interface.
Solution 2:
A little bit late but I think this is a good question and it doesn't have a good answer yet.
If you want to stream the camera and mic from an android device you have two main alternatives: Java or NDK implementations.
- Java implementation.
I'm only going to mention the idea but basically it is implement an RTSP Server and RTP Protocol in java based on these standards Real-Time Streaming Protocol Version 2.0 and RTP Payload Format for H.264 Video. This task will be very long and hard. But if you are doing your PhP it could be nice to have a nice RTSP Java lib for Android.
- NDK implementation.
This is alternative include various solutions. The main idea is to use a power C or C++ library in our Android application. For this instance, FFmpeg. This library can be compiled for Android and may support various architectures. The problem of this approach is that you may need to learn about the Android NDK, C and C++ to accomplish this.
But there is an alternative. You can wrap the c library and use the FFmpeg. But how?
For example, using FFmpeg Android, which has been compiled with x264, libass, fontconfig, freetype and fribidi and supports various architectures. But it still hard to program the if you want to stream in real-time you need to deal with file descriptors and in/out streams.
The best alternative, from a Java programming point of view, is to use JavaCV. JavaCV uses wrappers from commonly used libraries of computer vision that includes: (OpenCV, FFmpeg, etc, and provides utility classes to make their functionality easier to use on the Java platform, including (of course) Android.
JavaCV also comes with hardware accelerated full-screen image display (CanvasFrame
and GLCanvasFrame
), easy-to-use methods to execute code in parallel on multiple cores (Parallel
), user-friendly geometric and color calibration of cameras and projectors (GeometricCalibrator
, ProCamGeometricCalibrator
, ProCamColorCalibrator
), detection and matching of feature points (ObjectFinder
), a set of classes that implement direct image alignment of projector-camera systems (mainly GNImageAligner
, ProjectiveTransformer
, ProjectiveColorTransformer
, ProCamTransformer
, and ReflectanceInitializer
), a blob analysis package (Blobs
), as well as miscellaneous functionality in the JavaCV
class. Some of these classes also have an OpenCL and OpenGL counterpart, their names ending with CL
or starting with GL
, i.e.: JavaCVCL
, GLCanvasFrame
, etc.
But how can we use this solution?
Here we have a basic implementation to stream using UDP.
String streamURL = "udp://ip_destination:port";
recorder = new FFmpegFrameRecorder(streamURL, frameWidth, frameHeight, 1);
recorder.setInterleaved(false);
// video options //
recorder.setFormat("mpegts");
recorder.setVideoOption("tune", "zerolatency");
recorder.setVideoOption("preset", "ultrafast");
recorder.setVideoBitrate(5 * 1024 * 1024);
recorder.setFrameRate(30);
recorder.setSampleRate(AUDIO_SAMPLE_RATE);
recorder.setVideoCodec(AV_CODEC_ID_H264);
recorder.setAudioCodec(AV_CODEC_ID_AAC);
This part of the code shows how to initialize the FFmpegFrameRecorder object called recorder. This object will capture and encode the frames obtained from the camera and the samples obtained from the microphone.
If you want to capture a preview in the same Android app then we need to implement a CameraPreview Class this class will convert the raw data served from the Camera and it will create the Preview and the Frame for the FFmpegFrameRecorder.
Remember to replace the ip_destination with the ip of the pc or device where you want to send the stream. The port can be 8080 as example.
@Override
public Mat onCameraFrame(Mat mat)
{
if (audioRecordRunnable == null) {
startTime = System.currentTimeMillis();
return mat;
}
if (recording && mat != null) {
synchronized (semaphore) {
try {
Frame frame = converterToMat.convert(mat);
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(frame);
} catch (FFmpegFrameRecorder.Exception e) {
LogHelper.i(TAG, e.getMessage());
e.printStackTrace();
}
}
}
return mat;
}
This method shows the implementation of the onCameraFrame
method that get the Mat (picture) from the camera and it is converted as a Frame and recorded by the FFmpegFrameRecorder object.
@Override
public void onSampleReady(ShortBuffer audioData)
{
if (recorder == null) return;
if (recording && audioData == null) return;
try {
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
LogHelper.e(TAG, "audioData: " + audioData);
recorder.recordSamples(audioData);
} catch (FFmpegFrameRecorder.Exception e) {
LogHelper.v(TAG, e.getMessage());
e.printStackTrace();
}
}
Same with the audio the audioData
is a ShortBuffer
object that will be recorder by the FFmpegFrameRecorder.
In the PC or device destination you can run the following command to get the stream.
ffplay udp://ip_source:port
The ip_source
is the ip of the smartphone that is streaming the camera and mic stream. The port must be the same 8080.
I created a solution in my github repository here: UDPAVStreamer.
Good luck
Post a Comment for "Decode Android's Hardware Encoded H264 Camera Feed Using Ffmpeg In Real Time"