Jump to content
  • 0

Implementing a custom VideoCapturer for Android SDK



Using the Ant Media Android SDK, can I implement a custom VideoCapturer that takes a video feed from an outside source that is already encoded like h264 or VP8? (I would also like to implement a capturer for encoded audio in a similar manner)
I have seen a reference listed below where the FileVideoCapturer is used as an example.

It appears to me that I am supposed to create a VideoFrame class using a VideoFrame.Buffer class (not the Java NIO buffer class) which seems to want an i420 encoded buffer.
Is there a way to provide an h264 frame instead?

Link to comment
Share on other sites

  • Answers 3
  • Created
  • Last Reply

Top Posters For This Question

Top Posters For This Question

Posted Images

3 answers to this question

Recommended Posts

  • 3

Hi Guys,

I've just implemented a very simple RTSP Pulling and sending directly to Ant Media Server for Android SDK. 

Here is the customized SDK -> https://drive.google.com/file/d/1pQTaB_6DtzSbxfbIO1pyfkZdrDeWKPwK/view?usp=sharing

Just change the SERVER_ADDRESS for your Ant Media Server and RTSP_URL for your IP Camera in MainActivity.


public static final String SERVER_ADDRESS = "ovh36.antmedia.io:5080";

public static final String RTSP_URL = "rtsp://admin:admin@";


It's working for me for the simple IP Camera at my home.  




A. Oguz



Link to comment
Share on other sites

  • 0

Sorry for my late response. Actually answering this question is not easy for me. I should check it first on my end. So I was waiting for a available time to do it but I couldn't find so far. It may also take some time for me. I will check it soon.

Maybe someone in the community has experience and can answer before me

Link to comment
Share on other sites

  • 0

Hi Guys,

Thank you so much for the meeting yesterday.  Let me give you some idea how to do that.


1. Create a Virtual Video Capturer that actually does not nothing. Just sends very small buffers to the WebRTC Stack

2. Create a Virtual Hardware Encoder that you can push the H.264 packets into a synchronized list in this virtual hardware encoder. When its encode method is called by WebRTC stack, just pop the packet from the synchronized list and give back to WebRTC Stack as if it encodes the buffer.    

3.  Let the VideoEncoderFactory to create an instance the Virtual Hardware Encoder at the step #2 and get the reference from VideoEncoderFactory. 

So whenever you have an H264 packet from anywhere, just push it to the Virtual Hardware Encoder and trigger the Virtual Video Capturer to let the WebRTC stack runs properly. 



1. You can use FileVideoCapturer for your Virtual Video Capturer.  No need to read any content from any file. Just remove the most of the part in the FileVideoCapturer. Its task is whenever you have a H264 packet from your resource, just initialize a very small video frame and CapturerObserver's onFrameCaptured

final long captureTimeNs = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
final JavaI420Buffer buffer = JavaI420Buffer.allocate(1, 1);
VideoFrame videoFrame = new VideoFrame(buffer, 0 /* rotation */, captureTimeNs);

As you use the buffer size is very small. Just to let the WebRTC stack run properly. Just initialize the WebRTC stack with your custom Video Capturer in the createVideoCapturer method of WebRTCClient

 private @Nullable VideoCapturer createVideoCapturer(String source) {



2. You can use HardwareVideoEncoder for your Virtual Hardware Encoder. Basically, you don't need to open or close any encoder. Just delete the related lines. In your implementation, just create a synchronized linked list and push the H.264 packets to this linked list.  WebRTC stack will call the encode method. In the encode method, just pop from your linked list and deliver back to WebRTC stack

public VideoCodecStatus encode(VideoFrame videoFrame, EncodeInfo encodeInfo) {
		//pop from your linkedlist and assing it frameBuffer(ByteBuffer);
        EncodedImage.Builder builder = EncodedImage.builder()
        EncodedImage encodedImage = builder
                                            () -> {
        callback.onEncodedFrame(encodedImage, new CodecSpecificInfo());

For the details, you can take a look at the HardwareEncoder's deliverEncodedImage and encode methods. 

3. You can use HardwareVideoEncoderFactory for your VideoEncoderFactory. Just initialize your virtual hardware encoder encoder in your EncoderFactory and create a getter for that. 

public VideoEncoder createEncoder(VideoCodecInfo input) {
    //don't create virtual here because this method can be called multiple times


Use your new VideoEncoderFactory in PeerConnectionClient's createPeerConnectionFactoryInternal

public createPeerConnectionFactoryInternal() {
	 encoderFactory = new YOUR_HARDWARE_ENCODER_FACTORY()



Whenever you have H264 packet, get the EncoderFactory from PeerConnectionClient and get the Encoder from EncoderFactory and push the H264 packet to its synchronized list

I hope it helps you. If you have any question and does not make sense, you can get consultancy again or you can join the community hours that happen in every Thursday at 3:00 PM GMT+3.  Click the "Schedule Meeting" on antmedia.io and Choose the Community Hours.  


A. Oguz

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Create New...