Monday, May 27, 2013

Merge audio and video

The MergeAudioVideo program allows you to merge sound file (.wav) and video file(.mov). The result video file is mergedvideo.mov stored in your current folder. Merging the sound file and video file can be performed simply in Java by using the JMF API so you need to download and install JMF before the program can be compiled and run without any error.

merge audio and video in Java



MergeAudioVideo source code:

import java.io.*;
import javax.media.*;
import javax.media.control.TrackControl;
import javax.media.datasink.*;
import javax.media.format.*;
import javax.media.protocol.*;
import java.net.*;

public class MergeAudioVideo{
 
public static void main(String args[]){
if(args.length>1){
      System.out.println("Please wait...");
merging(args[0],args[1]);
    System.out.println("Merging finished");
}
else
System.out.println("Invalid files input");
System.exit(0);

}

//merge the sound and video files
public static void merging(String audioFileName, String videoFileName){


//Declare and initialize StateHelper objects: sha,shv, and shm
//sha for audio processor, shvfor audio process, and shm for merge processor
StateHelper sha=null;
StateHelper shv=null;
StateHelper shm=null;

//Declare and initialize processor objects for audio, video, and merged data
Processor audioProcessor=null;
Processor videoProcessor=null;
Processor mergeProcessor=null;

//create MediaLocator objects for audio and video files
MediaLocator audioLocator=null;
MediaLocator videoLocator=null;
MediaLocator outLocator=null;
try{
File audioFile=new File(audioFileName);
audioLocator=new MediaLocator(audioFile.toURI().toURL());

File videoFile=new File(videoFileName);
videoLocator=new MediaLocator(videoFile.toURI().toURL());

//Create MediaLocator for merged output file
File outFile=new File(System.currentTimeMillis()+"mergedvideo.mov");
outLocator=new MediaLocator(outFile.toURI().toURL());
}catch(MalformedURLException me){System.exit(-1);}

//create datasources
DataSource audioDataSource=null;
DataSource videoDataSource=null;
DataSource mergedDataSource=null;
DataSource arrayDataSource[]=null;
try{
audioDataSource = Manager.createDataSource(audioLocator); // your audio file
        videoDataSource = Manager.createDataSource(videoLocator); //your video file
        mergedDataSource = null; // data source to combine video with audio
        arrayDataSource= new DataSource[2]; //data source array
}catch(IOException ie){System.exit(-1);}
catch(NoDataSourceException ie){System.exit(-1);}
//format array for input audio and video
Format[] formats=new Format[2];
formats[0]=new AudioFormat(AudioFormat.IMA4_MS); //create audio format object
formats[1]=new VideoFormat(VideoFormat.JPEG); //create video format object

//create media file content type object
FileTypeDescriptor outftd=new FileTypeDescriptor(FileTypeDescriptor.QUICKTIME);

//create processor objects for video and audio
try{
videoProcessor = Manager.createProcessor(videoDataSource);
shv=new StateHelper(videoProcessor);        
        audioProcessor = Manager.createProcessor(audioDataSource);
sha=new StateHelper(audioProcessor);
}catch(IOException ie){System.exit(-1);}
catch(NoProcessorException ne){System.exit(-1);}
       
//Configure processors
if (!shv.configure(10000))
System.exit(-1);
if (!sha.configure(10000))
System.exit(-1);
//Realize processors

if (!shv.realize(10000))
System.exit(-1);
if (!sha.realize(10000))
System.exit(-1);

//return data sources from processors so they can be merged
arrayDataSource[0]=audioProcessor.getDataOutput();
arrayDataSource[1]=videoProcessor.getDataOutput();

//start the processors
videoProcessor.start();
   audioProcessor.start();

//create merged data source, connect, and start it
try{
mergedDataSource=Manager.createMergingDataSource(arrayDataSource);
mergedDataSource.connect();
        mergedDataSource.start();
}catch(IOException ie){System.exit(-1);}
catch(IncompatibleSourceException id){System.exit(-1);}
//processor for merged output
try{
mergeProcessor=Manager.createRealizedProcessor(new     ProcessorModel(mergedDataSource,formats,outftd));
      shm=new StateHelper(mergeProcessor);
}catch(IOException ie){System.exit(-1);}
catch(NoProcessorException ie){System.exit(-1);}
catch(CannotRealizeException ie){System.exit(-1);}
//set output file content type
mergeProcessor.setContentDescriptor(new ContentDescriptor(FileTypeDescriptor.QUICKTIME));
//query supported formats
TrackControl tcs[] =mergeProcessor.getTrackControls();
Format f[] = tcs[0].getSupportedFormats();
if (f == null || f.length <= 0)
System.exit(100);
//set track format
tcs[0].setFormat(f[0]);

//get datasource from the mergeProcessor so it is ready to write to a file by DataSink filewriter
DataSource source =mergeProcessor.getDataOutput();
//create DataSink filewrite for writing
DataSink filewriter = null;
try {
filewriter = Manager.createDataSink(source, outLocator);
filewriter.open();
} catch (NoDataSinkException e) {
System.exit(100);
} catch (IOException e) {
System.exit(100);
} catch (SecurityException e) {
System.exit(100);
}

// now start the filewriter and mergeProcessor
try {
mergeProcessor.start();
filewriter.start();
} catch (IOException e) {
System.exit(-1);
}
// wait 2 seconds for end of media stream
shm.waitToEndOfMedia(2000);
shm.close();
filewriter.close();


}


}

//The StateHelper class help you determine the states of the processors
class StateHelper implements ControllerListener {
Processor p = null;
boolean configured = false;
boolean realized = false;
boolean prefetched = false;
boolean eom = false;
boolean failed = false;
boolean closed = false;
public StateHelper(Processor pr) {
p= pr;
p.addControllerListener(this);
}

public boolean configure(int timeOutMillis) {
long startTime = System.currentTimeMillis();
synchronized (this) {
p.configure();
while (!configured && !failed) {
try {
wait(timeOutMillis);
} catch (InterruptedException ie) {}
if (System.currentTimeMillis() - startTime > timeOutMillis)
break;
}

}
return configured;
}
public boolean realize(int timeOutMillis) {
long startTime = System.currentTimeMillis();
synchronized (this) {
p.realize();
while (!realized && !failed) {
try {
wait(timeOutMillis);
} catch (InterruptedException ie) {}
if (System.currentTimeMillis() - startTime > timeOutMillis)
break;
}
}
return realized;
}

public boolean prefetch(int timeOutMillis) {
long startTime = System.currentTimeMillis();
synchronized (this) {
p.prefetch();
while (!prefetched && !failed) {
try {
wait(timeOutMillis);
} catch (InterruptedException ie) {}
if (System.currentTimeMillis() - startTime > timeOutMillis)
break;
}
}
return prefetched && !failed;
}

public boolean waitToEndOfMedia(int timeOutMillis) {
long startTime = System.currentTimeMillis();
eom = false;
synchronized (this) {
while (!eom && !failed) {
try {
wait(timeOutMillis);
} catch (InterruptedException ie){}
if (System.currentTimeMillis() - startTime > timeOutMillis)
break;
}
}
return eom && !failed;
}

public void close() {
synchronized (this) {
p.close();
while (!closed) {
try {
wait(100);
} catch (InterruptedException ie) {}
}

}
p.removeControllerListener(this);
}

public synchronized void controllerUpdate(ControllerEvent ce) {
if (ce instanceof RealizeCompleteEvent) {
realized = true;
} else if (ce instanceof ConfigureCompleteEvent) {
configured = true;
} else if (ce instanceof PrefetchCompleteEvent) {
prefetched = true;
} else if (ce instanceof EndOfMediaEvent) {
eom = true;
} else if (ce instanceof ControllerErrorEvent) {
failed = true;
} else if (ce instanceof ControllerClosedEvent) {
closed = true;
} else {
return;
}
notifyAll();
}

}

The process of merging an audio file and an video file is not too hard to understand. You need three MeidaLocator objects to point to the source audio file and video file, and the last one MediaLocator is needed for the output merged file. To read the data of the audio file and video file, you can use the createDataSource (MediaLocator locator) method of the Manager class. This method returns the DataSource object that contains the data of the audio or video file. You will call this method two times. One for reading the data of the audio file and another one for reading the data of the video file. One you have the DataSource objects you can pass them to the processors to start the merging process. To process the files, you also will define three processor objects. One is to process the data of the audio file; one is for video file; and another one for the output merged file. Before the processors can fully perform their, they must be configured and realized. The StateHelper class helps to do these tasks. To merge the two source files, the createMergingDataSource(DataSource[] data) method of the Manager class is used. This method returns a new DataSource that combines the DataSource elements of the DataSource array together. One you have the merged data source, you can create a DataSink object that acts a a file writer to write the merged data to the output file. You might like to read Sound Recording post to get the related information about Processor, DataSource, and DataSink.

compass app flashLight app

19 comments:

  1. Gives 58 errors, COntrollerClosed event not found, EndofMediaEvent not found and so on.

    ReplyDelete
  2. I recheck the code and compile it without any error.
    Please make sure you download and install JMF.
    http://www.oracle.com/technetwork/java/javase/tech/index-jsp-140239.html

    ReplyDelete
  3. thank you sir shar this program helpful knowledge for us .but sir when i compile this program ...then show this output:
    run:Java Result: -1
    BUILD SUCCESSFUL (total time: 1 second).......and also save 0kb file...sir please help me how can i handl.......this problem....sir soon comment me or tell me on my email id engrnajaf@yahoo.com.....thank's sir in advance...

    ReplyDelete
    Replies
    1. Did you find any solution for this problem because I am facing same problem

      Delete
    2. Did you find any solution for this problem because I am facing same problem

      Delete
  4. is this support all formats of audio files?

    ReplyDelete
  5. i am getting this error, please help !

    Please wait...
    Failed to build a graph for the given custom options.
    Failed to realize: com.sun.media.ProcessEngine@4c0bc4
    Cannot build a flow graph with the customized options:
    Unable to transcode format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed, 16000.0 frame rate, FrameSize=16 bits
    to: ima4/ms, 8000.0 Hz, 4-bit, Mono, 4055.0 frame rate, FrameSize=2048 bits
    outputting to: WAV
    Error: Unable to realize com.sun.media.ProcessEngine@4c0bc4

    ReplyDelete
  6. can i use this code in Android ..?

    ReplyDelete
  7. Hi ,
    I am getting below exception

    javax.media.CannotRealizeException: Unable to provide all requested tracks

    can any one help me?

    ReplyDelete
    Replies
    1. Full exception :

      javax.media.CannotRealizeException: Unable to provide all requested tracks
      at javax.media.Manager.createRealizedProcessor(Manager.java:908)
      at com.easycapture.recorder.MergeAudioVideo.merging(MergeAudioVideo.java:132)
      at com.easycapture.recorder.MergeAudioVideo.main(MergeAudioVideo.java:15)

      Delete
  8. I have an exception : javax.media.NoProcessorException: Cannot find a Processor for: com.sun.media.protocol.file.DataSource@30946e09
    Do you have any idea :D thanks

    ReplyDelete
  9. To share the records on Facebook, you should simply to get the connection to a specific PDF document with the.pdf expansion and post it on the mass of your business page. altomergepdf.com

    ReplyDelete
  10. In the event that you are new to utilizing these administrations, you can discover rules and practice ventures on the best way to effectively utilize this innovation further bolstering your advantage on most varying media printed materials and Internet-based specialist organizations. American Audio Visual

    ReplyDelete
  11. This comment has been removed by the author.

    ReplyDelete
  12. What's more, shouldn't something be said about sound? Audio visual rentals organizations have that secured as well. Audio and video production at americanaudiovisual.com

    ReplyDelete
  13. There are numerous occurrences where the customer may request the verbatim audio transcription administration which suggests that every one of the articulations that are being utilized in the discourse of a speaker in the audio are to be incorporated into the transcription.spanish transcription services

    ReplyDelete
  14. I also have complete control over where the files are stored and who gets to view them. When I put my audios up on my own website there is no competition or distractions taking visitors attention away from my content. best studio headphones for the money

    ReplyDelete
  15. One commonly used strategy is to have audio reviews dcm speakers of the different affiliate programs you are associated with.

    ReplyDelete