本文整理汇总了Java中org.jcodec.codecs.h264.H264Encoder类的典型用法代码示例。如果您正苦于以下问题:Java H264Encoder类的具体用法?Java H264Encoder怎么用?Java H264Encoder使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
H264Encoder类属于org.jcodec.codecs.h264包,在下文中一共展示了H264Encoder类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: transcode
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public List<ByteBuffer> transcode() throws IOException {
H264Decoder decoder = new H264Decoder();
decoder.addSps(avcC.getSpsList());
decoder.addPps(avcC.getPpsList());
Picture buf = Picture.create(mbW << 4, mbH << 4, ColorSpace.YUV420);
Frame dec = null;
for (VirtualPacket virtualPacket : head) {
dec = decoder.decodeFrame(H264Utils.splitMOVPacket(virtualPacket.getData(), avcC), buf.getData());
}
H264Encoder encoder = new H264Encoder(rc);
ByteBuffer tmp = ByteBuffer.allocate(frameSize);
List<ByteBuffer> result = new ArrayList<ByteBuffer>();
for (VirtualPacket pkt : tail) {
dec = decoder.decodeFrame(H264Utils.splitMOVPacket(pkt.getData(), avcC), buf.getData());
tmp.clear();
ByteBuffer res = encoder.encodeFrame(dec, tmp);
ByteBuffer out = ByteBuffer.allocate(frameSize);
processFrame(res, out);
result.add(out);
}
return result;
}
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:27,代码来源:AVCClipTrack.java
示例2: SequenceEncoderMp4
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public SequenceEncoderMp4(File out)
throws IOException
{
super(out);
this.ch = NIOUtils.writableFileChannel(out);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrack(TrackType.VIDEO, 5);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Transform to convert between RGB and YUV
transform = ColorUtil.getTransform(ColorSpace.RGB, encoder.getSupportedColorSpaces()[0]);
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
开发者ID:hiliving,项目名称:P2Video-master,代码行数:27,代码来源:SequenceEncoderMp4.java
示例3: SequenceEncoderMp4
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public SequenceEncoderMp4(File out)
throws IOException
{
super(out);
this.ch = NIOUtils.writableFileChannel(out);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrack(TrackType.VIDEO, timeScale);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Transform to convert between RGB and YUV
transform = ColorUtil.getTransform(ColorSpace.RGB, encoder.getSupportedColorSpaces()[0]);
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
开发者ID:ynztlxdeai,项目名称:ImageToVideo,代码行数:27,代码来源:SequenceEncoderMp4.java
示例4: SequenceImagesEncoder
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public SequenceImagesEncoder(File out, int screenWidth, int screenHeight) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, timescale);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(screenWidth * screenHeight * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
开发者ID:rafaelaaraujo,项目名称:Face-detect-framework,代码行数:25,代码来源:SequenceImagesEncoder.java
示例5: open
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
@Override
public void open(String _path, int width, int _height, int _fps) throws IOException {
path = _path;
height = _height;
fps = _fps;
ch = new FileChannelWrapper(FileChannel.open(Paths.get(path), StandardOpenOption.CREATE, StandardOpenOption.WRITE, StandardOpenOption.TRUNCATE_EXISTING));
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrack(TrackType.VIDEO, fps);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocateDirect(width * height * 6);
// Create an instance of encoder
encoder = new H264Encoder(new JCodecUtils.JHVRateControl(20));
// Encoder extra data ( SPS, PPS ) to be stored in a special place of MP4
spsList = new ArrayList<>();
ppsList = new ArrayList<>();
toEncode = Picture.create(width, height, ColorSpace.YUV420J);
}
开发者ID:Helioviewer-Project,项目名称:JHelioviewer-SWHV,代码行数:21,代码来源:JCodecExporter.java
示例6: SequenceEncoder
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public SequenceEncoder(File out, int frameRate) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, frameRate);// original frmae rate is 25
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
开发者ID:deepakpk009,项目名称:JScreenRecorder,代码行数:25,代码来源:SequenceEncoder.java
示例7: SequenceEncoder
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public SequenceEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrack(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Transform to convert between RGB and YUV
transform = ColorUtil.getTransform(ColorSpace.RGB, encoder.getSupportedColorSpaces()[0]);
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:25,代码来源:SequenceEncoder.java
示例8: Transcode2AVCTrack
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public Transcode2AVCTrack(VirtualTrack src, Size frameDim) {
checkFourCC(src);
this.src = src;
ConstantRateControl rc = new ConstantRateControl(TARGET_RATE);
H264Encoder encoder = new H264Encoder(rc);
scaleFactor = selectScaleFactor(frameDim);
thumbWidth = frameDim.getWidth() >> scaleFactor;
thumbHeight = (frameDim.getHeight() >> scaleFactor) & ~1;
mbW = (thumbWidth + 15) >> 4;
mbH = (thumbHeight + 15) >> 4;
se = H264Utils.createMOVSampleEntry(encoder.initSPS(new Size(thumbWidth, thumbHeight)), encoder.initPPS());
PixelAspectExt pasp = Box.findFirst(src.getSampleEntry(), PixelAspectExt.class, "pasp");
if (pasp != null)
se.add(pasp);
frameSize = rc.calcFrameSize(mbW * mbH);
frameSize += frameSize >> 4;
}
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:22,代码来源:Transcode2AVCTrack.java
示例9: ImageToH264MP4Encoder
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public ImageToH264MP4Encoder(SeekableByteChannel ch, AudioFormat af) throws IOException {
this.ch = ch;
this.af = af;
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrack(TrackType.VIDEO, 25);
// Create an instance of encoder
encoder = new H264Encoder();
// Transform to convert between RGB and YUV
transform = ColorUtil.getTransform(ColorSpace.RGB, encoder.getSupportedColorSpaces()[0]);
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
if (af != null)
audioTrack = muxer.addPCMAudioTrack(af);
}
开发者ID:guardianproject,项目名称:CameraV,代码行数:25,代码来源:ImageToH264MP4Encoder.java
示例10: initalize
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
/**
* Initalize a compress class
*
* @param fileHandle - resource to mpeg4 file
* @param width - video width
* @param height - video height
* @throws IOException
*/
public void initalize(File fileHandle, int width, int height) throws IOException{
this.width=width;
this.height=height;
ch=NIOUtils.writableFileChannel(fileHandle);
muxer=new MP4Muxer(ch, Brand.MP4);
outTrack=muxer.addTrackForCompressed(TrackType.VIDEO, frameRate);
outBuffer=ByteBuffer.allocate(width * height * 6);
transform=new RgbToYuv420(0, 0);
encoder=new H264Encoder();
spsList=new ArrayList<ByteBuffer>();
ppsList=new ArrayList<ByteBuffer>();
frameNo=0;
}
开发者ID:shadoq,项目名称:s3gdxcodec,代码行数:24,代码来源:PixmapEncoder.java
示例11: Encoder
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public Encoder(File out, int width, int height) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
_out = ByteBuffer.allocate(width * height * 6);
encoder = new H264Encoder();
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
this.ch = NIOUtils.writableFileChannel(out);
muxer = new MP4Muxer(ch, Brand.MP4);
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
}
开发者ID:kamil-karkus,项目名称:EasySnap,代码行数:13,代码来源:Encoder.java
示例12: Mpeg2AVCTrack
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public Mpeg2AVCTrack(VirtualTrack src) throws IOException {
checkFourCC(src);
this.src = src;
ConstantRateControl rc = new ConstantRateControl(TARGET_RATE);
H264Encoder encoder = new H264Encoder(rc);
nextPacket = src.nextPacket();
Size frameDim = MPEGDecoder.getSize(nextPacket.getData());
scaleFactor = selectScaleFactor(frameDim);
thumbWidth = frameDim.getWidth() >> scaleFactor;
thumbHeight = (frameDim.getHeight() >> scaleFactor) & ~1;
mbW = (thumbWidth + 15) >> 4;
mbH = (thumbHeight + 15) >> 4;
se = H264Utils.createMOVSampleEntry(encoder.initSPS(new Size(thumbWidth, thumbHeight)), encoder.initPPS());
PixelAspectExt pasp = Box.findFirst(src.getSampleEntry(), PixelAspectExt.class, "pasp");
if (pasp != null)
se.add(pasp);
frameSize = rc.calcFrameSize(mbW * mbH);
frameSize += frameSize >> 4;
}
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:25,代码来源:Mpeg2AVCTrack.java
示例13: MPEGToAVCTranscoder
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public MPEGToAVCTranscoder(int scaleFactor) {
this.scaleFactor = scaleFactor;
rc = new ConstantRateControl(Mpeg2AVCTrack.TARGET_RATE);
this.decoder = getDecoder(scaleFactor);
this.encoder = new H264Encoder(rc);
}
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:7,代码来源:MPEGToAVCTranscoder.java
示例14: AVCClipTrack
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public AVCClipTrack(VirtualTrack src, int frameFrom, int frameTo) {
super(src, frameFrom, frameTo);
SampleEntry origSE = src.getSampleEntry();
if (!"avc1".equals(origSE.getFourcc()))
throw new RuntimeException("Not an AVC source track");
rc = new ConstantRateControl(1024);
H264Encoder encoder = new H264Encoder(rc);
avcC = H264Utils.parseAVCC((VideoSampleEntry) origSE);
SeqParameterSet sps = H264Utils.readSPS(NIOUtils.duplicate(avcC.getSpsList().get(0)));
mbW = sps.pic_width_in_mbs_minus1 + 1;
mbH = H264Utils.getPicHeightInMbs(sps);
encSPS = encoder.initSPS(H264Utils.getPicSize(sps));
encSPS.seq_parameter_set_id = 1;
encPPS = encoder.initPPS();
encPPS.seq_parameter_set_id = 1;
encPPS.pic_parameter_set_id = 1;
encSPS.profile_idc = sps.profile_idc;
encSPS.level_idc = sps.level_idc;
encSPS.frame_mbs_only_flag = sps.frame_mbs_only_flag;
encSPS.frame_crop_bottom_offset = sps.frame_crop_bottom_offset;
encSPS.frame_crop_left_offset = sps.frame_crop_left_offset;
encSPS.frame_crop_right_offset = sps.frame_crop_right_offset;
encSPS.frame_crop_top_offset = sps.frame_crop_top_offset;
encSPS.vuiParams = sps.vuiParams;
avcC.getSpsList().add(H264Utils.writeSPS(encSPS, 128));
avcC.getPpsList().add(H264Utils.writePPS(encPPS, 20));
se = (VideoSampleEntry) MP4Util.cloneBox(origSE, 2048, SampleDescriptionBox.FACTORY);
se.removeChildren("avcC");
se.add(avcC);
frameSize = rc.calcFrameSize(mbW * mbH);
frameSize += frameSize >> 4;
}
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:40,代码来源:AVCClipTrack.java
示例15: Transcoder
import org.jcodec.codecs.h264.H264Encoder; //导入依赖的package包/类
public Transcoder() {
rc = new ConstantRateControl(TARGET_RATE);
this.decoder = getDecoder(scaleFactor);
this.encoder = new H264Encoder(rc);
pic0 = Picture.create(mbW << 4, (mbH + 1) << 4, ColorSpace.YUV444);
}
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:7,代码来源:Transcode2AVCTrack.java
注:本文中的org.jcodec.codecs.h264.H264Encoder类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论