如何更改unity中shaderplash image 的图

阅读:788回复:3
splash image显示的时候能否加一个进度条,也可以说在游戏开始的时候加一个进度条
发布于: 15:33
如题,请帮忙解答一下,先谢谢。
<span class="lou J_floor_copy" title="复制此楼地址"
data-hash="read_楼#
发布于: 16:43
可以呀,需要自已做.unity自已的设置中没有这个吧.
鲜花3864朵
<span class="lou J_floor_copy" title="复制此楼地址"
data-hash="read_楼#
发布于: 18:42
直接把splash image加载画面做成一个新场景,然后异步加载主界面
<span class="lou J_floor_copy" title="复制此楼地址"
data-hash="read_楼#
发布于: 16:30
:直接把splash image加载画面做成一个新场景,然后异步加载主界面 你好,我说的是Unity发布在手机上,刚启动程序的时候不是会出现一个预设的界面吗,这个界面通常有好几秒,所以我想问在这个界面是否有方法可以加个loading条?
您需要登录后才可以回帖,&或者&
Powered by怎么去掉那个power by unity的logo呀?必须专业版才能去掉吗?_unity3d吧_百度贴吧
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&签到排名:今日本吧第个签到,本吧因你更精彩,明天继续来努力!
本吧签到人数:0成为超级会员,使用一键签到本月漏签0次!成为超级会员,赠送8张补签卡连续签到:天&&累计签到:天超级会员单次开通12个月以上,赠送连续签到卡3张
关注:18,874贴子:
怎么去掉那个power by unity的logo呀?必须专业版才能去掉吗?收藏
android非得专业版才能去掉?如果真是的,那我反编译试试。
破解啊。还反编译。。。
PlayerSetting里的Splash Image,拖一张图片进去
不知道为什么我觉得这样还显得我做的东西高端点呢……总会有群众发出如此的声音“哦?这玩意使用Unity3D引擎做的啊,哇,炉石,轩辕剑6都是这个引擎呢”
登录百度帐号我的游戏推荐游戏
后查看最近玩过的游戏
为兴趣而生,贴吧更懂你。或Unity3D研究院之IOS截屏 话筒录音 录制截屏视频保存沙盒(另类实现方法)(三十五) ...
查看: 168723|
评论: 0|原作者: 雨松MOMO
摘要: 两周没有更新博客了,IOS + Android同时开发,激情的日子继续着。昨天有个朋友打电话告诉我说它们的U3D项目遇到点棘手的难题,他们想在Unity3D中增加截屏录像录音的功能,并且还要能导出保存成.mp4的格式。据我所知U ...
& & 两周没有更新博客了,IOS + Android同时开发,激情的日子继续着。昨天有个朋友打电话告诉我说它们的U3D项目遇到点棘手的难题,他们想在Unity3D中增加截屏录像录音的功能,并且还要能导出保存成.mp4的格式。据我所知Unity3D是没有截屏录像的功能,只有截屏图片的功能。有一段时间没有研究Unity3D的东西了,一时心里痒痒我决定那就好好研究研究,功夫不负有心人终于让我研究出来如何在Unity3D结合IOS前端录制截屏视频的功能了。首先我说说我研究实现的原理。1.截取屏幕每帧的图片,将截取的N张图片组成一个没有声音的视频.mp4文件。2.同时还需要录制手机听筒中的声音保存为.caf格式。3.最终将没有声音的视频和音频组合成一个全新的视频文件,保存在沙盒中即可。& & & & 因为他们的Unity3D项目比较特殊,可以认为是在Unity3D游戏引擎之上搭建的IOS软件项目。Unity3D只负责显示一个3D的模型,至于UI部分全部都是由IOS的前台的OC代码实现的。这样就造成一个问题,OC的代码截图只有UI部分,U3D截图只有3D部分。为了解决这个问题截屏时我们需要把这两张图片合成为一张全新的图片。这里再说一下用苹果私有API截图是可以同时将UI部分U3D部分保存为一张图片,不过有可能APPStore不能审核通过所以大家还是老老实实用合并的方法来做。OK下面MOMO来说代码的实现过程首先在Unity中创建一个全新的工程,在创建一个立方体对象,为了方便看效果我们写一条脚本让这个立方体的对象一直自转,这样播放出来的视频看的会比较清楚喔。Test.cs直接挂在立方体对象之上。代码比较简单我就不解释了。[代码]:using UnityE
using System.C
public class Test : MonoBehaviour {
int count = 0;
void Start () {
// Update is called once per frame
void Update ()
this.transform.Rotate(new Vector3(0,1,0));
//在这里OC的代码通知U3D截屏
void StartScreenshot(string str)
Application.CaptureScreenshot(count +"u3d.JPG");
}然后我们将这个Unity3D工程导出成IOS的项目 。Unity会生成对应的XCODE工程。我们写一个全新的ViewController覆盖在U3D生成的OPGL viewController之上,用于写UI高级控件,接着打开APPControll.mm文件。在如下方法的末尾处添加代码[代码]:int OpenEAGL_UnityCallback(UIWindow** window, int* screenWidth, int* screenHeight,
int* openglesVersion)
CGRect rect = [[UIScreen mainScreen] bounds];
// Create a full-screen window
_window = [[UIWindow alloc] initWithFrame:rect];
EAGLView* view = [[EAGLView alloc] initWithFrame:rect];
UnityViewController *controller = [[UnityViewController alloc] init];
sGLViewController =
#if defined(__IPHONE_3_0)
if( _ios30orNewer )
controller.wantsFullScreenLayout = TRUE;
controller.view =
CreateSplashView( UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone ? (UIView*)_window : (UIView*)view );
CreateActivityIndicator(_splashView);
// add only now so controller have chance to reorient *all* added views
[_window addSubview:view];
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone)
[_window bringSubviewToFront:_splashView];
_autorotEnableHandling =
[[NSNotificationCenter defaultCenter] postNotificationName: UIDeviceOrientationDidChangeNotification object: [UIDevice currentDevice]];
// reposition activity indicator after we rotated views
if (_activityIndicator)
_activityIndicator.center = CGPointMake([_splashView bounds].size.width/2, [_splashView bounds].size.height/2);
int openglesApi =
#if defined(__IPHONE_3_0) && USE_OPENGLES20_IF_AVAILABLE
kEAGLRenderingAPIOpenGLES2;
kEAGLRenderingAPIOpenGLES1;
for (; openglesApi &= kEAGLRenderingAPIOpenGLES1 && !_ --openglesApi)
if (!UnityIsRenderingAPISupported(openglesApi))
_context = [[EAGLContext alloc] initWithAPI:openglesApi];
if (!_context)
if (![EAGLContext setCurrentContext:_context]) {
_context = 0;
const GLuint colorFormat = UnityUse32bitDisplayBuffer() ? GL_RGBA8_OES : GL_RGB565_OES;
if (!CreateWindowSurface(view, colorFormat, GL_DEPTH_COMPONENT16_OES, UnityGetDesiredMSAASampleCount(MSAA_DEFAULT_SAMPLE_COUNT), NO, &_surface)) {
glViewport(0, 0, _surface.w, _surface.h);
[_window makeKeyAndVisible];
[view release];
*window = _
*screenWidth = _surface.w;
*screenHeight = _surface.h;
*openglesVersion = _context.API;
_glesContextCreated =
//--------------------下面的MyViewController就是我们新写的Contoller----------------
MyViewController * myView =
[[MyViewController alloc] init];
[sGLViewController.view addSubview:myView.view];
//--------------------上面的MyViewController就是我们新写的Contoller----------------
}如果你不是U3D项目请大家记得引入AVFoundation.formwork 和 MediaPlayer.framework ,因为U3D会自动将这两个fromWork生成出来[代码]://
MyViewController.h
Created by 雨松MOMO on 12-9-14.
Copyright (c) 2012年 雨松MOMO. All rights reserved.
#import &UIKit/UIKit.h&
#import &Foundation/Foundation.h&
#import &AVFoundation/AVFoundation.h&
#import &MediaPlayer/MediaPlayer.h&
@interface MyViewController : UIViewController&AVAudioRecorderDelegate,AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate&
//时间计时器
NSTimer *_
UILabel * _
AVAudioRecorder * _
//读取动画
UITextView *_sharedLoadingTextV
UIActivityIndicatorView* _sharedActivityV
@end下面是具体的实现 ,核心的代码MOMO也是在网上学习老外的文章,最终将它们组合在了一起就完成了。研究了好几个小时,真实内牛满面啊~~~~ &由于代码比较多,请大家一定要仔细阅读哦。[代码]://
MyViewController.m
Created by 雨松MOMO on 12-9-14.
Copyright (c) 2012年 雨松MOMO. All rights reserved.
#import "MyViewController.h"
@interface MyViewController ()
@implementation MyViewController
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
// Custom initialization
- (void)viewDidLoad
[super viewDidLoad];
self.view.backgroundColor = [UIColor redColor];
UIWindow *screenWindow = [[UIApplication sharedApplication] keyWindow];
UIGraphicsBeginImageContext(screenWindow.frame.size);
[screenWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
#if !TARGET_IPHONE_SIMULATOR
self.view.backgroundColor = [UIColor greenColor];
self.view.backgroundColor = [UIColor clearColor];
UIButton * start = [UIButton buttonWithType:UIButtonTypeRoundedRect];
[start setFrame:CGRectMake(0, 0, 200, 30)];
[start setTitle:@"开始截屏" forState:UIControlStateNormal];
[start addTarget:self action:@selector(startPress) forControlEvents:UIControlEventTouchDown];
UIButton * end = [UIButton buttonWithType:UIButtonTypeRoundedRect];
[end setFrame:CGRectMake(0, 50, 200, 30)];
[end setTitle:@"结束截屏(开始录制视频)" forState:UIControlStateNormal];
[end addTarget:self action:@selector(endPress) forControlEvents:UIControlEventTouchDown];
[self.view addSubview:start];
[self.view addSubview:end];
_labe = [[[UILabel alloc]initWithFrame:CGRectMake(30, 200, 300, 30)]autorelease];
_labe.text = [NSString stringWithFormat:@"%@%d",@"雨松MOMO开始计时:===
",_count];
[self.view addSubview:_labe];
//初始化录音
[self prepareToRecord];
-(void)addLoading:(NSString*) info
//顶部文本视图
_sharedLoadingTextView = [[[UITextView alloc] initWithFrame:CGRectMake(0, 0, 130, 130)] autorelease];
[_sharedLoadingTextView setBackgroundColor:[UIColor blackColor]];
[_sharedLoadingTextView setText:info];
[_sharedLoadingTextView setTextColor:[UIColor whiteColor]];
[_sharedLoadingTextView setTextAlignment:UITextAlignmentCenter];
[_sharedLoadingTextView setFont:[UIFont systemFontOfSize:15]];
_sharedLoadingTextView.textAlignment = UITextAlignmentC
_sharedLoadingTextView.alpha = 0.8f;
_sharedLoadingTextView.center = self.view.
_sharedLoadingTextView.layer.cornerRadius = 10;
_sharedLoadingTextView.layer.masksToBounds = YES;
//创建Loading动画视图
_sharedActivityView = [[[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray] autorelease];
//设置动画视图的风格,这里设定它为白色
_sharedActivityView.activityIndicatorViewStyle=UIActivityIndicatorViewStyleWhiteL
//设置它显示的区域
_sharedActivityView.frame = CGRectMake(0,0, 320, 480);
_sharedActivityView.center = self.view.
//开始播放动画
[_sharedActivityView startAnimating];
[self.view addSubview:_sharedLoadingTextView];
[self.view addSubview:_sharedActivityView];
-(void)removeLoading
[_sharedLoadingTextView removeFromSuperview];
[_sharedActivityView removeFromSuperview];
-(void)startPress
_count = 0;
_timer = [NSTimer scheduledTimerWithTimeInterval: 0.1
target: self
selector: @selector(heartBeat:)
userInfo: nil
repeats: YES];
//开始录音
[_recorder record];
-(void)endPress
if(_timer != nil)
[_timer invalidate];
[self addLoading:@"开始制作视频"];
[NSThread detachNewThreadSelector:@selector(startThreadMainMethod) toTarget:self withObject:nil];
-(void)startThreadMainMethod
//在这里制作视频
NSMutableArray *_array = [[[NSMutableArray alloc]init]autorelease];
NSString * Path = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];
for(int i =0; i& _ i++)
//读取存在沙盒里面的文件图片
NSString *
_pathSecond = [NSString stringWithFormat:@"%@/%d%@",Path,i,@".JPG"];
NSString *
_pathFirst = [NSString stringWithFormat:@"%@/%d%@",Path,i,@"u3d.JPG"];
//因为拿到的是个路径 把它加载成一个data对象
NSData *data0=[NSData dataWithContentsOfFile:_pathFirst];
NSData *data1=[NSData dataWithContentsOfFile:_pathSecond];
//直接把该 图片读出来
UIImage *img0=[UIImage imageWithData:data0];
UIImage *img1=[UIImage imageWithData:data1];
[_array addObject:[self MergerImage : img0 : img1]];
Path = [NSString stringWithFormat:@"%@/%@%@",Path,@"veido",@".MP4"];
[_recorder stop];
[self writeImages:_array ToMovieAtPath:Path withSize: CGSizeMake(320, 480) inDuration:_count*0.1 byFPS:10];
[self removeLoading];
NSLog(@"recorder successfully");
UIAlertView *recorderSuccessful = [[UIAlertView alloc] initWithTitle:@"" message:@"视频录制成功"
delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
[recorderSuccessful show];
[recorderSuccessful release];
- (void) heartBeat: (NSTimer*) timer
_labe.text = [NSString stringWithFormat:@"%@%d",@"雨松MOMO开始计时:===
",_count];
//这个是私有API运气不好会被拒接
//这个方法比较给力 可以直接把ios前端和 U3D中的所有图像都截取出来
//extern CGImageRef UIGetScreenImage();
//UIImage *image = [UIImage imageWithCGImage:UIGetScreenImage()];
//UIImageWriteToSavedPhotosAlbum(image,nil,nil,nil);
//保险起见还是用如下方法截图
//这个方法不能截取U3D的图像
UIWindow *screenWindow = [[UIApplication sharedApplication]keyWindow];
UIGraphicsBeginImageContext(screenWindow.frame.size);
[screenWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if (UIImagePNGRepresentation(image) == nil)
data = UIImageJPEGRepresentation(image, 1);
data = UIImagePNGRepresentation(image);
NSFileManager *fileManager = [NSFileManager defaultManager];
NSString * Path = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];
[fileManager createDirectoryAtPath:Path withIntermediateDirectories:YES attributes:nil error:nil];
Path = [NSString stringWithFormat:@"%@/%d%@",Path,_count,@".JPG"];
[fileManager createFileAtPath:Path contents:data attributes:nil];
//通知U3D开始截屏
UnitySendMessage("Cube","StartScreenshot","");
//合并图片,把ios前景图片和U3D图片合并在一起
-(UIImage*) MergerImage:(UIImage*) firstImg:(UIImage*) secondImg
UIGraphicsBeginImageContext(CGSizeMake(320, 480));
[firstImg drawInRect:CGRectMake(0, 0, firstImg.size.width, firstImg.size.height)];
[secondImg drawInRect:CGRectMake(0, 0, secondImg.size.width, secondImg.size.height)];
UIImage *resultImage=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resultI
- (void)viewDidUnload
[super viewDidUnload];
// Release any retained subviews of the main view.
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
return (interfaceOrientation == UIInterfaceOrientationPortrait);
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
- (void) writeImages:(NSArray *)imagesArray ToMovieAtPath:(NSString *) path withSize:(CGSize) size
inDuration:(float)duration byFPS:(int32_t)fps
//在这里将之前截取的图片合并成一个视频
//Wire the writer:
NSError *error =
AVAssetWriter *videoWriter = [[[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path]
fileType:AVFileTypeQuickTimeMovie
error:&error] autorelease];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//Write some samples:
CVPixelBufferRef buffer = NULL;
int frameCount = 0;
int imagesCount = [imagesArray count];
float averageTime = duration/imagesC
int averageFrame = (int)(averageTime * fps);
for(UIImage * img in imagesArray)
buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:size];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j & 30)
if (adaptor.assetWriterInput.readyForMoreMediaData)
printf("appending %d attemp %d\n", frameCount, j);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
float frameSeconds = CMTimeGetSeconds(frameTime);
NSLog(@"frameCount:%d,kRecordingFPS:%d,frameSeconds:%f",frameCount,fps,frameSeconds);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(buffer)
[NSThread sleepForTimeInterval:0.05];
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
if (!append_ok) {
printf("error appending image %d times %d\n", frameCount, j);
frameCount = frameCount + averageF
//Finish the session:
[videoWriterInput markAsFinished];
[videoWriter finishWriting];
NSLog(@"finishWriting");
//将静态视频 和声音合并成一个新视频
[self CompileFilesToMakeMovie];
- (void) prepareToRecord
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err =
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
[audioSession setActive:YES error:&err];
NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];
// Create a new dated file
NSString * recorderFilePath = [[NSString stringWithFormat:@"%@/%@.caf", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], @"sound"] retain];
NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
_recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
if(!_recorder){
NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
UIAlertView *alert =
[[UIAlertView alloc] initWithTitle: @"Warning"
message: [err localizedDescription]
delegate: nil
cancelButtonTitle:@"OK"
otherButtonTitles:nil];
[alert show];
[alert release];
//prepare to record
[_recorder setDelegate:self];
[_recorder prepareToRecord];
_recorder.meteringEnabled = YES;
BOOL audioHWAvailable = audioSession.inputIsA
if (! audioHWAvailable) {
UIAlertView *cantRecordAlert =
[[UIAlertView alloc] initWithTitle: @"Warning"
message: @"Audio input hardware not available"
delegate: nil
cancelButtonTitle:@"OK"
otherButtonTitles:nil];
[cantRecordAlert show];
[cantRecordAlert release];
//代理 这里可以监听录音成功
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
NSLog(@"recorder successfully");
UIAlertView *recorderSuccessful = [[UIAlertView alloc] initWithTitle:@"" message:@"录音成功"
delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
[recorderSuccessful show];
[recorderSuccessful release];
//代理 这里可以监听录音失败
- (void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)arecorder error:(NSError *)error
UIAlertView *recorderFailed = [[UIAlertView alloc] initWithTitle:@"" message:@"发生错误"
delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
[recorderFailed show];
[recorderFailed release];
-(void)CompileFilesToMakeMovie
//这个方法在沙盒中把静态图片组成的视频 与录制的声音合并成一个新视频
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSString* audio_inputFileName = @"sound.caf";
NSString* audio_inputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], audio_inputFileName] ;
audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];
NSString* video_inputFileName = @"veido.mp4";
NSString* video_inputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], video_inputFileName] ;
video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath];
NSString* outputFileName = @"outputVeido.mov";
NSString* outputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], outputFileName] ;
outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
CMTime nextClipStartTime = kCMTimeZ
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = @"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileU
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
@end如下图所示,高级控件的按钮属于UI部分,后面的立方体对象是U3D生成,并且立方体对象在一直的旋转。点击开始截屏按钮时OC 部分 和U3D会每一帧同时截屏,并且此时开始录音。点击结束截屏按钮时,程序先将OC和U3D截屏的图片每一帧两两的组合成一个新图片,然后生成没有声音的视频。 最后将没有声音的视频 和刚刚录制的音频组合成一个全新的视频存在沙盒中即可。&&此时我们看一下模拟器中的沙盒文件,”数字”.JPG就是OC截取的图片, “数字+U3D”.JPG就是U3D中截取的图片。Sound.caf就是录制的音频文件,veido.mp4 就是将连续的图片组合成的无声音视频文件,最后的outputVeido.mov就是将无声音的视频文件与音频文件组合成的新视频文件。&&双击打开outputVeido.mov视频文件,我们可以直接在QuickTimePlayer中播放它,怎么样功能很给力吧,哈哈哈。U3D也能轻松的实现截屏功能,哈哈哈~~&&最后我在说一下,这个代码同时也适用于IOS普通的软件项目中,U3D只是多做了一步合成图片的功能,所有代码都写在MyViewController中请大家仔细看喔。 这两天MOMO还会抽时间研究一下在Android 下如何截屏录制视频,请大家拭目以待噢,哇咔咔。因为U3D生成的工程比较大,所以我就不上传U3D生成的XCODE代码了,我给出一个纯OC代码的下载地址,最后祝大家学习愉快。下载地址:
Powered by

我要回帖

更多关于 unity image effect 的文章

 

随机推荐