Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IOS audio stops working after recording video #45

Open
ksyao2002 opened this issue Sep 9, 2020 · 33 comments
Open

IOS audio stops working after recording video #45

ksyao2002 opened this issue Sep 9, 2020 · 33 comments
Labels
bug Something isn't working

Comments

@ksyao2002
Copy link

ksyao2002 commented Sep 9, 2020

Audio from audiosource in ios build works fine before recording video, after recording video all audio cannot be heard.

Can reproduce by just playing audio in empty build after using the camera video recording function.

Platform specs

Please provide the following info if this is a Unity 3D repository.

  • Unity version: 2019.3.12f
  • Platform: iOS
  • Device: iPhone SE but also other iPhones
  • How did you download the plugin: Asset Store
@ksyao2002 ksyao2002 added the bug Something isn't working label Sep 9, 2020
@ksyao2002 ksyao2002 changed the title IOS audio stops working IOS audio stops working after recording video Sep 9, 2020
@ksyao2002
Copy link
Author

Attached is the barebones Unity app for your convenience:

https://github.com/ksyao2002/Barebones-Unity-Audio-and-Native-Camera

@yasirkula
Copy link
Owner

Thank you. This issue is in my list of issues to investigate.

@yasirkula
Copy link
Owner

How about this script:

public void playAudio()
{
	print("playing audio");
	explain1.PlayOneShot(explain1.clip);
}

private void RecordVideo()
{
	NativeCamera.Permission permission = NativeCamera.RecordVideo((path) =>
	{
		UnityEngine.Debug.Log("Video path: " + path + " is the path");
		if (path != null)
		{
			// Play the recorded video
			//Handheld.PlayFullScreenMovie("file://" + path);
		}
	});//, NativeCamera.Quality.Medium, 100, 100L, NativeCamera.PreferredCamera.Front);

	UnityEngine.Debug.Log("Permission result: " + permission);
}

I don't have access to a Mac so I've asked a friend to test it like this and he said that the audio issue was gone. I don't know if the culprit was Handheld.PlayFullScreenMovie, RecordVideo's parameters or explain1.Play(); but can you also test with this script and see if it works for you, as well? If it does, can you pinpoint the issue with the help of this script?

@ksyao2002
Copy link
Author

Thanks for your response. We tested your code out, but we found that it still had the same issue: after recording the video, the audio stopped working. I have pushed the changed code to the repo (the only changes were the changes you suggested above).

@yasirkula
Copy link
Owner

I'm not sure why we couldn't reproduce the issue.

@ksyao2002
Copy link
Author

Huh, weird. Was the barebones Unity app, without commenting out anything, playing the audio correctly after recording videos?

@yasirkula
Copy link
Owner

I've created a new Unity project in 2020.1.5f1, copied your script and audio file to it, created 2 UI buttons and assigned the playAudio and RecordVideo functions to them. At first try, it didn't work. After modifying the script as seen above, the issue was gone.

@yasirkula
Copy link
Owner

After updating your script and running the demo, do you see any error messages in Xcode console the moment you attempt to play the audio?

@ksyao2002
Copy link
Author

ksyao2002 commented Sep 18, 2020

I also tried updating my Unity project to 2020.1.5f1 and used the same script (with the code commented out) and scene file. The issue still remains. Looks like there may have been some error messages in Xcode. I have attached the images below:

Image from iOS (1)
Image from iOS

@yasirkula
Copy link
Owner

I see that you haven't changed the playAudio() function. Please also change it as follows:

public void playAudio()
{
	explain1.PlayOneShot(explain1.clip);
}

@ksyao2002
Copy link
Author

Thanks. I changed the script to what you had, but the issue still remains. Maybe you could send me your Unity iosbuild and we can make sure that it is not just our device that has the issue. Also, we noticed that when you press record button but don't record anything when the camera comes up to your screen, it still doesn't have audio.

@ksyao2002
Copy link
Author

Below is the logs btw

2020-09-18 16:48:02.608733-0500 StrippedAppToTestAudioAndRecording[3197:320665] Built from '2019.3/staging' branch, Version '2019.3.12f1 (84b23722532d)', Build type 'Release', Scripting Backend 'il2cpp'
-> applicationDidFinishLaunching()
-> applicationDidBecomeActive()
GfxDevice: creating device client; threaded=1
Initializing Metal device caps: Apple A13 GPU
Initialize engine version: 2019.3.12f1 (84b23722532d)
2020-09-18 16:48:08.926490-0500 StrippedAppToTestAudioAndRecording[3197:320665] Unbalanced calls to begin/end appearance transitions for <SplashScreenController: 0x1079071a0>.
UnloadTime: 0.444958 ms
-> applicationWillResignActive()
-> applicationDidBecomeActive()
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
Permission result: Granted
Controller:RecordVideo()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
2020-09-18 16:48:38.820153-0500 StrippedAppToTestAudioAndRecording[3197:320665] [Camera] Failed to read exposureBiasesByMode dictionary: Error Domain=NSCocoaErrorDomain Code=4864 "*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL" UserInfo={NSDebugDescription=*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL}
Video path: /private/var/mobile/Containers/Data/Application/660C24D9-47F3-43E5-9F9A-05597215E4A0/tmp/62215852811__993C384F-B385-4F9B-BA9F-2CFEA2D39EC0.MOV is the path
CameraCallback:Invoke(String)
NativeCameraNamespace.NCCameraCallbackiOS:OnMediaReceived(String)

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
Permission result: Granted
Controller:RecordVideo()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
2020-09-18 16:49:23.622557-0500 StrippedAppToTestAudioAndRecording[3197:320665] [Camera] Failed to read exposureBiasesByMode dictionary: Error Domain=NSCocoaErrorDomain Code=4864 "*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL" UserInfo={NSDebugDescription=*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL}
Video path: /private/var/mobile/Containers/Data/Application/660C24D9-47F3-43E5-9F9A-05597215E4A0/tmp/62215856771__32618A43-7A36-4E1B-9264-CE2F2C5B8565.MOV is the path
CameraCallback:Invoke(String)
NativeCameraNamespace.NCCameraCallbackiOS:OnMediaReceived(String)

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
-> applicationWillResignActive()
-> applicationDidEnterBackground()
-> applicationWillTerminate()
Setting up 1 worker threads for Enlighten.
Thread -> id: 16f91f000 -> priority: 1

@yasirkula
Copy link
Owner

yasirkula commented Sep 19, 2020

We've found out that if you preview the recorded video after capturing it and then click the Use Video button, sound works fine. Probably why we couldn't reproduce the issue before.

During our trials, enabling "Prepare iOS for Recording" fixed the issue for us. But people on the forums complain that enabling that option lowers the volume in their games. We didn't notice it but maybe because we were using an iPad. There are 2 interesting posts about this option that I want to put here:

Perhaps these solutions can resolve the volume issue of "Prepare iOS for Recording" (if it exists) but honestly, we were too tired to test them out. I'm not planning to test the above solutions in the next few days, so feel free to test them out yourself if you notice volume issues with "Prepare iOS for Recording" and please let me know of the results.

@yasirkula
Copy link
Owner

A few days have passed so I'm back! Nevermind the "Prepare iOS for Recording" option since its cons way overweigh its pros.

I've added the following logs to the native code to see what happens to audio configuration after UIImagePickerController is displayed:

// Before recording video
NSLog( @"=== BEFORE: %@", [[AVAudioSession sharedInstance] category] );
NSLog( @"=== BEFORE: %lu", (unsigned long) [[AVAudioSession sharedInstance] categoryOptions] );
NSLog( @"=== BEFORE: %@", [[AVAudioSession sharedInstance] mode] );

// After recording video
NSLog( @"=== AFTER: %@", [[AVAudioSession sharedInstance] category] );
NSLog( @"=== AFTER: %lu", (unsigned long) [[AVAudioSession sharedInstance] categoryOptions] );
NSLog( @"=== AFTER: %@", [[AVAudioSession sharedInstance] mode] );

Here's the output:

=== BEFORE: AVAudioSessionCategoryAmbient
=== BEFORE: 1
=== BEFORE: AVAudioSessionModeDefault

=== AFTER: AVAudioSessionCategoryPlayAndRecord
=== AFTER: 8
=== AFTER: AVAudioSessionModeVideoRecording

So, UIImagePickerController modifies all the audio parameters while recording the video and forgets to set them back. I've gone ahead and updated the native code to reset these values back to their original values after recording the video and it worked for us.

Please update NativeCamera.mm as follows and let me know if it works for you, as well:

#import <Foundation/Foundation.h>
#import <MobileCoreServices/UTCoreTypes.h>
#import <ImageIO/ImageIO.h>
#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>

#ifdef UNITY_4_0 || UNITY_5_0
#import "iPhone_View.h"
#else
extern UIViewController* UnityGetGLViewController();
#endif

#define CHECK_IOS_VERSION( version )  ([[[UIDevice currentDevice] systemVersion] compare:version options:NSNumericSearch] != NSOrderedAscending)

@interface UNativeCamera:NSObject
+ (int)checkPermission;
+ (int)requestPermission;
+ (int)canOpenSettings;
+ (void)openSettings;
+ (int)hasCamera;
+ (void)openCamera:(BOOL)imageMode defaultCamera:(int)defaultCamera savePath:(NSString *)imageSavePath maxImageSize:(int)maxImageSize videoQuality:(int)videoQuality maxVideoDuration:(int)maxVideoDuration;
+ (int)isCameraBusy;
+ (char *)getImageProperties:(NSString *)path;
+ (char *)getVideoProperties:(NSString *)path;
+ (char *)getVideoThumbnail:(NSString *)path savePath:(NSString *)savePath maximumSize:(int)maximumSize captureTime:(double)captureTime;
+ (char *)loadImageAtPath:(NSString *)path tempFilePath:(NSString *)tempFilePath maximumSize:(int)maximumSize;
@end

@implementation UNativeCamera

static NSString *pickedMediaSavePath;
static UIImagePickerController *imagePicker;
static int cameraMaxImageSize = -1;
static int imagePickerState = 0; // 0 -> none, 1 -> showing, 2 -> finished
static BOOL recordingVideo = NO;
static AVAudioSessionCategory unityAudioSessionCategory = AVAudioSessionCategoryAmbient;
static NSUInteger unityAudioSessionCategoryOptions = 1;
static AVAudioSessionMode unityAudioSessionMode = AVAudioSessionModeDefault;

// Credit: https://stackoverflow.com/a/20464727/2373034
+ (int)checkPermission
{
	if( CHECK_IOS_VERSION( @"7.0" ) )
	{
		AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
		if( status == AVAuthorizationStatusAuthorized )
			return 1;
		else if( status == AVAuthorizationStatusNotDetermined )
			return 2;
		else
			return 0;
	}
	
	return 1;
}

// Credit: https://stackoverflow.com/a/20464727/2373034
+ (int)requestPermission
{
	if( CHECK_IOS_VERSION( @"7.0" ) )
	{
		AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
		if( status == AVAuthorizationStatusAuthorized )
			return 1;
		else if( status == AVAuthorizationStatusNotDetermined )
		{
			__block BOOL authorized = NO;
			
			dispatch_semaphore_t sema = dispatch_semaphore_create( 0 );
			[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted )
			{
				authorized = granted;
				dispatch_semaphore_signal( sema );
			}];
			dispatch_semaphore_wait( sema, DISPATCH_TIME_FOREVER );
			
			return authorized ? 1 : 0;
		}
		else
			return 0;
	}
	
	return 1;
}

// Credit: https://stackoverflow.com/a/25453667/2373034
+ (int)canOpenSettings
{
	return ( &UIApplicationOpenSettingsURLString != NULL ) ? 1 : 0;
}

// Credit: https://stackoverflow.com/a/25453667/2373034
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
+ (void)openSettings
{
	if( &UIApplicationOpenSettingsURLString != NULL )
	{
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 100000
		if( CHECK_IOS_VERSION( @"10.0" ) )
			[[UIApplication sharedApplication] openURL:[NSURL URLWithString:UIApplicationOpenSettingsURLString] options:@{} completionHandler:nil];
		else
#endif
			[[UIApplication sharedApplication] openURL:[NSURL URLWithString:UIApplicationOpenSettingsURLString]];
	}
}
#pragma clang diagnostic pop

+ (int)hasCamera
{
	return [UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] ? 1 : 0;
}

// Credit: https://stackoverflow.com/a/10531752/2373034
+ (void)openCamera:(BOOL)imageMode defaultCamera:(int)defaultCamera savePath:(NSString *)imageSavePath maxImageSize:(int)maxImageSize videoQuality:(int)videoQuality maxVideoDuration:(int)maxVideoDuration
{
	if( ![UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] )
	{
		NSLog( @"Device has no registered cameras!" );
		
		UnitySendMessage( "NCCameraCallbackiOS", "OnMediaReceived", "" );
		return;
	}
	
	if( ( imageMode && ![[UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera] containsObject:(NSString*)kUTTypeImage] ) ||
		( !imageMode && ![[UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera] containsObject:(NSString*)kUTTypeMovie] ) )
	{
		NSLog( @"Camera does not support this operation!" );
		
		UnitySendMessage( "NCCameraCallbackiOS", "OnMediaReceived", "" );
		return;
	}
	
	imagePicker = [[UIImagePickerController alloc] init];
	imagePicker.delegate = self;
	imagePicker.allowsEditing = NO;
	imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
	
	if( imageMode )
		imagePicker.mediaTypes = [NSArray arrayWithObject:(NSString *)kUTTypeImage];
	else
	{
		imagePicker.mediaTypes = [NSArray arrayWithObject:(NSString *)kUTTypeMovie];
		
		if( maxVideoDuration > 0 )
			imagePicker.videoMaximumDuration = maxVideoDuration;
		
		if( videoQuality == 0 )
			imagePicker.videoQuality = UIImagePickerControllerQualityTypeLow;
		else if( videoQuality == 1 )
			imagePicker.videoQuality = UIImagePickerControllerQualityTypeMedium;
		else if( videoQuality == 2 )
			imagePicker.videoQuality = UIImagePickerControllerQualityTypeHigh;
	}
	
	if( defaultCamera == 0 && [UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceRear] )
		imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceRear;
	else if( defaultCamera == 1 && [UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront] )
		imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
	
	// Bugfix for https://github.com/yasirkula/UnityNativeCamera/issues/45
	if( !imageMode )
	{
		unityAudioSessionCategory = [[AVAudioSession sharedInstance] category];
		unityAudioSessionCategoryOptions = [[AVAudioSession sharedInstance] categoryOptions];
		unityAudioSessionMode = [[AVAudioSession sharedInstance] mode];
	}
	
	recordingVideo = !imageMode;
	pickedMediaSavePath = imageSavePath;
	cameraMaxImageSize = maxImageSize;
	
	imagePickerState = 1;
	[UnityGetGLViewController() presentViewController:imagePicker animated:YES completion:^{ imagePickerState = 0; }];
}

+ (int)isCameraBusy
{
	if( imagePickerState == 2 )
		return 1;
	
	if( imagePicker != nil )
	{
		if( imagePickerState == 1 || [imagePicker presentingViewController] == UnityGetGLViewController() )
			return 1;
		
		imagePicker = nil;
		[self restoreAudioSession];
		
		return 0;
	}
	
	return 0;
}

+ (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
	NSString *path = nil;
	if( [info[UIImagePickerControllerMediaType] isEqualToString:(NSString *)kUTTypeImage] )
	{
		NSLog( @"UIImagePickerController finished taking picture" );

		UIImage *image = info[UIImagePickerControllerEditedImage] ?: info[UIImagePickerControllerOriginalImage];
		if( image == nil )
			path = nil;
		else
		{
			NSString *extension = [pickedMediaSavePath pathExtension];
			BOOL saveAsJPEG = [extension caseInsensitiveCompare:@"jpg"] == NSOrderedSame || [extension caseInsensitiveCompare:@"jpeg"] == NSOrderedSame;
			
			// Try to save the image with metadata
			// CANCELED: a number of users reported that this method results in 90-degree rotated images, uncomment at your own risk
			// Credit: https://stackoverflow.com/a/15858955
			/*NSDictionary *metadata = [info objectForKey:UIImagePickerControllerMediaMetadata];
			NSMutableDictionary *mutableMetadata = nil;
			CFDictionaryRef metadataRef;
			CFStringRef imageType;
			
			if( saveAsJPEG )
			{
				mutableMetadata = [metadata mutableCopy];
				[mutableMetadata setObject:@(1.0) forKey:(__bridge NSString *)kCGImageDestinationLossyCompressionQuality];
				
				metadataRef = (__bridge CFDictionaryRef) mutableMetadata;
				imageType = kUTTypeJPEG;
			}
			else
			{
				metadataRef = (__bridge CFDictionaryRef) metadata;
				imageType = kUTTypePNG;
			}
			
			CGImageDestinationRef imageDestination = CGImageDestinationCreateWithURL( (__bridge CFURLRef) [NSURL fileURLWithPath:pickedMediaSavePath], imageType , 1, NULL );
			if( imageDestination == NULL )
				NSLog( @"Failed to create image destination" );
			else
			{
				CGImageDestinationAddImage( imageDestination, image.CGImage, metadataRef );
				if( CGImageDestinationFinalize( imageDestination ) )
					path = pickedMediaSavePath;
				else
					NSLog( @"Failed to finalize the image" );
				
				CFRelease( imageDestination );
			}*/
			
			if( path == nil )
			{
				//NSLog( @"Attempting to save the image without metadata as fallback" );
				
				if( ( saveAsJPEG && [UIImageJPEGRepresentation( [self scaleImage:image maxSize:cameraMaxImageSize], 1.0 ) writeToFile:pickedMediaSavePath atomically:YES] ) ||
					( !saveAsJPEG && [UIImagePNGRepresentation( [self scaleImage:image maxSize:cameraMaxImageSize] ) writeToFile:pickedMediaSavePath atomically:YES] ) )
					path = pickedMediaSavePath;
				else
				{
					NSLog( @"Error saving image without metadata" );
					path = nil;
				}
			}
		}
	}
	else
	{
		NSLog( @"UIImagePickerController finished recording video" );

		NSURL *mediaUrl = info[UIImagePickerControllerMediaURL] ?: info[UIImagePickerControllerReferenceURL];
		if( mediaUrl == nil )
			path = nil;
		else
			path = [mediaUrl path];
	}

	imagePicker = nil;
	imagePickerState = 2;
	UnitySendMessage( "NCCameraCallbackiOS", "OnMediaReceived", [self getCString:path] );

	[picker dismissViewControllerAnimated:NO completion:nil];
	[self restoreAudioSession];
}

+ (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
	NSLog( @"UIImagePickerController cancelled" );

	imagePicker = nil;
	UnitySendMessage( "NCCameraCallbackiOS", "OnMediaReceived", "" );
	
	[picker dismissViewControllerAnimated:NO completion:nil];
	[self restoreAudioSession];
}

// Bugfix for https://github.com/yasirkula/UnityNativeCamera/issues/45
+ (void)restoreAudioSession
{
	if( recordingVideo )
	{
		BOOL audioModeSwitchResult = YES;
		NSError *error = nil;
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 100000
		if( CHECK_IOS_VERSION( @"10.0" ) )
			audioModeSwitchResult = [[AVAudioSession sharedInstance] setCategory:unityAudioSessionCategory mode:unityAudioSessionMode options:unityAudioSessionCategoryOptions error:&error];
		else
#endif
			audioModeSwitchResult = [[AVAudioSession sharedInstance] setCategory:unityAudioSessionCategory withOptions:unityAudioSessionCategoryOptions error:&error] && [[AVAudioSession sharedInstance] setMode:unityAudioSessionMode error:&error];
		
		if( !audioModeSwitchResult )
		{
			if( error != nil )
				NSLog( @"Error setting audio session category back to %@ with mode %@ and options %lu: %@", unityAudioSessionCategory, unityAudioSessionMode, (unsigned long) unityAudioSessionCategoryOptions, error );
			else
				NSLog( @"Error setting audio session category back to %@ with mode %@ and options %lu", unityAudioSessionCategory, unityAudioSessionMode, (unsigned long) unityAudioSessionCategoryOptions );
		}
	}
}

// Credit: https://stackoverflow.com/a/4170099/2373034
+ (NSArray *)getImageMetadata:(NSString *)path
{
	int width = 0;
	int height = 0;
	int orientation = -1;

	CGImageSourceRef imageSource = CGImageSourceCreateWithURL( (__bridge CFURLRef) [NSURL fileURLWithPath:path], nil );
	if( imageSource != nil )
	{
		NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:(__bridge NSString *)kCGImageSourceShouldCache];
		CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex( imageSource, 0, (__bridge CFDictionaryRef) options );
		CFRelease( imageSource );

		CGFloat widthF = 0.0f, heightF = 0.0f;
		if( imageProperties != nil )
		{
			if( CFDictionaryContainsKey( imageProperties, kCGImagePropertyPixelWidth ) )
				CFNumberGetValue( (CFNumberRef) CFDictionaryGetValue( imageProperties, kCGImagePropertyPixelWidth ), kCFNumberCGFloatType, &widthF );
			
			if( CFDictionaryContainsKey( imageProperties, kCGImagePropertyPixelHeight ) )
				CFNumberGetValue( (CFNumberRef) CFDictionaryGetValue( imageProperties, kCGImagePropertyPixelHeight ), kCFNumberCGFloatType, &heightF );

			if( CFDictionaryContainsKey( imageProperties, kCGImagePropertyOrientation ) )
			{
				CFNumberGetValue( (CFNumberRef) CFDictionaryGetValue( imageProperties, kCGImagePropertyOrientation ), kCFNumberIntType, &orientation );
				
				if( orientation > 4 )
				{
					// Landscape image
					CGFloat temp = widthF;
					widthF = heightF;
					heightF = temp;
				}
			}

			CFRelease( imageProperties );
		}

		width = (int) roundf( widthF );
		height = (int) roundf( heightF );
	}

	return [[NSArray alloc] initWithObjects:[NSNumber numberWithInt:width], [NSNumber numberWithInt:height], [NSNumber numberWithInt:orientation], nil];
}

+ (char *)getImageProperties:(NSString *)path
{
	NSArray *metadata = [self getImageMetadata:path];
	
	int orientationUnity;
	int orientation = [metadata[2] intValue];
	
	// To understand the magic numbers, see ImageOrientation enum in NativeCamera.cs
	// and http://sylvana.net/jpegcrop/exif_orientation.html
	if( orientation == 1 )
		orientationUnity = 0;
	else if( orientation == 2 )
		orientationUnity = 4;
	else if( orientation == 3 )
		orientationUnity = 2;
	else if( orientation == 4 )
		orientationUnity = 6;
	else if( orientation == 5 )
		orientationUnity = 5;
	else if( orientation == 6 )
		orientationUnity = 1;
	else if( orientation == 7 )
		orientationUnity = 7;
	else if( orientation == 8 )
		orientationUnity = 3;
	else
		orientationUnity = -1;
	
	return [self getCString:[NSString stringWithFormat:@"%d>%d> >%d", [metadata[0] intValue], [metadata[1] intValue], orientationUnity]];
}

+ (char *)getVideoProperties:(NSString *)path
{
	CGSize size = CGSizeZero;
	float rotation = 0;
	long long duration = 0;
	
	AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:path] options:nil];
	if( asset != nil )
	{
		duration = (long long) round( CMTimeGetSeconds( [asset duration] ) * 1000 );
		CGAffineTransform transform = [asset preferredTransform];
		NSArray<AVAssetTrack *>* videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
		if( videoTracks != nil && [videoTracks count] > 0 )
		{
			size = [[videoTracks objectAtIndex:0] naturalSize];
			transform = [[videoTracks objectAtIndex:0] preferredTransform];
		}
		
		rotation = atan2( transform.b, transform.a ) * ( 180.0 / M_PI );
	}
	
	return [self getCString:[NSString stringWithFormat:@"%d>%d>%lld>%f", (int) roundf( size.width ), (int) roundf( size.height ), duration, rotation]];
}

+ (char *)getVideoThumbnail:(NSString *)path savePath:(NSString *)savePath maximumSize:(int)maximumSize captureTime:(double)captureTime
{
	AVAssetImageGenerator *thumbnailGenerator = [[AVAssetImageGenerator alloc] initWithAsset:[[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:path] options:nil]];
	thumbnailGenerator.appliesPreferredTrackTransform = YES;
	thumbnailGenerator.maximumSize = CGSizeMake( (CGFloat) maximumSize, (CGFloat) maximumSize );
	thumbnailGenerator.requestedTimeToleranceBefore = kCMTimeZero;
	thumbnailGenerator.requestedTimeToleranceAfter = kCMTimeZero;
	
	if( captureTime < 0.0 )
		captureTime = 0.0;
	else
	{
		AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:path] options:nil];
		if( asset != nil )
		{
			double videoDuration = CMTimeGetSeconds( [asset duration] );
			if( videoDuration > 0.0 && captureTime >= videoDuration - 0.1 )
			{
				if( captureTime > videoDuration )
					captureTime = videoDuration;
				
				thumbnailGenerator.requestedTimeToleranceBefore = CMTimeMakeWithSeconds( 1.0, 600 );
			}
		}
	}
	
	NSError *error = nil;
	CGImageRef image = [thumbnailGenerator copyCGImageAtTime:CMTimeMakeWithSeconds( captureTime, 600 ) actualTime:nil error:&error];
	if( image == nil )
	{
		if( error != nil )
			NSLog( @"Error generating video thumbnail: %@", error );
		else
			NSLog( @"Error generating video thumbnail..." );
		
		return [self getCString:@""];
	}
	
	UIImage *thumbnail = [[UIImage alloc] initWithCGImage:image];
	CGImageRelease( image );
	
	if( ![UIImagePNGRepresentation( thumbnail ) writeToFile:savePath atomically:YES] )
	{
		NSLog( @"Error saving thumbnail image" );
		return [self getCString:@""];
	}
	
	return [self getCString:savePath];
}

+ (UIImage *)scaleImage:(UIImage *)image maxSize:(int)maxSize
{
	CGFloat width = image.size.width;
	CGFloat height = image.size.height;
	
	UIImageOrientation orientation = image.imageOrientation;
	if( width <= maxSize && height <= maxSize && orientation != UIImageOrientationDown &&
		orientation != UIImageOrientationLeft && orientation != UIImageOrientationRight &&
		orientation != UIImageOrientationLeftMirrored && orientation != UIImageOrientationRightMirrored &&
		orientation != UIImageOrientationUpMirrored && orientation != UIImageOrientationDownMirrored )
		return image;
	
	CGFloat scaleX = 1.0f;
	CGFloat scaleY = 1.0f;
	if( width > maxSize )
		scaleX = maxSize / width;
	if( height > maxSize )
		scaleY = maxSize / height;
	
	// Credit: https://github.com/mbcharbonneau/UIImage-Categories/blob/master/UIImage%2BAlpha.m
	CGImageAlphaInfo alpha = CGImageGetAlphaInfo( image.CGImage );
	BOOL hasAlpha = alpha == kCGImageAlphaFirst || alpha == kCGImageAlphaLast || alpha == kCGImageAlphaPremultipliedFirst || alpha == kCGImageAlphaPremultipliedLast;
	
	CGFloat scaleRatio = scaleX < scaleY ? scaleX : scaleY;
	CGRect imageRect = CGRectMake( 0, 0, width * scaleRatio, height * scaleRatio );
	UIGraphicsBeginImageContextWithOptions( imageRect.size, !hasAlpha, image.scale );
	[image drawInRect:imageRect];
	image = UIGraphicsGetImageFromCurrentImageContext();
	UIGraphicsEndImageContext();
	
	return image;
}

+ (char *)loadImageAtPath:(NSString *)path tempFilePath:(NSString *)tempFilePath maximumSize:(int)maximumSize
{
	// Check if the image can be loaded by Unity without requiring a conversion to PNG
	// Credit: https://stackoverflow.com/a/12048937/2373034
	NSString *extension = [path pathExtension];
	BOOL conversionNeeded = [extension caseInsensitiveCompare:@"jpg"] != NSOrderedSame && [extension caseInsensitiveCompare:@"jpeg"] != NSOrderedSame && [extension caseInsensitiveCompare:@"png"] != NSOrderedSame;

	if( !conversionNeeded )
	{
		// Check if the image needs to be processed at all
		NSArray *metadata = [self getImageMetadata:path];
		int orientationInt = [metadata[2] intValue];  // 1: correct orientation, [1,8]: valid orientation range
		if( orientationInt == 1 && [metadata[0] intValue] <= maximumSize && [metadata[1] intValue] <= maximumSize )
			return [self getCString:path];
	}
	
	UIImage *image = [UIImage imageWithContentsOfFile:path];
	if( image == nil )
		return [self getCString:path];
	
	UIImage *scaledImage = [self scaleImage:image maxSize:maximumSize];
	if( conversionNeeded || scaledImage != image )
	{
		if( ![UIImagePNGRepresentation( scaledImage ) writeToFile:tempFilePath atomically:YES] )
		{
			NSLog( @"Error creating scaled image" );
			return [self getCString:path];
		}
		
		return [self getCString:tempFilePath];
	}
	else
		return [self getCString:path];
}

// Credit: https://stackoverflow.com/a/37052118/2373034
+ (char *)getCString:(NSString *)source
{
	if( source == nil )
		source = @"";
	
	const char *sourceUTF8 = [source UTF8String];
	char *result = (char*) malloc( strlen( sourceUTF8 ) + 1 );
	strcpy( result, sourceUTF8 );
	
	return result;
}

@end

extern "C" int _NativeCamera_CheckPermission()
{
	return [UNativeCamera checkPermission];
}

extern "C" int _NativeCamera_RequestPermission()
{
	return [UNativeCamera requestPermission];
}

extern "C" int _NativeCamera_CanOpenSettings()
{
	return [UNativeCamera canOpenSettings];
}

extern "C" void _NativeCamera_OpenSettings()
{
	[UNativeCamera openSettings];
}

extern "C" int _NativeCamera_HasCamera()
{
	return [UNativeCamera hasCamera];
}

extern "C" void _NativeCamera_TakePicture( const char* imageSavePath, int maxSize, int preferredCamera )
{
	[UNativeCamera openCamera:YES defaultCamera:preferredCamera savePath:[NSString stringWithUTF8String:imageSavePath] maxImageSize:maxSize videoQuality:-1 maxVideoDuration:-1];
}

extern "C" void _NativeCamera_RecordVideo( int quality, int maxDuration, int preferredCamera )
{
	[UNativeCamera openCamera:NO defaultCamera:preferredCamera savePath:nil maxImageSize:4096 videoQuality:quality maxVideoDuration:maxDuration];
}

extern "C" int _NativeCamera_IsCameraBusy()
{
	return [UNativeCamera isCameraBusy];
}

extern "C" char* _NativeCamera_GetImageProperties( const char* path )
{
	return [UNativeCamera getImageProperties:[NSString stringWithUTF8String:path]];
}

extern "C" char* _NativeCamera_GetVideoProperties( const char* path )
{
	return [UNativeCamera getVideoProperties:[NSString stringWithUTF8String:path]];
}

extern "C" char* _NativeCamera_GetVideoThumbnail( const char* path, const char* thumbnailSavePath, int maxSize, double captureTimeInSeconds )
{
	return [UNativeCamera getVideoThumbnail:[NSString stringWithUTF8String:path] savePath:[NSString stringWithUTF8String:thumbnailSavePath] maximumSize:maxSize captureTime:captureTimeInSeconds];
}

extern "C" char* _NativeCamera_LoadImageAtPath( const char* path, const char* temporaryFilePath, int maxSize )
{
	return [UNativeCamera loadImageAtPath:[NSString stringWithUTF8String:path] tempFilePath:[NSString stringWithUTF8String:temporaryFilePath] maximumSize:maxSize];
}

@ksyao2002
Copy link
Author

Hello, sorry for the long wait! I really appreciate all the help you've given :) We are currently in the process of testing it out. I will let you know how it goes when we finish.

@ksyao2002
Copy link
Author

Hello, we have tested it and unfortunately the same error is occurring. Another detail that might help: when we press the button to record video, but then exit the video recording without recording any videos, the audio is gone. It might be an issue with the initial communication between Unity and the plugin, rather than communication within the plugin. Here are some future steps I think we could take:

  1. You could let me know where you put the logs that you mentioned in your most recent comment so we can test if we are getting the same logs
    (
    // Before recording video
    NSLog( @"=== BEFORE: %@", [[AVAudioSession sharedInstance] category] );
    NSLog( @"=== BEFORE: %lu", (unsigned long) [[AVAudioSession sharedInstance] categoryOptions] );
    NSLog( @"=== BEFORE: %@", [[AVAudioSession sharedInstance] mode] );

// After recording video
NSLog( @"=== AFTER: %@", [[AVAudioSession sharedInstance] category] );
NSLog( @"=== AFTER: %lu", (unsigned long) [[AVAudioSession sharedInstance] categoryOptions] );
NSLog( @"=== AFTER: %@", [[AVAudioSession sharedInstance] mode] );
)

  1. You could send me your iosbuild so we can test to see if we have the same issue or if we can reproduce the lack of issues. On the same note we could send you our iosbuild and you can test if you can reproduce the issue. I have attached the build as a google drive link here: https://drive.google.com/drive/folders/1AQeds1GArEYd1oekgZloG269QMB53Abc?usp=sharing
    It may be an issue with our iphone versions, xcode versions, etc. I have listed our specs down again:

iPhone 11 with iOS 13.0
Xcode version is 11.7 (11E801a)

Also tested on iPad
iPad (7th generation) with 13.1.1

  1. Here are the logs:
    2020-09-27 07:59:47.765908-0500 StrippedAppToTestAudioAndRecording[413:17421] Built from ‘2019.3/staging’ branch, Version ’2019.3.12f1 (84b23722532d)’, Build type ‘Release’, Scripting Backend ‘il2cpp’
    -> applicationDidFinishLaunching()
    -> applicationDidBecomeActive()
    GfxDevice: creating device client; threaded=1
    Initializing Metal device caps: Apple A13 GPU
    Initialize engine version: 2019.3.12f1 (84b23722532d)
    2020-09-27 07:59:48.375302-0500 StrippedAppToTestAudioAndRecording[413:17421] Unbalanced calls to begin/end appearance transitions for <SplashScreenController: 0x100d2c260>.
    UnloadTime: 0.138292 ms
    playing audio
    Controller:playAudio()
    UnityEngine.Events.UnityAction:Invoke()
    UnityEngine.Events.UnityEvent:Invoke()
    UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
    UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
    UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
    UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
Permission result: Granted
Controller:RecordVideo()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
2020-09-27 08:00:24.074475-0500 StrippedAppToTestAudioAndRecording[413:17421] [Common] _BSMachError: port bf03; (os/kern) invalid capability (0x14) “Unable to insert COPY_SEND”
2020-09-27 08:00:24.074977-0500 StrippedAppToTestAudioAndRecording[413:17421] [Common] _BSMachError: port bf03; (os/kern) invalid capability (0x14) “Unable to insert COPY_SEND”
-> applicationWillResignActive()
-> applicationDidBecomeActive()
2020-09-27 08:00:32.627385-0500 StrippedAppToTestAudioAndRecording[413:17421] libMobileGestalt MobileGestalt.c:3154: statfs(/mnt4): No such file or directory
2020-09-27 08:00:34.074178-0500 StrippedAppToTestAudioAndRecording[413:17624] [AssetImport] Could not generate preview image for /private/var/mobile/Containers/Data/Application/9BE5B216-DCB5-4815-B9D4-FBE95C987E43/tmp/62290443262__F7AEB389-1F74-4FD1-95E6-4AFAF3B40BB1.MOV. Error: Error Domain=AVFoundationErrorDomain Code=-11832 “Cannot Open” UserInfo={NSLocalizedFailureReason=This media cannot be used., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x2811c7f30 {Error Domain=NSOSStatusErrorDomain Code=-12431 “(null)“}}
2020-09-27 08:00:34.116644-0500 StrippedAppToTestAudioAndRecording[413:17421] [AssetImport] Could not generate preview image for /private/var/mobile/Containers/Data/Application/9BE5B216-DCB5-4815-B9D4-FBE95C987E43/tmp/62290443262__F7AEB389-1F74-4FD1-95E6-4AFAF3B40BB1.MOV. Error: Error Domain=AVFoundationErrorDomain Code=-11832 “Cannot Open” UserInfo={NSLocalizedFailureReason=This media cannot be used., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x2811b7d20 {Error Domain=NSOSStatusErrorDomain Code=-12431 “(null)“}}
2020-09-27 08:00:38.593448-0500 StrippedAppToTestAudioAndRecording[413:17421] UIImagePickerController finished recording video
Video path: /private/var/mobile/Containers/Data/Application/9BE5B216-DCB5-4815-B9D4-FBE95C987E43/tmp/62290443262__F7AEB389-1F74-4FD1-95E6-4AFAF3B40BB1.MOV is the path
CameraCallback:Invoke(String)
NativeCameraNamespace.NCCameraCallbackiOS:OnMediaReceived(String)

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
playing audio
Controller:playAudio()
UnityEngine.Events.UnityAction:Invoke()
UnityEngine.Events.UnityEvent:Invoke()
UnityEngine.EventSystems.EventFunction1:Invoke(T1, BaseEventData) UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction1)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchPress(PointerEventData, Boolean, Boolean)
UnityEngine.EventSystems.StandaloneInputModule:ProcessTouchEvents()
UnityEngine.EventSystems.StandaloneInputModule:Process()

(Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)

@yasirkula
Copy link
Owner

I've added the // Before recording video logs to just before the UnityGetGLViewController() presentViewController line. The rest I've added to the beginning of restoreAudioSession function. Please also add a 3rd set of NSLog functions at the end of the restoreAudioSession function, I'd like to see if my code had any effect.

If you could test the plugin with those NSLog functions added and share the logs again, I'd appreciate it (I'm interested in my NSLog functions only).

@ksyao2002
Copy link
Author

Thanks. Below are the logs:

=== BEFORE: AVAudioSessionCategoryAmbient
=== BEFORE: 1
=== BEFORE: AVAudioSessionModeDefault

=== AFTER: AVAudioSessionCategoryPlayAndRecord
=== AFTER: 8
=== AFTER: AVAudioSessionModeVideoRecording

=== AFTER FUNCTION END: AVAudioSessionCategoryAmbient
=== AFTER FUNCTION END: 1
=== AFTER FUNCTION END: AVAudioSessionModeDefault

@yasirkula
Copy link
Owner

The next time you call the RecordVideo function, do "BEFORE" logs output the same values? If they do, I really can't tell what the issue is anymore.

@ksyao2002
Copy link
Author

It looks like so:

=== BEFORE: AVAudioSessionCategoryAmbient
=== BEFORE: 1
=== BEFORE: AVAudioSessionModeDefault

=== AFTER: AVAudioSessionCategoryPlayAndRecord
=== AFTER: 8
=== AFTER: AVAudioSessionModeVideoRecording

=== AFTER FUNCTION END: AVAudioSessionCategoryAmbient
=== AFTER FUNCTION END: 1
=== AFTER FUNCTION END: AVAudioSessionModeDefault

@ksyao2002
Copy link
Author

Is it possible for you to send us your ios build folder from Unity? I think it might be an issue with our phones. I will see if we can reproduce the issue using your iosbuild

@yasirkula
Copy link
Owner

It will take some time. I'll have to convince my friend to create a new project on his Mac again, test the plugin once again and send me the Xcode project. I don't think it will make a difference but still, I will let you know if I get the iOS build folder. In your tests, make sure that the device isn't in silent mode.

@ksyao2002
Copy link
Author

Thanks. I'm sure you know this, but you could generate an iosbuild folder on Unity if you're using a windows computer. Maybe you could generate it there and send it to me? Also, I'm not sure if it was you or your friend who was creating the Unity projects in the past, but if you have iosbuilds that you sent to your friend before, those would also work. How our team has been doing it is I generate the iosbuild on my windows machine and send them the iosbuild folder on google drive and they test it on their MacBooks.

Also, just to make sure that our operational definitions are consistent, when your friend tests to see that it works, what is the criteria? For us, we press play audio once, it plays correctly, then press record me, the native camera opens, then record video (or don't record video and just exit the native camera), then press record me, and the sound no longer plays.

@yasirkula
Copy link
Owner

I've always built the project on the Mac, so I'd rather stick with that 😄 We tested the plugin using the same steps.

When you use Unity's Microphone API to record a short clip, do you encounter the same issue after the recording is completed?

@ksyao2002
Copy link
Author

I will test that out. Also, we will try to build it on mac, maybe windows build has some deep bug

@ksyao2002
Copy link
Author

ksyao2002 commented Oct 1, 2020

We found some promising/strange results. You could take a look at the barebones github repo and look inside Controller.cs, but the basic thing we added was:

Microphone.Start(null, true, 20, maxFreq);
^To start recording audio using Unity's Microphone

and

Microphone.End(null); //Stop the audio recording
goAudioSource.Play(); //Playback the recorded audio
^When the audio is ended

Then, here is what I did.

  1. I kept recording the audio with Unity’s mic.
  2. Played avatar audio instruction
  3. Recorded the video
  4. Played avatar audio instruction again. (The audio worked)

I found when the audio is not working, then whatever I start audio recording or stop audio recording, the audio can work again. And to answer your question directly, if all we do is record audio using the Unity Microphone, the audio from our other sources still work fine. Any ideas?

@ksyao2002
Copy link
Author

We may have found a solution. We added config = AudioSettings.GetConfiguration(); to save current audio setting. Then restore the setting after video recording, and the audio works.

@ksyao2002
Copy link
Author

It's strange because we see that the configuration before recording and after recording is the same, but resetting it after recording seemed to fix the issue. Any ideas?

@yasirkula
Copy link
Owner

Your findings are promising 😺 I'm guessing that calling AudioSettings.Reset inside CameraCallback works fine, right? So we don't have to wait a few frames?

I'm wondering though; which property of AudioSettings.GetConfiguration might be modified after recording a video?

@ksyao2002
Copy link
Author

I'm not sure if it will, but those frames won't matter too much anyways (if I'm understanding your question correctly). That is the same question we have haha cause we see that the audio configuration is the same before and after. Regardless, this is a viable solution for us for the time being, and hopefully it helps others too. Thanks for all your help! You can feel free to close this issue if you want. Just to reiterate, the solution was to add

config = AudioSettings.GetConfiguration(); //in start

and

AudioSettings.Reset(config); //before calling any audiosource

@yasirkula
Copy link
Owner

I was wondering if adding AudioSettings.Reset(config); inside CameraCallback (instead of before calling any audiosource) works so that I can integrate the solution into NativeCamera.cs :D I don't have access to the Mac for a while so if you could test it, it would be awesome. But if you don't have the time, that's totally fine.

yasirkula added a commit that referenced this issue Oct 15, 2020
@yasirkula
Copy link
Owner

yasirkula commented Oct 15, 2020

I've pushed the changed I've made. As I said previously, they were sufficient in my case. I won't be pushing the AudioSettings.Reset(config); fix until I can reproduce the issue you are encountering and verify that putting it in CameraCallback resolves the issue without any noticeable side effects.

@ksyao2002
Copy link
Author

Sorry for the late responses. I also don't have access to a mac haha, I will let you know when I am able to test it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants