I’m looking to attach an external 360 camera to an iOS device (iPhone) to capture hand gestures of users. I’m currently using Unity to build for iOS and I’ve been able to render 360 video feeds onto my iphone without much issue using Unity’s ARKit and ARFoundation plugins.
I’d like to know what is the best setup to have for debugging if I use an external 360 camera attached to the iPhone. I am also looking into using the mediapipe Unity plugin so that I can capture hand gestures with the 360 camera and I’d like to be able to visualize the hand gestures in the Unity editor. I’m guessing I should have a live feed of the 360 camera in my Unity editor, but when developing for iOS, you have to build and compile in XCode, so I can’t directly stream what the iPhone is seeing (with the 360 camera).
Can I just connect the 360 feed to my MacBook where I’m doing development, switch platforms to mac and debug this way? Or is there a way to get a live feed from my phone in the editor before building to XCode? I think switching platforms to MacOS for debugging would be an issue for ARKit and ARFoundation since they are used in iOS.