-
Notifications
You must be signed in to change notification settings - Fork 28
Is Multiple PSEye + multiple PS Controllers + UE4 possible??? #24
Comments
Short answer, at the scale you are talking, almost certainly not. Long answer: That said, what you are trying to achieve would probably be a lot more easily done with the Vive. The stress level zero guys have a great demo of just how large a volume you can track with the Vive: https://www.youtube.com/watch?v=VD4UlShicgY |
Yes, I would like something like a VR arena. I was thinking in multiple PS3 Eye + PS Moves controllers because is the solution used by zerolatency guys (http://www.cnet.com/news/zero-latency-vr-entertainment-revolution-begins-melbourne-australia/) About Vive solution, do you think that I could managed distances around 20 meters? Could I add more than 2 base stations? I need large space to tracking and the HTC Vive installation guide says max.5 meters of distance (https://support.steampowered.com/kb_article.php?ref=2001-UXCM-4439&l=spanish) Any more solution? I've searching in internet and I don't find any solution, only VRcade, but it costs 275.000 usd for 20x20 meters space!! Other solutions found, what do you guys think about?
Thanks! |
I think the key take away from the zerolatency article is that while that is a very impressive feat, it was a non-trivial exercise to build that tracking tech: "Building the technology has taken the team at Zero Latency more than three years" and that was with a team of six people working on that full time. They had to solve all of the problems I mentioned as well as a host of others I probably haven't thought of. Maybe they might be willing to license it? But I can't imagine that would be cheap. An open source PSMove project is not going to match that effort because we're all doing this in our free time. I'm certainly going to keep working on multi-camera tracking myself, because I think it's interesting, but don't expect comparable results any time soon. As for the Vive, yeah I think they aren't meant be pushed past 5m like you said. There was an interesting thread on reddit where v2kzay (Alan Yates), architect of the lighthouse system at Valve, goes into of how the that range isn't so much a hard limit as it's just an arbitrary cutoff past which it starts becoming harder to compensate for the noise without the user noticing. They are interested in supporting more than two lighthouses, but who knows when they will come out with that. VrTracker is a neat modular system for positional tracking, but because it's optical system it will suffer the same range and line of sight limitations as the PS3Eye camera (the CMUcam5 camera they use is 640x400@50fps) so you'll need a lot of cameras to cover the range you're interested in. However at least they have the foundations of a large scale tracking system and it's cheap so it wouldn't be much of an investment to experiment with a few trackers. The orientation issue could be solved with a separate IMU sensor that has a magnetometer (something like this maybe: https://inmagicwetrust.wordpress.com/2015/11/04/diy-project-wearable-imu-tracking-sensor/) I can't speak to OptiTrack, though I think @cboulay has used this system before? Perhaps he could speak to it. IndoTraq seems like the best bet for scale of tracking you want. Because the radio-timing based triangulation is not affected by line of sight you'll need fewer transmitters then you would need in a camera based system. It looks like their DevKit introductory price is $3,500, which isn't ridiculous. That said, I think the first thing you should do is get a Vive anyway for developing your ideas before committing to a large expensive system. It's well supported in both Unreal 4 and in Unity5. And the quality of tracking is probably the best your are going to get for that price point. If you can't make the game you want to do feel good on the Vive, it's not likely it will work any better on the IndoTrak. All that said, I'm not an expert in any of this so don't take my word as gospel. You should consider posting this question on reddit.com/r/oculus. I'm betting someone there can give better advice than I can. Good luck! |
Thank you very much for your advice. I think I will forget the PS3 Eye system definetively for the positional tracking... By the way, about PS move controller and the PSMove API:
Thanks!! |
|
On a related note, how easy is it to use a pair of move controllers at present? I've been flicking through the information that's available, but it looks like you only a single motion controller is currently supported? |
@HipsterSloth Added support for this a while ago. I've yet to try it. If it's not in my fork then it'll be in his. (If it's not in my fork then let me know and I'll create the PR). |
Thanks, I'll take a look shortly. I'm slowly downloading UE 4.10 at the moment as I noticed the plugin doesn't currently compile under 4.11 :) |
oh? We should change that, but... |
Nothing too sinister, from what I could tell - Epic have changed a few things on their end. I'm doing a quick fiddle over the next few days, so I figured it'd be quicker to just downgrade on my side than explore a plugin I'm not familiar with and clean it up. Sounds healthy to me. I can see Epic's side of things changing quite frequently over the coming months, especially once other motion controllers properly hit the market. |
Hi Cramer230, I am working on same project and i have choose HTC Vive, if your are interested you can contact me . [email protected] |
I'm new in these issues and I have poor knowledge in programming. I want to know if this following approach is possible in real time and acceptable latency for VR purposes:
-Multiple PS3 Eye cameras (around 50 cams or more)
-Position and orientation tracking of 12 PS Move controllers (6 head tracking and 6 weapon tracking)
-Get these 12 position and orientation tracking data in UE4.
Thanks!
The text was updated successfully, but these errors were encountered: