Hardware
Engadget Takes On The Playstation Move
As promised, more on the PlayStation Move. At The Engadget Show which aired this past weekend, Dr. Peter Marks and Anton Mikhailov arrived to chat and show off the move.
From the show, you really get a sense for all of the different applications that the hardware can be used for, and why Sony chose to bring it to market now. One of the most amazing tech demos shown is hard to describe. Imagine a 3D environment comprised of live video, photos and a map that you can bend, move, resize and place almost anywhere you want in this virtual space. Creation of images, shapes, and screens are all done using gesture based control converging on the same screen (think John Anderton). I can’t honestly see augmented reality ever getting old.
Now imagine being able to use the Move controller to manipulate the camera, looking up, down, left, right and being able to zoom in on any part of this virtual room/display. It sounds out there, but it’s not. And it all runs amazingly well on what are just tech demos. It’s easy to see a future where, instead of just a virtual tour of the U.S. Supreme Court, you actually walk through it, interact with it, and pick up items and read them. The multitouch demo was easily one of the most impressive things I have seen in a long time. To see it all in action, check out the replay of the show and fast forward to about the 43:34 second mark:
Dr. Marks also addresses why, from the time he started working on this type of tech back in 2004 until now, it finally arrived on the scene. Of all the factors involved, one of which was CPU consumption. Apparently the PS2 Eyetoy consumed aproximately 25% of the consoles power right off the top, making it more difficult to present this as viable option to developers, when the accessories market was a tough one to compete in. He also explains that Sony investigated the same type of tech Microsoft is using for Natal, but chose to go a different route because it was felt that to offer the type of experience they wanted, hand and body based gestures only would be too limited.
During the conversation the subject of lag came up (at round the 1:03:34 mark), and Dr. Marks commented on the latency delay of 22 ms that was seen on screen. He added things such as television delay and display delay are things that are out of their control. Anton later added that what we are looking at now is just raw output before any filters or tweaks have been applied. Given how well things ran, I can’t wait to see what happens when the development community really starts understand exactly how to use the Move system and make compensations if required.
After the event, Engadget went one on with Anton Mikhailov. After playing a full game of Move Party, they had a brief, but informative interview. He revealed several key things such as the multitouch demo was literally written by himself and Kenny Hoff the day before, and that they are able to generate this content fairly quickly. The tech demos are included the SDK he went to say, so developers are able to see what can be accomplished. He also reiterated the same thing Dr. Marks eluded to earlier: the power of the PS3 is making this possible. Given the computational abilities of the Cell Broadband Engine, the PS3 is handling the image processing since it excels in that area. This allows a lot more more processing power to still be available to developers in generating their content. I definitely suggest watch the interview video below:
With E3, just around the corner, if Sony is releasing this much now, we can be sure they will have something exclusive to show in the coming months. It’s time to go look for my standing in line/camping gear again..