Tag Archives: brain computer interface

AR+BCI = Telepresence

I’ve finally thought up a possible first application for the combination of augmented reality and brain computer interfaces:

Telepresence.

Basically, one can both use a non-invasive BCI setup and a virtual reality headset in combination, login to Second Life using a plugin (or viewer) that allows for augmented reality access, and use the avatar "mouselook" view to peer through the avatar’s eyes into the other region of the real world.

This, of course, would need a number of strategically-placed volumetric cameras placed around the area of augmented view to allow the avatar to "see" the real world and its objects in as many complete dimensions as possible (alot like Street View, in a way).

Technically, this should allow the user to both neurally and freely navigate whole regions of the real world and graphically self-project to another region of the real world using a remotely-controlled virtual avatar.

I  can also visualize a number of applications for this:

  • Providing Croquet-like virtual 3D hyperlinks/hyperframes between the real world regions would allow for quick avatar-based hypertravelling between regions. In addition, tabbed navigation could allow for easy switching back-and-forth between real world regions.
  • Incorporation of KML markup coordinates could signal the positions of remote-controlled avatars in the real world and could necessitate SL/OpenSim hyperlinks to these coordinates in the real world.
  • Extraterrestrial tourism, in which users would never have to leave Earth to visit the Moon and Mars since cameras and coordinate markers could be shipped on lunar robot missions to allow avatars to navigate regions of the two bodies, is a possibility.
  • Remote meetings or get-togethers would be simplified, with a mix of both real individuals and virtual avatars.

I’m sure there’s more to this, but I’ll wait until another post.

Blogged with the Flock Browser