Tag Archives: augmented reality

Sentience + ARToolKit = AR done right

sentience is a software library that allows for robotic stereo vision using stereo webcams (like Minoru 3D, which is of British origin despite its Japanese name), and is written in C#. Meanwhile, ARToolKit is one of the most widely-employed augmented reality software frameworks (and is also, like sentience, FOSS). 

Continue reading Sentience + ARToolKit = AR done right

AR+BCI = Telepresence

I’ve finally thought up a possible first application for the combination of augmented reality and brain computer interfaces:

Telepresence.

Basically, one can both use a non-invasive BCI setup and a virtual reality headset in combination, login to Second Life using a plugin (or viewer) that allows for augmented reality access, and use the avatar "mouselook" view to peer through the avatar’s eyes into the other region of the real world.

This, of course, would need a number of strategically-placed volumetric cameras placed around the area of augmented view to allow the avatar to "see" the real world and its objects in as many complete dimensions as possible (alot like Street View, in a way).

Technically, this should allow the user to both neurally and freely navigate whole regions of the real world and graphically self-project to another region of the real world using a remotely-controlled virtual avatar.

I  can also visualize a number of applications for this:

  • Providing Croquet-like virtual 3D hyperlinks/hyperframes between the real world regions would allow for quick avatar-based hypertravelling between regions. In addition, tabbed navigation could allow for easy switching back-and-forth between real world regions.
  • Incorporation of KML markup coordinates could signal the positions of remote-controlled avatars in the real world and could necessitate SL/OpenSim hyperlinks to these coordinates in the real world.
  • Extraterrestrial tourism, in which users would never have to leave Earth to visit the Moon and Mars since cameras and coordinate markers could be shipped on lunar robot missions to allow avatars to navigate regions of the two bodies, is a possibility.
  • Remote meetings or get-togethers would be simplified, with a mix of both real individuals and virtual avatars.

I’m sure there’s more to this, but I’ll wait until another post.

Blogged with the Flock Browser

Augmented reality and lucid dreaming

I don’t think the two have been discussed together yet. I’m only mentioning them because lucid dreaming is often described by its proponents as being a form of mentally-generated, private virtual reality, while augmented reality is the blending of virtual reality and “real reality.”

So let’s think: what if lucid dreaming could be applied for the generation of augmented reality, or what if future instrumentations could allow for such? More like live, publicly-viewable and interactive lucid dreaming or something like that.

Live-streaming video in virtual worlds

I think that the much-feared ubiquitous camera surveillance – particularly webcams – will have a role to play in the future of multitouch and augmented reality. It will allow for devices to “peek” into other, far-removed areas in real time via streaming video.

Also, I think that this sort of “peeking” can be currently implemented in such virtual world applications as Second Life and Kaneva. Avatars can open up “holes” that can stream a “virtual camera”‘s view of another sim to the avatar’s view.

This could be useful to determine if the user may want to know about the current setting and appeal of another sim before teleporting to it. In fact, it could also serve as a tool of teleportation between sims.

Of course, such “streaming,” if taken to extremes, could further wear down on the Grid that runs the sims.