Microsoft researchers have created a new augmented reality concept by improving how virtual simulations react in the physical world.

The Kinect sensor is used in a process called Kinect Fusion, which allows projections of objects to react to different surfaces. Kinect Fusion is possible with the Beamatron — a device consisting of the Kinect sensor and a projector. It’s attached to a spinning head in the ceiling and allows it to take detailed maps of physical spaces.

This technology projects objects anywhere in a room and allows realistic movement. It bumps into objects such as table, chairs and walls. Plus, the simulation is never distorted when crossing over bumpy surfaces.

It’s now possible for virtual objects to interact within physical spaces like never before. The projector can sense what is going on in the room and shows changes in seconds.

In a video, Andy Wilson, Microsoft principal researcher drives a 3D image of a toy car around a room with a remote control. It bumps into walls and drives over hills.

SEE ALSO: With Augmented Reality, Wallit App Assigns Virtual Walls to Physical Places
The Microsoft researchers are also working to apply this augmented reality technology to help individuals in their surroundings. Future technology will scope the environment and bring “notifications and other graphics to the attention of a user by automatically placing the graphics within the user’s view.” Other applications of this augmented reality technology may be applied to architecture and gaming.

Check out the above video for more details.

What do you think about virtual and physical worlds merging as one? Tell us in the comments.

Image courtesy of Flickr, MichaelMarner

More About: Augmented Reality, kinect, microsoft

For more Dev & Design coverage:


Portuguese artist Nuno Serrão wants to make art viewing more stimulating by incorporating music through an iPhone app and QR codes.

The artist’s photography exhibit called Project Paperclip is currently housed at the Centro das Artes in Madeira Island, Portugal. People can walk in and do something usually discouraged at galleries — wear headphones and listen to music while taking in the images.

“It can carry you to a different interpretation of that moment in the frame,” Serrão, who has a background in programming, design and music, told Mashable. “All the pictures are inspired by science, curiosity and imagination.”

People can experience it by downloading the free Project Paperclip app. The app developed especially for this exhibit scans the QR scans very easily, connecting to musical airwaves. Try it online, where a few images from the Project Paperclip are viewable.

“The QR codes are used to unlock the soundscapes so that the viewer has access to the reactive soundscapes designed for that photo,” he said as he explained how the idea evolved.

The experience at the gallery or using the app outside the exhibit will be different for everyone. The soundtracks will change depending on when and where you open the application. Your voice, level of noise in the room, movement, and location will set off different sounds, according to the artist.

This gallery is the first augmented reality art exhibit, revolving around a Cold War theme — chosen because it is interesting from a cultural, scientific and political standpoint.

SEE ALSO: Rooftop QR Codes Aim to Infiltrate Google Maps
“There has been an incredible wave of great feedback, I’ve been following mostly on Twitter,” said Serrão, who hopes to bring the augmented reality art experience to international audiences.

The photos are surreal, especially with the pairing of soundtracks. The artist captured natural sound where photos were taken and incorporated those into original soundscapes co-created with musician Alexandre Gonçalves.

“I think I feel in love with the concept of joining art forms when I read a book [by] Arthur C. Clark called The Songs of Distant Earth,” he said, mentioning the 1986 science fiction novel that eventually was sold with a CD based on the book after 1994.

The 16-photograph exhibit opened Feb. 11 and will be available until April 29. The app is currently only available for iPhone 3 and later.

Image courtesy of

More About: art, Augmented Reality, iphone, Mobile, QR Codes, Tech

For more Dev & Design coverage:

The future of books may be here. Augmented reality book Between Page and Screen is an innovative art project that seeks to renew the reading experience by combining the physicality of a printed book with the technology of Adobe Flash to create a virtual love story.

To see the technology in action, you simply lay the 44-page hardcover across a laptop with a webcam and words will suddenly appear, spin and rattle. Turn the page to experience the wordless book of poems and see the future of interactive reading.

Poet Amaranth Borsuk and developer Brad Bouse, creators of Between Page and Screen, started exploring augmented reality after seeing a business card developed with similar technology. A simple geometric pattern on the card once held up to a camera would turn up the card owner’s face.

SEE ALSO: Augmented Reality Business Card Comes to Life [VIDEO]

Borsuk, whose background is in book art and writing, and Bouse, developing his own startup, were mesmerized by the technology. The married duo combined their separate love of writing and technology to create this augmented reality art project that would explore the relationship between handmade books and digital spaces.

The book is full of wordplay between the characters P and S. Expect a lot of movement and the fun of a pop-up book designed for adults.

“It is actually pretty fun,” said Bouse, who described seeing people experiencing augmented reality with a book for the first time. “Amaranth has been invited to do presentations. When she opens the book and people see the letters pop up [on screen] for the first time there’s always a initial gasp.”

People shake the book, turn the page and appear to really enjoy the experience, said the authors.

The book’s animation, which helps propel the written love story along, was written in Flash. Between Page and Screen uses FLARToolKit to project images from book, using Robot Legs framework, 3D-effects of Papervision, BetweenAS3 animation and JibLib Flash.

Any computer with a webcam can play the book, which will be published in April. However, the augmented reality book is ready for pre-order at

The authors created this book as an art project, but we’re wondering if you’d be interested in a broader augmented reality book selection. Let us know in the comments.

More About: Augmented Reality, books, innovation, Tech, webcam

For more Dev & Design coverage:

The majority of the 9/11 remembrance exhibit in the New York Times’ lobby appropriately focuses on the tragedy of 10 years ago, but one digital gallery looks forward rather than back at the site of the World Trade Center.

Three iPads loaded with an augmented reality app sit on a table next to the atrium, in a section set apart from the moving collection of historic Times coverage that makes up the rest of the gallery. The app provides an accurate digital model of the future World Trade Center site and Memorial Pools by pointing the camera at a photo of the World Trade Center construction site. Few people intuitively picked up the iPads to try out the app, but those who did seemed pleasantly surprised by the information.

“This is a wider side of the exhibit,” explains Brandon Melchior, the Times‘ creative director of marketing who designed the gallery. “It’s set aside from the heavier part.”

Graphics editor Graham Roberts worked with an architect to design the digital model, which is based on physical models and architectural plans for the office buildings, arts center, PATH/Subway hub and museum that are planned for the site. The memorial pools, which will be constructed to resemble the footprints of the twin towers, are integrated into the model.

It’s one of the first times the R&D team has worked with Augmented Reality. In the future, similar features could become a part of its storytelling platform. The iPad app might one day, for instance, have the capability to expand a photo on paper or the Times website into a 3D model telling a deeper story.

“9/11 Remembered: A Gallery of Reflection” will be on display through Monday, Sept. 12 from 8 a.m. to 7 p.m.

More About: Augmented Reality, new york times, september 11

Boulder-based computer vision startup Occipital has raised $7 million in Series A funding, and aims to leverage the investment to develop a next-generation computer vision platform.

Occipital, a TechStars veteran, is most widely known for the hit barcode-scanning app RedLaser, which it sold to eBay last year. Now, the startup’s most notable app is 360 Panorama for 3D panoramic image captures via mobile.

But Occipital has bigger plans. It wants to be the computer vision foundation — just as RedLaser became the backbone of many barcode-scanning apps — powering apps that will help mobile users interact with the physical world around them.

“360 Panorama is just the tip of the iceberg,” says co-founder Jeff Powers. What’s the whole iceberg actually look like?

“The iceberg is what sits underneath 360 Panorama — it’s the beginnings of a sophisticated computer vision platform that aims to fundamentally transform the way we interact with environments,” co-founder Vikas Reddy explains to Mashable. “Think computer vision plus augmented reality and the applications that become possible when your smartphone has a visual understanding of its surroundings.”

This is where third-party developers will come into play. Occipital will be soon be launching a platform that will give enterprising developers a crack at creating new layers on top of the computer vision technology inside 360 Panorama.

“Currently, there are companies that have introduced specific mobile applications that use limited computer-vision techniques,” says Occipital investor and new board member Jason Mendelson. “No one has produced a platform that allows developers to create dynamic content that automatically leverages best-in-class computer vision technology.”

Occipital’s $7 million Series A round was led by Foundry Group. Jason Mendelson and Brad Feld of Foundry Group, Manu Kumar of K9 Ventures and Gary Bradski of Willow Garage will join the startup’s board.

Image courtesy of Flickr, jurvetson

More About: 360 panorama, Augmented Reality, funding, occipital, startup

For more Dev & Design coverage:

The Spark of Genius Series highlights a unique feature of startups and is made possible by Microsoft BizSpark. If you would like to have your startup considered for inclusion, please see the details here.

Name: GoldRun

Quick Pitch: Using augmented reality app GoldRun, advertisers create scavenger hunts for virtual goods in physical locations.

Genius Idea: Buzz has been big around augmented reality, but few companies have figured out a way to turn it into an effective marketing tool. We’ve seen brands invoke everything from Iron Man masks to musical cheese snacks in efforts to incorporate augmented reality into their marketing plans. But none of these ideas exactly created the AdWords of augmented reality.

GoldRun, which launched in November with a campaign for H&M, comes closer to creating a marketing platform that will be useful across multiple industries. The app allows brands to create virtual scavenger hunts. When consumers download the free GoldRun app and sign up to follow a campaign or “run,” they can collect virtual goods from physical locations using their phone’s camera. During the H&M campaign, for instance, users could collect a different virtual item from the brand’s fall/winter collection by snapping a photo of it in front of each of its 10 Manhattan locations. Doing so resulted in an instant 10% discount on any H&M purchase.

The platform’s agility is its greatest strength. AirWalk used the platform to build virtual pop-up stores in locations in Washington Square Park and Venice Beach at which app users could purchase a special edition shoe from its website (VP of Business Development Shailesh Rao calls it “V-Commerce”). The NBC’s Today Show ran a scavenger hunt for virtual items in Rockefeller Plaza. Esquire Magazine is planning a campaign that will virtually place its February cover model, Brooklyn Decker, in more than 700 Barnes and Noble stores. Other planned campaigns range from the Sundance Film Festival to Gwen Stefani’s perfume line.

GoldRun provides a more interactive and customizable approach to location-based advertising than check-in games like Foursquare and Gowalla. Campaigns, in addition to distributing special offers, include an option for users to create interesting photos (items in the H&M campaign, for instance, were positioned in a way in which they could be virtually “tried on”). Users share these photos through their Facebook profiles, which is more valuable for the brand than shared check-in information.

Given how eager brands have been to adopt location-based marketing through check-in apps, it’s not a surprise that many are eager to run campaigns on the GoldRun app. Rao says that more than 40 companies from various industries have approached the as of now self-funded startup about running a campaign. It will be interesting to see if consumers respond with equivalent enthusiasm.

Series Supported by Microsoft BizSpark

Microsoft BizSpark

The Spark of Genius Series highlights a unique feature of startups and is made possible by Microsoft BizSpark, a startup program that gives you three-year access to the latest Microsoft development tools, as well as connecting you to a nationwide network of investors and incubators. There are no upfront costs, so if your business is privately owned, less than three years old, and generates less than U.S.$1 million in annual revenue, you can sign up today.

More About: Augmented Reality, esquire, GoldRun, MARKETING, mobile app