So what is this Virtual Projection and when can I buy it?

January 24th, 2012 § 7 comments § permalink

The last days have been somewhat crazy as we saw a lot of interest in Virtual Projection (VP), a project by Steve Feiner, Sebastian Boring and myself that will be published at this year’s CHI in Austin. First, then Engadget and finally The Verge all picked up the story and the corresponding video. I want to use this blog post to provide a little more information on the project.

In case you haven’t seen the video, take three minutes of your time and watch it:

When we started with Virtual Projection the initial idea was just to artificially replicate the workings of an optical projector. Just pointing at a wall and pressing a button to make an enlarged image appear is a very easy and powerful concept. To transfer information to a wall display is comparably difficult (fiddling with cables, struggling with software issues), so we wanted to make that just as quick and painless. But once we had reached that goal it became clear that while we’re simulating the whole thing anyway we can even improve on that metaphor. While Virtual Projection has the clear downside that it requires a suitable display and does not work on any regular surface, we can at least “fix” some of the downsides of its real-world model.

One of the first things was getting rid of distortion: When projectors are aimed towards a wall at an angle, keystone distortion can arise, warping the resulting image. Virtual projections can be freely adjusted when it comes to distortion and transformations of the resulting image: It’s possible to either (a) fully recreate a projector’s image, (b) remove at least the distortion, (c) ignore the orientation of the handheld (e.g., to have a photo always upright), (d) ignore both orientation and scaling (e.g., to have the photo in its original size), (e) ignore all transformations and just use the technique for selecting a display that’s used fullscreen (e.g., to show a video or a presentation).

In addition to this control over transformation and distortion, virtual projections can also be “decoupled” from the handheld. While an optical projection is always fixed to its light source, we can fix virtual projections to displays and also create more of them at the same time. The above image shows a typical VP workflow: We first (a) start an application (for that, we implemented something very similar to Apple’s Springboard for the iPhone) and (b) interact with it. Once we (c) point at a display a preview of the VP appears. By (d) long-pressing we can control the VP’s placement on the display and (e) fix it by lifting the finger. Both the handheld view and the VP are now synchronized, so (f) interacting on the handheld changes the VP (in this example: the currently highlighted section of the photo. Note that the VP shows the whole photo, while the handheld is only able to display a part of it). By pointing at the VP, we can (g) select a different part of the photo to be shown on the handheld by tapping. Once we’re done, we can (h) remove the VP by long-pressing again and dragging it off the display. In case you’re wondering where the transformations/distortions went: They are predefined for each application, so in this photo-viewer example we take everything except the distortion (so type (b) from above).

This also works with multiple VPs: By pointing at an inactive VP on the display and tapping, the respective view immediately becomes active and visible on the handheld (the VP before that still stays on the display and continues running in the background on the handheld). Another option is to shake the handheld to bring up the menu (see (a)) and switch to a background app or start a new one. It’s no coincidence that starting and working with apps in VP is similar to the regular application management on a smartphone: We wanted to show that the VP interaction technique can easily be included into existing smartphone operating systems. Of course, due to security restrictions we weren’t able to completely integrate it (shaking the device, for example, was our replacement for pressing the Home-button on the iPhone). However, our VP implementation works on a regular iPhone, not even jailbroken, and without any private libraries.

With this interaction and tracking framework we built several applications. While the video has much more, here are three interesting ones. (a) is the example from above, so a photo-viewer that shows whole photos and parts of them on the handheld. It’s possible to quickly switch the visible part by pointing and tapping. The handheld’s perspective frustum determines which part that is, but it’s always possible to see that from the preview on the display (grey-ish border). (b) shows an image filter that can be applied to photos (in this example greyscale). Image filters work like regular virtual projections, only they show nothing when on their own. But placed on another VP that displays a photo they filter it. It’s also possible to combine multiple of them by stacking them on top of each other. (c) finally demonstrates multiple maps next to each other. Map viewers work similar to photo viewers in that they show a larger section of the map than possible on the handheld. In this example, the handheld also works as a “magic lens”: It shows a satellite image for the current section, while the stationary display shows the road map. By moving the handheld in front of the display that image changes correspondingly in real-time.

To sum up: In Virtual Projection we did interesting things with simulated projections and tried to keep it as close to a real-world scenario as possible. Our prototype works with regular, unmodified iPhones, the corresponding server runs on a regular Windows-PC (for the video we used an i7 machine) and everything happens via Wifi (so no cables needed). Imagine having a VP server running on every display that you encounter in your daily life and being able to “borrow” the display space for a while (e.g., to look something up on a map). Give it a few more years (and a friendly industry consortium ;)) and this could become reality.

HowTo: Remote control iOS Spotify from your couch

August 29th, 2011 § 0 comments § permalink

At the moment I like to do my working/coding from the couch, but listen to music via my iPod Touch that’s running the Spotify app and is connected to the stereo. Unfortunately, the iTouch is conveniently out of reach, so if I want to change the song or playlist that’s running I’m out of luck (or rather: have to get up and walk all the way to the stereo to change it). Of course, there are dedicated (aka pricey) solutions by Sonos with remote controls to provide this functionality, but I’m happy with my iPod. So what do?

Spotify’s constant synchronization to the rescue: It’s trivially possible to remote control a running Spotify app using a playlist.

  1. First, I launch Spotify on both the iPod Touch and my laptop.
  2. I then create a new playlist (in this example called my_remote_playlist), add one or more songs and start the playback on the iPod.
  3. The contents of the playlist are constantly synchronized, so if I now add songs using Spotify running on the laptop, all changes are reflected in the iPod version of the playlist. Once the iPod is done playing one song it takes the next one from the latest version of the playlist (instead of the original one).

This provides me with a play queue where I can drop new songs at will and have them played immediately. Of course, this is just a workaround: no skipping of songs or stopping playback and I always have to keep the playlist filled with songs otherwise it runs out and I have to get up to press play again.
But the nice thing is that it works across all devices running Spotify (I also tried the Android version and a desktop PC).
And if you want to know what song you just heard, check out your listening history on – don’t you just love these hacks?

Where Am I?

You are currently browsing the technology category at Augenmusik.

  • noise

    Error: Twitter did not respond. Please wait a few minutes and refresh this page.