I've wondered ever since I watched Georg Klein's PTAM videos of interactive 3D tracking on an old model iPhone (and later his contributions to your first version of the mobile panorama capture app just over three years ago now http://news.cnet.com/8301-10805_3-20025794-75.html#! ) when we might see interactive photosynth capture in the same way that you currently offer interactive panorama capture in your current mobile apps.
For my peers, here are Georg's videos that I mentioned: http://www.robots.ox.ac.uk/~gk/youtube.html
The point is not the 3D graphics drawn over the scene, per se, but rather the fact that a coordinate system is being formed and tracked in realtime.
In a Photosynth mobile app, the 3D graphics drawn over the scene could simply be low res representations of the full resolution photos you've taken so far (for a spin, wall, walk, or pano [parallax or not]) trailing out behind the viewfinder video in 3D.