Earlier this week I came across an interesting research video submitted to the CVPR 2013 conference. https://vimeo.com/68212573
This is one of the first examples that I can recall of someone utilizing structure from motion (what Photosynth + other photogrammetry programs use to generate their 3D point clouds) in a panorama stitching application.
I've always wanted ICE to do this + said so on their forums, but haven't seen any signs of Microsoft pursuing this yet. The Photosynth mobile panorama app is another example of where I'd like to see this technique applied as I'm not pleased with the current low res (+ often poorly stitched) results.
Since it is virtually impossible to really hold your phone's camera lens perfectly still and rotate yourself and the rest of the phone around the lens, it is far more practical to capture high res video + compute depth from that for cleaner results in 2D as well as stereo viewing.
More here: http://richardt.name/megastereo