Topic: Panarama Questions of future capabilities

Report Abuse Report Abuse
cgeorge (Over 1 year ago)
I have a number of questions regarding the capabilities ,or possible future capabilities of ICE.

Firstly, I was wondering if there was any plans to link panaramas? It would be great if you could select a point in a panarama and then zoom to that area where another panarama loads in the same frame that has been made of this area, kind of like google streetview I guess.

Another question I have is about the possibility of photo-stitiching images in ICE that instead of creating a 360 inside looking out, you can create a 360 degree view from around an object, outside looking in so to speak. I know this can be done in photosynth but this does not have the seamlessness of panaramas. It seems that this shouldn't be too difficult to do, just invert the curve of the stitched image.

I think these two things would make fantastic additions to ICE and would create an even greater interactive experience.
Nathanael (Over 1 year ago)
I would love to see linked panoramas as well. That would essentially allow people to create Streetside panoramas for their own street if Bing is taking too long to send a car to their neighborhood. Power to the people, right?

As to future capabilities of ICE, you'll probably find better luck getting answers over on the ICE forum:
Nathanael (Over 1 year ago)
I will comment on your suggestion for inverse panoramas, though. Traditionally this concept has been called "Object Movies", but mapping a texture map onto a cylinder doesn't really work when looking inward. 

Back when QuickTime VR was released, they had a panorama mode, which used a cube projection, just as Photosynth panoramas and Bing Streetside panoramas do today, and secondly an object movie mode where frames were taken at an equal distance all around an object, via a turntable and tripod or similar systems. The result was something that you could spin around smoothly and get a good illusion of dimension from, but the illusion breaks if you attempt to merge the frames. The end result is really nothing more than a video which you can play in multiple directions at multiple speeds, depending on how fast you're moving your mouse.
Nathanael (Over 1 year ago)
Because object movies are constructed of a constantly changing camera perspective, it is impossible to create any sort of satisfactory two dimensional 'stitch' of those disparate views. The only sort of stitching that will satisfy must be at least 3 dimensional and this is what the point cloud in Photosynth is a beginning of. Computer Vision has long been interested in generating a full model from nothing but photographs and although other software exists which is dedicated to the modeling side of things which generate an actual textured 3D mesh, rather than only a point cloud, Photosynth remains dedicated to the source images.

Here's a gallery of QTVR object movies:
They are constructed of a single orbit, but there is no particular reason that you could not expand the coverate to be fully spherical, allowing you to view the object from a far greater range of angles.
Nathanael (Over 1 year ago)
One service which specializes in creating a mesh from a series of input photos, rather than only a point cloud is the Automatic Reconstruction Unit:

Automatic reconstructions from Photosynth will certainly improve over time, though. 

See Yasutaka Furukawa's work entitled: Towards Internet-scale Multi-view Stereo

For evidence that Photosynth is also looking at dense reconstruction (as they are part of Bing Maps), see this video:

In years to come, as surface reconstructions become less prone to error and reflective and illuminative properties of surfaces are able to be calculated as well, the 3D model will become much closer to photo-realistic, being able to be re-lit, etc. I am interested to see what impact stereo and plenoptic lenses on cameras will have on computational photography as well, as cameras begin to record light fields, rather than simple bitmaps.
Nathanael (Over 1 year ago)
Linking panoramas was asked about on the old Photosynth support forum not long ago and the question was answered by Photosynth Developer Lead, Bert Molinari here:
Nathanael (Over 1 year ago)
I meant to point out that QTVR object movies have something of a strength in that they download all the data before allowing you to view anything. This means that you can flip the object as fast as you please and it will always remain crisp. 

The tradeoff is that object movies have a very limited range of motion and are often very low resolution. Photosynth using Seadragon allows a panorama or individual photo to be as high resolution as you please, but because Photosynth only fetches the necessary pieces of the photo that you're currently looking at after you've zoomed or panned to it, it will never be as crisp and snappy as QTVR, unless all of the thumbs for an orbit are downloaded ahead of time and the photographer has taken pains to ensure good even coverage in their orbit.
cgeorge (Over 1 year ago)
Hi Nathanael,

Thanks for the detailed reply. From the link you sent about intentions to merge panaramas, it seems that there is no plan to do this. I hope I'm wronk though. It doesn't seem that it would be a difficult thing to do and would be a brilliant addition. Since parts of panaramas can already have highlights attached that change the behaviour of the panarame, surely it wouldn't be difficult to for a part of the panarama to be tagged and when clicked would load another panarama. Anyway, hopefully it's coming as it would be a great addition.
NateLawrence (6 months ago)
To anyone who finds this old discussion, sadly Photosynth never has shipped the ability to link between multiple spherical panoramas, but as for Spin Movies or Object Movies, back in December of 2013 they launched the Photosynth 2 Technical Preview and of the four types of new synths (Spins, Walks, Walls, and Pans) definitely check out the 'Spin' type.

Here's a video and the manual to help you understand how to shoot for Photosynth 2:

And here's the Photosynth 2 homepage: and upload page:

I hope to see your creations in the recent synths newsfeeds: