Do you have an idea for an awesome feature we should add… or hate the way we’re currently doing something? Share your ideas and suggestions here.
Researching for a presentation I will be giving on Photosynth I discovered deprecated features from Photo Tourism, and previous incarnations of Photosynth. Namely:
- Automatic orbits and pans finder (accessible thru thumbnail hyperlinks).
- Appearance-based selection (to deal w/ panning/orbit fotos under widely different lighting – e.g. day/night).
- Adaptive color compensation (to deal w/ minor lighting differences in orbits and pans).
- Day and night browsing (filter images for day or night only).
- Frustra viewing.
- Object browsing: thumbnail hyperlinks to objects of interest (as identified by algorithm during feature extraction).
- Photo annotating, and automatic transfer of annotation to other photos w/ common object.
- Georeferencing: register photos to DEM or other 3D terrain model (or even 2D overhead image).
The frusta viewing has made somewhat of a comeback in overhead view and is also present in Greg Pascale's iSynth app for iPhone.
What more are you wanting from the georeferencing feature request that is not covered currently in the synth editor's Geo-Alignment? I have to assume that lining our point clouds up with satellite imagery is just paving the way to take advantage of the work covered in the 'Skeletal Sets for Efficient Structure From Motion' and 'Building Rome In a Day' work that the University of Washington and Microsoft Research have been collaborating on. Presumably whatever larger scale bundle adjustment would be performed on all the little synths that are in close proximity to each other on the map could also use whatever satellite and aerial imagery was in the vicinity into consideration and use existing DEM as a strong prior to gain accuracy.
You might also be interested in warping the photos to fit the pointcloud when moving from one to the next.
(as seen in Photo Tourism and early Photosynth prototypes)