Do you have an idea for an awesome feature we should add… or hate the way we’re currently doing something? Share your ideas and suggestions here.
we don't want your fluff iphone app image viewer nonsense. give us the actual 3D, that's the only thing interesting here and that you keep refusing us. it doesn't matter if it isn't perfect in the first version. and let us export it for a 3d editor.
if you keep being obtuse about it you risk someone like me implementing a similar sift based 3d correspondence app with proper 3D reconstruction and put you out of business. consider this your final warning.
if nothing else start with an option for saving the point cloud to a file as a show of good faith. but let full-surface textured option follow quickly after.
Just so we're clear, Dan, I am NOT anyone official.
That said, I find your sudden appearance and subsequent ultimatum to be hilarious. "consider this your final warning." - Lol.
Maybe do your homework next time, big guy. The Photosynth team actually championed Binary Millenium's hacking out of the individual point cloud binary files two years ago when Photosynth launched.
Christoph Hausner (our fellow Photosynth user) has created a simple to use point cloud exporter that (more recently) also converts the camera parameter data found in a synth's .json file to CSV. You can download his app here: http://synthexport.codeplex.com/
Kean Walmsley of AutoDesk also has a plugin for AutoCAD 2011 that imports Photosynth point clouds.
Nathanael, I'm aware there are hacks to get point clouds. champ.
As for more dense point clouds obtained via multi-view stereo, oriented patches, or meshes, there is a small group of Photosynth users interested in converting Photosynth data to Bundler's format so as to be able to use CMVS and PMVS2 from University of Washington's research on Photosynth data. Feel free to check out what they've come up with here: http://synthexport.codeplex.com/Thread/View.aspx?ThreadId=204015
CMVS for Windows: http://francemapping.free.fr/Portfolio/Prog3D/CMVS.html
PMVS2 for Windows: http://francemapping.free.fr/Portfolio/Prog3D/PMVS2.html
As for creating your own SIFT-based, 3D correspondence app, you may be more than capable, but why not simply make use of the GPL source code from Noah Snavely's Bundler: http://phototour.cs.washington.edu/bundler/ and modify it as you find necessary?
Now onto the viewer...
Ironically, the first Photosynth viewer was hardware accelerated and written in Direct3D - which would have made subsequent release of meshes created from point clouds and other scenarios more likely to be released right away.
The reaction from many people was that they didn't like that the Photosynth viewer required a standalone plugin. Again and again I watched people ask for the Photosynth viewer to be written in something like Flash or Silverlight (neither of which had much in the way of hardware accelerated 3D). The Photosynth team chose Silverlight as their development platform, but that means that they are now, to some degree, dependent on the Silverlight team to provide real 3D capabilities, apart from writing their own 3D graphics engine in C#. Silverlight 3 and 4 have hardware accelerated perspective 3D, which I assume works for the photo quads, but it doesn't have any hardware accelerated particle engine.
Having said that, the third party iSynth app consumes Photosynth data just fine and is written as a native iPhone app - using OpenGL (I think - not sure on which graphics library) to display the point cloud and photos.
With all the work that has been done by Binary Millennium, Christoph, etc. to understand Photosynth's point cloud and camera parameter format and given how well documented the Deep Zoom Image format is, I see no reason why you couldn't create your own Photosynth viewer that adds extra graphical goodness. Why not write it to WebGL's spec and have it soon be able to run everywhere?
Here are some links to deep zoom image documentation:
Here are some links to open source deep zoom image viewers:
What would be truly cool to me is if you could get Bundler to output Photosynth's data formats.
To that end, here are some links to free to use deep zoom image generators:
Of course, the discussion that I previously linked to on Christoph's SynthExport page on CodePlex should point you in the right direction for understanding the differences between Photosynth's format and Bundler's format.
For some inspiration of other graphical techniques that you could add to your own viewer:
Back to the viewer topic for just a moment, you might also find the work at http://photocitygame.com to be of interest.
They chose to use Flash for their viewer, rather than Silverlight, and it currently only allows users to view the point cloud, rather than the photos as well, but you seem more interested in the model anyway. Here's a link: http://photocitygame.com/swisher.php?model_id=217 (you'll need to [Alt] + Drag to zoom in)
Other than that, I'd say, cross your fingers that Silverlight 5 has some real 3D support. PDC 2010 is coming up very soon and we should hear the first details of SL5 there. Apple's maps are rumored to be releasing this fall as well and I have no doubt that they'll be going the HTML5 Canvas + WebGL route, so the Silverlight guys definitely need to answer that.
Also important little details to keep in mind:
>> 2010 August 20 is the 2 year anniversary of Photosynth's release.
>> The Photosynth team has said repeatedly that they are focused on improving navigation in synths and have said that synths need to be as easy to navigate as Streetside (see: http://blogs.msdn.com/b/photosynth/archive/2010/03/18/buttery-smooth-gigapixel-panoramas.aspx ). That sounds rather telling to me.
>> Bing Maps has been very quiet all year. Last year that meant releasing a whole new Silverlight map control which introduced the stitched together Bird's Eye view mosaics, viewing individual Bird's Eye View shots like a Photosynth, Streetside panoramas, Photosynths viewable on the map, and then early this spring Worldwide Telescope and a limited release of Flickr photos mapped onto Streetside imagery. I know that they must have had Navteq driving the Ultracam cars all over the roads, but we haven't seen any updates since the Olympics. To me, this spells out loud and clear: watch Bing Maps this fall. Last year it was November and December that Bing Maps and Photosynth really embraced, so I wouldn't be surprised to see something around the same timeframe this year.
Things I'm waiting to see:
:: I would love love love to see synth linking finally happen.
:: Seeing Streetside massively expand their coverage so that synthing Flickr (and web) photos to Streetside panos can really take off.
:: Seeing Photosynths on top of other map imagery, whether that be satellite, aerial, or street level.
:: Street Slide, as demoed at SIGGRAPH 2010. http://research.microsoft.com/en-us/um/people/kopf/street_slide/
:: Seeing web video of public spaces begin to be synthed onto Photosynths and Streetside imagery, similar to what Blaise and the guys at the lab (Drew Steedly, Georg Klein, + co.) demoed with live video at TED 2010. http://www.ted.com/talks/blaise_aguera.html
Like it or not, Silverlight is the cross platform way forward that Blaise chose as the architect of Bing Maps, forgoing the older Direct3D Bing Maps control for something that would work on Macs (and theoretically Linux - if the Moonlight team can keep up with SL) as well.
@Dan.Frederiksen, I see you replied in the middle of my posting there. If you haven't given Christoph's SynthExport a try, seriously do. It's nothing like the mess that earlier methods were.
You should then be able to export to a format that will work for the 3D modeling program of your choice to view and edit the point cloud as you wish.
Meshlab is a great free tool for viewing the point cloud. http://meshlab.sourceforge.net/
I'm not sure how Blender does opening point clouds, but again, it's free. http://www.blender.org/
Mark Willis has achieved great things using VRmesh Studio, which you can have already seen here: http://www.youtube.com/watch?v=9-sjaUjrTnw
For his workflow, see: http://photosynth.net/discussion.aspx?cat=6b63cb81-8b57-4d5d-a978-41d5509bf59a&dis=7b771e05-8a31-4109-8258-e97e4a9f41ae
For more of his videos, see: http://www.youtube.com/user/mdwillis01#p/u/16/
Mark is obviously very well informed, but ultimately just another Photosynth user like us. You can do what he did at your leisure.
If you are capable of coding up a replacement for Photosynth that delivers better results then I am sure that you are able to follow a workflow which uses publicly available parts that has already been figured out, demonstrated, and freely published.
Lastly, considering your interests in the model side of things, you should check out the Photosynth fan group on Ning: http://photosynth.ning.com/ created by Gary Mortimer. Several of the people who are interested in the modeling possibilities of Photosynth have joined over there, including Mark Willis, so come on over and visit if you have questions. Maybe you'll get more answers there.
thanks for all the info but I want the photosynth people to do it, asap. not join a fanclub for a bs pop up book website.
3d or bust
Well, if you watch the video of Blaise's keynote at Augmented Reality 2010 that I linked you to above, it is obvious that they are going to do it. It just isn't ready yet.
I've also communicated to you the current realities of a Silverlight viewer and pointed you to a resource to easily get a perfect copy of your point clouds (or others') as well as a collection of videos that talks at length about where all of this is going and approximate timeframes when that is probable to occur.
I think that everyone wants richer 3D reconstructions, but you are mistaken when you say that that is the only interesting thing. Photosynth will always be about the photos. If you just want a model, then go use ARC3D or Photomodeler while you wait for a future version of Photosynth and PhotoCity.
You aren't going to make it happen faster by throwing a tantrum and then insulting people who help you. Grow up.
Nathanael, what will make it happen faster then?
There's nothing wrong with your request... just the way you asked. Making sure that the Photosynth and their parent group of Bing Maps know that there is a demand for the publicly available reconstructions to improve is a perfectly valid thing for you to do.
I'm sure that they are going as fast as they can to deliver textured meshes, etc. They have Google, Apple, Adobe, and the rest of the computer industry breathing down their necks, competing with them. It doesn't serve Photosynth's or Bing's or Microsoft-at-large's interests to let Google or Apple beat them to launch with full 3D and if you pay attention to a lot of the research students who were doing work on dense reconstruction at University of Washington, several of them have been snatched up by Google. I'm sure that the Photosynth team is painfully aware of the need to release this sort of thing soon.
The downside to closed source programs is that there *isn't* anything that we can do to speed things up.
Robert Scoble visited Microsoft yesterday and posted his thoughts over on Cinch: http://bit.ly/9za36Q
He doesn't specifically mention Photosynth, but he did say that he visited Research and saw some things that he couldn't talk about yet but that he expected to see on future episodes of CSI (Photosynth has been featured on CSI before). He also talks a little about how difficult it is for new innovative things to be released at Microsoft towards the end. That may not apply directly to Photosynth as they're already established and expected to make progress, but it's food for thought, anyway.
Going back for a moment, one thing that I'll say is that even though it's possible for Photosynth to generate more dense point clouds than it currently does, the parts of your synths that have giant gaps between points are also going to have giant gaps in the meshes. I recommend learning to shoot so that even the current sparse reconstructions are dense enough to mesh.
a scene doesn't have to be filled with dots to become a surface. there are more or less easy things that can be done to span a surface. it's only a problem as a hack. that's why photosynth has to do it.
I guess all I meant was that even in academic research on creating dense reconstructions, in areas where no geometry has been recovered in the sparse reconstruction, there is also missing geometry in the dense reconstruction.
This can be seen in Yasutaka Furukawa's work: http://www.cs.washington.edu/homes/furukawa/
and is the very focus of Michael Goesele's latest released work: http://www.gris.informatik.tu-darmstadt.de/~mgoesele/projects/AmbientPointClouds.html
I'm not saying that a professional development team can't do better than a research lab, but whatever solution is arrived at it will have to be an automatic one, not a hand crafted filling of holes in the mesh.
It's also true that you usually see things done in research before they are brought into play in an actual product. In the Ambient Point Clouds video above, Drew Steedly and Rick Szeliski from the Photosynth and Photo Tourism are directly involved. Drew was the lead on creating today's synther, so if he is involved in a research project whose primary goal is to mask the missing geometry even after dense reconstruction has been completed, that ought to tip you off.
Changing gears, Josh Harle pointed me to a very nice piece of work by Astre Henri this morning that I think would be right up your alley if you can be patient with it:
thanks for the links but I still want them to do it. if I have to do it it will be to defeat their obtuseness and I have too many other things to correct in this crappy world.
I'm not saying they won't do it.
I think the Photosynth team and certainly the Bing Maps team intends to provide as close to full 3D reconstruction as possible. You want them to do it, I want them to do it, and many other people want them to do it... and they will. I mean, that's the whole point of the entire Computer Vision industry. I don't know that Photosynth will be eager to just let you just snatch the models, but they've been forgiving enough with the point clouds.
I'm just not sure that it'll happen before the year is up (or even next year). That's all I ever meant to communicate. Well, that and there are ways to make this happen in the meantime if you really want it badly enough. I know you want a professionally crafted solution. I do too. I just meant that if history is anything to go by, this may be something of a long wait and I wanted to give you something to tide you over if you found you were starving while waiting for an official solution to ship.