Topic: Photosynth coordinate system

Report Abuse Report Abuse
brakar (Over 1 year ago)

I am looking into pointclouds derived from photosynts for further processing. I am however having a hard time understanding the coordinate system that is used in photosynth. By looking at different synths, it appears to me that the x, y, and z-axis are used in an arbitrary orientation relative to each other. Is this correct or is there a logic that I do not see?

Any help will be most welcome.
Nathanael (Over 1 year ago)
Greetings, brakar, 

When Photosynth begins a reconstruction, it begins with a pair of images and iteratively adds other photos as it is capable without any real knowledge of which way is which. Somewhere in the process, the top of each image that is grouped into a cluster is queried as to which way is up and the consensus of the majority of the photos defines which way is 'up' for the point cloud, but I'm not certain if this transforms the coordinate system or not.

I'm also wondering whether you're working with synths that are less than 100% synthy and then merging all of the different point clouds together, rather than keeping them as separate pieces, independently positionable.
Nathanael (Over 1 year ago)
The binary files are named according to which coordinate system in a synth they belong to. For example, the first bin file will be named like points_0_0.bin which signifies that it belongs to the first point cloud in the synth and it is the first part of that point cloud. The point clouds are stored in groups of 5,000 points apiece and so how many pieces are in each point cloud is simply a matter of dividing the total number of points in that coordinate system by 5,000. The second piece of the first point cloud would be named points_0_1.bin and so forth.

In a 100% synthy synth, you should only have a single coordinate system, so all bin files should be merged, but where you see bin files which begin with points_1_0.bin or points_2_0.bin, etc. these will be the first piece of a second or third point cloud, if not all the images were able to be fit together.

Apologies if I'm telling you things you already know.
Nathanael (Over 1 year ago)
Once you have the point cloud in a 3D modeling program, however, you can rotate the point cloud so that it is oriented correctly and save it.

Nathan Craig has written an article on his web log about how to transform Photosynth coordinates to real world coordinates.

You may also find this paper from Guenter Pomaska to be of use: Utilization of Photosynth Point Clouds for 3D Object Reconstruction

There's some discussion from Wonmin Lee and others here about how to get targets in the point cloud which have been geo-referenced at the time of the photo shoot:
Nathanael (Over 1 year ago)
Most recently Cesar Lopez has talked about the benefits that using Yasutaka Furukawa's PMVS2 to densify the point cloud brings to being able to identify the geo-referenced markers.

Hopefully that sheds a little light on things. Feel free to clarify if I'm not understanding what you're asking (and Photosynth Team members, if you're reading this, please feel free to jump in and do a better job of explaining than what my limited knowledge on the subject can provide).
Nathanael (Over 1 year ago)
Three utilities which I would suggest for collecting point clouds, if you're not already using them, are as follows: 




Nathanael (Over 1 year ago)
Whoops. I didn't get all of that last link in there. Here it is:

I should probably also point you to a follow up post from Nathan Craig about how real world coordinates can prove bad for graphics performance:
brakar (Over 1 year ago)
Nathanael, thank you very much for your reply. There were a few very interesting links that I had not come across. And no, I am not offended by being explained a few things I allready knew. (I prefere simple explanations when there is things I don't understand. When I have a general overview I can usually dig into the rest myself. I don't understand why people gets offended by being explained things in a simple way).

However, I was not able to upload images here, so I uploaded a few photos at:  to illustrate what I am wondering about.

In short, in photosynth; does X describe the right-hand side of the screen, Y the axix towards the person in front of the screen, Z up... or???
brakar (Over 1 year ago)
After some more thinking, I guess my difficulties is caused by mirrored axises. Then the question to ask would be; is photosynth using a right or left hand coord. system?
douglas (Over 1 year ago)
Hi brakar,
  I believe the coord system output by the synther is:

Up => (0, -1, 0)
Side => (1, 0, 0)
Look => (0, 0, 1)

Does that help?
brakar (Over 1 year ago)
Thank you douglas, this is of great help. (I do have at least one pointcloud that indicates that the "Side=>" and "Look^" axis are reversed from what you write, but that may have to do with other things then Photosynth. E.g. the downloader or viewer I am using).

I did however discover an error in my rotation algorithm. As soon as I have sorted that out, I will start experimenting on more pointclouds. The important thing for me to know is if there exists a "fixed" logic or not - with regard to the orientation of coordinate axis. 

Thanks again,

brakar (Over 1 year ago)
I finally got my software to work. (Only tested on one data-set so far). 

What initially fooled me was that the pointcloud viewer I used (CloudCompare) actually reversed the east coordinates, so that the pointcloud was displayed mirrored.

With regard to Photosynth coordinate system, what I found to work was;
North => (0, 1, 0)
East => (-1, 0, 0)
Elev =) (0, 0, 1)
(Which probably is the same as douglas described, map/pointcloud just rotated 180 deg about the elev/Z-axis).
douglas (Over 1 year ago)
Great to hear you got it working.  Would be great to see a link to what you are working on at some point in the future.

brakar (Over 1 year ago)
Here is a link to a description of the software I am working on:

PS: it would have been nice to be able to upload images also at this forum, after all it's a forum about imagery ;-)
brakar (Over 1 year ago)
I just released PC-AffineTrans, which is a tool for transformation of pointclouds like the ones that can be extracted from Photosynth.
To perform the transformation one needs to identify a set of points in the pointcloud - which must be matched with points with known coordinates in another coordinatesystem. E.g. real world UTM-coordinates. 

The tool can be downloaded here:
NateLawrence (Over 1 year ago)
Jorn, have you had any contact with Mark Willis? Perhaps I don't understand well what you two are up to, but it seems as though your tool would mesh well with his work with DEMs:
brakar (Over 1 year ago)
Nate, I am well aware of Mark Willis' work related to DEM's, and yes my work is closely related to his. My work can more or less be seen as an effort to simplify some of the steps described in his tutorial found here:

In this regard, PC-AffineTrans might be a (simpler and faster) alternative to the JAG3D software. I am however also working on simplifications of some of the other steps as described in his tutorial, and plan to release some new tools soon. Then I will hopefulle be able to explain my ideas in a more understandable and coherent way.
ajvanloon (Over 1 year ago)
I don't understand your discussion at all. But that's because of me, I absolutely not a specialist. Nevertheless, this may be a point to take with in your discussion:
As a user I would like to know: can I see (or put) the compass points on a 360 deg panorama? Actually when I geotag a synth after uploading, I am asked to align the image with the (Bing) map, but on the panorama itself nothing changes. Especially for a viewer a compass bearing should be useful for extra orientation.