JoeJ wrote:Ah, ok - then we can assume the positions and debug visuals are correct and continue on orientation

One thing is - you are using position over a pole because newton and visuals space seems to mach up.
How can you be sure about this?
The problem i see is, exactly over a pole the visual space orienation is undefined because we can only bulid an up vector there, but tangents are undefined and the renderer will use random tangents we don't know.
You would need to keep a bit of distance to the pole axis to avoid that.
You are correct - the camera system does act a little wonky when the object it is looking at is exactly above North Pole - but I think Simulator has some code internally that takes care of complete instability. But more to the point:
At this point, I am not dealing with visual (sim) space orientation. I am not using Newton's orientation to adjust Sim orientation. I am using Newton Orientation (which is stable) to calculate Debug Objects (3 spheres) positions, that indicate Newton Body's current orientation. True, I am using a conversion utility to convert from Cartesian To Polar, but that's just a relatively simple trig. For instance, I position Newton bodies 390km above north pole, at x,y,z (0,0,(390 + EARTH_RAD) km), and the conversion to spherical Lat,Lon,Alt returns (I am paraphrasing here) something like (90,0,390km). The debug objects around the Newton body are placed along its principal axes at 10m distance, so when converted, I get (for one of the spheres, just as an example) something like Lat,Lon,Alt (89.99341, 0.000234, 390000.012). As I rotate Newton body, the spheres change their Lat,Lon,Alt position, NOT their orientation.
But, Let me step back. Here is the list of the steps I am going through to address this problem:
1: Establish Newton Cartesian and Simulator Spherical coordinate system parity
2: Establish Newton rotation to Simulator Euler angle relationship
3: Convert attitudes from Newton (ECEF) frame of reference to Simulator (NED) frame of reference
I thought step #1 was established and that object placement (but NOT orientation) was at parity between Newton and my Simulator... however, if you think there might be a problem with my reasoning, please let me know if there is a test case I can use to verify.
JoeJ wrote:However, i assume finaly you need to create a reference orientation from the ship position and measure euler angles in relation to that.
The reference should be constructed likely this way:
HeightAxis = (shipPos - earthCenter).Normalized()
LatitudeAxis = HeightAxis.Cross(earthPoleAxis).Normalized()
LongitudeAxis = LatitudeAxis.Cross(HeightAxis)
Dou you agree?
From those axis you can build the reference orientation matrix, calculate the rotation from ship to refernce, and get the final eulers from the rotation.
For the first step i would use your debug visuals to show those 3 reference axis instead of the ship orientation.
As the ship orbits earth, you can check if the vectors keep properly aligned to erths lat/long grid,
and the flipping cases crossing a pole should also show up.
When this works, i'd adjust the signs of the lot/long axis so they point towards increasing angles, assuming that's the proper convention.
But more important, make sure the reference system has the same handness as the ship orientation, otherwise euler angles will be garbage.
EDIT:
I'm not sure im right with my 'reference orientation relative to earth' idea.
The problem is, terms like Yaw, pitch & roll make sense in relation to an airplane,
but do they make sense in relation to earths lat/long grid?
Are there any specs from your visual system explaining the relations of the expected angles?
Well, yes, I think I know what you're getting at, and this is the gist of the whole problem, and has to do with step #3. Newton deals with a pure cartesian coordinate system. If you take Earth, make z-axis go through its poles, and have positive Y axis connect Earth center and intersection of equator and the prime meridian, that is called Earth-Centered, Earth-Fixed (ECEF) system. If you look at 2 airplanes, both with roll,pitch,yaw at 0,0,0 flying above the Earth surface, one above North pole, and another one flying above south pole, the airplane at south pole will look upside down.
However, to each pilot, airplane is in perfectly wing-level, nose level attitude. That's because they each have their own coordinate reference frame, which is oriented depending on where on hearth they are positioned. This coordinate reference frame is called NED (North-East-Down).
My Simulator is taking the attitude parameters ONLY in NED format. Newton, being a pure 3D cartesian coordinate system, is in ECEF format. My task is to convert from ECEF to NED, and feed NED values to my Simulator.
I am at stage #2, so I am completely staying away from worrying about converting the orientations from Newton to Sim (display engine). I actually have code for this conversion: I found a scientific paper on this
here, (PDF download).
This is, in reality, a conversion commonly used in space dynamics applications. There is an explanation of the problem, with diagrams, of the conversion process. The two coordinate systems are, again, referred to as ECEF (earth centered, earth fixed) and NED (North-East-Down). I have taken the method described in this paper and wrote a procedure to convert from ECEF to NED (which I will need) and also, the other way around, from NED to ECEF. However, to test it, I need to make sure that my coordinate systems and orientations are set up as expected.
I also have a quite simple, bullet proof verification approach to this problem: Assuming a spacecraft with fixed attitude (that is, non rotating), is orbiting around the Earth:
NED mode (the mode Simulator is in): horizon (and Earth itself) will be fixed in relation to the spacecraft
ECEF mode: Stars will be fixed in relation to the spacecraft.
When I see stars not moving through the porthole of my spacecraft, whatever the spacecraft orbit is, I'll know I've performed ECEF->NED conversion correctly
Misho