I wanted to calculate the angle between two triangles in 3D space. The two triangles will always share exactly two points. e.g.
Triangle 1:
Point1 (x1, y1, z1),
Point2 (x2, y2, z2),
Point3 (x3, y3, z3).
Triangle 2:
Point1 (x1, y1, z1),
Point2 (x2, y2, z2),
Point4 (x4, y4, z4).
Is there a way to calculate the angle between them efficiently in CUDA?
For each plane, you need to construct it's normal vector (perpendicular to all lines in that plane). The simple way to do that is to take the cross-product of two non-parallel lines in the triangle. (ex (P3-P1) X (P2-P1) and (P4-P1) X (P2-P1).
Normalize those.
The dot product of those two direction vectors gives you the cosine of the angle.
The tricky bit is to watch out for degenerate triangles! If all 3 points defining either triangle are colinear, (that triangle is just a line) then what you're asking for is undefined, and the cross-product will divide by zero. You need to decide what you're going to do in that case.
Since you're trying to do this on a GPU, you'll ideally want to write this function without any branches, if you're concerned about efficiency. That would mean instead of testing for degenerate triangles with an if
clause, you should try and do it with a ternary A ? B : C