I wrote a C# program that uses the mouse and it works perfectly. I now wish to implement touch support. The problem is that I track the mouse a lot and use the Point and Point3D classes throughout the program. When I use touch, I need to track where the "mouse" moves. I can get the coordinates of the touch but that's where I get stuck. I am using the Helix 3D Toolkit and most functions require a Point or Point3D input parameter. My question is, is there any way to convert the TouchPoint into a Point or Point3D? Or is there any other "easy" way to implement touch support?
Here is a little code of what I'm trying to do:
private Point3D? GetPoints(TouchEventArgs e)
{
var p = e.GetTouchPoint(ViewPort);
var ray = Viewport3DHelper.Point2DtoRay3D(ViewPort.Viewport, p); //error here at p
if (ray != null)
{
var pi = ray.PlaneIntersection(new Point3D(0, 0, 0), new Vector3D(0, 0, 1));
if (pi.HasValue)
return pi;
}
return null;
}
Figured it out. I just made a new Point() with p.Bounds.X and p.Bounds.Y.