I am trying to move mouse programatically between two coordinates. But I want to maintain the speed reliably on all fast or slow processing machines. I saw this link here. But it doesn't gurantee me the optimum, smooth, and visible speed of the cursor when simulating the move between two coordinates. I wonder if anyone knows a trick to determine the parameters like delay and steps optimum value for various machines Like my first idea was use for-loop for specific iteraton to determine the perfomance of the machine then grade the parameters based on how much time the for-loop took ...an idea? or Am i totally wrong on this? Thanks
You should make the motion a function of time. Starting with the answer at C# moving the mouse around realistically, and using the Stopwatch class to measure the elapsed time:
public void LinearSmoothMove(Point newPosition, TimeSpan duration)
{
Point start = GetCursorPosition();
// Find the vector between start and newPosition
double deltaX = newPosition.X - start.X;
double deltaY = newPosition.Y - start.Y;
// start a timer
Stopwatch stopwatch = new Stopwatch();
stopwatch.Start();
double timeFraction = 0.0;
do
{
timeFraction = (double)stopwatch.Elapsed.Ticks / duration.Ticks;
if (timeFraction > 1.0)
timeFraction = 1.0;
PointF curPoint = new PointF(start.X + timeFraction * deltaX,
start.Y + timeFraction * deltaY);
SetCursorPosition(Point.Round(curPoint));
Thread.Sleep(20);
} while (timeFraction < 1.0);
}