I am currently working on a paper to develop an algorithm that calculates the intersecting area between a rectangle and a triangle. In my conclusion I want to compare it to the well known sweepline-algorithm.
But the thing is, since this algorithm does not scale (it only takes one rectangle and one triangle) in any form/by any variable I can't use asymptotic Big-O-Notation.
What should I measure to compare those two? My current thoughts are counting up all multiplications and optimizing both algorithms to have as few of them as possible.
Other ideas would be to count all arithmetic operations and/or all comparisons.
I'd be grateful for any tips or links to papers on this matter, since I always just find references to Big-O, when I research this topic.
You can count arithmetic operations, and in particular multiplication and division which are the hardest one to perform. Addition is easy.
Technically, the "size" of your input can still scale as the number of digits of the coordinates. If the coordinates are (1, 0), (0, 1), etc., then of course it's going to be faster than if the coordinates are (0.97659532947896743856826735964827, 0.597865467885453586248625678), etc., and you expect a precise answer for the area. So you can formulate the complexity of your algorithm in function of n
, the total number of digits in the input.
But accounting for this is likely the same as just counting the number of multiplications and divisions.