So in my current project, I am analyzing different ML models based on their quality. Right now, I'd like to put the quality in the context of the time a model needs to train. I track their quality using a F1 Score and I also log the needed time. Now I've been researching the best way to define some of a time-quality ratio but I am unsure how to reach that.
I've been thinking to create a table that has the F1 scores on the y-axis and the Time needed on the x-axis (or the other way around, I don't mind either but figured this makes most sense) but I struggle to define that in Google sheets. My table currently looks something like this (all values are imagined and could vary):
First Dataset | Time (in Min) | Quality (F1 Score) |
---|---|---|
Iteration 1 | 5 | 0 |
Iteration 2 | 8 | 0.1 |
Iteration 3 | 11 | 0.2 |
Iteration 4 | 21 | 0.5 |
Iteration 5 | 20 | 0.8 |
Iteration 6 | 21 | 1 |
And I'd like a table (this is manually created in GeoGebra) similar to this:
I'm aware I can manually pick my x-axis but was wondering what the best way would be to achieve this - if at all.