I know clock cycles vary between operating systems and settings. If I am writing code that wants to be relatively confident (at least 95% sure) of a sleep occurring what is the minimum time I could use for a sleep and be confident that any computer/os running the code will sleep?
Is there a way to gaurente a sleep of at least one 'clock cycle' regardless of how long that cycle is in java?
You should never try doing that. Ask yourself if you really need to sleep for one clock cycle. Tying your implementation with timers is always a bad decision. Below I give you a few alternatives.
A number of libs already implement the concept of fps. Usually their implementation already abstract away clocks per second and OSes limitations/details. You could use that concept and be platform agnostic.
This way you can tweak your time requirement by using more or less fps.
Why do you need to sleep for one cycle? That is a very small amount of time. You could try to synchronize (if that is the case) using mutexes instead of timers.
Also, mutexes are usually implemented by hardware instructions. So that guarantees they are atomic. If you really need to sleep for an infinitesimal time, you could lock the mutex and then unlock it. To be honest, any code you execute will by definition (unless it is a NOOP) be similar to sleeping by one cycle. You could also use tmp = 1+1
. That takes two instructions.
I suggest the mutex. From your question, it is not clear why you need that sleep time.
If you need to wait for user interaction (or any external event, like requests), lock the mutex and only unlock it when the event or user input becomes available.
If you really wanna go down the timer road, I suggest you implement a routine that executes a long for loop and then you can try to derive your timer from the time it took to run through that for loop.
As I said earlier, I don't find this approach to be reliable but it is something. Also, I suggest that you also protect this code using a mix of Monte Carlo reliability techniques and unit tests. You can read more about this on this article.
As a final note, beware the optimizations that the compiler/interpreter can make and screw your timer.