What's the worst resolution I can reasonably expect from System.nanoTime? What's the worst resolution I can reasonably expect from System.nanoTime? windows windows

What's the worst resolution I can reasonably expect from System.nanoTime?


Question: Under what circumstances might nanoTime return a value whose resolution is worse than microseconds? What operating systems, hardware, JVMs etc. that are somewhat commonly used might this affect? Please try to provide sources if you can.

Asking for an exhaustive list of all possible circumstances under which that constraint will be violated seems a bit much, nobody knows under which environments your software will run. But to prove that it can happen see this blog post by aleksey shipilev, where he describes a case where nanotime becomes less accurate (in terms of its own latency) than a microsecond on windows machines, due to contention.

Another case would be the software running under a VM that emulates hardware clocks in a very coarse manner.

The specification has been left intentionally vague exactly due to platform and hardware-specific behaviors.

You can "reasonably expect" microsecond precision once you have verified that the hardware and operating system you're using do provide what you need and that VMs pass through the necessary features.