Consider a very simple timer;
python float 精度。start = time.time()
end = time.time() - start
精度等級?while(end<5):
end = time.time() - start
print end
how precise is this timer ? I mean compared to real-time clock, how synchronized and real-time is this one ?
Now for the real question ;
What is the smallest scale of time that can be measured precisely with Python ?
解決方案
This is entirely platform dependent. Use the timeit.default_timer() function, it'll return the most precise timer for your platform.
From the documentation:
Define a default timer, in a platform-specific manner. On Windows, time.clock() has microsecond granularity, but time.time()‘s granularity is 1/60th of a second. On Unix, time.clock() has 1/100th of a second granularity, and time.time() is much more precise.
So, on Windows, you get microseconds, on Unix, you'll get whatever precision the platform can provide, which is usually (much) better than 1/100th of a second.
版权声明:本站所有资料均为网友推荐收集整理而来,仅供学习和研究交流使用。
工作时间:8:00-18:00
客服电话
电子邮件
admin@qq.com
扫码二维码
获取最新动态