python float 精度,python精度_通過Python可以達到的最高時間精度范圍是多少?

 2023-11-10 阅读 16 评论 0

摘要:Consider a very simple timer;python float 精度。start = time.time()end = time.time() - start精度等級?while(end<5):end = time.time() - startprint endhow precise is this timer ? I mean compared to real-time clock, how synchronized and real-t

Consider a very simple timer;

python float 精度。start = time.time()

end = time.time() - start

精度等級?while(end<5):

end = time.time() - start

print end

how precise is this timer ? I mean compared to real-time clock, how synchronized and real-time is this one ?

Now for the real question ;

What is the smallest scale of time that can be measured precisely with Python ?

解決方案

This is entirely platform dependent. Use the timeit.default_timer() function, it'll return the most precise timer for your platform.

From the documentation:

Define a default timer, in a platform-specific manner. On Windows, time.clock() has microsecond granularity, but time.time()‘s granularity is 1/60th of a second. On Unix, time.clock() has 1/100th of a second granularity, and time.time() is much more precise.

So, on Windows, you get microseconds, on Unix, you'll get whatever precision the platform can provide, which is usually (much) better than 1/100th of a second.

版权声明:本站所有资料均为网友推荐收集整理而来,仅供学习和研究交流使用。

原文链接:https://hbdhgg.com/4/169985.html

发表评论:

本站为非赢利网站,部分文章来源或改编自互联网及其他公众平台,主要目的在于分享信息,版权归原作者所有,内容仅供读者参考,如有侵权请联系我们删除!

Copyright © 2022 匯編語言學習筆記 Inc. 保留所有权利。

底部版权信息