C++ 我如何计算我的程序运行需要多少毫秒?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/1516659/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How do I count how many milliseconds it takes my program to run?
提问by SomeUser
This will show how many seconds:
这将显示多少秒:
#include <iostream>
#include <time.h>
using namespace std;
int main(void)
{
int times,timed;
times=time(NULL);
//CODE HERE
timed=time(NULL);
times=timed-times;
cout << "time from start to end" << times;
}
This will show how many ticks:
这将显示有多少滴答声:
#include <iostream>
#include <time.h>
using namespace std;
int main(void)
{
int times,timed;
times=clock();
//CODE HERE
timed=clock();
times=timed-times;
cout << "ticks from start to end" << times;
}
How do I get milliseconds?
我如何获得毫秒?
回答by Satbir
Refer to question "Convert Difference between 2 times into Milliseconds" on Stack Overflow.
请参阅Stack Overflow 上的问题“将 2 次之间的差异转换为毫秒”。
Or use this:
或者使用这个:
static double diffclock(clock_t clock1,clock_t clock2)
{
double diffticks=clock1-clock2;
double diffms=(diffticks)/(CLOCKS_PER_SEC/1000);
return diffms;
}
回答by jpmelos
If you use a Unix OS, like Linux or Mac OS X, you can go to the command line and use the line
如果您使用 Unix 操作系统,例如 Linux 或Mac OS X,则可以转到命令行并使用该行
time call-program
The time command times how long the execution of any command line takes, and reports that to you.
time 命令计算执行任何命令行所需的时间,并将其报告给您。
I don't know if there's something like that for Windows, nor how you can measure miliseconds inside a C/C++ program, though.
不过,我不知道 Windows 是否有类似的东西,也不知道如何在 C/C++ 程序中测量毫秒。
回答by ChrisW
There's a CLOCKS_PER_SEC
macro to help you convert ticks to milliseconds.
有一个CLOCKS_PER_SEC
宏可以帮助您将刻度转换为毫秒。
There are O/S-specific APIs to get high-resolution timers.
有特定于操作系统的 API 可以获取高分辨率计时器。
You can run your program more than once (e.g. a 1000 times) and measure that using a low-resolution timer (e.g. some number of seconds), and then divide that total by the number of times you ran it to get a (higher-resolution) average time.
您可以多次运行您的程序(例如 1000 次)并使用低分辨率计时器(例如某些秒数)进行测量,然后将该总数除以您运行它的次数以获得(更高-分辨率)平均时间。
回答by JRL
In Windows, you can use GetTickCount, which is in milliseconds.
在 Windows 中,您可以使用GetTickCount,它以毫秒为单位。
回答by Cat Plus Plus
Under Win32, you can access the high-resolution timer using QueryPerformanceFrequencyand QueryPerformanceCounter(IMHO that should be preferred, possibly with fallback to GetTickCount
). You can find the example in Community Content section on MSDN.
在 Win32 下,您可以使用QueryPerformanceFrequency和QueryPerformanceCounter(恕我直言,这应该是首选,可能回退到GetTickCount
)访问高分辨率计时器。您可以在 MSDN 的社区内容部分找到示例。
回答by Josua Robson
This will show how many ticks:
#include <iostream> #include <time.h> using namespace std; int main(void) { int times,timed; times=clock(); //CODE HERE timed=clock(); times=timed-times; cout << "ticks from start to end" << times; }
How do I get milliseconds?
这将显示有多少滴答声:
#include <iostream> #include <time.h> using namespace std; int main(void) { int times,timed; times=clock(); //CODE HERE timed=clock(); times=timed-times; cout << "ticks from start to end" << times; }
我如何获得毫秒?
clock()
returns milliseconds. Compile this code, it will return 1000. Of course if your on Linux and not on Windows replace #include <windows.h>
with #include <unistd.h>
and replace Sleep(1000)
with usleep(1000000)
.
clock()
返回毫秒。编译这段代码,它会返回 1000。当然,如果你在 Linux 而不是在 Windows 上替换#include <windows.h>
为#include <unistd.h>
和替换Sleep(1000)
为usleep(1000000)
.
#include <stdio.h>
#include <time.h>
#include <windows.h>
int main()
{
unsigned long x = clock();
Sleep(1000);
printf("Difference: %d", clock() - x);
}