C++ beginner how to use GetSystemTimeAsFileTime

10,282

Solution 1

Getting the system time out of Windows with decent accuracy is something that I've had fun with, too... I discovered that Javascript code running on Chrome seemed to produce more consistent timer results than I could with C++ code, so I went looking in the Chrome source. An interesting place to start is the comments at the top of time_win.cc in the Chrome source. The links given there to a Mozilla bug and a Dr. Dobb's article are also very interesting.

Based on the Mozilla and Chrome sources, and the above links, the code I generated for my own use is here. As you can see, it's a lot of code!

The basic idea is that getting the absolute current time is quite expensive. Windows does provide a high resolution timer that's cheap to access, but that only gives you a relative, not absolute time. What my code does is split the problem up into two parts:

1) Get the system time accurately. This is in CalibrateNow(). The basic technique is to call timeBeginPeriod(1) to get accurate times, then call GetSystemTimeAsFileTime() until the result changes, which means that the timeBeginPeriod() call has had an effect. This gives us an accurate system time, but is quite an expensive operation (and the timeBeginPeriod() call can affect other processes) so we don't want to do it each time we want a time. The code also calls QueryPerformanceCounter() to get the current high resolution timer value.

bool NeedCalibration = true;
LONGLONG CalibrationFreq = 0;
LONGLONG CalibrationCountBase = 0;
ULONGLONG CalibrationTimeBase = 0;

void CalibrateNow(void)
{
  // If the timer frequency is not known, try to get it
  if (CalibrationFreq == 0)
  {
    LARGE_INTEGER freq;
    if (::QueryPerformanceFrequency(&freq) == 0)
      CalibrationFreq = -1;
    else
      CalibrationFreq = freq.QuadPart;
  }

  if (CalibrationFreq > 0)
  {
    // Get the current system time, accurate to ~1ms
    FILETIME ft1, ft2;
    ::timeBeginPeriod(1);
    ::GetSystemTimeAsFileTime(&ft1);
    do
    {
      // Loop until the value changes, so that the timeBeginPeriod() call has had an effect
      ::GetSystemTimeAsFileTime(&ft2);
    }
    while (FileTimeToValue(ft1) == FileTimeToValue(ft2));
    ::timeEndPeriod(1);

    // Get the current timer value
    LARGE_INTEGER counter;
    ::QueryPerformanceCounter(&counter);

    // Save calibration values
    CalibrationCountBase = counter.QuadPart;
    CalibrationTimeBase = FileTimeToValue(ft2);
    NeedCalibration = false;
  }
}

2) When we want the current time, get the high resolution timer by calling QueryPerformanceCounter(), and use the change in that timer since the last CalibrateNow() call to work out an accurate "now". This is in Now() in my code. This also periodcally calls CalibrateNow() to ensure that the system time doesn't go backwards, or drift out.

FILETIME GetNow(void)
{
  for (int i = 0; i < 4; i++)
  {
    // Calibrate if needed, and give up if this fails
    if (NeedCalibration)
      CalibrateNow();
    if (NeedCalibration)
      break;

    // Get the current timer value and use it to compute now
    FILETIME ft;
    ::GetSystemTimeAsFileTime(&ft);
    LARGE_INTEGER counter;
    ::QueryPerformanceCounter(&counter);
    LONGLONG elapsed = ((counter.QuadPart - CalibrationCountBase) * 10000000) / CalibrationFreq;
    ULONGLONG now = CalibrationTimeBase + elapsed;

    // Don't let time go back
    static ULONGLONG lastNow = 0;
    now = max(now,lastNow);
    lastNow = now;

    // Check for clock skew
    if (LONGABS(FileTimeToValue(ft) - now) > 2 * GetTimeIncrement())
    {
      NeedCalibration = true;
      lastNow = 0;
    }

    if (!NeedCalibration)
      return ValueToFileTime(now);
  }

  // Calibration has failed to stabilize, so just use the system time
  FILETIME ft;
  ::GetSystemTimeAsFileTime(&ft);
  return ft;
}

It's all a bit hairy but works better than I had hoped. This also seems to work well as far back on Windows as I have tested (which was Windows XP).

Solution 2

I believe you are looking for GetSystemTimePreciseAsFileTime() function or even QueryPerformanceCounter() - to be short for something that is guarantied to produce monotone values.

Share:
10,282
oodan123
Author by

oodan123

New to C++ and MFC might post the occasional question relating to this. More confident in C and Java as I use these languages for my university studies.

Updated on June 04, 2022

Comments

  • oodan123
    oodan123 almost 2 years

    I have a program that reads the current time from the system clock and saves it to a text file. I previously used the GetSystemTime function which worked, but the times weren't completely consistent eg: one of the times is 32567.789 and the next time is 32567.780 which is backwards in time.

    I am using this program to save the time up to 10 times a second. I read that the GetSystemTimeAsFileTime function is more accurate. My question is, how to I convert my current code to use the GetSystemTimeAsFileTime function? I tried to use the FileTimeToSystemTime function but that had the same problems.

    SYSTEMTIME st;
    GetSystemTime(&st);
    
    WORD sec = (st.wHour*3600) + (st.wMinute*60) + st.wSecond; //convert to seconds in a day
    lStr.Format( _T("%d   %d.%d\n"),GetFrames() ,sec, st.wMilliseconds);
    
    std::wfstream myfile;  
    myfile.open("time.txt", std::ios::out | std::ios::in | std::ios::app );
    if (myfile.is_open())
        {
         myfile.write((LPCTSTR)lStr, lStr.GetLength());
         myfile.close();
        }
    else {lStr.Format( _T("open file failed: %d"), WSAGetLastError());
    }           
    

    EDIT To add some more info, the code captures an image from a camera which runs 10 times every second and saves the time the image was taken into a text file. When I subtract the 1st entry of the text file from the second and so on eg: entry 2-1 3-2 4-3 etc I get this graph, where the x axis is the number of entries and the y axis is the subtracted values.

    enter image description here

    All of them should be around the 0.12 mark which most of them are. However you can see that a lot of them vary and some even go negative. This isn't due to the camera because the camera has its own internal clock and that has no variations. It has something to do with capturing the system time. What I want is the most accurate method to extract the system time with the highest resolution and as little noise as possible.

    Edit 2 I have taken on board your suggestions and ran the program again. This is the result:

    enter image description here

    As you can see it is a lot better than before but it is still not right. I find it strange that it seems to do it very incrementally. I also just plotted the times and this is the result, where x is the entry and y is the time:

    enter image description here

    Does anyone have any idea on what could be causing the time to go out every 30 frames or so?

  • oodan123
    oodan123 over 8 years
    Thanks, do you have an example? I seem to be having trouble replacing it with GetSystemTimeAsFileTime, i'm on windows 7
  • Admin
    Admin over 8 years
    @oodan123, See my answer. I've included an example. You could uncomment the code and comment the GetSystemTimeAsFileTime() function instead
  • oodan123
    oodan123 over 8 years
    I can't use the function because it is only on windows 8 or higher I tried timeBeginPeriod(1) but that didn't do anything either
  • oodan123
    oodan123 over 8 years
    awesome answer! So if I use FileTimeToSystemTime on the now() filetime that was calculated will I keep that accuracy or will it go back to a relative time?
  • oodan123
    oodan123 over 8 years
    I had a go at trying to implement it on my program and I couldn't seem to improve the time
  • DavidK
    DavidK over 8 years
    Hmmm, hard to know where the problem lies. If you tried posting a complete example with the attempt at implementing the above it would be possible to have a play with it.
  • DavidK
    DavidK over 8 years
    The more I think about this, the more it seems to me that you need to be sure that your trigger is as regular as you think (that is, whatever gives you the X axis values, presumably something to do with your camera. I would try just recording the high resolution timer (from QueryPerformanceCounter()) and graphing that. Although the timer is just a relative number, not an absolute time, it should give you an idea of whether the variations you see really come from problems determining the current time, or a lack of regularity in whatever triggers the code each time.