This issue is related to the recently stabilised x86 intrinsic function _rdtsc
The issue is that the function is defined to return a signed i64. This is in accordance with Intel's documentation. However, that is not how most people define it, including C/C++ compilers.
This is how GCC defines it:
extern __inline unsigned long long // < note the return type
__attribute__((__gnu_inline__, __always_inline__, __artificial__))
__rdtsc (void)
{
return __builtin_ia32_rdtsc ();
}
This Clang example proves it also defines it as unsigned long long.
Now, I'm confused. Since the timestamp counter is a monotonically increasing timer, it wouldn't make sense to return negative values. In fact, I think it is allowed to return values with the most significant bit set (which would make the returned value negative, unless transmuted). Is the mismatch intended?
This issue is related to the recently stabilised x86 intrinsic function
_rdtscThe issue is that the function is defined to return a signed
i64. This is in accordance with Intel's documentation. However, that is not how most people define it, including C/C++ compilers.This is how GCC defines it:
This Clang example proves it also defines it as
unsigned long long.Now, I'm confused. Since the timestamp counter is a monotonically increasing timer, it wouldn't make sense to return negative values. In fact, I think it is allowed to return values with the most significant bit set (which would make the returned value negative, unless transmuted). Is the mismatch intended?