04-09-2008, 10:04 PM
Now, as you may or all know, Mirage frequently uses the GetTickCount API. Now GetTickCount is a very useful API command, however there are much better ways of doing internal timing. The problem with GetTickCount is that it is often off a few milliseconds and actually has (on a typical system) about 25-50 ms of inaccuracy. Now, 25-50 ms might sound like a very small number to you, but when using it to do FPS calculations, this can actually affect the number of frames and the smoothness displayed quite, lowering the FPS and the smoothness.
Now, there are many other high performance timers, such as QueryPerformanceCounter and TimeGetTime. In this tutorial, I will be showing you how to use the TimeGetTime API to do all your timing functions. The TimeGetTime API is very useful because you can set it's accuracy, and we will be setting it to the highest accuracy, which is 1 ms, that way all your timing will be exact.
Now I do not know if this will break speed hacks or not, someone can test that if they want, and report it in the tutorial, that'd be very useful
Now TimeGetTime is also very convenient because it uses Longs, just like GetTickCount, so you don't have to change any of your variable types, and it has a rollover of around 40 days if I remember correctly
So, to start out, you need to add these API Declares :
The timeGetTime command is what we will use as our replacement for GetTickCount. timebeginPeriod and timeEndPeriod set the accuracy of the timer instead of using the default accuracy of 15 ms.
You have to use the command timeBeginPeriod once your application starts to set the accuracy, and use timeEndPeriod before your application stops to set it back to the default value. So, with Mirage, we have the perfect places for this
At the top of sub Main (or sub InitServer for the server side) add the following line :
and then at the bottom of sub GameDestroy (or sub DestroyServer for the serverside) add the followign line :
And there you go! the accuracy is all set
Now, just go through your source and replace all the calls from GetTickCount to timeGetTime and you are now done! Simple wasn't it
Now, there are many other high performance timers, such as QueryPerformanceCounter and TimeGetTime. In this tutorial, I will be showing you how to use the TimeGetTime API to do all your timing functions. The TimeGetTime API is very useful because you can set it's accuracy, and we will be setting it to the highest accuracy, which is 1 ms, that way all your timing will be exact.
Now I do not know if this will break speed hacks or not, someone can test that if they want, and report it in the tutorial, that'd be very useful

Now TimeGetTime is also very convenient because it uses Longs, just like GetTickCount, so you don't have to change any of your variable types, and it has a rollover of around 40 days if I remember correctly

So, to start out, you need to add these API Declares :
Code:
Public Declare Function timeGetTime Lib "winmm.dll" () As Long
Public Declare Function timeBeginPeriod Lib "winmm.dll" (ByVal uPeriod As Long) As Long
Public Declare Function timeEndPeriod Lib "winmm.dll" (ByVal uPeriod As Long) As Long
The timeGetTime command is what we will use as our replacement for GetTickCount. timebeginPeriod and timeEndPeriod set the accuracy of the timer instead of using the default accuracy of 15 ms.
You have to use the command timeBeginPeriod once your application starts to set the accuracy, and use timeEndPeriod before your application stops to set it back to the default value. So, with Mirage, we have the perfect places for this

At the top of sub Main (or sub InitServer for the server side) add the following line :
Code:
timeBeginPeriod 1
and then at the bottom of sub GameDestroy (or sub DestroyServer for the serverside) add the followign line :
Code:
timeEndPeriod 1
And there you go! the accuracy is all set

