There are 100 ticks in a time mark. Each tick is processed every so many milliseconds based on the game speed. There are (gameSpeed * 4) ticks per second. That is, the time interval in milliseconds between processing ticks is 1000 / (gameSpeed * 4). So at speed 10, that's 10 * 4 = 40 ticks per second, or 1000 / (10 * 4) = 1000 / 40 = 25 milliseconds between ticks.
Or maybe more to the point of your question, 1 time mark at the max speed takes 2.5s. (40 ticks/s * 2.5s = 100 ticks = 1 mark).
And to be nice and general about this, (1 s / (gameSpeed * 4) ticks/s) * (100 tick / 1 mark) = 25 / gameSpeed mark/s.
Example 1: At speed 10, it takes 25/10 = 2.5 seconds per mark.
Example 2: At speed 6, it takes 25/6 = 4.16 (6 repeating) seconds for 1 mark.
Example 3: At speed 5, it takes 25/5 = 5 seconds for 1 mark.
Example 4: At speed 1, it takes 25/1 = 25 seconds for 1 mark.
This is of course assuming no latency or jitter, which will inevitably happen, so the real speed may turn out to be silghtly slower. This is particularly true if a task switch occurs just before it's time to proceed to the next frame. The difference though should still be fairly negligible. The time difference is measured from when the last tick code started to run, so if it's a bit late running one time, it won't try to run sooner the next time to make up for it. It'll be run some fixed number of milliseconds after the current tick.