Assembly x86 (16bit): More accurate time measurement

I'm programming in TASM 16bit with DOSBox and here's today's issue: Using DOS INT 21h/2Ch I can get the system's current hundredths of a second. That's good and all... until it's not.

See, I'm looking for an at least semi-accurate time measurement in milliseconds, and I'm positive it's possible.

Why, you ask? Have a look at INT 15h/86h. Using this interrupt I can delay the program in microseconds. If such precision exists, I'm sure getting milliseconds would be a walk in the park.

Some ideas I had: Using INT 70h which occurs every 1/1024 of a second, but I don't know how to listen to interrupts, nor do I want a timing system that can't be divided by 10.

This question has taken the better of me by now, and I've failed finding an already existing solution online.

Cheers in advance.


  • A big thank you to Peter Cordes in the comments for answering, I'll now post the answer to anyone else planning on using an old-fashioned compiler from 30 years ago.

    Roughly, the best clock you can get in 16bit TASM is still not enough for accuracy. Luckily, in TASM you can "unlock" 32bit mode by using the .386 directive (as mentioned here).

    Then, you can use the RDTSC command (Read Time-Stamp Counter), but one problem.. It does not exist in TASM. The fact it doesn't exist serves us no purpose, because all commands are in TASM (often called mnemonics) are just replacements for an OpCode, which is what defines every instruction the CPU can run.

    When the Intel Pentium CPU was released, an OpCode for RDTSC was included, so if you have a CPU from it and up... You're good.

    Now, how do we run the RDTSC instruction if it doesn't exist in TASM? (but does in our CPU)

    In TASM, there's an instruction called db, and with it we can run an OpCode directly.

    As seen here, what we'll need to do to run RDTSC is: db 0Fh, 31h.

    And that's it! You can now run this instruction easily, and your program will still stay a mess, but a timed mess at that!