Does a 64 bit run-time run faster than a 32 bit? Was our childhood a lie?
Backstory:
A runtime I really like has been updated to 64 bit. As a programmer, the only thing I could think of was that meant that you could create larger numbers and access more memory.
But growing up the newest consoles went from 8 bit, to 16 bit, then 32 bit and you won't guess what's next, 64 bit. So everyone knew 16 bit was better in every way, including speed, than 8 bit.
So when my favorite runtime says it's upgraded to 64 bit does it mean it's faster than the 32 bit? It's upgraded for Mac OS X and will be upgraded to 64 bit for Windows as well.
Also, it looks like Firefox just went 64 bit.
The processor word size (32 or 64) is somewhat independent of the the speed. Generally 64-bit processors are newer than those supporting only 32-bits, so they are inherently faster. However, manipulating more data, larger addresses is inherently slower than manipulating smaller data (shorter addresses).
Let's say that you have a library that does image processing (e.g., read/write JPEG files) in which you need to do 64-bit scaled integers (at 32-bits, you get serious rounding errors in JPEG). The 64-bit processor can add a 64-bit scale integer in one instruction. A 32-bit processor would take 3 or more instructions to do the same (inherently slower).
64 bits means access to more memory (if you use a CAD program you know what that means); not more speed. But, because 64-bit processors tend to be newer and faster, you generally get more speed; but not because of 64-bits.