I remember hearing that for performance a development machine should be 32 bit, while servers should be 64 bit. I think it was Richard Campell on Dot Net Rocks! that mentioned this.
Why would 32-bit be faster than the 64-bit for a development box and vice versa for servers?
One major reason is the fact that 32-bit OSs can't address 4GB of RAM. 4-8GB can be crucial in a lot of development environments where virtual machines are involved, or even heavy lifting in general. This is why I always stick with 64-bit where possible, and all modern CPUs support it.