I understand that JPEG is a lossy compression standard, and that the 'quality' factor controls the degree of compression and thus the amount of data loss.
But when the quality number is set to 100, is the resulting jpeg lossless?
Using a "typical" JPEG encoder at quality 100 does not give you lossless compression.1
In JPEG compression, information is mostly lost during the DCT coefficient quantization step. That is, 8-by-8 coefficient blocks are divided by a 8-by-8 quantization table, so they become smaller, and thus 'more compressible' after rounding.
When you set JPEG quality to 100, the DCT coefficients are unchanged except for rounding. This is because the quantization table will be all 1s with the standard IJG-JPEG tables.
Thus, there are mainly two factors leading to information loss when quality is 100:
1 Lossless JPEG encoding exists, but it's different in nature and seldom used.