I am confused on, is H. 264 a codec (software used to encode/decode) or standard/format given; and there are software codecs that uses H.264 to encode/decode.
As with most things in computing, the terminology is loosely used.
Looking at the basic definitions:
So a codec would encode or decode a video, usually following some standard or agreed encoding format.
In practice, people often speak of a H.264 codec or a H.265 codec. What this usually means is a codec that implements a H.264 or a H.265 standard encoding.
The codecs themselves can be either SW or HW based - more common codecs, like ones that implement H.264, are often implemented in HW to increase performance and reduce battery usage on devices.