Let's say we have a enum
declared outside the main function in a C code.
We can access the members of the enum
inside the function without issue, and also assign it to variables of int
data type.
But, there exists the concept instantation of enum
type, i.e.,
Why is it required when using enum
elements with int
also does the task?
I declared this outside the main function:
enum months {Jan, Feb, Mar};
..and accessed it's member(s) using int
inside the main()
function
int main () {
int m=Feb;
printf("%d", m);
}
(prints 1
on screen)
But it's popular to create an instance of the enum
type in the function where it's used:
int main () {
enum months m=Feb;
printf("%d", m);
}
(prints 1
on screen)
Why is this needed when you can use int
itself?
Instantiation is the creation of an actual variable (an instance) of a particular type. In the example, you choose to either create a local instance of the enum
erated type, months
, or an instance of an int
type, which knows nothing about the names of months, but can still hold the underlying number that enum month
variables use to represent months.
While it's true that enum
s are compiled to int
s at runtime, enum
s add a layer of abstraction to our source-code that makes it easier for humans to understand. We use enum
types when dealing with a limited set of valid values, each representing a different meaning. Using them is all about being nice to your future self, because C gives you free rein to do all sorts of things that might hurt later; Ben Klemens (see Resources) describes C as "Punk Rock"!
Humans read source code more frequently than the compiler does, especially when you work in a team. Using enum
types makes the code clearer, so that you and your colleagues can understand the intent of the code. The compiler doesn't know about what your code is intended to do - it just does as it's told - but it really matters to you, the developer.
When you make use of the type system, you're making it easier for developers to understand and maintain the code. Even solo developers benefit from this, because you soon forget the details of your implementation. You're doing a favor to the future-you, by giving yourself a clue, just like when you choose meaningful names for your variables and functions. Also, enums tell the compiler which limited set of values are okay for variables of that type, thus giving it a chance to catch mistyped values at compile time, rather than failing at runtime.
With C, you still need to explicitly check for invalid values, eg. in a switch
statement with a default
clause, especially if you do any arithmetic on the enum
.
Ben Klemens is not entirely uncritical of how C handles enum
types, especially for cluttering the compiler's global namespace, thereby requiring longer, clunkier identifiers to avoid name clashes in larger programs.
S. McConnell, Code complete: a practical handbook of software construction. Redmond, Wash: Microsoft Press, 1993.
B. Klemens, 21st century C, Second edition. Beijing ; Sebastopol, CA: O’Reilly Media, Inc, 2014.
Search terms C "code quality" or C "best practice".
Also, definitely check out Coding Standards for pure C (not C++) on this site.