For the following code snippet I get the output as 1
. I want to know how it came?
void main()
{
int x=10,y=20,z=5,i;
i=x<y<z;
printf("%d",i);
}
i=x<y<z;
, gets interpreted as i=(x<y)<z
, which in turn gets interpreted as i=1<z
, which evaluates to 1.