I am reading "Introduction to Algorithms", third edition. In there under the section "Analyzing Algorithms" it is written that:
We also assume a limit on the size of each word of data. For example when working with inputs of size n, we typically assume that integers are represented by c lg n bits for some constant c>=1. We require c>=1 so that each word can hold the value of n, enabling us to index the individual input elements, and we restrict c to be a constant so that the word size doesn't grow arbitrarily.
What is the significance of the word "word" here? Is this a standard to represent data by "word"?
They mean a machine word; basically the size of a processor register, or the "most natural size" of a piece of data for that machine. For a 32-bit machine, it's 32 bits; for a 64-bit machine, it is (not surprisingly) 64 bits.
Word size used to be somewhat more variable, as computer architectures evolved. If you look at this Wikipedia article on word size, you'll see links to descriptions of 12-bit, 18-bit, 21-bit, 24-bit, 31-bit, 36-bit, 48-bit, and 60-bit hardware. I remember reading about a 72-bit machine once, although I can't find a reference right now.