The term endian refers to a computer architecture’s “byte order,” or the way the
computer stores the bytes of a multiple-byte data element. Virtually all computer
architectures today are byte-addressable and must, therefore, have a standard for
storing information requiring more than a single byte. Some machines store a
two-byte integer, for example, with the least significant byte first (at the lower
address) followed by the most significant byte. Therefore, a byte at a lower
address has lower significance. These machines are called little endian machines.
Other machines store this same two-byte integer with its most significant byte
first, followed by its least significant byte. These are called big endian machines
because they store the most significant bytes at the lower addresses. Most UNIX
machines are big endian, whereas most PCs are little endian machines. Most
newer RISC architectures are also big endian.
These two terms, little and big endian, are from the book Gulliver’s Travels.
You may remember the story in which the Lilliputians (the tiny people) were
divided into two camps: those who ate their eggs by opening the “big” end (big
endians) and those who ate their eggs by opening the “little” end (little endians).
CPU manufacturers are also divided into two factions. For example, Intel has
always done things the “little endian” way whereas Motorola has always done
things the “big endian” way. (It is also worth noting that some CPUs can handle
both little and big endian.)
For example, consider an integer requiring 4 bytes: