Time has been called God's way of making sure that everything doesn't happen at once. In the same spirit, noise is Nature's way of making sure that we don't find out everything that happens. Noise, in short, is the protector of information.
This is not what I thought physics was about when I started out: I learned that the idea is to explain nature in terms of clearly understood mathematical laws; but perhaps comparisons are the best we can hope for.
In order to understand information, we must define it; bit in order to define it, we must first understand it. Where to start?
Science has taught us that what we see and touch is not what is really there.
The switch from 'steam engines' to 'heat engines' signals the transition from engineering practice to theoretical science.
Paradox is the sharpest scalpel in the satchel of science. Nothing concentrates the mind as effectively, regardless of whether it pits two competing theories against each other, or theory against observation, or a compelling mathematical deduction against ordinary common sense.
Information gently but relentlessly drizzles down on us in an invisible, impalpable electric rain.
We don't know what energy is, any more than we know what information is, but as a now robust scientific concept we can describe it in precise mathematical terms, and as a commodity we can measure, market, regulate and tax it.
Claude Shannon, the founder of information theory, invented a way to measure 'the amount of information' in a message without defining the word 'information' itself, nor even addressing the question of the meaning of the message.
The solution of the Monty Hall problem hinges on the concept of information, and more specifically, on the relationship between added information and probability.
Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information.
The smell of subjectivity clings to the mechanical definition of complexity as stubbornly as it sticks to the definition of information.
In fact, an information theory that leaves out the issue of noise turns out to have no content.
To put it one way, a collection of Shakespeare's plays is richer than a phone book that uses the same number of letters; to put it another, the essence of information lies in the relationships among bits, not their sheer number.
Nowhere is the difference between either/or and both/and more clearly apparent than in the context of information.