Project IV 2023-24


Entropy and Information

Simon Ross (Michaelmas), Madalena Lemos (Epiphany)

(email)


Description

The notion of entropy was first introduced in thermodynamics, as a measure of our ignorance of the state of a system. A closely related notion was introduced in classical information theory by Shannon in 1948. The Shannon Entropy measures the amount of ``information'' or surprise in a message, and serves to determine how efficiently a message can be compressed. If a message's content is entirely predictable it can be efficiently compressed. By contrast, an entirely random string of bits is completely unpredictable, and cannot be further compressed. Thus, uncertain messages contain more information: this notion is central in considering both data compression and communication over noisy channels.

Communication using quantum systems introduces new possibilities, such as quantum teleportation. For quantum states, the von Neumann entropy similarly measures the uncertainty in our knowledge of the state. This uncertainty can be thought of as arising from the interaction of the quantum system with its environment, producing entanglement. There are again relations to compressing quantum information and transmitting it over noisy channels.

The aim of the project is to understand classical and quantum information theory, with a focus on these two notions of entropy.

Pre-requisite

References