You updated your password.

Reset Password

Enter the email address you used to create your account. We will email you instructions on how to reset your password.

Forgot Your Email Address? Contact Us

Reset Your Password

SHOW
SHOW

The Science of Information: From Language to Black Holes

Explore the exciting concepts, history, and applications of information theory in 24 challenging and eye-opening half-hour lectures taught by a world-renowned physicist.
The Science of Information: From Language to Black Holes is rated 4.7 out of 5 by 61.
  • y_2024, m_5, d_1, h_9
  • bvseo_bulk, prod_bvrr, vn_bulk_3.0.42
  • cp_1, bvpage1
  • co_hasreviews, tv_3, tr_58
  • loc_en_CA, sid_1301, prod, sort_[SortEntry(order=SUBMISSION_TIME, direction=DESCENDING)]
  • clientName_teachco
  • bvseo_sdk, p_sdk, 3.2.1
  • CLOUD, getAggregateRating, 10.9ms
  • REVIEWS, PRODUCT
Rated 5 out of 5 by from Another Plane of Existence Schumacher deepens physics, molecular biology (see Lectures 13-15=L13-15), and even geology/evolution (See L14’s kaolin discussion). Would read the Guidebook and then view the lecture video. I have 9 pages of notes in addition to the course Guidebook. The course exercises are excellent. Strongly consider the Great Course “Physics and Our Universe" by Wolfson before taking this course. LECTURES: Entropy is usually thought of as energy not available for work in a closed thermodynamic system, as a measure of disorder, or as a loss of information. In Lecture 3 (L3), entropy is counterintuitively defined as “the fundamental measure of information" and the connection between computer info entropy and thermodynamics is large. His next concepts: "entropy H of a source = minimum number of bits required to represent a message from the source" and: "entropy should be a measure of novelty” makes one begin to suspect that this course is about the relevance (and manipulation) of the unseen. Then comes the first clue of WHY we are to broaden our views of entropy: while it costs a mere 0.5 nanocent to store a bit of information, high costs arise with info transfers and thus info must be placed into the smallest number of bits to transfer it. L4: The course’s hero Claude Shannon put it this way: “The entropy of course X is the average surprise of the messages in X.” Like everything in this course, one must challenge old ways of thinking. L6 starts a series of practical applications of entropic attempts to lessen bit transfer loads via discussing the perceptual coding behind MP3, JPEG, and MPEG-4. It ends: "The more we can predict, the less we have to encode." Thus, wise loss = less transmission overload creating a useful "loss of information" link to the original definition of entropy. L7 goes into the technical challenges of noise (present unless at absolute zero); why noise affects amplitude more than frequency (making, for example, f.m. radio superior to a.m.); and its applications to the Voyager deep space network. L12 exemplifies the usefulness of noise via Claude Shannon’s WWII work for Roosevelt/Churchill SIGSALY communication where messages were compressed then a random noise record (new for each message) was overlaid. This noise record created an unbreakable code (unless the record courier turned traitorous). L13 discusses Schrodinger's Aperiodic Crystal and notes that genetic info is digital rather than analog (proven with x-rays on fruit flies where all or none genetic changes occurred). He states that HIV reverse transcription of RNA into host DNA bypasses safety, making HIV impossible to eliminate. It also contains the best Great Course explanation of the Graham Cairns-Smith hypothesis whereby self-reproducing kaolin (clay) forms H20 solutions that can store and copy information with boundaries between info blocks. L15 on nerve dendrites initially seems to limit the info storage capacity of synaptic terminals - until the end of the lecture where he stated that 10 to the 10th neurons each connect to 10 to the 3rd others. Even this missed that neuronal “waves” of information are somehow moderated by each neuron’s multiplicity of connections. The latest neuronal estimate information capacity is 1.5 petabytes! One might ask if the time needed to “evolve" the cellular chemical “luck" allowing the human brain's layered, sophisticated complexity remotely matches humanity’s tiny archeological timeline? See also TGC's "What Darwin Didn't Know” by Solomon”. Memorable lines: L16 Thermodynamics is where the laws of nature and info meet. L17: “No process can have as its only result the erasure of information.” L20: Turing’s proof that the halting program is an incomputable function. Despite the stock market avidity to AI, its potential for unstoppable runaway remains. L21: Feynman's summary: “Nobody understands quantum mechanics." L22 The beautifully deep “monogamy” of entanglement: “the deep meaning of the quantum no cloning theorem” is that “Because of his relationship to Alice, you can’t clone Bob." L23: John Wheeler on science vs. reality includes his critical "It from bit" observation: "…laws are math theory but missing the principle by which ‘it’ can become real”. L24's Guidebook entire opening paragraph is marvelous.
Date published: 2024-04-22
Rated 5 out of 5 by from Comprehensive and well presented This course has helped me improve my grasp of information theory, and I am grateful. This is a notoriously elusive subject, for which many writers and speakers quickly fall into a kind of gibberish only intelligible to themselves. This instructor generally manages to avoid that trap, though I will need more than one trip through this material before I can feel that I understand it well. Since any topic that is heavily concerned with entropy is difficult to explain, the instructor is to be congratulated for his achievement. That said, I wish the instructor had taken a slightly different approach to exploring the information in DNA. For example, he observes that human DNA has 6 billion basepairs. However, as far as information is concerned human DNA has 23 pairs of chromosomes, 23 from dad and 23 from mom. Each set of 23 chromosomes contains about 3 billion basepairs, with enough information to construct a human being (using the X chromosome). Omitting some details (the Y chromosome etc.), the other 3 billion basepairs are present just in case of a problem. How to express that mathematically? There is plenty of scope here for a full lecture, which might replace his somewhat unfocused lecture on the RNA world. Finally a minor quibble: uracil does not have a methyl group.
Date published: 2023-10-27
Rated 5 out of 5 by from Excellent The content is captivating The presentation is amazing A must watch
Date published: 2023-04-06
Rated 5 out of 5 by from A most remarkable course This is a clear, concise and complete course in Information science, taught by an excellent teacher. Of course, to reap all of its benefits, it's best to have studied some elementary Math. I would recommend Prof. Morris Kline's "Mathematics for the Nonmathematician" to all severe cases of Mathallergy.
Date published: 2021-08-11
Rated 5 out of 5 by from Don't be intimidated by the math! Don't be intimidated by the math! I listened to this course on audio during my daily walks, and I think that helped me absorb what I could without getting distressed that the math was often way beyond my understanding. (Because I was not looking at incomprehensible equations.) Reliably, the professor would also state the point he was trying to make in non-math terms, and that part I usually understood. Even without grasping the math, I think I took in most of his main points. Among the points in this course that I found interesting and surprisingly relevant to some of my interests: * language as an information system * generalizations about the frequency of words in a language * challenges and solutions for data compression * error-correcting strategies * parallels between errors induced by manuscript copying in ancient times and DNA transmission * types of codes, including simple locks and keys * breakable vs. unbreakable codes and how the Enigma machine code was broken during WWII * the challenge of communicating with future humans and extra-terrestrials The professor modulates his voice enough that I could listen to him without feeling tired, even when I didn't quite follow what he was saying. From time to time he introduced down-home explanations that really did make things clearer - for example, comparing a certain challenge of cryptography to the old TV show "The Newlywed Game."
Date published: 2020-09-14
Rated 5 out of 5 by from Claude Shannon's Work! Currently about 1/3 of the way, but this is outstanding. The professor based the course on the work of Claude Shannon and its the best way to learn the Science of Information!
Date published: 2020-05-11
Rated 5 out of 5 by from Great information I'm just getting started with this one. We are watching another course right now. The 'preview' looks fantastic. It's a bit of a 'challenge' since I learned some of this from a different point of view so to speak when I was in school many years ago. I can't wait to get back into it.
Date published: 2020-02-09
Rated 5 out of 5 by from This man can teach very impressed by the elegance and clarity of these lectures, and his personable delivery. obviously he took great care and attention to distill and to work on phrasings and transitions that enabled the viewer to pull things together. i had not appreciated the maths involved and don't fully understand them, but to learn that they are there and drive the science of information is itself a great learning.
Date published: 2019-12-29
  • y_2024, m_5, d_1, h_9
  • bvseo_bulk, prod_bvrr, vn_bulk_3.0.42
  • cp_1, bvpage1
  • co_hasreviews, tv_3, tr_58
  • loc_en_CA, sid_1301, prod, sort_[SortEntry(order=SUBMISSION_TIME, direction=DESCENDING)]
  • clientName_teachco
  • bvseo_sdk, p_sdk, 3.2.1
  • CLOUD, getReviews, 5.65ms
  • REVIEWS, PRODUCT

Overview

Never before have we been able to acquire, record, communicate, and use information in so many different forms. This revolution goes far beyond limitless content-information also underlies our understanding of ourselves, the natural world, and the universe. Discover how the concepts of information reveal breathtaking insights into the workings of nature, even as they lay the foundation of astounding new technologies.

About

Benjamin Schumacher

Gravity is about both phenomena near at hand at the human scale, everyday and intuitive, and phenomena far off at an astronomical scale.

INSTITUTION

Kenyon College

Dr. Benjamin Schumacher is Professor of Physics at Kenyon College, where he has taught for 20 years. He received his Ph.D. in Theoretical Physics from The University of Texas at Austin in 1990. Professor Schumacher is the author of numerous scientific papers and two books, including Physics in Spacetime: An Introduction to Special Relativity. As one of the founders of quantum information theory, he introduced the term qubit, invented quantum data compression (also known as Schumacher compression), and established several fundamental results about the information capacity of quantum systems. For his contributions, he won the 2002 Quantum Communication Award, the premier international prize in the field, and was named a Fellow of the American Physical Society. Besides working on quantum information theory, he has done physics research on black holes, thermodynamics, and statistical mechanics. Professor Schumacher has spent sabbaticals working at Los Alamos National Laboratory and as a Moore Distinguished Scholar at the Institute for Quantum Information at California Institute of Technology. He has also done research at the Isaac Newton Institute of Cambridge University, the Santa Fe Institute, the Perimeter Institute, the University of New Mexico, the University of Montreal, the University of Innsbruck, and the University of Queensland.

By This Professor

Black Holes, Tides, and Curved Spacetime: Understanding Gravity
854
Quantum Mechanics
854
The Science of Information: From Language to Black Holes
854
Impossible: Physics Beyond the Edge
854
The Science of Information: From Language to Black Holes

Trailer

The Transformability of Information

01: The Transformability of Information

What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit-the basic unit of information....

33 min
Computation and Logic Gates

02: Computation and Logic Gates

Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs basic mathematical calculations....

31 min
Measuring Information

03: Measuring Information

How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game....

31 min
Entropy and the Average Surprise

04: Entropy and the Average Surprise

Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise....

31 min
Data Compression and Prefix-Free Codes

05: Data Compression and Prefix-Free Codes

Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes....

31 min
Encoding Images and Sounds

06: Encoding Images and Sounds

Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of quality. See how this works in the MP3, JPEG, and MPEG formats....

30 min
Noise and Channel Capacity

07: Noise and Channel Capacity

One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate....

31 min
Error-Correcting Codes

08: Error-Correcting Codes

Dig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it. Then graduate to approaches used for correcting errors in computer operating systems, CDs, and data transmissions from the Voyager spacecraft....

32 min
Signals and Bandwidth

09: Signals and Bandwidth

Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth-concepts that apply to many types of communication....

31 min
Cryptography and Key Entropy

10: Cryptography and Key Entropy

The science of information is also the science of secrets. Investigate the history of cryptography starting with the simple cipher used by Julius Caesar. See how entropy is a useful measure of the security of an encryption key, and follow the deciphering strategies that cracked early codes....

29 min
Cryptanalysis and Unraveling the Enigma

11: Cryptanalysis and Unraveling the Enigma

Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude Shannon's revolutionary views on the nature of secrecy....

31 min
Unbreakable Codes and Public Keys

12: Unbreakable Codes and Public Keys

The one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted with one-time pads. Close with the mathematics behind public key cryptography, which makes modern transactions secure-for now....

31 min
What Genetic Information Can Do

13: What Genetic Information Can Do

Learn how DNA and RNA serve as the digital medium for genetic information. Also see how shared features of different life forms allow us to trace our origins back to an organism known as LUCA-the last universal common ancestor-which lived 3.5 to 4 billion years ago....

31 min
Life's Origins and DNA Computing

14: Life's Origins and DNA Computing

DNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories, including the view that genetic information in living cells results from eons of natural computation....

31 min
Neural Codes in the Brain

15: Neural Codes in the Brain

Study the workings of our innermost information system: the brain. Take both top-down and bottom-up approaches, focusing on the world of perception, experience, and external behavior on the one hand versus the intricacies of neuron activity on the other. Then estimate the total information capacity of the brain....

31 min
Entropy and Microstate Information

16: Entropy and Microstate Information

Return to the concept of entropy, tracing its origin to thermodynamics, the branch of science dealing with heat. Discover that here the laws of nature and information meet. Understand the influential second law of thermodynamics, and conduct a famous thought experiment called Maxwell's demon....

30 min
Erasure Cost and Reversible Computing

17: Erasure Cost and Reversible Computing

Maxwell's demon has startling implications for the push toward ever-faster computers. Probe the connection between the second law of thermodynamics and the erasure of information, which turns out to be a practical barrier to computer processing speed. Learn how computer scientists deal with the demon....

31 min
Horse Races and Stock Markets

18: Horse Races and Stock Markets

One of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on possible alternatives. Apply his celebrated log-optimal strategy to horse racing and stock trading....

32 min
Turing Machines and Algorithmic Information

19: Turing Machines and Algorithmic Information

Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation of all digital computers....

31 min
Uncomputable Functions and Incompleteness

20: Uncomputable Functions and Incompleteness

Algorithmic information is plagued by a strange impossibility that shakes the very foundations of logic and mathematics. Investigate this drama in four acts, starting with a famous conundrum called the Berry Paradox and including Turing's surprising proof that no single computer program can determine whether other programs will ever halt....

31 min
Qubits and Quantum Information

21: Qubits and Quantum Information

Enter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes quantum information so different. Work your way toward a concept that seems positively magical: the quantum computer....

30 min
Quantum Cryptography via Entanglement

22: Quantum Cryptography via Entanglement

Learn how a feature of the quantum world called entanglement is the key to an unbreakable code. Review the counterintuitive rules of entanglement. Then play a game based on The Newlywed Game that illustrates the monogamy of entanglement. This is the principle underlying quantum cryptography....

32 min
It from Bit: Physics from Information

23: It from Bit: Physics from Information

Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe!...

31 min
The Meaning of Information

24: The Meaning of Information

Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography-designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does the theory of information leave out?"...

33 min

We have updated our Terms of Use. By continuing to use of our website, you are agreeing to these updated Terms of Use.