Saturday, November 1, 2008

Deleting and entropy

Quantum computation is a hot topic, no doubt. But sometimes I feel that, in order to understand it well, we should go deeper into the physics of classical computation... So, let's go.

According to Landauer, the only step in a computation which requires an expense in energy is deletion. Why that? His argument is simple. Deletion is the only operation in a computer which is irreversible. Therefore it implies an increase in entropy DS. So, we'll have to pay an energy T DS.

So, among the many things that this assertion make me think, this one can be funny. How to store a memory bit, physically? Anyway you decide to do it, it should be a system with two states, 0 and 1. Let's consider a 1D system, characterized by a potential V(x), which in principle is a double well with two equal minima, associated with the values 0 and 1, and separated by a potential barrier.

At temperature T, the probability of finding the system at equilibrium at any position, p(x) will be proportional to the Boltzmann factor exp[-V(x)/kT]. This function p(x) has two equal maxima, also at the 0 and 1 positions.

So, let's assume that the bit is 1. This means that the probability distribution should be highly peaked around 1. Imagine that the potential energy for both values, 0 and 1, are the same. Can this be stable? It can stay for long, if the barrier is large, but will eventually come to the previous function p(x).

Let us assume that the barrier is large enough for our purposes. Now the question is: (a) how to take the probability distribution from one minimum to the other? According to Landauer, we should be able to do this without any expense in energy, and (b) how to delete the information? For this, we should spend a minimum energy which is kT log(2).

Starting out

Science is communicated in many different ways: journals, books, conferences, etc. And science is taught in a few ways, also: books, reviews, classes, etc. All these ways tend to be formal and give the aspect of finished work. But work in science is always work in progress. Real science is more alike to the scrabbled napkins after a discussion in a cafeteria than to the polished pages of a book.

So, why not using the biggest cafeteria in the world? That is, the web. We'll have discussions here about both advanced and basic topics, help solve problems, spread news and gossip...

And, why entropy? Two reasons. One is that the napkins will be rather messy. The other is that we like the concept of entropy. It has had a central role in many sciences, and this role is going to grow in the future...

We have decided on a few categories for the posts:
• Travelers' tales: Discussing news, books, webs...
• Hot topics: the big debates, both about and around science.
• Amusements: curiosities, surprising connections... but never too serious.
• Newbies: soft introduction to a topic.
• Small talk: just that, from discussion about the blog to jokes to... whatever.
• Challenges: Can you solve this?
• I wonder: Why does this happen?
• Help wanted: Asking for collaboration.
• Baby ideas: perhaps they'll be able to grow up here.
And we'll try to state the scientific level required to understand a post: Everybody, High School, College, Research.