Did I mention before that the package selection tool (originally from Fedora) that comes with the Aspire One sucks big time? Yesterday, I came across zebra barcode reader I could use this together with the built in camera to keep track of the books I lend to various people. Zebra was of course not packaged and compiling it required C++ bindings for ImageMagick. Those I could not install due to dependency problems. I spent more or less the whole day installing this/compiling that/upgrading the other. In the end, I had zebra running (only to learn that ordinary bar codes are either too small to be resolved by the web cam or bringing them closer to the cam is too close for the camera to focus. Bummer. But enlarging the barcode on the copier made the computer recognize it but somehow defeats the purpose) but left the Aspire One in a state unable to boot.
Luckily, it only did not boot into X but I could still get a prompt and mount an external disk and backup my home dir. And now I install Ubuntu to it. I will continuously update this post as I keep going.
Step one was to get the installation disk onto a bootable USB stick. UNetbootin is a wonderful tool for this it only took me some time to find it. In the first go I ended up with a "Missing operating system" error when trying to boot from it. Fortunately, Bug #277903 in usb-creator (Ubuntu): “Missing Operating System [message at boot]” has the solution: Remove the partition using fdisk(!!!) and create new one using gparted. That worked. Now I install from that stick.
Stay with me and keep fingers crossed!
Update: I had expected to have to wait for long times fighting with various bits and pieces and having time to blog the solutions as I found them. But there were no solutions to find. Everything I tried so far just worked. Without any hassle. So nothing to say except I should have switched to Ubuntu already much earlier. OK, two things do not work so far: The mic (for skype for example) and the bluetooth USB stick. But I have not really tried.
Wednesday, January 21, 2009
Sunday, January 18, 2009
A feature I would love
I have a problem and it seems so far no one has come up with a good solution for it: Many web 2.0 services come in the form of a stream of items most of which I would like to see exactly once. Examples are RSS feeds, podcasts (here seeing could as well mean synching with the ipod), mail (in some respect), twitter, usenet, etc.
If I access them with a single program on a single device, this program keeps track if what I have seen so far and (at least in default mode) only presents me the new stuff. Excellent. Only that I use more than one device to access these services: I have my desktop, a laptop at home, a netbook on the road or in meetings and seminars, sometimes I even use other peoples devices.
What I want is a way that all these different devices have a possibility to share the information which items I have already seen. C'mon, this can't be that hard to implement. And I am sure I am not the only person with more than one computer.
The problem is most pronounced with RSS feeds. As mentioned some time ago, I used Liferea read blogs. This is where I first noted I would like to share the "read it flag" info between different computers. I thought that program might have a file like .newsrc in the old days where it records for each feed which posts have been seen before. Then it shouldn't be too hard to put this under subversion control or write a small perl script to merge those files. Except that information is not kept in a simple format. Instead, liferea keeps the downloaded content in some xml file and that file has flags. In order to merge the state of read items one would have to download them as does liferea and then sync the flags. Why do they have to mix the content and the meta information, why why why? I even looked at liferea's source but I couldn't see the possibility of an easy patch.
I "solved" this by giving up on liferea and moving to Google Reader instead. Since there, feeds are not read locally but on a single server, there is no problem with sharing state information. Only that I am not very comfortable with letting google know which feeds I like. And maybe at some point the GUI gets on my nerves or whatever. I don't think this is an ideal solution.
For more or less the same reason I use a similar approach to mail: All my different addresses' mail ends up in the inbox of my desktop computer (except for mailing lists etc which are sorted in appropriate alternative inboxes but on the same computer). In case I want to read mail on a different computer I ssh to the desktop. Except that it is not directly connected to the net. I first have to ssh to LMU's firewall. But that computer still cannot see my desktop. So from there I ssh to some PC in the Arnold Sommerfeld Center. From where eventually I can ssh to my computer (although only with a numeric ip since my computer is not important enough to get a DNS entry. And so far I was too lazy to hook it up to dyndns. But over recent months dhcp was kind enough to always give me the same ip and thus this chain of computers is supported by appropriate entires in .ssh/config . But I am digressing.
What I wanted to say about mail is not so much I don't want to use IMAP and one central mail server. It is also about saved mail in local folders. Those I have to many and too much volume to put them all in IMAP. At least on publicly accessible computers. And on my desktop (where I can install whatever I want) it would be of no use since this is always two hops away from the rest of the internet, at least for ingoing connections. OK, I could set up ssh tunnels. But those usually do not work reliably over longer times (we are talking at least weeks, I want to have reliable access to my mail even if I travel for longer time).
But again, the main problem seems to be sharing the 'read it' information.
One more incarnation of the same problem: ITunes does of course not exist for Linux. So I manage my ipod with Amarok. I would like to use my different computers to upload recent podcasts to the ipod but have not found a way to do this consistently.
If I access them with a single program on a single device, this program keeps track if what I have seen so far and (at least in default mode) only presents me the new stuff. Excellent. Only that I use more than one device to access these services: I have my desktop, a laptop at home, a netbook on the road or in meetings and seminars, sometimes I even use other peoples devices.
What I want is a way that all these different devices have a possibility to share the information which items I have already seen. C'mon, this can't be that hard to implement. And I am sure I am not the only person with more than one computer.
The problem is most pronounced with RSS feeds. As mentioned some time ago, I used Liferea read blogs. This is where I first noted I would like to share the "read it flag" info between different computers. I thought that program might have a file like .newsrc in the old days where it records for each feed which posts have been seen before. Then it shouldn't be too hard to put this under subversion control or write a small perl script to merge those files. Except that information is not kept in a simple format. Instead, liferea keeps the downloaded content in some xml file and that file has
I "solved" this by giving up on liferea and moving to Google Reader instead. Since there, feeds are not read locally but on a single server, there is no problem with sharing state information. Only that I am not very comfortable with letting google know which feeds I like. And maybe at some point the GUI gets on my nerves or whatever. I don't think this is an ideal solution.
For more or less the same reason I use a similar approach to mail: All my different addresses' mail ends up in the inbox of my desktop computer (except for mailing lists etc which are sorted in appropriate alternative inboxes but on the same computer). In case I want to read mail on a different computer I ssh to the desktop. Except that it is not directly connected to the net. I first have to ssh to LMU's firewall. But that computer still cannot see my desktop. So from there I ssh to some PC in the Arnold Sommerfeld Center. From where eventually I can ssh to my computer (although only with a numeric ip since my computer is not important enough to get a DNS entry. And so far I was too lazy to hook it up to dyndns. But over recent months dhcp was kind enough to always give me the same ip and thus this chain of computers is supported by appropriate entires in .ssh/config . But I am digressing.
What I wanted to say about mail is not so much I don't want to use IMAP and one central mail server. It is also about saved mail in local folders. Those I have to many and too much volume to put them all in IMAP. At least on publicly accessible computers. And on my desktop (where I can install whatever I want) it would be of no use since this is always two hops away from the rest of the internet, at least for ingoing connections. OK, I could set up ssh tunnels. But those usually do not work reliably over longer times (we are talking at least weeks, I want to have reliable access to my mail even if I travel for longer time).
But again, the main problem seems to be sharing the 'read it' information.
One more incarnation of the same problem: ITunes does of course not exist for Linux. So I manage my ipod with Amarok. I would like to use my different computers to upload recent podcasts to the ipod but have not found a way to do this consistently.
Friday, January 02, 2009
Thermodynamics of gravitational systems
In this last (first?) post of the year I would like to express some confusion I have with respect to applying thermodynamic reasoning to cosmology or in general situations governed by gravity. The main puzzle I would like to understand is the question regarding the entropy balance of the universe: According to the second law of thermodynamics, entropy is never decreasing (I hope this is the correct sign, I can never remember it. Let's see, S is minus trace rho log rho. If rho is proportional to the projector on an N dimensional subspace we have S = - N 1/N log(1/N) = log(N). Thus it is increasing if the probability spreads over a larger subspace. Good). So if it is increasing, it should have been minimal at the big bang which seems to be at conflict with the universe being a hot soup of all kinds of fluctuations right after the big bang.
With the popular science interpretation of entropy as a measure of disorder or negative information the early universe must have been highly ordered and should have contained maximal information, a notion which is highly counter intuitive. So this needs some clearing up.
The simplest resolution would be that it is compatible with observation to assume that the universe has infinite volume and if it has a finite entropy density the entropy is infinite and any discussion of increasing or decreasing entropy is meaningless as it will be infinite at any time and it does not make sense to talk about more or less infinite entropy.
We could however try to still make sense in a local, desitised version: We could make the usual cosmology assumption of the universe being pretty much homogeneous and talk solely of entropy densities (after all, we only observe a Hubble sized ball of it and should thus only make appropriate local statements). But since the universe is expanding should we use co-moving or constant volumes when computing the densities when applying a desitised second law? But I don't think this is the real problem.
I am much more worried about another point: I am not convinced it makes sense to apply thermodynamic reasoning to situations that involve gravity! Obviously, the universe as we see it is not in thermal equilibrium, all the interesting stuff we see are local fluctuations. So standard textbook equilibrium thermodynamics does not apply: Remember for example, temperature is a property of an equilibrium, the fact it is well defined is sometimes called the zeroth law and out of equilibrium situations do not have a temperature! Only if locally things are not too different from an equilibrium state one can assign something like a local temperature. But things are even worse: The usual systems that we are used to describe thermodynamically (steam engines, containers of gas etc) have the property that the equilibrium is an attractor of the dynamics: All kinds of small, local perturbations diffuse away exponentially fast. This is in line with our intuitive understanding of the second law: The homogeneous state is the one with the highest entropy and thus the diffusion is governed by the second law.
This is not the case anymore as soon as gravity is the dominating force: What is different here is that gravity is always attractive. Thus if you have a nearly homogeneous matter distribution with small local fluctuations, over-dense regions will gravitate even stronger and thus will be even denser while under-dense regions will gravitate less and will become even emptier. Thus the contrast is increasing over time (a feature which is of course essential to structure formation of galaxies, stars etc). But this means the equilibrium is unstable. This is at least in conflict with the naive understanding of the second law above.
Some deeper inspection reveals that when you axiomatise thermodynamics you usually make some assumption on convexity (or concavity, depending on whether you use intensive or extensive variables of state) of your favorite thermodynamic potential (free energy etc). IIRC this is something you impose. Your system has to fulfill this property in order to be described by thermodynamics. And it seems that gravity does not have this property (the stability) and it quite possible (if I am not mistaken, sitting here in a train without any books or internet access) thermodynamic arguments do not apply to gravity.
Note well that I am talking classically (actually even only about the weak field situation in which the fluctuations are well described by Newtonian gravity), I have not even mentioned black holes and their negative heat capacity due to Hawking radiation which should make you even more uneasy about thermodynamic stability.
There is however a related problem my classically relativistic friends told me about: When discussing cosmology, it is usually a good first approximation that the universe is homogeneous which supposedly it is at large scales. At small scales however, this is obviously not the case with voids, galaxies, stars, stones etc. But for the evolution at large scales you average all those local fluctuations and replace everything by the cosmological fluid.
The problem with the non-linear theory of gravity is however that it is by far not obvious that this averaging commutes with time evolution: That, starting from good initial conditions it does not matter if you first average and then compute the time evolution of the averaged matter density or if you first compute the time evolution and then to the spatial averaging. The first thing is of course what we always compute while the second thing is what really happens. An incarnation of this problem was an argument that was discussed a few years ago that what looks like the cosmological constant in our local patch of the universe is just a density fluctuation with a super-horizon wave length. At first you would reject such a suggestion since something that happens over regions that are causally disconnected from us should not influence our local observations. However, due to the non-linear nature of gravity this argument is too fast and needed a more thorough inspection. My impression is that eventually it was decided that this idea does not work. I would be happy to be informed by somebody follows these things more closely.
To wrap up, I feel that I would need to have to understand much more basic things about thermodynamics applied to gravity before I could make sensible statements about the entropy of the universe or Boltzmann brains and the similar.
With the popular science interpretation of entropy as a measure of disorder or negative information the early universe must have been highly ordered and should have contained maximal information, a notion which is highly counter intuitive. So this needs some clearing up.
The simplest resolution would be that it is compatible with observation to assume that the universe has infinite volume and if it has a finite entropy density the entropy is infinite and any discussion of increasing or decreasing entropy is meaningless as it will be infinite at any time and it does not make sense to talk about more or less infinite entropy.
We could however try to still make sense in a local, desitised version: We could make the usual cosmology assumption of the universe being pretty much homogeneous and talk solely of entropy densities (after all, we only observe a Hubble sized ball of it and should thus only make appropriate local statements). But since the universe is expanding should we use co-moving or constant volumes when computing the densities when applying a desitised second law? But I don't think this is the real problem.
I am much more worried about another point: I am not convinced it makes sense to apply thermodynamic reasoning to situations that involve gravity! Obviously, the universe as we see it is not in thermal equilibrium, all the interesting stuff we see are local fluctuations. So standard textbook equilibrium thermodynamics does not apply: Remember for example, temperature is a property of an equilibrium, the fact it is well defined is sometimes called the zeroth law and out of equilibrium situations do not have a temperature! Only if locally things are not too different from an equilibrium state one can assign something like a local temperature. But things are even worse: The usual systems that we are used to describe thermodynamically (steam engines, containers of gas etc) have the property that the equilibrium is an attractor of the dynamics: All kinds of small, local perturbations diffuse away exponentially fast. This is in line with our intuitive understanding of the second law: The homogeneous state is the one with the highest entropy and thus the diffusion is governed by the second law.
This is not the case anymore as soon as gravity is the dominating force: What is different here is that gravity is always attractive. Thus if you have a nearly homogeneous matter distribution with small local fluctuations, over-dense regions will gravitate even stronger and thus will be even denser while under-dense regions will gravitate less and will become even emptier. Thus the contrast is increasing over time (a feature which is of course essential to structure formation of galaxies, stars etc). But this means the equilibrium is unstable. This is at least in conflict with the naive understanding of the second law above.
Some deeper inspection reveals that when you axiomatise thermodynamics you usually make some assumption on convexity (or concavity, depending on whether you use intensive or extensive variables of state) of your favorite thermodynamic potential (free energy etc). IIRC this is something you impose. Your system has to fulfill this property in order to be described by thermodynamics. And it seems that gravity does not have this property (the stability) and it quite possible (if I am not mistaken, sitting here in a train without any books or internet access) thermodynamic arguments do not apply to gravity.
Note well that I am talking classically (actually even only about the weak field situation in which the fluctuations are well described by Newtonian gravity), I have not even mentioned black holes and their negative heat capacity due to Hawking radiation which should make you even more uneasy about thermodynamic stability.
There is however a related problem my classically relativistic friends told me about: When discussing cosmology, it is usually a good first approximation that the universe is homogeneous which supposedly it is at large scales. At small scales however, this is obviously not the case with voids, galaxies, stars, stones etc. But for the evolution at large scales you average all those local fluctuations and replace everything by the cosmological fluid.
The problem with the non-linear theory of gravity is however that it is by far not obvious that this averaging commutes with time evolution: That, starting from good initial conditions it does not matter if you first average and then compute the time evolution of the averaged matter density or if you first compute the time evolution and then to the spatial averaging. The first thing is of course what we always compute while the second thing is what really happens. An incarnation of this problem was an argument that was discussed a few years ago that what looks like the cosmological constant in our local patch of the universe is just a density fluctuation with a super-horizon wave length. At first you would reject such a suggestion since something that happens over regions that are causally disconnected from us should not influence our local observations. However, due to the non-linear nature of gravity this argument is too fast and needed a more thorough inspection. My impression is that eventually it was decided that this idea does not work. I would be happy to be informed by somebody follows these things more closely.
To wrap up, I feel that I would need to have to understand much more basic things about thermodynamics applied to gravity before I could make sensible statements about the entropy of the universe or Boltzmann brains and the similar.
Subscribe to:
Posts (Atom)