Twice a year, around the last Sunday in March and the last Sunday in October, everybody (in particular newspaper journalists) take a few minutes off to rant about daylight savings time. So, for this first time, I want to join this tradition in writing.
Until I had kids, I could not care less about the question of changing the time twice a year. But at least for our kids (and then secondary also for myself), I realize biorhythm is quite strong and at takes more than a week to adopt to the 1 hour jet lag (in particular in spring when it means getting out of bed "one hour earlier"). I still don't really care about cows that have to deliver their milk at different times since there is no intrinsic reason that the clock on the wall has to show a particular time when it is done and if it were really a problem, the farmers could do it at fixed UTC.
So, obviously, it is a nuisance. So what are the benefit that justify it? Well, obviously, in summer the sun sets at a later hour and we get more sun when being outside in the summer. That sounds reasonable. But why restrict it to the summer?
Which brings me to my point: If you ask me, I want to get rid of changing the offset to UTC twice a year and want to permanently adopt daylight saving time.
But now I hear people cry that this is "unnatural", we have to have the traditional time at least in the winter when it does not matter as it's too cold to be outside (which only holds for people with defective clothing as we know). So how natural is CET (the time zone we set our clocks to in winter), let's take people living in Munich for an example?
First of all: It is not solar time! CET is the "mean solar time" when you live at a longitude of 15 degrees east, which is (assuming the latitude) close to Neumarkt an der Ypps somewhere in Austria not too far from Vienna. Munich is about 20 minutes behind. So, this time is artificial as well, and Berlin being closer to 15 degrees, it is probably Prussian.
Also a common time zone for Germany was established only in the 1870s when the advent of railways and telegraphs make synchronization between different local times advantageous. So this "natural" time is not that old either.
It is so new, that Christ Church college in Oxford still refuses to fully bow to it: Their clock tower shows Greenwich time. And the cathedral services start according to solar time (about five minutes later) because they don't care about modern shenanigans. ("How many Oxford deans does it take to change a light bulb?" ---- "Change??!??"). Similarly, in Bristol, there is a famous clock with two minute hands.
Plus, even if you live in Neumarkt an der Ybbs, your sun dial does not always show the correct noon! Thanks to the tilt of the earth axis and the fact that the orbit of the earth is elliptic, this varies through the year by a number of minutes:
So, "winter time" is in no way more natural than the other time zone. So we should be free to choose a time zone according to what is convenient. At least for me, noon is not the center of my waking hours (it's more 5,5 : 12). So, aligning those more with the sun seems to be a pretty good idea.
PS: The title was a typo, but looking at it I prefer it the way it is...
Thursday, October 27, 2016
Monday, October 24, 2016
Mandatory liability for software is a horrible idea
Over the last few days, a number of prominent web sites including Twitter, Snapchat and Github were effectively unreachable for an extended period of time. As became clear, the problem was that DynDNS, a provider of DNS services for these sites was under a number of very heavy DDoS (distributed denial of service) attack that were mainly coming from compromised internet of things devices, in particular web cams.
Even though I do not see a lot of benefit from being able to change the color of my bedroom light via internet, I love the idea to have lots of cheap devices (I continue to have a lot of fun with C.H.I.P.s, full scale Linux computers with a number of ports for just 5USD, also for Subsurface, in particular those open opportunities for the mobile version), there are of course concerns how one can economically have a stable update cycle for those, in particular once they are build into black-box customer devices.
Now, after some dust settled comes of course the question "Who is to blame?" and should be do anything about this. Of course, the manufacturer of the web cam made this possible through far from perfect firmware. Also, you could blame DynDNS for not being able to withstand the storms that from time to time sweep the internet (a pretty rough place after all) or the services like Twitter to have a single point of failure in DynDNS (but that might be hard to prevent given the nature of the DNS system).
More than once I have now heard a call for new laws that would introduce a liability for the manufacturer of the web cam as they did not provide firmware updates in time that prevent these devices from being owned and then DDoSing around on the internet.
This, I am convinced, would be a terrible idea: It would make many IT businesses totally uneconomic. Let's stick for example with the case at hand. What is the order of magnitude of damages that occurred to the big companies like Twitter? They probably lost ad revenue of about a weekend. Twitter recently made $6\cdot 10^8\$ $ per quarter, which averages to 6.5 million per day. Should the web cam manufacturer (or OEM or distributor) now owe Twitter 13 million dollars? I am sure that would cause immediate bankruptcy. Or just the risk that this could happen would prevent anybody from producing web cams or similar things in the future. As nobody can produce non-trivial software that is free of bugs. You should strive to weed out all known bugs and provide updates, of course, but should you be made responsible if you couldn't? Responsible in a financial sense?
What was the damage cause by the heart bleed bug? I am sure this was much more expensive. Who should pay for this? OpenSSL? Everybody that links against OpenSSL? The person that committed the wrong patch? The person that missed it code review?
Even if you don't call up these astronomic sums and have fixed fine (e.g. an unfixed vulnerability that gives root access to an attacker from the net costs 10000$) that would immediately stop all open source development. If you give away your software for free, do you really want to pay fines if not everything is perfect? I surely wouldn't.
For that reason, the GPL has the clauses (and other open source licenses have similar ones) stating
Even though I do not see a lot of benefit from being able to change the color of my bedroom light via internet, I love the idea to have lots of cheap devices (I continue to have a lot of fun with C.H.I.P.s, full scale Linux computers with a number of ports for just 5USD, also for Subsurface, in particular those open opportunities for the mobile version), there are of course concerns how one can economically have a stable update cycle for those, in particular once they are build into black-box customer devices.
Now, after some dust settled comes of course the question "Who is to blame?" and should be do anything about this. Of course, the manufacturer of the web cam made this possible through far from perfect firmware. Also, you could blame DynDNS for not being able to withstand the storms that from time to time sweep the internet (a pretty rough place after all) or the services like Twitter to have a single point of failure in DynDNS (but that might be hard to prevent given the nature of the DNS system).
More than once I have now heard a call for new laws that would introduce a liability for the manufacturer of the web cam as they did not provide firmware updates in time that prevent these devices from being owned and then DDoSing around on the internet.
This, I am convinced, would be a terrible idea: It would make many IT businesses totally uneconomic. Let's stick for example with the case at hand. What is the order of magnitude of damages that occurred to the big companies like Twitter? They probably lost ad revenue of about a weekend. Twitter recently made $6\cdot 10^8\$ $ per quarter, which averages to 6.5 million per day. Should the web cam manufacturer (or OEM or distributor) now owe Twitter 13 million dollars? I am sure that would cause immediate bankruptcy. Or just the risk that this could happen would prevent anybody from producing web cams or similar things in the future. As nobody can produce non-trivial software that is free of bugs. You should strive to weed out all known bugs and provide updates, of course, but should you be made responsible if you couldn't? Responsible in a financial sense?
What was the damage cause by the heart bleed bug? I am sure this was much more expensive. Who should pay for this? OpenSSL? Everybody that links against OpenSSL? The person that committed the wrong patch? The person that missed it code review?
Even if you don't call up these astronomic sums and have fixed fine (e.g. an unfixed vulnerability that gives root access to an attacker from the net costs 10000$) that would immediately stop all open source development. If you give away your software for free, do you really want to pay fines if not everything is perfect? I surely wouldn't.
For that reason, the GPL has the clauses (and other open source licenses have similar ones) stating
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
(capitalization in the original). Of course, there is "required by applicable law" but I cannot see people giving you software for free if you later make them pay fines.
And for course, it is also almost impossible to make exceptions in the law for this. For example, a "non-commercial" exception does not help as even though you do not charge for open source software a lot of it is actually provided with some sort of commercial interest.
Yes, I can understand the tendency to make creators of defective products that don't give a damn about an update path responsible for the stuff they ship out. And I have the greatest sympathy for consumer protection laws. But here, there collateral damage would be huge (we might well lose the whole open source universe every small software company except the few big one that can afford the herds of lawyers to defend against these fines).
Note that I only argue for mandatory liability. It should of course always be a possibility that a provider of software/hardware give some sort of "fit for purpose" guarantee to its customers or a servicing contract where they promise to fix bugs (maybe so that the customer can fulfill their liabilities to their customers herself). But in most of the cases, the provider will charge for that. And the price might be higher than currently that for a light bulb with an IP address.
Note that I only argue for mandatory liability. It should of course always be a possibility that a provider of software/hardware give some sort of "fit for purpose" guarantee to its customers or a servicing contract where they promise to fix bugs (maybe so that the customer can fulfill their liabilities to their customers herself). But in most of the cases, the provider will charge for that. And the price might be higher than currently that for a light bulb with an IP address.
The internet is a rough place. If you expose your service to it better make sure you can handle every combination of 0s and 1s that comes in from there or live with it. Don't blame the source of the bits (no matter how brain dead the people at the other end might be).
Friday, October 07, 2016
My two cents on this year's physics Nobel prize
This year's Nobel prize is given for quite abstract concepts. So the popular science outlets struggle in giving good explanations for what it is awarded for. I cannot add anything to this, but over at math overflow, mathematicians asked for a mathematical explanation. So here is my go of an outline for people familiar with topology but not so much physics:
Let me try to give a brief explanation: All this is in the context of Fermi liquid theory, the idea that you can describe the low energy physics of these kinds of systems by pretending they are generated by free fermions in an external potential. So, all you need to do is to solve the single particle problem for the external potential and then fill up the energy levels from the bottom until you reach the total particle number (or actually the density). It is tempting (and conventional) to call these particles electrons, and I will do so here, but of course actual electrons are not free but interacting. This "Fermi Liquid" explanation is just and effective description for long wavelength (the IR end of the renormalization group flow) where it turns out, that at those scales the interactions play no role (they are "irrelevant operators" in the language of the renormalization group).
The upshot is, we are dealing with free "electrons" and the previous paragraph was only essential if you want to connect to the physical world (but this is MATH overflow anyway).
Since the external potential comes from a lattice (crystal) it is invariant under lattice translations. So Bloch theory tells you, you can restrict your attention as far as solving the Schrödinger equation to wave functions living in the unit cell of the lattice. But you need to allow for quasi-periodic boundary conditions, i.e. when you go once around the unit cell you are allowed to pick up a phase. In fact, there is one phase for each generator of the first homotopy group of the unit cell. Each choice of these phases corresponds to one choice of boundary conditions for the wave function and you can compute the eigenvalues of the Hamiltonian for these given boundary conditions (the unit cell is compact so we expect discrete eigenvalues, bounded from below).
But these eigenvalues depend on the boundary conditions and you can think of the as a function of the phases. Each of the phases takes values in U(1) so the space of possible phases is a torus and you can think of the eigenvalues as functions on the torus. Actually, when going once around an irreducible cycle of the torus not all eigenvalues have to come back to themselves, you can end up with a permutation it this is not really a function but a section of a bundle but let's not worry too much about this as generally this "level crossing" does not happen in two dimensions and only at discrete points in 3D (this is Witten's argument with the 2x2 Hamiltonian above).
The torus of possible phases is called the "Brioullin zone" (sp?) by physicists and its elements "inverse lattice vectors" (as you can think of the Brioullin zone as obtained from modding out the dual lattice of the lattice we started with).
Now if your electron density is N electrons per unit cell of the lattice Fermi Liquid theory asks you to think of the lowest N energy levels as occupied. This is the "Fermi level" or more precisely the graph of the N-th eigenvalue over the Bioullin zone. This graph (views as a hyper-surface) can have non-trivial topology and the idea is that by doing small perturbations to the system (like changing the doping of the physical probe or changing the pressure or external magnetic field or whatever) stuff behaves continuously and thus the homotopy class cannot change and is thus robust (or "topological" as the physicist would say).
If we want to inquire about the quantum Hall effect, this picture is also useful: The Hall conductivity can be computed to leading order by linear response theory. This allows us to employ the Kubo formula to compute it as a certain two-point function or retarded Green's function. The relevant operators turn out to be related to the N-th level wave function and how it changes when we move around in the Brioullin zone: If we denote by u the coordinates of the Brioullin zone and by $\psi_u(x)$ the N-th eigenfunction for the boundary conditions implied by u, we can define a 1-form
$$ A = \sum_i \langle \psi_u|\partial_{u_i}|\psi_u\rangle\, du^i = \langle\psi_u|d_u|\psi\rangle.$$
This 1-form is actually the connection of a U(1) bundle and the expression the Kubo-formula asks us to compute turns out to be the first Chern number of that bundle (over the Brioullin zone).
Again that, as in integer, cannot change upon small perturbations of the physical system and this is the explanation of the levels in the QHE.
In modern applications, an important role is played by the (N-dimensional and thus finite dimensional) projector the subspace of Hilbert space spanned by the eigenfunctions corresponding to he N lowest eigenvalues, again fibered over the Brioullin zone. Then one can use K-theory (and KO-theory in fact) related to this projector to classify the possible classes of Fermi surfaces (these are the "topological phases of matter", as eventually, when the perturbation becomes too strong even the discrete invariants can jump which then physically corresponds to a phase transition).
Let me try to give a brief explanation: All this is in the context of Fermi liquid theory, the idea that you can describe the low energy physics of these kinds of systems by pretending they are generated by free fermions in an external potential. So, all you need to do is to solve the single particle problem for the external potential and then fill up the energy levels from the bottom until you reach the total particle number (or actually the density). It is tempting (and conventional) to call these particles electrons, and I will do so here, but of course actual electrons are not free but interacting. This "Fermi Liquid" explanation is just and effective description for long wavelength (the IR end of the renormalization group flow) where it turns out, that at those scales the interactions play no role (they are "irrelevant operators" in the language of the renormalization group).
The upshot is, we are dealing with free "electrons" and the previous paragraph was only essential if you want to connect to the physical world (but this is MATH overflow anyway).
Since the external potential comes from a lattice (crystal) it is invariant under lattice translations. So Bloch theory tells you, you can restrict your attention as far as solving the Schrödinger equation to wave functions living in the unit cell of the lattice. But you need to allow for quasi-periodic boundary conditions, i.e. when you go once around the unit cell you are allowed to pick up a phase. In fact, there is one phase for each generator of the first homotopy group of the unit cell. Each choice of these phases corresponds to one choice of boundary conditions for the wave function and you can compute the eigenvalues of the Hamiltonian for these given boundary conditions (the unit cell is compact so we expect discrete eigenvalues, bounded from below).
But these eigenvalues depend on the boundary conditions and you can think of the as a function of the phases. Each of the phases takes values in U(1) so the space of possible phases is a torus and you can think of the eigenvalues as functions on the torus. Actually, when going once around an irreducible cycle of the torus not all eigenvalues have to come back to themselves, you can end up with a permutation it this is not really a function but a section of a bundle but let's not worry too much about this as generally this "level crossing" does not happen in two dimensions and only at discrete points in 3D (this is Witten's argument with the 2x2 Hamiltonian above).
The torus of possible phases is called the "Brioullin zone" (sp?) by physicists and its elements "inverse lattice vectors" (as you can think of the Brioullin zone as obtained from modding out the dual lattice of the lattice we started with).
Now if your electron density is N electrons per unit cell of the lattice Fermi Liquid theory asks you to think of the lowest N energy levels as occupied. This is the "Fermi level" or more precisely the graph of the N-th eigenvalue over the Bioullin zone. This graph (views as a hyper-surface) can have non-trivial topology and the idea is that by doing small perturbations to the system (like changing the doping of the physical probe or changing the pressure or external magnetic field or whatever) stuff behaves continuously and thus the homotopy class cannot change and is thus robust (or "topological" as the physicist would say).
If we want to inquire about the quantum Hall effect, this picture is also useful: The Hall conductivity can be computed to leading order by linear response theory. This allows us to employ the Kubo formula to compute it as a certain two-point function or retarded Green's function. The relevant operators turn out to be related to the N-th level wave function and how it changes when we move around in the Brioullin zone: If we denote by u the coordinates of the Brioullin zone and by $\psi_u(x)$ the N-th eigenfunction for the boundary conditions implied by u, we can define a 1-form
$$ A = \sum_i \langle \psi_u|\partial_{u_i}|\psi_u\rangle\, du^i = \langle\psi_u|d_u|\psi\rangle.$$
This 1-form is actually the connection of a U(1) bundle and the expression the Kubo-formula asks us to compute turns out to be the first Chern number of that bundle (over the Brioullin zone).
Again that, as in integer, cannot change upon small perturbations of the physical system and this is the explanation of the levels in the QHE.
In modern applications, an important role is played by the (N-dimensional and thus finite dimensional) projector the subspace of Hilbert space spanned by the eigenfunctions corresponding to he N lowest eigenvalues, again fibered over the Brioullin zone. Then one can use K-theory (and KO-theory in fact) related to this projector to classify the possible classes of Fermi surfaces (these are the "topological phases of matter", as eventually, when the perturbation becomes too strong even the discrete invariants can jump which then physically corresponds to a phase transition).
My two cents on this years physics Nobel prize
This year's Nobel prize is given for quite abstract concepts. So the popular science outlets struggle in giving good explanations for what it is awarded for. I cannot add anything to this, but over at math overflow, mathematicians asked for a mathematical explanation. So here is my go of an outline for people familiar with topology but not so much physics:
Let me try to give a brief explanation: All this is in the context of Fermi liquid theory, the idea that you can describe the low energy physics of these kinds of systems by pretending they are generated by free fermions in an external potential. So, all you need to do is to solve the single particle problem for the external potential and then fill up the energy levels from the bottom until you reach the total particle number (or actually the density). It is tempting (and conventional) to call these particles electrons, and I will do so here, but of course actual electrons are not free but interacting. This "Fermi Liquid" explanation is just and effective description for long wavelength (the IR end of the renormalization group flow) where it turns out, that at those scales the interactions play no role (they are "irrelevant operators" in the language of the renormalization group).
The upshot is, we are dealing with free "electrons" and the previous paragraph was only essential if you want to connect to the physical world (but this is MATH overflow anyway).
Since the external potential comes from a lattice (crystal) it is invariant under lattice translations. So Bloch theory tells you, you can restrict your attention as far as solving the Schrödinger equation to wave functions living in the unit cell of the lattice. But you need to allow for quasi-periodic boundary conditions, i.e. when you go once around the unit cell you are allowed to pick up a phase. In fact, there is one phase for each generator of the first homotopy group of the unit cell. Each choice of these phases corresponds to one choice of boundary conditions for the wave function and you can compute the eigenvalues of the Hamiltonian for these given boundary conditions (the unit cell is compact so we expect discrete eigenvalues, bounded from below).
But these eigenvalues depend on the boundary conditions and you can think of the as a function of the phases. Each of the phases takes values in U(1) so the space of possible phases is a torus and you can think of the eigenvalues as functions on the torus. Actually, when going once around an irreducible cycle of the torus not all eigenvalues have to come back to themselves, you can end up with a permutation it this is not really a function but a section of a bundle but let's not worry too much about this as generally this "level crossing" does not happen in two dimensions and only at discrete points in 3D (this is Witten's argument with the 2x2 Hamiltonian above).
The torus of possible phases is called the "Brioullin zone" (sp?) by physicists and its elements "inverse lattice vectors" (as you can think of the Brioullin zone as obtained from modding out the dual lattice of the lattice we started with).
Now if your electron density is N electrons per unit cell of the lattice Fermi Liquid theory asks you to think of the lowest N energy levels as occupied. This is the "Fermi level" or more precisely the graph of the N-th eigenvalue over the Bioullin zone. This graph (views as a hyper-surface) can have non-trivial topology and the idea is that by doing small perturbations to the system (like changing the doping of the physical probe or changing the pressure or external magnetic field or whatever) stuff behaves continuously and thus the homotopy class cannot change and is thus robust (or "topological" as the physicist would say).
If we want to inquire about the quantum Hall effect, this picture is also useful: The Hall conductivity can be computed to leading order by linear response theory. This allows us to employ the Kubo formula to compute it as a certain two-point function or retarded Green's function. The relevant operators turn out to be related to the N-th level wave function and how it changes when we move around in the Brioullin zone: If we denote by u the coordinates of the Brioullin zone and by $\psi_u(x)$ the N-th eigenfunction for the boundary conditions implied by u, we can define a 1-form
$$ A = \sum_i \langle \psi_u|\partial_{u_i}|\psi_u\rangle\, du^i = \langle\psi_u|d_u|\psi\rangle.$$
This 1-form is actually the connection of a U(1) bundle and the expression the Kubo-formula asks us to compute turns out to be the first Chern number of that bundle (over the Brioullin zone).
Again that, as in integer, cannot change upon small perturbations of the physical system and this is the explanation of the levels in the QHE.
In modern applications, an important role is played by the (N-dimensional and thus finite dimensional) projector the subspace of Hilbert space spanned by the eigenfunctions corresponding to he N lowest eigenvalues, again fibered over the Brioullin zone. Then one can use K-theory (and KO-theory in fact) related to this projector to classify the possible classes of Fermi surfaces (these are the "topological phases of matter", as eventually, when the perturbation becomes too strong even the discrete invariants can jump which then physically corresponds to a phase transition).
Let me try to give a brief explanation: All this is in the context of Fermi liquid theory, the idea that you can describe the low energy physics of these kinds of systems by pretending they are generated by free fermions in an external potential. So, all you need to do is to solve the single particle problem for the external potential and then fill up the energy levels from the bottom until you reach the total particle number (or actually the density). It is tempting (and conventional) to call these particles electrons, and I will do so here, but of course actual electrons are not free but interacting. This "Fermi Liquid" explanation is just and effective description for long wavelength (the IR end of the renormalization group flow) where it turns out, that at those scales the interactions play no role (they are "irrelevant operators" in the language of the renormalization group).
The upshot is, we are dealing with free "electrons" and the previous paragraph was only essential if you want to connect to the physical world (but this is MATH overflow anyway).
Since the external potential comes from a lattice (crystal) it is invariant under lattice translations. So Bloch theory tells you, you can restrict your attention as far as solving the Schrödinger equation to wave functions living in the unit cell of the lattice. But you need to allow for quasi-periodic boundary conditions, i.e. when you go once around the unit cell you are allowed to pick up a phase. In fact, there is one phase for each generator of the first homotopy group of the unit cell. Each choice of these phases corresponds to one choice of boundary conditions for the wave function and you can compute the eigenvalues of the Hamiltonian for these given boundary conditions (the unit cell is compact so we expect discrete eigenvalues, bounded from below).
But these eigenvalues depend on the boundary conditions and you can think of the as a function of the phases. Each of the phases takes values in U(1) so the space of possible phases is a torus and you can think of the eigenvalues as functions on the torus. Actually, when going once around an irreducible cycle of the torus not all eigenvalues have to come back to themselves, you can end up with a permutation it this is not really a function but a section of a bundle but let's not worry too much about this as generally this "level crossing" does not happen in two dimensions and only at discrete points in 3D (this is Witten's argument with the 2x2 Hamiltonian above).
The torus of possible phases is called the "Brioullin zone" (sp?) by physicists and its elements "inverse lattice vectors" (as you can think of the Brioullin zone as obtained from modding out the dual lattice of the lattice we started with).
Now if your electron density is N electrons per unit cell of the lattice Fermi Liquid theory asks you to think of the lowest N energy levels as occupied. This is the "Fermi level" or more precisely the graph of the N-th eigenvalue over the Bioullin zone. This graph (views as a hyper-surface) can have non-trivial topology and the idea is that by doing small perturbations to the system (like changing the doping of the physical probe or changing the pressure or external magnetic field or whatever) stuff behaves continuously and thus the homotopy class cannot change and is thus robust (or "topological" as the physicist would say).
If we want to inquire about the quantum Hall effect, this picture is also useful: The Hall conductivity can be computed to leading order by linear response theory. This allows us to employ the Kubo formula to compute it as a certain two-point function or retarded Green's function. The relevant operators turn out to be related to the N-th level wave function and how it changes when we move around in the Brioullin zone: If we denote by u the coordinates of the Brioullin zone and by $\psi_u(x)$ the N-th eigenfunction for the boundary conditions implied by u, we can define a 1-form
$$ A = \sum_i \langle \psi_u|\partial_{u_i}|\psi_u\rangle\, du^i = \langle\psi_u|d_u|\psi\rangle.$$
This 1-form is actually the connection of a U(1) bundle and the expression the Kubo-formula asks us to compute turns out to be the first Chern number of that bundle (over the Brioullin zone).
Again that, as in integer, cannot change upon small perturbations of the physical system and this is the explanation of the levels in the QHE.
In modern applications, an important role is played by the (N-dimensional and thus finite dimensional) projector the subspace of Hilbert space spanned by the eigenfunctions corresponding to he N lowest eigenvalues, again fibered over the Brioullin zone. Then one can use K-theory (and KO-theory in fact) related to this projector to classify the possible classes of Fermi surfaces (these are the "topological phases of matter", as eventually, when the perturbation becomes too strong even the discrete invariants can jump which then physically corresponds to a phase transition).
Subscribe to:
Posts (Atom)