# Stability of eigenfunctions

### From Polymath1Wiki

In [CZ1994] some bounds for the Neumann heat kernels *P*_{t}(*x*,*y*) on a general domain are given; in particular, one can bound this kernel by a multiple of the Euclidean heat kernel for t small enough. Using this bound, [BP2008] showed uniform stability of the second eigenfunction (in the uniform topology) and eigenvalue with respect to uniform perturbations of the domain.

## Contents |

## Formal theory

Suppose one has a one-parameter family of self-adjoint operators (on some Hilbert space, e.g. *L*^{2}(Ω), though for this formal computation the domain will not be important), and one-parameter families , to the eigenfunction equation

*L*(*t*)*u*(*t*) = λ(*t*)*u*(*t*). (1)

We normalise *u*(*t*) to have norm 1:

- . (2)

Formally, if we differentiate the norm equation (2) at time zero, we get

- (3)

while if we differentiate (1) at time zero we obtain

- . (4)

Taking the inner product of (4) with u(0) and using (2), (3) we conclude the *Hadamard first variation formula* for eigenvalues:

- . (5)

If we instead take the orthogonal projection onto the orthogonal complement of *u*(0), we obtain the *Hadamard first variation formula* for eigenfunctions:

- . (6)

Formally, if λ is a simple eigenvalue, then *L*(0) − λ is invertible on the orthogonal complement of *u*(0), and so (6) and (3) allow one to solve for .

## L^2 and H^1 theory

Suppose that one has a domain Ω with second Neumann eigenvalue λ_{2}, and third Neumann eigenvalue (*not* counting multiplicity) λ_{3} > λ_{2}. Thus, one has

- (7)

for all mean zero u, with equality when u lies in the second eigenspace *V*_{2}, and one can improve this to

- (8)

when u is orthogonal to *V*_{2}.

Now consider a perturbation *B*Ω of Ω, where B is an invertible linear transformation. Then a second eigenfunction of *B*Ω, after change of variables, becomes a function u on Ω that minimizes the modified Rayleigh quotient

where *M*: = (*B*^{ − 1})(*B*^{ − 1})^{T}. We may normalize this eigenfunction as *u* = *u*_{2} + *v*, where u_2 is a unit eigenfunction in V_2 and v is orthogonal to V_2, so that . Then the modified Rayleigh quotient of u is less than or equal to that of u_2, and hence

Expanding out u as u_2+v and rearranging, we end up at

Note that is orthogonal to by integration by parts, and so we may replace on the RHS by the orthogonal projection . Letting σ_{1}(*M*) = σ_{2}(*B*)^{ − 2} be the least singular value of M, we now apply Cauchy-Schwarz and conclude that

- .

By (8) we may bound , and so we conclude that

- . (9)

This gives an H^1 bound on the error v between the perturbed eigenfunction u and the original eigenfunction u_2. To understand this bound, note that we may upper bound

and also so that

and so one has

provided that the denominator is positive. In terms of the condition number κ: = σ_{2}(*B*) / σ_{1}(*B*) of B, this becomes

- (15)

which is a non-trivial bound when κ < (λ_{3} / λ_{2})^{1 / 2}, and is of the order of when κ is close to 1. Using (8), we conclude in particular that

- (16)

For these calculations performed on a third reference triangle, http://www.math.sfu.ca/~nigam/polymath-figures/Perturbation.pdf

## An alternate approach to the L^2 and H^1 theory

Let us keep the reference triangle Ω fixed, and view the matrix M=M(t) as being smoothly time dependent, so that the eigenvalues λ_{k} = λ_{k}(*t*) and L^2-normalised eigenfunctions *u*_{k} = *u*_{k}(*t*) are also time dependent. Assume for the sake of argument that eigenvalues stay simple and all functions depend smoothly on t. We have the eigenfunction equation

and the Neumann boundary condition

and the L^2 normalisation

- .

Differentiating, we conclude that

- (10)

and

- (11)

and

- . (12)

Taking the inner product of (10) with u_k and using (12) yields

- .

Integrating by parts using the Neumann condition and (11) yields

- .

By the eigenfunction equation and (12), the first inner product on the RHS vanishes. By Stokes theorem one has

and thus we have the variation formula

- . (14)

Next, if we take the inner product of (10) with u_l for some l distinct from k, one has

- .

Integrating by parts as before, we have

- .

By the eigenfunction equation, the first inner product on the RHS is . By Stokes theorem we have

and thus

and thus by eigenfunction expansion (and (12))

where the convergence is in an unconditional L^2 sense. (Note that is an orthonormal system and so from Bessel's inequality we know that , which is enough decay to justify the L^2 convergence of the RHS.) In fact we may differentiate and conclude that

where the convergence is again in the unconditional L^2 sense (i.e. the previous convergence was in the unconditional H^1 sense). From the Bessel inequality we see in particular that

- ;

in particular, we have

and thus

which can be viewed as infinitesimal variants of the estimates in the previous section.

## A Sobolev inequality

**Lemma**Suppose that obeys the inhomogeneous eigenfunction equation − Δ*u*= λ*u*+*F*. Suppose also that is a smooth function that equals 1 at 0 and vanishes on . Then

**Proof** We introduce the averaged functions

and observe that obeys the inhomogeneous Bessel equation

with initial condition .

We can solve this equation by separation of variables, writing

- .

Using the standard Wronskian identity

one soon arrives at the equations

and

- .

Meanwhile, by integration by parts

and thus

From Cauchy-Schwarz one has

and

and similarly

The claim then follows from further applications of the Cauchy-Schwarz inequality. []

In principle, the above inequality, when combined with the previous L^2 and H^1 estimates, gives L^infty control on in the interior of the triangle. One should also be able to get control near the interior of an edge by reflection. One needs to modify the argument a bit though to handle what is going on at vertices.

Here is an alternate approach that avoids the use of Bessel functions of the first and second kinds. Suppose one has a solution to the inhomogeneous eigenfunction equation − Δ*u* = λ*u* + *F* on a ball B(0,R). Let η be a smooth (or at least C^2) cutoff on this ball which equals 1 at the origin. Then *u*η is compactly supported with Laplacian

and hence by the fundamental solution to the Laplacian

and hence by Cauchy-Schwarz

Because Δ(η*u*) has mean zero, one can also replace log | *x* | here by log | *x* | − *C* for any constant C, for instance one can use log( | *x* | / *R*).

## Sobolev inequalities in logarithmic coordinates

The computations appear to become cleaner if we work in logarithmic coordinates

- (
*s*,θ): = (log*r*,θ).

Thus, for instance, a sector in polar coordinates, becomes a half-infinite strip in logarithmic coordinates. A triangle Ω which contains this sector as one of its corners, then becomes a slightly enlarged version logΩ of this half-infinite strip, with a concave boundary connecting the two half-infinite edges.

Consider an eigenfunction − Δ*u* = λ*u* in the original domain Ω. In polar coordinates, the eigenfunction equation is

the Rayleigh quotient is

and the Neumann boundary condition on the two radial edges are

- .

If we convert to logarithmic coordinates by the change of variables

*v*(*s*,θ): =*u*(*e*^{s},θ)

then one can compute that the eigenfunction equation now becomes

- (A.1)

the Rayleigh quotient becomes

and the Neumann boundary conditions on the two half-infinite edges are

- . (A.2)

We can reflect logΩ infinitely often across the half-infinite edges, extending v to a half-infinite domain that contains the half-space . Thanks to the Neumann condition (A.2), v will remain smooth and obey the eigenfunction equation (A.1) in the entirety of this half-space.

Let's suppose that we have normalised our original eigenfunction u to have an L^2 norm of 1, then after transforming into logarithmic coordinates we have

- (A.3)

and hence by the Rayleigh quotient we have H^1 control:

- . (A.4)

Now we can start moving towards L^infty control of v. We first look at the mean

- .

From the eigenfunction equation and the Neumann boundary condition, the mean obeys the equation

- . (A.5)

On the other hand, the L^2 normalisation of v and Cauchy-Schwarz gives

- (A.6)

and the H^1 bounds similarly gives

- . (A.7)

Among other things, (A.7) tells us that must go to zero along some subsequence as . (Indeed, a Bessel expansion of the original eigenfunction shows that it goes to zero uniformly, but we won't need this fact here.) Integrating (A.5) and using (A.6) and Cauchy-Schwarz then gives us the bound

- (A.8)

and hence on a second integration

- . (A.9)

From the triangle inequality, we conclude that

- (A.10)

and hence by (A.6)

- (A.11)

(which already bounds the original eigenfunction at the vertex) and hence by (A.9)

- . (A.12)

This gives L^infty bounds on the mean . Now we move on to bounding v itself. We have the Sobolev inequality

- (A.13)

so it will suffice to get pointwise bounds on the angular energy

- (A.14)

since we then have

- (A.15)

from the triangle inequality.

Note that (A.4) gives an integrated bound

- . (A.16)

On the other hand, differentiating (A.1) in θ gives

and hence by integration by parts

for any test function η. Bounding

we conclude that

- .

In particular, if η is a function just of s and is supported in , we conclude that

- . (A.17)

Combining this with (A.16) for various choices of cutoff η gives integral control on , and this combined with (A.16) gives pointwise control on E(s) and hence on v(s). This gives L^infty control on the eigenfunction u near the vertex. (The computations are somewhat ugly, though, particularly if one wants to optimise in eta.)

## H^2 theory

Formally, if on a triangle Ω, a function v obeys the inhomogeneous Laplace equation

- − Δ
*v*=*F*(B.1)

with homogeneous Neumann boundary data , then after two integrations by parts we have the H^2 bound

(Bochner-Weitzenbock identity.) The integrations by parts can be justified if v is smooth on Omega, with first derivative uniformly bounded and second derivative blowing up by at most at each vertex for some , by the usual trick of inserting a cutoff function to smoothly cut out the integral at the vertices, performing the integration by parts, and then sending the cutoff parameter to zero. I think this sort of regularity for acute-angled triangles can be achieved in practice without difficulty.

Now suppose that one is solving the inhomogeneous Laplace equation (B.1) but with inhomogeneous Neumann data

- . (B.2)

Then the integration by parts becomes much less favorable, as there are now a number of nasty boundary terms to take care of. But suppose that one can find a reasonably smooth function w (of the same regularity needed for v, i.e. first derivative uniformly bounded and second derivative blowing up slower than 1/r) which solves (B.2) but not (B.1), i.e.

on the boundary. Then the difference v-w solves (B.1) with forcing term *F* + Δ*w* and thus

From the pointwise estimate and the triangle inequality we conclude that

- (B.3)

We can apply this to the first variation of the second eigenfunction at M=I. From the L^2 and H^1 stability theory one has

with boundary condition

- (B.4)

and thus we have the H^2 bound

- (B.5)

where w is any function with boundary value (B.4). From the L^2 stability theory one has

and so one has bounds on everything on the RHS of (B.5) except for the last term.

To control this term, we divide the triangle Ω = *A**B**C* into three overlapping regions Ω_{A},Ω_{B},Ω_{C}, each one being a region around one of the three vertices A,B,C that avoids the other two vertices, as well as the edge joining those vertices. In each of these regions, we find a function w that solves the boundary condition (B.5) in that region, which obeys good H^2 bounds; combining these functions together by a partition of unity will then give a global w in *H*^{2}(Ω) that obeys (B.5) on the entire boundary.

Let's look at the region Ω_{A} near the vertex A, which we set to be the origin. In polar coordinates (*r*,θ), we know that on the edges AB, AC, and so the boundary condition (B.4) becomes

on AB and

on AC for some explicit scalar constants *c*_{AB},*c*_{AC} depending only on geometry of the triangle ABC and on . On Ω_{A}, we can build a function w_A that exhibits this data by the formula

*w*_{A}(*x*): =*u*(Φ(*x*))

where is a smooth function that is the identity on the edges AB, BC inside Ω_{A}, and whose Hessian maps the outward normal n to a vector with radial component *c*_{AB} on AB and *c*_{AC} on AC. It is not difficult to see that one can choose such a function on Ω_{A} so that and (one can choose a function of the form for a suitable function f). The function w_A defined as above will solve the Neumann conditions, and from the chain rule one has

and so

A similar estimate (with some additional lower order terms) obtains when one glues together the three partial solutions w_A, w_B, w_C to create a solution w to the boundary problem with

To finish the task of estimating in H^2, one therefore needs bounds on (and similarly for the other two vertices of the triangle). I think such bounds can be obtained as follows. Let's work in Ω_{A} with some suitable smooth cutoff and ignore terms coming from the derivatives of the cutoff (which don't involve the singularity 1/|x| and so are easy to estimate). If we subtract off then u now has mean zero in angular directions. From integration by parts we have

But if A has angle α, then the Poincare inequality in the angular direction gives

and so as α < π, we can bound as required.

## An alternate approach to the H^2 theory

Yet another approach to the H^2 theory, which seems to give better results, comes from using Schwarz-Christoffel transformations, with the upper half-plane H being the reference domain.

More precisely, given three angles α + β + γ = π for a triangle, we can find a Schwarz-Christoffel transformation from the half-plane *H* = {*z*:*I**m*(*z*) > 0} to a triangle Ω_{α,β} with angles α,β,γ by solving the differential equation

with initial condition Φ_{α,β}(0) = 0, keeping Φ_{α,β} conformal. In particular, Φ_{α,β}(*z*) behaves like near the origin, near 1, and at infinity, for some complex numbers A,B,C with A=0. The image Ω_{α,β} is a triangle ABC with angles α,β,γ.

Write *w* = Φ_{α,β}(*z*), then

The Lebesgue measure dm_w on the triangle is related to the Lebesgue measure dm_z on the half-plane by the change of variables formula

We write the expression as *e*^{2ω}, where

The Rayleigh quotient, pulled back to the reference half-plane H, is then

and so the second eigenfunction u_2, in the reference half-plane will obey the equation

- − Δ
*u*_{2}= λ_{2}*e*^{2ω}*u*_{2}(C.1)

with Neumann boundary condition

- (C.2)

on the boundary of the half-plane and the mean zero condition

- . (C.3)

We can normalise u_2 in L^2 norm,

- (C.4)

which by the Rayleigh quotient immediately gives an H^1 bound

- (C.5)

Now suppose that we vary α,β smoothly with respect to a time parameter; this means that ω will also vary by the formula

- (C.6)

The eigenvalue can also vary at a rate which can be controlled by the previous perturbation theory (e.g. equation (14)). As for the variation of the eigenfunction , we can compute this by differentiating (C.1) to obtain

- . (C.7)

Also, on differentating (C.2) we see that conveniently satisfies a homogeneous Neumann condition:

- (C.8)

Using the orthonormal basis *u*_{k}*e*^{ω} of *L*^{2}(*H*,*d**m*_{z}) as in the previous notes on L^2 and H^1 stability, we can obtain an explicit formula for . Indeed, integrating (C.7) against another eigenfunction u_l and integrating by parts using (C.1), (C.8) gives

and thus

and hence by the eigenfunction expansion

which among other things gives the L^2 identity

and hence by the Bessel inequality

- (C.9)

where X is the quantity

- . (C.10)

The quantity cannot quite be bounded directly by (C.4) because of the logarithmic divergence in the weight , but it should be controllable by also employing (C.5) and (C.3), perhaps after transferring back to the triangle ABC.

The quantity X also controls . Indeed, on integrating (C.7) against u_2 and using (C.1), (C.2), (C.4) we obtain that

and hence by Cauchy-Schwarz and (C.4)

- (C.11)

If we now multiply (C.7) by *e*^{ − ω} and then take L^2 norms using the triangle inequality and the previous estimates (C.9), (C.10), (C.11) we conclude that

which simplifies to

We can transfer this back to the triangle Ω_{α,β} = *A**B**C* by the change of variables

giving

which after two integration by parts (Bochner-Weitzenbock inequality) gives an H^2 bound on v:

By Sobolev embedding, (noting also that we have L^2 and H^1 control on v) we can thus obtain L^infty control on v (or equivalently, u) in terms of the quantity X. So the main task is to understand how to bound X, which in the ABC coordinates is basically the L^2 norm of v weighted by a reasonably explicit logarithmic weight.

This is done more explicitly at this set of notes.

The Bessel function *J*_{ν}(*x*) for has the Weierstrass factorisation

where *j*_{ν,i} are the positive zeroes of *J*_{ν}; see e.g. this link.

In particular, we have

It is known that *j*_{ν,i} are non-decreasing in ν (page 508 of WATSON, G.N., A Treatise on the Theory of Bessel Functions, 2nd ed., Cambridge University Press, 1944; one can also use here the Sturm comparison theorem). Thus if and , then
is non-increasing in ν, and thus

- . (1)

Now, consider a solution to the eigenfunction equation − Δ*u* = λ*u* in the sector for some acute 0 < α < π / 2 and some radius *R* > 0, obeying the Neumann conditions for θ = 0,α. For technical reasons we will also need the upper bound

- (2)

which seems to hold in practice.

By separation of variables, we can write u in polar coordinates as

for some coefficients *c*_{k}. From (2) and the triangle inequality we thus have the bound

for any . To bound the coefficients, we assume an L^2 normalisation

The left-hand side may be expanded as

and thus

where A is the quantity

- (3)

Thus by the Cauchy-Schwarz inequality, we have

- (4)

where B(r_0) is the quantity

We may upper bound B(r_0) as follows. We may reduce the range of integration from [0,R] to [*r*_{1},*R*] for some intermediate radius .
By (1) and (2), we may replace by
. Thus

One may then sum the geometric series and conclude that

- (5)

for any , where

One can use (5) to exclude extrema for small values of r_0. For instance, (5) shows that | *u*(*r*_{0},θ) | cannot exceed | *u*(0) | unless

The left hand side behaves like r_0^2, and the right-hand side like , so for acute α this inequality becomes impossible for small enough *r*_{0} (and "small enough" can be made numerically explicit if one is willing to compute with Bessel functions). Things become bad when α is close to a right angle, but it seems in those cases that the extremum is obtained elsewhere.