Mathematics Department and Graduate School Colloquium
Archive
2001-2002
- Tuesday May 21, 2002
Joan Verdera (Universitat Autonoma de
Barcelona)
Analytic capacity and Calderon-Zygmund Theory
- Abstract:
More than one hundred years ago Painleve raised the problem of
describing in geometric (or metric) terms the removable sets for
bounded analytic functions. The problem is still unsolved but
important progress has been made recently using real variable
methods in geometric contexts. In particular, the Calderon-Zygmund
theory of the Cauchy kernel has played a fundamental role in recent
developments. The purpose of the talk is to present the problem,
discuss the early contributions by Denjoy, Garabedian and Calderon
and have a glimpse at some of the main ideas behind the more recent
contributions.
- Tuesday May 14, 2002
Brendan Hassett (Rice University)
Density of rational points on K3 surfaces
- Abstract:
Let X be an algebraic variety with
equations in a number field. Perhaps the
most fundamental question of arithmetic geometry
is the Density Problem: Are rational points
dense in X or are they confined to a proper
subvariety? Faltings' proof of the Mordell
conjecture gives nondensity for curves of
genus at least two, and it has been conjectured
by Lang and Bombieri that similar results hold
whenever the canonical class is positive.
We consider the case of K3 surfaces, where the
canonical class is zero and the conjectural
picture is much murkier. Nevertheless, there are density
theorems by Bogomolov, Harris, and Tschinkel for
special classes of K3 surfaces, and weak
density results for general K3 surfaces due to
Tschinkel and the speaker.
- Tuesday April 30, 2002
DIFFERENT LOCATION: EE1 031
Maciej Zworski (UC Berkeley)
Trace formulae, zeta functions and the classical/quantum correspondence
- Abstract:
Since the work of Selberg in the 1950s trace formulae have become
one of the most elegant ways of describing the classical/quantum
correspondence: one side of the formula is given in terms of classical
closed orbits and the other side in terms of spectral or scattering
information. In symmetric situations, the trace formulae are exact,
but in more general semi-classical (Gutzwiller) or geometric
(Duistermaat-Guillemin) situations they are only asymptotic.
The developments of computing power and of microlocal techniques
have led to new progress in the study of trace formulae both in
physics and mathematics. Sophisticated tools from dynamics,
such as the dynamical zeta functions and Birkhoff normal forms,
have played a significant role in this. In my talk I would like
describe these concepts and the way in which they help in our
understanding of the classical/quantum correspondence.
- Friday April 26, 2002
Sara Billey (MIT)
A root system description of pattern avoidance with applications to
Schubert varieties
- Abstract:
A permutation "avoids a pattern" if the corresponding matrix contains
no submatrix specified by the pattern. The notion of pattern
avoidance dates back to the stack-sortable permutations due to the
combined work of Knuth and Tarjan. Pattern avoidance has been used to
classify several notions for permutations and signed permutations,
particularly pertaining to Bruhat order. For example, Lakshmibai and
Sandhya have shown that smooth Schubert varieties are characterized by
permutations avoiding 4231 and 3412. In this talk, we propose a
new generalization of pattern avoidance which can be applied to all
root systems and their Weyl groups. The main theorem shows that for
- Tuesday April 23, 2002
Louis J. Billera (Cornell University)
Geometry of the Space of Phylogenetic Trees
- Abstract:
We consider a continuous space that models the set of all
phylogenetic trees having a fixed set of leaves. This
space has a natural metric of nonpositive curvature (i.e.,
it is CAT(0) in the sense of Gromov), giving a way of
measuring distance between phylogenetic trees and
providing some procedures for averaging or otherwise
doing statistical analyses on sets of trees on a common set
of species.
This geometric model of tree space provides a setting in
which questions that have been posed by biologists and
statisticians over the last decade can be approached in a
systematic fashion. For example, it provides a justification
for disregarding portions of a collection of trees that
agree, thus simplifying the space in which comparisons are
to be made.
This is joint work with Susan Holmes and Karen Vogtmann:
http://www.math.cornell.edu/~vogtmann/Trees/lap.pdf
- Tuesday April 16, 2002
Weian Zheng (UC Irvine)
Rate of Convergence in Homogenization of Parabolic PDEs
- Abstract:
We consider the solutions to
${\partial \over \partialt}u^{(n)}=a^{(n)}(x)\Delta u^{(n)}$
where $\{a^{(n)}(x)\}_{n=1,2,...}$ are random fields
satisfying a ``well-mixing" condition
(which is different to the usual ``strong mixing" condition).
We estimate in this paper the rate of
convergence of $u^{(n)}$ to the solution of a Laplace equation.
Since our equation is of simple form, we get quite strong result
which covers the previous homogenization results obtained
on this equation.
- Tuesday March 12, 2002
Special Location: Communications 120
Terry Rockafellar (UW)
Convex Analysis and Duality in Variational Problems
-
Abstract:
The calculus of variations is the oldest branch of mathematics
centered on problems that we now recognize as part of the larger
subject of optimization. Convexity has long had a basic role in
it, especially in defining the Hamiltonian function by means of the
Legendre transformation so as to obtain the canonical first-order
differential equations that equivalently express the classical
Euler-Lagrange condition for optimality.
The Legendre transformation, as a means of dualizing convex functions,
was hugely extended by Fenchel around 1950 in work inspired by the
surge in interest in optimization that arose with the advent of
computers. It was aimed at generalizing patterns such as linear
programming duality, however, and its potential in the calculus of
variations initially went unrecognized.
This talk will describe how the Legendre-Fenchel transformation and
related developments in convex analysis lead to a much broader version
of the calculus of variations. As in linear programming duality,
problems appear in pairs that are inextricably tied together in
optimality. The Euler-Lagrange equations and Hamiltonian equations
achieve a generalized formulation in terms of so-called subgradients.
- Tuesday February 19, 2002
Special Location:
PAB A110
Mina Aganagic (Harvard University)
Geometric Physics
-
Abstract:
In string theory, there is a deep interplay between physics and
geometry. This is not without precedent. For example, the
relationship between classical gravity and Riemannian geometry is
central to General Relativity. String theory, however, unifies
gravity, quantum mechanics, and gauge theory, so the interaction is
particularly rich and profound. My aim in this talk is to illustrate
this.
Mina Aganagic is a candidate for an assistant professor position
in physics.
- Tuesday February 12, 2002
Kai Behrend
(University of British Columbia)
An invitation to stacks.
-
Abstract:
The idea of stacks goes back to Grothendieck in the 1960s. For a long time
there was little interest in the topic, but in recent years stacks have
become more and more indispensible as a tool in many areas of mathematics.
One gets stacks if one tries to endow certain types of categories with
geometric structure.
This will be an elementary introduction to some of the ideas underlying the
theory of stacks. We will try to give the audience some idea of what a stack
is and why stacks are very basic mathematical objects, even more basic then
spaces or groups.
- Friday, 4:00pm February 8, 2002,
Thomson Hall 234
Michael Thaddeus (Columbia University)
Mirror symmetry and Langlands duality.
-
Abstract:
Strominger, Yau, and Zaslow have proposed that a Calabi-Yau orbifold and
its mirror should fiber over the same real orbifold, with special
Lagrangian fibers which are tori dual to each other. We present some
compelling evidence for this conjecture: a set of examples of hyperkahler
orbifolds where the mirror can be explicitly constructed, and the equality
of Hodge numbers predicted by mirror symmetry can be completely verified.
The examples arise as moduli spaces of flat connections on a 4-torus with
compact structure group; the mirror is the corresponding space with the
Langlands dual structure group.
- Tuesday January 29, 2002
Special Location: Thomson Hall 135
Oded Schramm (Microsoft Research)
Conformally invariant scaling limits: Brownian motion, percolation, and
loop-erased random walk.
- Abstract:
Many interesting random models in two dimensions have been
conjectured to converge to conformally invariant processes under
scaling. Some
of these conjectures are now established. The talk will survey
significant
recent advances in this area by several authors. We will define and
describe a
random path SLE(k), depending on one real parameter k>0. It has been
proven
that SLE(6) describes the interface between critical percolation
clusters and
has the same outer boundary as that of two-dimensional Brownian motion.
SLE(2)
is the scaling limit of the loop-erased random walk and SLE(8) is the
scaling
limit of the uniform spanning tree Peano path. There are various
further
conjectures regarding processes converging to SLE with other parameters
k. In
particular, the intractable self-avoiding walk is conjectured to
converge to
SLE(8/3).
- Tuesday January 22, 2002
Francisco Santos (UC Davis)
A non-connected toric Hilbert scheme
- Abstract:
The toric Hilbert scheme associated with a non-negative integer
$n\times d$ matrix $A$ is the parameter space of all $A$-homogeneous
ideals $I \subset K[x_1,\dots,x_n]$ with linear codimension one in all the
possible degrees $b\in A{\mathbb N}^n \subseteq {\mathbb N}^d$. If
$A=[1,1,\dots,1]$ this agrees with the usual Hilbert scheme of homogeneous
ideals whose quotient algebra has Hilbert series equal to 1 in all
degrees.
Results of Maclagan and Thomas relate connectivity of the toric Hilbert
scheme to connectivity of the graph of triangulations of the vector
configuration given by the columns of $A$. For example, a not connected
graph of triangulations containing unimodular triangulations in at least
two different components implies that the toric Hilbert scheme is
not connected.
Using this property, we present the first known example of such
non-connected Hilbert schemes. The example has $n=48$ and $d=6$ and is
actually the product of the vertex set of a 24-cell and a segment,
suitably homogenized.
- Tuesday January 15, 2002
András Stipsicz
(Princeton University)
Topological properties of Stein domains
- Abstract:
After introducing the geography problem for various classes
of 4-manifolds (e.g. complex, symplectic, irreducible), we discuss
a symplectic "surgery" operation. This leads us to reconsider
the geography problem for 4-manifolds carrying some extra
structure (both in the interior and on the boundary). Special
techniques (including Legendrian surgery and Seiberg-Witten
theory) for studying these "Stein domains" will be described,
and we conclude with a list of results concerning topological
properties of these 4-manifolds.
- Monday December 3, 2001
Location: Smith 211
John Dunagan (MIT)
Perturbations to Linear Programs
- Abstract:
Consider an arbitrary linear program Ax < b with m constraints in d
dimensions. Let each element of the constraint matrix be subject to a
small random Gaussian perturbation of variance sigma^2. We show that a
simple classical algorithm for solving linear programs, the perceptron
algorithm[1954], succeeds in finding a feasible point (if one exists) in
O~(m^2 d^3 /(sigma^2 delta)) iterations with probability at
least 1-delta. This is joint work with Avrim Blum to appear in SODA '02.
We proceed to analyze Renegar's condition number for linear programs
under the same perturbation model. Condition numbers measure the
ill-posedness of a problem; they are ubiquitous in numerical analysis
and scientific computing. Numerous interior point methods have recently
been analyzed using Renegar's condition number. We show that the
condition number is O~(m^2 d^2 /(sigma^2 delta)) with
probability at least 1-delta. This is joint work with Dan Spielman (MIT)
and Shang-Hua Teng (UIUC -> BU) submitted to the SIAM Conference
on Optimization.
Both of these results can be viewed in the smoothed complexity model
introduced by Spielman and Teng. Smoothed analysis interpolates between
worst-case and average-case analysis by measuring the running time on an
arbitrary instance under a random perturbation. A large perturbation yields
traditional average-case analysis; as the perturbation becomes vanishingly
small, we obtain traditional worst-case analysis. In this framework, our
results are that the perceptron algorithm and condition number have
polynomial smoothed complexity with high probability.
- November 27, 2001
Location: Communications 120
Sven Leyffer (University of Dundee)
How the Grinch solved MPECs Mathematical Programs with Equilibrium
Constraints.
- Abstract:
Equilibrium constraints in the form of complementarity
conditions, and more generally variational inequalities,
often appear as constraints in optimization problems.
Applications of equilibrium constraints are widespread
and fast growing. They cover very diverse areas such as
the design of structures involving friction, elasto-
hydrodynamic lubrication, taxation models, the modeling
of competition in deregulated electricity markets and
transportation network design.
Over recent years, it has become evident that equilibrium
constraints cannot be solved satisfactorily with standard
techniques for Nonlinear Programming (NLP). Both numerical
and theoretical evidence has been advanced which support
this view.
This talk is aimed at a general mathematical audience and
starts by introducing and reviewing equilibrium constraints.
We then give some applications which emphasize the usefulness
and elegance of equilibrium constraints as a modeling tool.
Next, the assertion that standard techniques for NLP cannot
be applied to equilibrium constraints is re-examined and some
startling numerical evidence is presented using our own NLP
solver.
The talk concludes by examining the local convergence properties
of certain NLP methods applied to MPECs. It is shown that a simple
constraint relaxation strategy allows a proof of second order
convergence to be given under reasonable assumptions. A number of
illustrative examples are presented which show that some of the
assumptions are difficult to relax.
- November 20, 2001
Location: Communications 120
Lisa Korf (University of Washington)
Martingale Pricing Measures in Incomplete Markets via Stochastic
Programming Duality in the Dual of L^Infinity.
- Abstract:
The goal of this lecture is to set forth a new framework for analyzing
pricing theory for incomplete markets and contingent claims, using
conjugate duality and optimization theory. Various statements in the
Mathematical Finance literature of the Fundamental Theorem of Asset
Pricing give conditions under which an essentially arbitrage-free market
is equivalent to the existence of an equivalent martingale measure, and a
formula for the fair price of a contingent claim as an expectation with
respect to such a measure. In the setting of incomplete markets, the fair
price is not attainable as such a particular expectation, but rather as a
supremum over an infinite set of equivalent martingale measures. In this
lecture, the problem is considered as a stochastic program and pricing
results derived for quite general discrete time processes. It will be
shown that in its most general form, the martingale pricing measure is
attainable if it is permitted to be finitely additive. This setup also
gives rise to a natural way of analyzing models with risk preferences,
spreads and margin constraints, and other problem variants that cannot be
handled in the classical setting. We'll consider a discrete time,
multi-stage, infinite probability space setting and derive the basic
results of arbitrage pricing in this framework.
- October 30, 2001
Henry Cohn (Microsoft Research)
Sphere Packing and Harmonic Analysis
- Abstract:
The sphere packing problem asks for the optimal packing density in R^n i.e.,
what fraction of R^n can be covered by non-overlapping unit balls. This is not
only a natural geometric question, but it is also relevant to error-correcting
codes. In 1972, Desarte discovered a powerful technique, called linear
programming bounds, for proving upper bounds on packing densities. These bounds
apply to many different sorts of packing problems, and typically yield the best
bounds known. Applying them effectively amounts to a problem in harmonic
analysis. The talk will explain all this and survey work in this area,
particularly recent work of the speaker with Noam Elkies on sphere packing.
- October 23, 2001
Rodrigo Banuelos (Purdue University)
Generalized Isoperimetric Inequalities
- Abstract:
The rearrangement inequalities for multiple integrals of H.S. Brascamp, E.H.
Lieb, and J.M. Luttinger provide a powerful and elegant method for proving
many of the classical geometric and physical isoperimetric inequalities for
regions in Euclidean space.
These include, amongst others, the classical isoperimetric inequality, the
Raleigh-Faber-Krahn inequality for the lowest eigenvalue of regions of fixed
volume, isoperimetric inequalities for the trace of the Dirichlet heat kernel,
and the Polya-Szego isoperimetric inequality for electrostatic capacity.
After discussing some of these classical results, we will present new versions
of multiple integral inequalities from which other "generalized" isoperimetric
inequalities for heat kernels of Schrodinger operators follow. Besides being of
independent interest, such isoperimetric inequalities for heat kernels imply
sharp i\nequalities for the lowest eigenvalue and the spectral gap of the
Dirichlet Laplacian in certain convex regions of fixed diameter and fixed inradius (radius of largest ball in the region). In particular, these results
improve the spectral gap bounds of I.M. Singer-B.Wang-S.T. Yau-S.S.T.Yau and
prove some special cases of a conjecture of M. van den Berg (Problem #44 in
Yau's 1990 "open problems in geometry") on the size of the spectral gap.
This talk is particulary designed for a general audience. We will discuss
results, show some pictures, but provide as few technicalities as possible.
|