Dimitris Achlioptas at TEDxAthens 2010, in Athens, Greece, gives a great talk on how to share your brain.
Dimitris Achlioptas delivers an insightful talk on Algorithmic Barriers from Phase Transitions.
Transport is a fundamental underpinning of almost every physical system. We are interested in traffic flow on highways, congestion of packets on the Internet, flow of nutrients through the body, flocking behavior of birds in flight, formation of river networks, etc.
These all rely on transportation and flow of physical substances. In almost all scenarios we see a range of self-organized behaviors in the flow, such as formation of vortices or spiral waves (i.e., we see the formation of patterns and order with no external control of guidance). Understanding what gives rise to these patterns requires insights ranging from mathematics and physics, to social science and economics.
Of greater interest is understanding what causes jamming, in other words, what causes the flow to breakdown and stop. Very often we find that jamming happens abruptly. For instance, for a low density of cars on a highway, the flow is smooth and even. Increase the density a tiny bit, and the whole system can come to a complete standstill.
Simple models that capture this abrupt phase transition from flow to jamming have been found to describe a wide range of physical scenarios, and thus have a “universal” nature. Extending these models to realistic scenarios of highway traffic, or flow on a network is an outstanding challenge, and an active area of research. Another outstanding challenge is extending these models to describe flow in biological systems.
In this session we bring together experts from the US and Japan who study flow and jamming and are addressing the outstanding challenges. K. Nishinari, from the University of Tokyo studies self-driven flows, jams, and flocking behavior. Jen Schwarz, from Syracuse University studies dynamical phase transitions,
statistical approaches to computational complexity, jamming, biophysics, and Boolean networks. She recently introduced new models which challenge our conventional understanding of the “universal” nature of the jamming phase transition.
Numerous problems in applied mathematics and statistics require integrating a function f over a high-dimensional domain.
For example, estimating the partition function of a graphical model for a fixed set of parameters requires integrating (summing) its unnormalized probability function f over all possible configurations (value assignments).
The most common methods for performing such integration stochastically involve Markov Chains (MCMC).
We present an alternative to MCMC based on ideas from the theory of modern error-correcting codes.
Specifically, we will see that stochastically integrating a non-negative function f over its entire domain can often be achieved at the cost of merely maximizing f over a few random subsets of its domain.
The key lies in specifying subsets with appropriate statistical properties in a manner than does not dramatically affect the capacity for maximization.
Using real-life CNF formulas as benchmarks, our approach of encoding the subsets as cosets of LDPC error-correcting codes allows us to achieve dramatic speedup and levels of accuracy that were previously completely unattainable.
The best-known model of random graphs was introduced 50 years ago, by Erdos and Renyi (ER): take n vertices and connect each pair of them, independently, with probability p(n).
By now, and thousands of research papers later, we know that ER random graphs have a number of amazing properties, many of which we understand very well.
A bit more slowly we have also come to understand that ER random graphs can be poor models of “network realities”: it is impossible to place the vertices of a (typical) ER random graph in a low-dimensional space, so that for each pair of vertices, their geometric distance is a good approximation of their graphical (shortest-path) distance. In other words, ER random graphs are inherently high-dimensional objects, in contrast to many real networks.
In this tutorial talk we will survey some of the fundamental results for ER random graphs and use them as springboards to introduce (and compare with) alternative random graph models.
We consider the task of summing (integrating) a non-negative function over a discrete domain, e.g., to compute the partition function of a graphical model.
It is known that in a probabilistic approximate sense, summation can be reduced to maximization over random subsets of the domain defined by systems of linear equations modulo 2 (parity constraints).
Unfortunately, random parity constraints with many variables are computationally intractable, while random parity constraints with few variables have poor statistical performance.