## Click here for the workshop’s homepage

This is a tentative program and subject to change.

**Barriers in Computational Complexity workshop**

Click here for PDF version of program.

Talks are in Room 101 in the Friend Center auditorium, lunch is located in the Friend center convocation room (room 113). Breakout rooms available are Friend 004 and 006.

** **

**Tue, Aug 25: Boolean complexity**

** **

Organizers: Scott Aaronson, Alexander Razborov

08:00 Breakfast

09:30 – 10:20 : Alexander Razborov, *Somewhat Impartial Survey** of Results in Circuit Complexity *alexander_razborov_slides hi-res video low-res video

10:30 – 11:20 : Scott Aaronson, *Has There Been Progress** on the P vs. NP Question? *scott_aaronson_slides hi-res video low-res video

NP and reality P vs NP independent? Fortnow survey Sipser survey Sipser video Quantum adiabatic 1 Quantum adiabatic 2 solution space of k-sat

11:20 – 11:40 Break

11:40 – 12:30 : Benjamin Rossman, *The monotone complexity** of k-clique on random graphs *ben_rossman_slides

12:30 – 14:30 Lunch

14:30 – 15:20 : Ketan Mulmuley, *On P vs. NP, Geometric Complexity** Theory, Explicit Proofs and the Complexity Barrier *ketan_mulmuley_slides hi-res video low-res video

all papers available from Mulmuley’s homepage see also videos of IAS lectures

15:20 – 17:00 Break

17:00 – 18:00 : Panel discussion, moderated by Scott Aaronson hi-res video low-res video

**Wed, Aug 26: Arithmetic computation**

** **

Organizers: Ran Raz, Amir Shpilka.

08:00 Breakfast

09:30 – 11:20 : Amir Shpilka: *A (biased) survey of arithmetic circuits *amir_shpilka_slides hi-res video low-res video

11:20 – 11:40 Break

11:40 – 12:30 : Amir Yehudayoff: *On the structure of arithmetic** circuits, with applications *amir_yehoudayoff_slides hi-res video low-res video

12:30 – 14:30: Lunch

14:30 – 15:20 : Chris Umans: *Group-theoretic algorithms for matrix** Multiplication *chris_umans_slides hi-res video low-res video

* *

15:40 – 16:30 : Ran Raz: *How to fool people to work on circuit** lower bounds *ran_raz_slides hi-res video low-res video

16:30 – 17:00 Break

17:00 – 18:00 : Rump Session + Open Problems hi-res video low-res video

18:30 – 21:00: ** Workshop Banquet, **Palmer House

* *

** **

**Thu, Aug 27: Proof complexity**

** **

Organizers: Toni Pitassi, Pavel Pudlak

8:00 Breakfast

09:00 – 09:45 Steve Cook, *Branching Programs: Avoiding Barriers *steve_cook_slides hi-res video low-res video

09:50 – 10:35 Pavel Pudlak, *Proof complexity of search problems. *pavel_pudlak_slides hi-res video low-res video

10:35-10:50 Break 10:50 – 11:35 Toniann Pitassi, *Hardness Amplification in** Proof Complexity *tonnian_pitassi_slides hi-res video low-res video

11:40 – 12:25 Jan Krajicek, *Hard Tautologies *jan_krajicek_slides hi-res video low-res video

12:30- 14:00 Lunch

14:00 – 14:45 Russell Impagliazzo, *An Axiomatic Approach** to Algebrization *(overheads talk) hi-res video low-res video

14:45 – 15:00 Break

15:00 – 16:15 Short talks (25 minutes each)

Rahul Santhanam, *Effective Polynomial Simulations *rahul_santhanam_slides hi-res video low-res video

Jacob Nordstrom, *Understanding Space in Proof Complexity: ** Separations and Tradeoffs via Substitutions *jacob_nordstrom_slides hi-res video low-res video

Neil Thapen, *TBA*

16:15 – 16:30 Break

16:30 – 17:45 Short talks (25 minutes each

Iddo Tzamaret, *Proofs of polynomial identities *iddo_tzameret_slides hi-res video low-res video

Emil Jarabek, *On monotone sequent calculus *emil_jerabek_slides hi-res video low-res video

* *

17:45 – 20:30 Dinner

20:30 – 21:30 Panel Discussion, Will proof complexity help to solve P vs. NP?”

Sam Buss (moderator)

Steve Cook

Paul Beame

Alasdair Urquhart

**Fri, Aug 28: Computational Learning theory**

** **

Organiziers: Avrim Blum, Rocco Servedio

08:00 Breakfast

09:20 – 09:30 Opening remarks

9:30 – 10:20: Tong Zhang, *Computation and Optimization** in Machine Learning *tong_zhang_slides hi-res video low-res video

10:30 – 11:20 Nina Balcan, *Approximate Clustering without** the Approximation *nina_balcan_slides hi-res video low-res video

approx clustering low error clustering agnostic clustering sparsest cut clustering clustering framework

11:20 – 11: 40 Break

11:40 – 12:30 Adam Kalai, *Learning and Smoothed Analysis *adam_kalai_slides hi-res video low-res video

12:30 Lunch break

14:30 – 15:20 John Langford, *Getting around Barriers in Learning Theory *john_langford_slides hi-res video low-res video

15:30 - 16:20 Adam Klivans, *Learning, Polynomials,** and Circuit Lower Bounds *adam_klivans_slides hi-res video low-res video

16:20 – 16:40 Break

16:40 – 18:00 Discussion hi-res video low-res video

**Sat, Aug 29: Pseudorandomness & Derandomization**

** **

Organizers: Valentine Kabanets, Luca Trevisan

08:00 Breakfast

09:15 -10:00 Mark Braverman: *Polylog-wise Independence Fools AC0 *(blackboard talk)* *hi-res video low-res video

10:15 -11:00 Valentine Kabanets: *Derandomization vs. Circuit Lower Bounds *valentine_kabanets_slides hi-res video low-res video

11:00 – 11:15 Break

11:15 -12:00 Omer Reingold: *Randomness vs. Memory: Prospects** and Barriers *omer_reingold_slides hi-res video low-res video

12:00 – 14:00 Lunch

14:00 -14:45 Luca Trevisan: *Pseudorandomness in** Additive Combinatorics *luca_trevisan_slides hi-res video low-res video

* *

14:45 – 15:15 Break* *

End of program

** **

Some abstracts

Ketan Mulmuley

**Title: On P vs. NP, Geometric Complexity**

* Theory, Explicit Proofs and the Complexity Barrier*

** **

**Abstract:**

**This talk will give its informal overview of the complexity barrier and its role in geometric complexity theory (GCT), an approach to the P vs. NP and related problems. The relevant paper (with the same title as the talk) is available on the web at the speaker’s personal home page. **

**—————————**

**Tong Zhang**

**Title: Computation and Optimization in Machine Learning**

**Abstract: Many problems in statistical machine learning involve non-convex objective functions that are difficult to optimize. A frequently employed approach is to solve a convex approximation (or relaxation) of the original problem, which raises the question about the quality of such relaxation methods. That is, whether the solution of an approximation is close to the solution of the original problem. A different approach is to use traditional numerical optimization procedures that directly find a local minimum of the nonconvex problem. Again, an important question is the quality of the local minimum being found.**

**I will discuss some recent results for both approaches as well as some future directions. In the first part, I will discuss convex relaxation methods, and the emphasis is on the effectiveness of convex relaxation. The second part discusses local optimization methods; again, the emphasis is the on effectiveness of such procedures.**

** **

** **

**Adam Kalai**

**Title: Learning and Smoothed Analysis**

**Abstract:**

**We give a new model of learning motivated by smoothed analysis (Spielman and Teng, 2001). In this model, we analyze two new algorithms, for PAC-learning DNFs and agnostically learning decision trees, from random examples drawn from a constant-bounded product distributions. These two problems had previously been solved using membership queries (Jackson, 1995; Gopalan et al, 2005). Our analysis demonstrates that the “heavy” Fourier coefficients of a DNF alone suffice to recover the DNF. We also show that a structural property of the Fourier spectrum of any boolean function holds over “typical” product distributions.**

** **

**Joint work with Alex Samorodnitsky and Shang-Hua Teng**

**————————–**

**Adam Klivans**

** **

**Title: Learning, Polynomials, and Circuit Lower Bounds.**

** **

**Abstract:**

** **

**In 1989, Linial, Mansour, and Nisan pioneered the “Fourier method” in computational learning theory by reducing the problem of uniform-distribution learning to the problem of approximating concept classes by real polynomials. Additionally, they gave applications to proving circuit lower bounds for constant depth circuits. In this two-part talk, I will continue these themes and describe 1) how real approximation theory is still a fundamental problem in learning theory (in particular, by examining the complexity of empirical risk minimization over R^n) and 2) why– in a precise sense– learning algorithms actually imply circuit lower bounds.**** **

**————————–**

** **

**Maria-Florina Balcan**

** **

**Title: Approximate Clustering without the Approximation**

** **

**Abstract:**

** **

**There has been substantial work on approximation algorithms for clustering**

**data under distance-based objective functions such as k-median, k-means,**

**and min-sum objectives. This work is fueled in part by the hope that**

**approximating these objectives well will indeed yield more accurate**

**solutions. That is, for problems such as clustering proteins by function,**

**or clustering images by subject, there is some unknown correct “target”**

**clustering and the implicit assumption is that clusterings that are**

**approximately optimal in terms of these distance-based measures are also**

**approximately correct in terms of error with respect to the target. In**

**this work we show that if we make this implicit assumption explicit —**

**that is, if we assume that any c-approximation to the given clustering**

**objective Phi is epsilon-close to the target — then we can produce**

**clusterings that are O(epsilon)-close to the target, even for values c for**

**which obtaining a c-approximation is NP-hard. In particular, for the**

**k-median, k-means, and min-sum objectives, we show that we can achieve**

**this guarantee for any constant c > 1.**

** **

**Our results show how by explicitly considering the alignment between the**

**objective function used and the true underlying clustering goals, one can**

**bypass computational barriers and perform as if these objectives were**

**computationally substantially easier.**

** **

**————————–**

** **

**John Langford**

** **

**Title: Getting around Barriers in Learning Theory**

** **

**Abstract:**

** **

**This talk is about a collection of examples where a barrier in learning theory has fallen, how it fell, and some discussion about how we can get around remaining barriers. I plan to discuss and compare problems in sample complexity, computational complexity, and complex learning problems.**

** **

**————————–**

** **

** **