April 18, 2014

Workshop: Impagliazzo’s Worlds — June 3-5, 2009

Tentative program for the workshop on

"Complexity and Cryptography: Status of Impagliazzo's Worlds"

June 3-5, Princeton

Talks are in Room 006 in the Friend center , lunch and rump session in convocation room of Friend center, banquet is in Palmer House.

Parking: You can park in lots 3 and 21 using this permit, see  campus map . There are also metered parking spots on Olden St and Propspect Ave.

Below are streaming versions of videos, you can also download them via this link

Wednesday June 3rd.


10:30           Breakfast, coffee etc...
11:15-11:30 Opening words
11:30-12:30  Russell Impagliazzo: 5 worlds of problems hi-res video
12:30-2:30    Lunch
2:30-3:30      Chris Peikert: Public-Key Cryptosystems from the Worst-Case Shortest Vector Problem
 hi-res video  slides
3:30-4          Coffee break
4-5               Craig Gentry: A Homomorphic Public Key Cryptosystem hi-res video

6-9pm          Banquet at Palmer House

Thursday, June 4th.

9                Breakfast, coffee etc..
9:30-10:30  Iftach Haitner: Inaccessible Entropy hi-res video slides
10:30-11     Coffee break
11-12          Luca Trevisan: On Algorithmica vs Heuristica
12-1:30       Lunch
1:30-2:30    Dimitris Achlioptas: Algorithmic Phase Transitions in Constraint Satisfaction Problems hi-res video slides
2:30-3:30    Uri Feige: Dense subgraphs of random graphs hi-res video
3:30-4         Coffee Break
4-5             Benny Applebaum: Cryptography from Average-Case Hardness hi-res video  slides

Dinner on your own.

7-9pm        Rump/open problems  session schedule

slides:
Dodis Goyal Vadhan Ostrovski Naor Mahmoody-Ghidary Chung Prabhakaran Trevisan Haitner Wee Rothblum Gordon Viola Xiao Peikert Barak

Friday, June 5th.

9                Breakfast, coffee etc..
9:30-10:30  Scott Aaronson:  Impagliazzo's Worlds in Arithmetic Complexity hi-res video slides
10:30-11     Coffee break
11-12         Valentine Kabanets: Direct Product Testing hi-res video slides
12-1:30      Lunch
1:30-2:30   Moni Naor: How Fair Can a Coin Toss Be? hi-res video
2:30-3:30   Yuval Ishai:  Efficiency vs. assumptions in secure computation hi-res video slides
3:30-4       Coffee break
4-5            Amit Sahai: A New Protocol Compiler hi-res video  slides (keynote)

Abstracts

Russell Impagliazzo : Five Worlds of Problems
``A personal view on average-case complexity'' introduced five possible worlds as
metaphors for the outcomes of  fundamental questions in complexity concerning
whether and to what extent intractable problems exist.
In this talk, I will try to put forward some concrete open problems addressing these issues.
I will emphasize those problems I believe are both within reach,
in that there is no fundamental reason they should be difficult, but whose solutions
would cast light on which of the five worlds is the case and what these worlds look like.

Chris Peikert: Public-Key Cryptosystems Based from the Worst-Case Shortest Vector Problem
We construct public-key cryptosystems that are secure assuming the
worst-case hardness of approximating the minimum distance on
$n$-dimensional lattices to within small poly(n) factors.  Prior
cryptosystems with worst-case connections were based either on the
shortest vector problem for a *special class* of lattices (Ajtai and
Dwork, STOC 1997; Regev, J.~ACM 2004), or on the conjectured hardness
of lattice problems for *quantum* algorithms (Regev, STOC 2005).

Our main technical innovation is a reduction from variants of the
shortest vector problem to corresponding versions of the ``learning
with errors'' problem; previously, only a quantum reduction of this
kind was known.  As an additional contribution, we construct a natural
*chosen ciphertext-secure* cryptosystem having a much simpler
description and tighter underlying worst-case approximation factor
than prior schemes.


 Craig Gentry: A Homomorphic Public Key Cryptosystem
We propose a fully homomorphic encryption scheme – i.e., a scheme that
allows one to evaluate circuits over encrypted data without being able
to decrypt. Our solution comes in three steps. First, we provide a
general result – that, to construct an encryption scheme that permits
evaluation of arbitrary circuits, it suffices to construct an encryption
scheme that can evaluate (slightly augmented versions of) its own
decryption circuit; we call a scheme that can evaluate its (augmented)
decryption circuit bootstrappable.
Next, we describe a public key encryption scheme using ideal lattices
that is almost bootstrappable. Lattice-based cryptosystems typically
have decryption algorithms with low circuit complexity, often dominated
by an inner product computation that is in NC1. Also, ideal lattices
provide both additive and multiplicative homomorphisms (modulo a
public-key ideal in a polynomial ring that is represented as a lattice),
as needed to evaluate general circuits.
Unfortunately, our initial scheme is not quite bootstrappable – i.e.,
the depth that the scheme can correctly evaluate can be logarithmic in
the lattice dimension, just like the depth of the decryption circuit,
but the latter is greater than the former. In the final step, we show
how to modify the scheme to reduce the depth of the decryption circuit,
and thereby obtain a bootstrappable encryption scheme, without reducing
the depth that the scheme can evaluate. Abstractly, we accomplish this
by enabling the encrypter to start the decryption process, leaving less
work for the decrypter, much like the server leaves less work for
decrypter in a server-aided cryptosystem.


Iftach Haitner: Inaccessible Entropy
We put forth a new computational notion of entropy, which measures the
(in)feasibility of sampling high entropy strings that are consistent
with a given protocol.  Specifically, we say that the i'th round of a
protocol (A, B) has _accessible entropy_ at most k, if no
polynomial-time strategy A^* can generate messages for A such that the
entropy of its message in the i'th round has entropy greater than k
when conditioned both on prior messages of the protocol and on prior
coin tosses of A^*.  We say that the protocol has _inaccessible entropy_
if the total accessible entropy (summed over the rounds) is noticeably
smaller than the real entropy of A's messages, conditioned only on
prior messages (but not the coin tosses of A).  As applications of
this notion, we

* Give a much simpler and more efficient construction of statistically
hiding commitment schemes from arbitrary one-way functions.

* Prove that constant-round statistically hiding commitments are
necessary for constructing constant-round zero-knowledge proof systems
for NP that remain secure under parallel composition (assuming the
existence of one-way functions).

Luca Trevisan: On Algorithmica vs Heuristica


Dimitris Achlioptas: Algorithmic Phase Transitions in Constraint Satisfaction Problems
For many Constraint Satisfaction Problems by now we have a good
understanding of the largest constraint density for which solutions
exist. At the same time, though, all known polynomial-time algorithms
for these problems stop finding solutions at much smaller densities. We
study this phenomenon by examining how the different sets of solutions
evolve as constraints are added. We prove in a precise mathematical
sense that, for each problem studied, the barrier faced by algorithms
corresponds to a phase transition in that problem’s solution-space
geometry. Roughly speaking, at some problem-specific critical density,
the set of solutions shatters and goes from being a single giant ball
to exponentially many, well-separated, tiny pieces. All known
polynomial-time algorithms work in the ball regime, but stop as soon as
the shattering occurs. Besides giving a geometric view of the solution
space of random CSPs our results provide novel constructions of one-way functions.
Joint work with Amin Coja-Oghlan.

Uri Feige: Dense subgraphs of random graphs
 see pdf file


Benny Applebaum: Cryptography from Average-Case Hardness 

We construct a public key cryptosystem based on the assumption that
it is hard to solve n^{1.41} random 3-sparse equations modulo 2 in n
variables, where an n^{-0.2} fraction of the equations are noisy.
This assumption seems different than previous assumptions used to
construct public key systems. In particular, to our knowledge, it is the
first noisy equations type public key system with noise magnitude larger
than 1/sqrt{n}. We also show that our assumption is not refuted
by certain natural algorithms such as Lassere SDP's, low-degree polynomials and ACO circuits,
and, using Feige's 3XOR Principle, that the refutation variant of
this assumption with the same parameters is implied by the hardness
of refuting random 3CNFs with O(n^{1.41}) clauses.

We also construct a public key cryptosystem based on the hardness o
the above problem with fewer equations- as low as c*n for a large
constant c - at the expense of making an additional assumption about
a planted dense subgraph problem in random 3-uniform hypergraphs.
Joint work with Boaz Barak and Avi Wigderson.

Scott Aaronson:  Impagliazzo's Worlds in Arithmetic Complexity
In this talk, I'll examine Impagliazzo’s worlds through the "funhouse
mirror" of arithmetic computation over finite fields.  The motivation
is not only to gain insight into the ordinary Boolean worlds, but also
to create a Natural Proofs theory for arithmetic circuits, which would
explain our failure to prove superpolynomial lower bounds for the
Permanent.  I'll prove two main results.  Firstly, in a reasonable
model of arithmetic computation over "small" finite fields F---where
"small" means |F|<=2^poly(n)---one-way functions, pseudorandom
generators, and pseudorandom functions exist if and only if they exist
in the ordinary Boolean world.  The proof uses a shifted quadratic
character problem due to Boneh and Lipton as a "bridge" between the
Boolean and arithmetic worlds.  Secondly, over "large" finite fields
F—where "large" means |F|>2^poly(n)---there's a meaningful sense in
which "P!=NP implies the existence of one-way functions": that is,
Heuristica, Pessiland, and Minicrypt all collapse to a single world.
I'll end by speculating about the possibility of a Natural Proofs
theory for low-degree arithmetic circuits, and proposing a concrete
candidate for a low-degree arithmetic pseudorandom function.

Joint work with Andrew Drucker

Valentine Kabanets: Direct Product Testing

Moni Naor: How Fair Can a Coin Toss Be?
Coin-flipping protocols allow mutually distrustful parties to
generate a common unbiased random bit, guaranteeing that even if
one of the parties is malicious, it cannot significantly bias the
output of the honest party.
A classical result by Cleve from STOC 1986 showed that without
simultaneous broadcast, for any two-party coin-flipping protocol
with r rounds, there exists an efficient adversary that can bias
the output of the honest party by Omega(1/r). However, the best
previously known protocol only guarantees O(1/sqrt(r)) (based on
Blum's protocol) and Cleve and Impagliazzo have shown that this is
optimal in the fail-stop model (where the adversary is limited to
early stopping but has unbounded computational power).
We establish the optimal tradeoff between the round complexity and
the maximal bias of two-party coin-flipping protocols. Under
standard assumptions, we show that Cleve's lower bound is tight:
we construct an r-round protocol with bias O(1/r). We make use of
recent progress by Gordon, Hazay, Katz and Lindell regarding fair
protocols (STOC 2008).
Unlike Blum's coin flipping protocol that requires only the
existence of any one-way function, our protocol (like the Protocol
of Gordon et al) requires "cryptomania" style assumptions. One of
the main interesting questions is whether this is essential.

Joint work with Tal Moran and Gil Segev


Yuval Ishai:  Efficiency vs. assumptions in secure computation

The talk will survey the state of the art in the complexity of secure computation, highlight
connections with the topics of the workshop, and discuss the role of
nonstandard assumptions in some of the current frontiers.



Amit Sahai: A New Protocol Compiler

One of the most fundamental goals in cryptography is to design
protocols that remain secure when adversarial participants can engage
in arbitrary malicious behavior.  In 1986, Goldreich, Micali, and
Wigderson presented a powerful paradigm for designing such protocols:
their approach reduced the task of designing secure protocols to
designing protocols that only guarantee security against
"honest-but-curious" participants. By making use of zero-knowledge
proofs, the GMW paradigm enforces honest behavior without compromising
secrecy.  Over the past two decades, this approach has been the
dominant paradigm for cryptographic protocol design.
In this talk, we present a new general paradigm / protocol compiler
for secure protocol design. Our approach also reduces the task of
designing secure protocols to designing protocols that only guarantee
security against honest-but-curious participants.  However, our
approach avoids the use of zero-knowledge proofs, and instead makes
use of multi-party protocols in a much simpler setting - where the
majority of participants are completely honest (such multi-party
protocols can exist without requiring any computational assumptions).
Our paradigm yields protocols that rely on Oblivious Transfer (OT) as
a building block.  This offers a number of advantages in generality
and efficiency.
In contrast to the GMW paradigm, by avoiding the use of zero-knowledge
proofs, our paradigm is able to treat all of its building blocks as
"black boxes". This allows us to improve over previous results in the
area of secure computation. In particular, we obtain:
* Conceptually simpler and more efficient ways for basing
unconditionally secure cryptography on OT.
* More efficient protocols for generating a large number of OTs using
a small number of OTs.
* Secure and efficient protocols which only make a black-box use of
cryptographic primitives or underlying algebraic structures in
settings where no such protocols were known before.

This talk is based primarily on joint works with Yuval Ishai
(Technion and UCLA) and Manoj Prabhakaran (UIUC).