Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
2-Source Extractors Under Computational Assumptions and Cryptography with Defective Randomness
 

Summary: 2-Source Extractors Under Computational Assumptions and
Cryptography with Defective Randomness
Yael Tauman Kalai
Xin Li
Anup Rao
May 8, 2009
1 Context
Randomness is a useful resource for solving many problems in computer science. The goal of the broad
area of derandomization is to weaken the assumptions that are placed on the randomness used. One way
to do this is to design randomness extractors: these are algorithms that take randomness that comes from
defective sources and extract truly random bits from them. A 2-Source extractor is an algorithm that
extracts random bits from two independent sources, each giving bits with some entropy. The best known
extractor algorithms for this situation require that at least one of the sources has 0.499 entropy rate, even
though the probabilistic method shows that a random bit can be extracted from two independent sources
each of which gives n bits with only log n + O(1) bits of entropy. Past work in derandomization has left us
with the dramatic result that any randomized algorithm can be simulated by an algorithm that only has
access to a single source of randomness with some entropy. However, even though randomness is used in an
essential way in cryptography and distributed computing, there was no analogous result for these domains.
Moreover, there is a strong negative result showing that many cryptographic tasks are impossible given only
a single weak random source. Nevertheless, it was later shown how to use a variant of the DDH assumption

  

Source: Anderson, Richard - Department of Computer Science and Engineering, University of Washington at Seattle

 

Collections: Computer Technologies and Information Sciences