# Boneh, Dan 1969-

Overview
Works: 15 works in 42 publications in 2 languages and 397 library holdings Conference papers and proceedings Author, Editor, Thesis advisor
Publication Timeline
.
Most widely held works by Dan Boneh
Advances in cryptology - CRYPTO 2003 : 23rd Annual International Cryptology Conference, Santa Barbara, California, USA, August 17-21, 2003 : proceedings by Dan Boneh( Book )

28 editions published in 2003 in English and German and held by 379 WorldCat member libraries worldwide

Annotation This book constitutes the refereed proceedings of the 23rd Annual International Cryptology Conference, CRYPTO 2003, held in Santa Barbara, California in August 2003.The 34 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 166 submissions. The papers are organized in topical sections on public key cryptanalysis, alternate adversary models, protocols, symmetric key cryptanalysis, universal composability, zero knowledge, algebraic geometry, public key constructions, new problems, symmetric key constructions, and new models
Breaking DES using a molecular computer by Dan Boneh( Book )

1 edition published in 1995 in English and held by 2 WorldCat member libraries worldwide

Abstract: "Recently Adleman [1] has shown that a small traveling salesman problem can be solved by molecular operations. In this paper we show how the same principles can be applied to breaking the Data Encryption Standard (DES). Our method is based on an encoding technique presented in Lipton [8]. We describe in detail a library of operations which are useful when working with a molecular computer. We estimate that given one arbitrary (plain-text, cipher-text) pair, one can recover the DES key in about 4 months of work. Furthermore, if one is given cipher-text, but the plain text is only known to be one of several candidates then it is still possible to recover the key in about 4 months of work. Finally, under chosen cipher-text attack it is possible to recover the DES key in one day using some preprocessing."
On the implementation of pairing-based cryptosystems by Ben Lynn( Book )

1 edition published in 2007 in English and held by 2 WorldCat member libraries worldwide

Making DNA computers error resistant by Dan Boneh( Book )

1 edition published in 1995 in English and held by 2 WorldCat member libraries worldwide

Security for real-world networked applications by Nagendra Gupta Modadugu( Book )

1 edition published in 2007 in English and held by 2 WorldCat member libraries worldwide

Hardware support for tamper-resistant and copy-resistant software( )

1 edition published in 2001 in English and held by 1 WorldCat member library worldwide

"Although there have been many attempts to develop code transformations that yield tamper-resistant software, no reliable software-only methods are known. Motivated by numerous potential applications, we investigate a prototype hardware mechanism that supports software tamper-resistance with an atomic decrypt-and-execute operation. Our hardware architecture uses a novel combination of standard architectural units. As usual, security has its costs. In this design, the most difficult security tradeoffs involve testability and performance."--Abstract
Formal proofs of cryptographic security of network protocols by Arnab Roy( )

1 edition published in 2009 in English and held by 1 WorldCat member library worldwide

Present-day internet users and networked enterprises rely on key management and related protocols that use cryptographic primitives. In spite of the staggering financial value of, say, the total number of credit card numbers transmitted by SSL/TLS in a day, we do not have correctness proofs that respect cryptographic notions of security for many of these relatively simple distributed programs. In light of this challenge, there have been many efforts to develop and use methods for proving security properties of network protocols. Computational Protocol Composition Logic (CPCL), developed by our group at Stanford, is a symbolic logic whose semantics is defined with respect to the complexity-theoretic model of cryptography. The axiomatic proofs in CPCL do not involve probability and complexity and are amenable to automation. Furthermore, the soundness theorem guarantees that they provide comparable mathematical guarantees as traditional hand-proofs done by cryptographers. Protocol authentication properties are generally trace-based, meaning that authentication holds for the protocol if authentication holds for individual traces (runs of the protocol and adversary). Computational secrecy conditions, on the other hand, often are not trace based: the ability to computationally distinguish a system that transmits a secret from one that does not, is measured by overall success on the \textit{set} of all traces of each system. Non-trace-based properties present a challenge for inductive or compositional methods: induction is a natural way of reasoning about traces of a system, but it does not appear directly applicable to non-trace properties. We therefore investigate the semantic connection between trace properties that could be established by induction and non-trace-based security requirements. In this dissertation, we present foundations for inductive analysis of computational security properties by proving connections between selected trace properties of protocol executions and non-trace complexity theoretic properties standard in the literature. Specifically, we prove that a certain trace property implies computational secrecy and authentication properties, assuming the encryption scheme provides chosen ciphertext security and ciphertext integrity. We formalize the aforesaid inductive properties in a set of new axioms and inference rules that are added to CPCL and prove soundness of the system over a standard cryptographic model with a probabilistic polynomial time adversary. We illustrate the system by giving a modular, formal proof of computational authentication and secrecy properties of Kerberos V5. We also present axioms and inference rules for reasoning about Diffie-Hellman-based key exchange protocols and use these rules to prove authentication and secrecy properties of two important protocol standards, the Diffie-Hellman variant of Kerberos, and IKEv2, the revised standard key management protocol for IPSEC. The proof system extended with the new axioms and rules is sound for an accepted semantics used in cryptographic studies. In the process of applying our system, we uncover a deficiency in Diffie-Hellman Kerberos that is easily repaired
Collusion-secure fingerprinting for digital data by Dan Boneh( Book )

1 edition published in 1994 in English and held by 1 WorldCat member library worldwide

Abstract: "This paper discusses methods for assigning codewords for the purpose of fingerprinting digital data (e.g., software, documents, and images). Fingerprinting consists of uniquely marking and registering each copy of the data. This marking allows a distributor to detect any unauthorized copy and trace it back to the user. This threat of detection will deter users from releasing unauthorized copies. A problem arises when users collude: For digital data, two different fingerprinted objects can be compared and the differences between them detected. Hence, a set of users can collude to detect the location of the fingerprint. They can then alter the fingerprint to mask their identities. We present a general fingerprinting solution which is secure in the context of collusion. In addition, we discuss methods for distributing fingerprinted data."
Security from location by Di Qiu( Book )

1 edition published in 2012 in English and held by 1 WorldCat member library worldwide

Studies in computational number theory with applications to cryptography by Dan Boneh( )

1 edition published in 1996 in English and held by 1 WorldCat member library worldwide

Paradigms for virtualization based host security by Tal Simeon Garfinkel( )

1 edition published in 2010 in English and held by 1 WorldCat member library worldwide

Virtualization has been one of the most potent forces reshaping the landscape of systems software in the last 10 years and has become ubiquitous in the realm of enterprise compute infrastructure and in the emerging field of cloud computing. This presents a variety of new opportunities when designing host based security architectures. We present several paradigms for enhancing host security leveraging the new capabilities afforded by virtualization. First, we present a virtualization based approach to trusted computing. This allows multiple virtual hosts with different assurance levels to run concurrently on the same platform using a novel "open box" and "closed box" model that allows the virtualized platform to present the best properties of traditional open and closed platforms on a single physical platform. Next, we present virtual machine introspection, an approach to enhancing the attack resistance intrusion detection and prevention systems by moving them "out of the box" i.e. out of the virtual host they are monitoring and into a seperate protection domain where they can inspect the host they are monitoring from a more protected vantage point. Finally, we present overshadow data protection, an approach for providing a last line of defense for application data even if the guest OS running an application has been compromised. We accomplish this by presenting two views of virtual memory, an encrypted view to the operating system and a plain text view to the application the owning that memory. This approach more generally illustrates the mechanisms necessary to introduce new orthogonal protection mechanisms into a Guest Operating system from the virtualization layer while maintaining backwards compatibility with existing operating systems and applications
Advanced applications of multilinear maps in cryptography by Kevin Lewi( )

1 edition published in 2016 in English and held by 1 WorldCat member library worldwide

We study two new cryptographic primitives inspired by recent advances in multilinear maps: private constrained pseudorandom functions (PRFs) and order-revealing encryption (ORE). We show how these primitives have direct applications in searchable symmetric encryption, watermarking, deniable encryption, private information retrieval, and more. To construct private constrained PRFs, we first demonstrate that our strongest notions of privacy and functionality can be achieved using indistinguishability obfuscation. Then, for our main constructions, we build private constrained PRFs for bit-fixing constraints and for puncturing constraints from concrete algebraic assumptions over multilinear maps. We also construct the first implementable ORE scheme that provides what is known as best-possible'' semantic security. In our scheme, there is a public algorithm that given two ciphertexts as input, reveals the order of the corresponding plaintexts and nothing else. Our constructions are inspired by obfuscation techniques, but do not use obfuscation. Finally, we also show how to build efficiently implementable ORE from PRFs, achieving a simulation-based security notion with respect to a leakage function that precisely quantifies what is leaked by the scheme
Spatial encryption by Michael Alexander Hamburg( )

1 edition published in 2011 in English and held by 1 WorldCat member library worldwide

Since Boneh and Franklin and Cocks first constructed identity-based encryption in 2001, many variants of that technology have appeared. We present a unified model for those variants. Furthermore, we show two highly flexible designs which can be used to build new systems under this model. We prove the security of these systems, and discuss applications to other areas of cryptography and security
Lattice-based non-interactive argument systems by David J Wu( )

1 edition published in 2018 in English and held by 1 WorldCat member library worldwide

Non-interactive argument systems are an important building block in many cryptographic protocols. In this work, we begin by studying non-interactive zero-knowledge (NIZK) arguments for general NP languages. In a NIZK argument system, a prover can convince a verifier that a statement is true without revealing anything more about the statement. Today, NIZK arguments can be instantiated from random oracles, or, in the common reference string (CRS) model, from trapdoor permutations, pairings, or indistinguishability obfuscation. Notably absent from this list are constructions from lattice assumptions, and realizing NIZKs (for general NP languages) from lattices has been a long-standing open problem. In this work, we make progress on this problem by giving the first construction of a multi-theorem NIZK argument from standard lattice assumptions in a relaxed model called the preprocessing model, where we additionally assume the existence of a trusted setup algorithm that generates a proving key (used to construct proofs) and a verification key (used to verify proofs). Moreover, by basing hardness on lattice assumptions, our construction gives the first candidate that plausibly resists quantum attacks. We then turn our attention to constructing succinct non-interactive arguments (SNARGs) for general NP languages. SNARGs enable verifying computations with substantially lower complexity than that required for classic NP verification. Prior to this work, all SNARG constructions relied on random oracles, pairings, or indistinguishability obfuscation. This work gives the first lattice-based SNARG candidates. In fact, we show that one of our new candidates satisfy an appealing property called "quasi-optimality, " which means that the SNARG simultaneously minimizes both the prover complexity and the proof size (up to polylogarithmic factors). This is the first quasi-optimal SNARG from any concrete cryptographic assumption. Again, because of our reliance on lattice-based techniques, all of our new candidates resist quantum attacks (in contrast to existing pairing-based constructions)
Algorithms and lower bounds for parameter-free online learning by Ashok Cutkosky( )

1 edition published in 2018 in English and held by 1 WorldCat member library worldwide

Training a machine learning model today involves minimizing a loss function on datasets that are often gigantic, and so almost all practically relevant training algorithms operate in an online manner by reading in small chunks of the data at a time and making updates to the model on-the-fly. As a result, online learning, a popular way to analyze optimization algorithms operating on datastreams, is at the heart of modern machine learning pipelines. In order to converge to the optimal model as quickly as possible, online learning algorithms all require some user-specified parameters that reflect the shape of the loss or statistics of the input data. Examples of such parameters include the size of the gradients of the losses, the distance from some initial model to the optimal model, and the amount of variance in the data, among others. Since the true values for these parameters are often unknown, the practical implementation of online learning algorithms usually involves simply guessing (called tuning''), which is both inefficient and inelegant. This motivates the search for parameter-free algorithms that can adapt to these unknown values. Prior algorithms have achieved adaptivity to many different unknown parameters individually - for example one may adapt to an unknown gradient sizes given a known distance to the optimal model, or adapt to the unknown distance given a known bound on gradient size. However, no algorithm could adapt to both parameters simultaneously. This work introduces new lower bounds, algorithms, and analysis techniques for adapting to many parameters at once. We begin by proving a lower bound showing that adapting to both the size of the gradients and distance to optimal model simultaneously is fundamentally much harder than adapting to either individually, and proceed to develop the first algorithm to meet this lower bound, obtaining optimal adaptivity to both parameters at once. We then expand upon this result to design algorithms that adapt to more unknown parameters, including the variance of the data, different methods for measuring distances, and upper or lower bounds on the second derivative of the loss. We obtain these results by developing new techniques that convert non-parameter-free optimization algorithms into parameter-free algorithms. In addition to providing new and more adaptive algorithms, the relative simplicity of non-parameter-free algorithms allows these techniques to significantly reduce the complexity of many prior analyses

more
fewer
Audience Level
 0 1 Kids General Special

Covers
Alternative Names
Bwneh Dan 1969-....

Dan Boneh

Dan Boneh cryptograaf uit Israël

Dan Boneh informatico israeliano

Dan Boneh Israeli cryptographer

Dan Boneh israelisch-US-amerikanischer Informatiker und Kryptologe

Бонех, Дэн

Ден Бонех

בונה דן 1969-....

دان بونه

ダン・ボウネイ

Languages
English (41)

German (1)