WorldCat Identities

Koller, Daphne

Works: 88 works in 122 publications in 1 language and 824 library holdings
Genres: Educational films  Internet videos  Conference papers and proceedings 
Roles: Author, Thesis advisor
Classifications: QA279.5, 519.5420285
Publication Timeline
Most widely held works about Daphne Koller
Most widely held works by Daphne Koller
Probabilistic graphical models : principles and techniques by Daphne Koller( Book )

15 editions published between 2009 and 2012 in English and held by 464 WorldCat member libraries worldwide

Proceedings of the annual Conference on Uncertainty in Artificial Intelligence, available for 1991-present. Since 1985, the Conference on Uncertainty in Artificial Intelligence (UAI) has been the primary international forum for exchanging results on the use of principled uncertain-reasoning methods in intelligent systems. The UAI Proceedings have become a basic reference for researches and practitioners who want to know about both theoretical advances and the latest applied developments in the field
TEDTalks : Daphne Koller - What We're Learning from Online Education( Visual )

1 edition published in 2012 in English and held by 154 WorldCat member libraries worldwide

Educator Daphne Koller is enticing top universities to put their most intriguing courses online for free - not just as a service, but as a way to research how people learn. In this TEDTalk, Koller explains how Coursera, a social entrepreneurship company cofounded with Andrew Ng, tracks each keystroke, quiz, peer-to-peer discussion, and self-graded assignment to build an unprecedented pool of data on how knowledge is processed
Uncertainty in artificial intelligence : proceedings of the seventeenth conference (2001), August 2-5, 2001, University of Washington, Seattle, Washington by Jack Breese( Book )

5 editions published in 2001 in English and held by 37 WorldCat member libraries worldwide

From knowledge to belief by Daphne Koller( Book )

7 editions published between 1993 and 1994 in English and held by 11 WorldCat member libraries worldwide

We use techniques from finite model theory to analyze the computational aspects of random worlds. The problem of computing degrees of belief is undecidable in general. However, for unary knowledge bases, a tight connection to the principle of maximum entropy often allows us to compute degrees of belief
Adaptive probabilistic networks by Stuart J Russell( Book )

2 editions published in 1994 in English and held by 7 WorldCat member libraries worldwide

Representation dependence in probabilistic inference by Joseph Y Halpern( Book )

2 editions published in 1995 in English and held by 6 WorldCat member libraries worldwide

Abstract: "Non-deductive reasoning systems are often representation dependent: representing the same situation in two different ways may cause such a system to return two different answers. This is generally viewed as a significant problem. For example, the principle of maximum entropy has been subjected to much criticism due to its representation dependence. There has, however, been almost no work investigating representation dependence. In this paper, we formalize this notion and show that it is not a problem specific to maximum entropy. In fact, we show that any probabilistic inference system that sanctions certain important patterns of reasoning, such as minimal default assumption of independence, must suffer from representation dependence. We then show that invariance under a restricted class of representation changes can form a reasonable compromise between representation dependence and other desiderata."
Asymptotic conditional probabilities : the unary case by A. J Grove( Book )

4 editions published in 1993 in English and held by 6 WorldCat member libraries worldwide

Abstract: "Motivated by problems that arise in computing degrees of belief, we consider the problem of computing asymptotic conditional probabilities for first-order sentences. Given first-order sentences [symbol] and [theta], we consider the structures with domain [1 ..., N] that satisfy [theta], and compute the fraction of them in which [symbol] is true. We then consider what happens to this fraction as N gets large. This extends the work on 0-1 laws that considers the limiting probability of first-order sentences, by considering asymptotic conditional probabilities. As shown in [Lio69, GHK93], in the general case, asymptotic conditional probabilities do not always exist, and most questions relating to this issue are highly undecidable. These results, however, all depend on the assumption that [theta] can use a nonunary predicate symbol. Liogon'kiĭ [Lio69] shows that if we condition on formulas [theta] involving unary predicate symbols only (but no equality or constant symbols), then the asymptotic conditional probability does exist and can be effectively computed. This is the case even if we place no corresponding restrictions on [symbol]. We extend this result here to the case where [theta] involves equality and constants. We show that the complexity of computing the limit depends on various factors, such as the depth of quantifier nesting, or whether the vocabulary is finite or infinite. We completely characterize the complexity of the problem in the different cases, and show related results for the associated approximation problem."
Random worlds and maximum entropy by A. J Grove( Book )

2 editions published in 1994 in English and held by 3 WorldCat member libraries worldwide

Abstract: "Given a knowledge base KB containing first-order and statistical facts, we consider a principled method, called the random- worlds method, for computing a degree of belief that some formula [symbol] holds given KB. If the domain has size N, then we can consider all possible worlds, or first-order models, with domain [1 ..., N] that satisfy KB, and compute the fraction of them in which [symbol] is true. We define the degree of belief to be the asymptotic value of this fraction as N grows large. We show that when the vocabulary underlying [symbol] and KB uses constants and unary predicates only, we can naturally associate an entropy with each world. As N grows larger, there are many more worlds with higher entropy. Therefore, we can use a maximum entropy computation to compute the degree of belief. This result is in a similar spirit to previous work in physics (e.g., [Jay78]) and artificial intelligence (e.g., [PV89, Sha89]), but is far more general. Of equal interest to the actual results themselves are the numerous subtle issues we must address when formulating it. For languages with binary predicate symbols, the random-worlds method continues to make sense, but there no longer seems to be any useful connection to maximum entropy. It is difficult to see how maximum entropy can be applied at all. In fact, results from [GHK93a] show that even generalizations of maximum entropy are unlikely to be useful. These observations suggest unexpected limitations to the applicability of maximum entropy methods."
On the complexity of two-person zero-sum games in extensive form by Daphne Koller( Book )

2 editions published in 1990 in English and held by 3 WorldCat member libraries worldwide

Alignment of cryo-electron tomography images using Markov Random Fields by Fernando Amat Gil( )

1 edition published in 2010 in English and held by 2 WorldCat member libraries worldwide

Cryo-Electron tomography (CET) is the only imaging technology capable of visualizing the 3D organization of intact bacterial whole cells at nanometer resolution in situ. However, quantitative image analysis of CET datasets is extremely challenging due to very low signal to noise ratio (well below 0dB), missing data and heterogeneity of biological structures. In this thesis, we present a probabilistic framework to align CET images in order to improve resolution and create structural models of different biological structures. The alignment problem of 2D and 3D CET images is cast as a Markov Random Field (MRF), where each node in the graph represents a landmark in the image. We connect pairs of nodes based on local spatial correlations and we find the "best'' correspondence between the two graphs. In this correspondence problem, the "best'' solution maximizes the probability score in the MRF. This probability is the product of singleton potentials that measure image similarity between nodes and the pairwise potentials that measure deformations between edges. Well-known approximate inference algorithms such as Loopy Belief Propagation (LBP) are used to obtain the "best'' solution. We present results in two specific applications: automatic alignment of tilt series using fiducial markers and subtomogram alignment. In the first case we present RAPTOR, which is being used in several labs to enable real high-throughput tomography. In the second case our approach is able to reach the contrast transfer function limit in low SNR samples from whole cells as well as revealing atomic resolution details invisible to the naked eye through nanogold labeling
Efficient computation of equilibria for extensive two-person games by Daphne Koller( Book )

1 edition published in 1994 in English and held by 2 WorldCat member libraries worldwide

Abstract: "The Nash equilibria of a two-person, non-zero-sum game are the solutions of a certain linear complementarity problem (LCP). In order to use this for solving a game in extensive form, it is first necessary to convert the game to a strategic description such as the normal form. The classical normal form, however, is often exponentially large in the size of the game tree. Hence, finding equilibria of extensive games typically implies exponential blowup in terms of both time and space. In this paper we suggest an alternative approach, based on the sequence form of the game. For a game with perfect recall, the sequence form is a linear sized strategic description, which results in an LCP of linear size. For this LCP, we show that an equilibrium can be found efficiently by Lemke's algorithm, a generalization of the Lemke-Howson method."
The digital patient machine learning techniques for analyzing electronic health record data by Suchi Saria( )

1 edition published in 2011 in English and held by 2 WorldCat member libraries worldwide

The current unprecedented rate of digitization of longitudinal health data -- continuous device monitoring data, laboratory measurements, medication orders, treatment reports, reports of physician assessments -- allows visibility into patient health at increasing levels of detail. A clearer lens into this data could help improve decision making both for individual physicians on the front lines of care, and for policy makers setting national direction. However, this type of data is high-dimensional (an infant with no prior clinical history can have more than 1000 different measurements in the ICU), highly unstructured (the measurements occur irregularly, and different numbers and types of measurements are taken for different patients) and heterogeneous (from ultrasound assessments to lab tests to continuous monitor data). Furthermore, the data is often sparse, systematically not present, and the underlying system is non-stationary. Extracting the full value of the existing data requires novel approaches. In this thesis, we develop novel methods to show how longitudinal health data contained in Electronic Health Records (EHRs) can be harnessed for making novel clinical discoveries. For this, one requires access to patient outcome data -- which patient has which complications. We present a method for automated extraction of patient outcomes from EHR data; our method shows how natural languages cues from the physicians notes can be combined with clinical events that occur during a patient's length of stay in the hospital to extract significantly higher quality annotations than previous state-of-the-art systems. We develop novel methods for exploratory analysis and structure discovery in bedside monitor data. This data forms the bulk of the data collected on any patient yet, it is not utilized in any substantive way post collection. We present methods to discover recurring shape and dynamic signatures in this data. While we primarily focus on clinical time series, our methods also generalize to other continuous-valued time series data. Our analysis of the bedside monitor data led us to a novel use of this data for risk prediction in infants. Using features automatically extracted from physiologic signals collected in the first 3 hours of life, we develop Physiscore, a tool that predicts infants at risk for major complications downstream. Physiscore is both fully automated and significantly more accurate than the current standard of care. It can be used for resource optimization within a NICU, managing infant transport to a higher level of care and parental counseling. Overall, this thesis illustrates how the use of machine learning for analyzing these large scale digital patient data repositories can yield new clinical discoveries and potentially useful tools for improving patient care
From statistical knowledge bases to degrees of belief by Thomas J. Watson IBM Research Center( Book )

1 edition published in 1994 in English and held by 1 WorldCat member library worldwide

Making Rational Decisions Using Adaptive Utility Elicitation( )

1 edition published in 2000 in Undetermined and held by 1 WorldCat member library worldwide

Representations and solutions for game-theoretic problems by Daphne Koller( )

1 edition published in 1997 in English and held by 1 WorldCat member library worldwide

Restricted Bayes Optimal Classifiers( )

1 edition published in 2000 in Undetermined and held by 1 WorldCat member library worldwide

Probabilistic models for region-based scene understanding by Stephen Gould( )

1 edition published in 2010 in English and held by 1 WorldCat member library worldwide

One of the long-term goals of computer vision is to be able to understand the world through visual images. This daunting task involves reasoning simultaneously about objects, regions and 3D geometry. Traditionally, computer vision research has tackled these tasks is isolation: independent detectors for finding objects, image segmentation algorithms for defining regions, and specialized monocular depth perception methods for reconstructing geometry. Unfortunately, this isolated reasoning can lead to inconsistent interpretations of the scene. In this thesis we develop a unified probabilistic model that avoids these inconsistencies. We introduce a region-based representation of the scene in which pixels are grouped together to form consistent regions. Each region is then annotated with a semantic and geometric class label. Next, we extend our representation to include the concept of objects, which can be comprised of multiple regions. Finally, we show how our region-based representation can be used to interpret the 3D structure of the scene. Importantly, we model the scene using a coherent probabilistic model over random variables defined by our region-based representation. This enforces consistency between tasks and allows contextual dependencies to be modeled across tasks, e.g., that sky should be above the horizon, and ground below it. Finally, we present an efficient algorithm for performing inference in our model, and demonstrate state-of-the-art results on a number of standard tasks
Sensors & Symbols: An Integrated Framework( Book )

2 editions published in 1999 in English and held by 1 WorldCat member library worldwide

The goal of this effort was to provide a unified probabilistic framework that integrates symbolic and sensory reasoning. Such a framework would allow sensor data to be analyzed in terms of high-level symbolic models. It will also allow the results of high-level analysis to guide the low-level sensor interpretation task and to help in resolving ambiguities in the sensor data. Our approach was based on the framework of probabilistic graphical models, which allows us to build systems that learn and reason with complex models, encompassing both low-level continuous sensor data and high-level symbolic concepts. Over the five years of the project, we explored two main thrusts: Inference and learning in hybrid and temporal Bayesian networks Mapping and modeling of 3D physical environments. Our progress on each of these two directions is detailed in the attached report
Knowledge Representation for an Uncertain World( Book )

2 editions published in 1997 in English and held by 1 WorldCat member library worldwide

Any application where an intelligent agent interacts with the real world must deal with the problem of uncertainty. Bayesian belief networks have become dominant in addressing this issue. This is a framework based on principled probabilistic semantics, which achieves effective knowledge representation and inference capabilities by utilizing the locality structure in the domain: typically, only very few aspects of the situation directly affect each other. Despite their success, belief networks are inadequate as a knowledge representation language for large, complex domains: Their attribute-based nature does not allow us to express general rules that hold in many different circumstances. This prevents knowledge from being shared among applications; the initial knowledge acquisition cost has to be paid for each new domain. It also inhibits the construction of large complex networks. We deal with this issue by presenting a rich knowledge-representation language from which belief networks can be constructed to suit specific circumstances, algorithms for learning the network parameters from data, fast approximate inference algorithms designed to deal with the large networks that result. We show how these techniques can be applied in domains involving continuous variables, in situations where the world changes over time, and in the context of planing under uncertainty
Statistical analyses on high-thoughput sequencing of B cell and T cell receptors by Yi Liu( )

1 edition published in 2014 in English and held by 1 WorldCat member library worldwide

The immune system is typically described as having two major branches: the adaptive branch, and the innate branch. In the adaptive immune system, T cells and B cells learn to discern foreign pathogens from self. B cell repertoires mediate the host's ability to mount appropriately pathogen-specific humoral responses against immunological challenges. VDJ somatic recombination of the immunoglobulin chains, affinity maturation, and B cell clonal selection all contribute to the generation of a healthy B cell repertoire. Despite starting from only a limited number of genomic segments, these processes lead to diverse and optimized repertoires of clones, where each clone has its own distinct binding specificity for foreign antigens. High depth sequencing data on the repertoire of B cell receptors has enabled us to examine the diverse receptors in great detail. My thesis will showcase three aspects of the B cell repertoire in terms of how they relate to ageing and/or pathology: clonal convergence, clonal expansion, and phylogenetic hypermutation. In a separate but related work, we apply a novel combination of sequencing method and statistical techniques to examine T cell receptors. We raised the lower bounds on T cell receptor repertoires beta chain richness in healthy humans, and examined how T cell receptor diversity relates to age
moreShow More Titles
fewerShow Fewer Titles
Audience Level
Audience Level
  Kids General Special  
Audience level: 0.58 (from 0.23 for TEDTalks : ... to 0.95 for Sensors & ...)

Probabilistic graphical models : principles and techniques
Alternative Names
Dafne Kollere

Daphne Koller amerykańska informatyczka

Daphne Koller informatica statunitense

Daphne Koller informatica uit Israël

Daphne Koller informaticienne américaine

Daphne Koller profesora estadounidense de informática

Daphne Koller US-amerikanische Informatikerin

Koller, Daphne

Дафна Коллер

Дафни Колер

Դաֆնա Կոլլեր

דפנה קולר

دافنی کالر

دافنی کالەر


English (52)

Uncertainty in artificial intelligence : proceedings of the seventeenth conference (2001), August 2-5, 2001, University of Washington, Seattle, Washington