passa ai contenuti
Evaluation of Information Retrieval Systems: Approaches, Issues, and Methods.
ChiudiAnteprima di questo documento
Stiamo controllando…

Evaluation of Information Retrieval Systems: Approaches, Issues, and Methods.

Autore: Stephen P Harter; Carol A Hert
Edizione/Formato: Articolo Articolo : English
Pubblicazione:Annual Review of Information Science and Technology (ARIST) v32 p3-94 1997
Banca dati:ERIC La banca dati ERIC è un’iniziativa dell’U.S. Department of Education.
Altre banche dati: ArticleFirstBritish Library Serials
Sommario:
Discusses the traditional Cranfield model for information retrieval (IR) evaluation, problems with the model, and extensions and alternatives to Cranfield; describes emerging themes in IR evaluation, including multiple evaluation dimensions and methods, inclusion of users and other stakeholders in the evaluation process, and various approaches to evaluation; and outlines a research agenda that addresses unresolved  Per saperne di più…
Voto:

(non ancora votato) 0 con commenti - Diventa il primo.

Altri come questo

 

&AllPage.SpinnerRetrieving;

Trova una copia in biblioteca

&AllPage.SpinnerRetrieving; Stiamo ricercando le biblioteche che possiedono questo documento…

Dettagli

Tipo documento: Article
Tutti gli autori / Collaboratori: Stephen P Harter; Carol A Hert
ISSN:0066-4200
Nota sulla lingua: English
Identificatore univoco: 424834530
Riconoscimenti:

Abstract:

Discusses the traditional Cranfield model for information retrieval (IR) evaluation, problems with the model, and extensions and alternatives to Cranfield; describes emerging themes in IR evaluation, including multiple evaluation dimensions and methods, inclusion of users and other stakeholders in the evaluation process, and various approaches to evaluation; and outlines a research agenda that addresses unresolved issues. Contains 396 references. (AEF)

Commenti

Commenti degli utenti
Recuperando commenti GoodReads…
Stiamo recuperando commenti DOGObooks

Etichette

Le etichette di tutti gli utenti (8)

Vedi le etichette più popolari come: lista di etichette | nuvola di etichette

Conferma questa richiesta

Potresti aver già richiesto questo documento. Seleziona OK se si vuole procedere comunque con questa richiesta.

Dati collegati


<http://www.worldcat.org/oclc/424834530>
library:oclcnum"424834530"
owl:sameAs<info:oclcnum/424834530>
rdf:typeschema:Article
schema:about
schema:about
schema:about
schema:about
schema:contributor
schema:creator
schema:datePublished"1997"
schema:description"Discusses the traditional Cranfield model for information retrieval (IR) evaluation, problems with the model, and extensions and alternatives to Cranfield; describes emerging themes in IR evaluation, including multiple evaluation dimensions and methods, inclusion of users and other stakeholders in the evaluation process, and various approaches to evaluation; and outlines a research agenda that addresses unresolved issues. Contains 396 references. (AEF)"
schema:exampleOfWork<http://worldcat.org/entity/work/id/76299178>
schema:isPartOf
<http://worldcat.org/issn/0066-4200>
rdf:typeschema:Periodical
schema:name"Annual Review of Information Science and Technology (ARIST)"
schema:isPartOf
schema:name"Evaluation of Information Retrieval Systems: Approaches, Issues, and Methods."
schema:pageStart"3"
schema:url

Content-negotiable representations

Chiudi finestra

Per favore entra in WorldCat 

Non hai un account? Puoi facilmente crearne uno gratuito.