omitir hasta el contenido
Evaluation of Information Retrieval Systems: Approaches, Issues, and Methods.
CerrarVer este material de antemano
Chequeando…

Evaluation of Information Retrieval Systems: Approaches, Issues, and Methods.

Autor: Stephen P Harter; Carol A Hert
Edición/Formato: Artículo Artículo : English
Publicación:Annual Review of Information Science and Technology (ARIST) v32 p3-94 1997
Base de datos:ERIC La base de datos ERIC es una iniciativa del Departamento de Educación de los Estados Unidos.
Otras bases de datos: ArticleFirstBritish Library Serials
Resumen:
Discusses the traditional Cranfield model for information retrieval (IR) evaluation, problems with the model, and extensions and alternatives to Cranfield; describes emerging themes in IR evaluation, including multiple evaluation dimensions and methods, inclusion of users and other stakeholders in the evaluation process, and various approaches to evaluation; and outlines a research agenda that addresses unresolved  Leer más
Calificación:

(todavía no calificado) 0 con reseñas - Ser el primero.

Más materiales como éste

 

&AllPage.SpinnerRetrieving;

Encontrar un ejemplar en la biblioteca

&AllPage.SpinnerRetrieving; Encontrando bibliotecas que tienen este material…

Detalles

Tipo de documento: Artículo
Todos autores / colaboradores: Stephen P Harter; Carol A Hert
ISSN:0066-4200
Nota del idioma: English
Identificador único: 424834530
Premios:

Resumen:

Discusses the traditional Cranfield model for information retrieval (IR) evaluation, problems with the model, and extensions and alternatives to Cranfield; describes emerging themes in IR evaluation, including multiple evaluation dimensions and methods, inclusion of users and other stakeholders in the evaluation process, and various approaches to evaluation; and outlines a research agenda that addresses unresolved issues. Contains 396 references. (AEF)

Reseñas

Reseñas contribuidas por usuarios
Recuperando reseñas de GoodReads…
Recuperando reseñas de DOGObooks…

Etiquetas

Todas las etiquetas de usuarios (8)

Ver etiquetas más populares como: lista de etiquetas | nube de etiquetas

Confirmar este pedido

Ya ha pedido este material. Escoja OK si desea procesar el pedido de todos modos.

Datos enlazados


<http://www.worldcat.org/oclc/424834530>
library:oclcnum"424834530"
owl:sameAs<info:oclcnum/424834530>
rdf:typeschema:Article
schema:about
schema:about
schema:about
schema:about
schema:contributor
schema:creator
schema:datePublished"1997"
schema:description"Discusses the traditional Cranfield model for information retrieval (IR) evaluation, problems with the model, and extensions and alternatives to Cranfield; describes emerging themes in IR evaluation, including multiple evaluation dimensions and methods, inclusion of users and other stakeholders in the evaluation process, and various approaches to evaluation; and outlines a research agenda that addresses unresolved issues. Contains 396 references. (AEF)"
schema:exampleOfWork<http://worldcat.org/entity/work/id/76299178>
schema:isPartOf
<http://worldcat.org/issn/0066-4200>
rdf:typeschema:Periodical
schema:name"Annual Review of Information Science and Technology (ARIST)"
schema:isPartOf
schema:name"Evaluation of Information Retrieval Systems: Approaches, Issues, and Methods."
schema:pageStart"3"
schema:url

Content-negotiable representations

Cerrar ventana

Inicie una sesión con WorldCat 

¿No tienes una cuenta? Puede fácilmente crear una cuenta gratuita.