The influence of basic tokenization on biomedical document retrieval
conference paper
Tokenization is a fundamental preprocessing step in Information Retrieval systems in which text is turned into index terms. This paper quantifies and compares the influence of various simple tokenization techniques on document retrieval effectiveness in two domains: biomedicine and news. As expected, biomedical retrieval is more sensitive to small changes in the tokenization method. The tokenization strategy can make the difference between a mediocre and well performing IR system, especially in the biomedical domain. Copyright 2007 ACM.
Topics
TNO Identifier
240315
Source title
30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07, 23-27 July 2007, Amsterdam, The Netherlands
Pages
803-804
Files
To receive the publication files, please send an e-mail request to TNO Repository.