Mining correlations between medically dependent features and image retrieval models for query classification
Journal of the American Society for Information Science and Technology
Published online on February 27, 2017
Abstract
The abundance of medical resources has encouraged the development of systems that allow for efficient searches of information in large medical image data sets. State‐of‐the‐art image retrieval models are classified into three categories: content‐based (visual) models, textual models, and combined models. Content‐based models use visual features to answer image queries, textual image retrieval models use word matching to answer textual queries, and combined image retrieval models, use both textual and visual features to answer queries. Nevertheless, most of previous works in this field have used the same image retrieval model independently of the query type. In this article, we define a list of generic and specific medical query features and exploit them in an association rule mining technique to discover correlations between query features and image retrieval models. Based on these rules, we propose to use an associative classifier (NaiveClass) to find the best suitable retrieval model given a new textual query. We also propose a second associative classifier (SmartClass) to select the most appropriate default class for the query. Experiments are performed on Medical ImageCLEF queries from 2008 to 2012 to evaluate the impact of the proposed query features on the classification performance. The results show that combining our proposed specific and generic query features is effective in query classification.