. . . . . . . . . . . . . . . . . . . . "2020-07-20" . "Prioritizing positive feature values: a new hierarchical feature selection method"^^ . . . . . "In this work, we address the problem of feature selection for the classification task in hierarchical and sparse feature spaces, which characterise many real-world applications nowadays. A binary feature space is deemed hierarchical when its binary features are related via generalization-specialization relationships, and is considered sparse when in general the instances contain much fewer \u201Cpositive\u201D than \u201Cnegative\u201D feature values. In any given instance, a feature value is deemed positive (negative) when the property associated with the feature has been (has not been) observed for that instance. Although there are many methods for the traditional feature selection problem in the literature, the proper treatment to hierarchical feature structures is still a \r\n challenge. Hence, we introduce a novel hierarchical feature selection method that follows the lazy learning paradigm \u2013 selecting a feature subset tailored for each instance in the test set. Our strategy prioritizes the selection of features with positive values, since they tend to be more informative \u2013 the presence of a relatively rare property is usually a piece of more relevant information than the absence of that property. Experiments on different application domains have shown that the proposed method outperforms previous hierarchical feature selection methods and also traditional methods in terms of predictive accuracy, selecting smaller feature subsets in general."^^ . . .