Abstract
In this paper I discuss how to conceptual engineer ‘entropy’ and ‘information’ as they are used in information theory and statistical mechanics. Initially, I evaluate the extent to which the all-pervasive entangled use of entropy and information notions can be somehow defective in these domains, such as being meaningless or generating confusion. Then, I assess the main ameliorative strategies to improve this defective conceptual practice. The first strategy is to substitute the terms ‘entropy’ and ‘information’ by non-loaded terms, as it was first argued by Bar-Hillel in the 1950s. A second strategy is to prescribe how these terms should be correctly used to be meaningful, as it was pioneered by Carnap (Two essays on entropy, University of California Press, 1977) in Two Essays on Entropy. However, the actual implementation of these two ameliorative strategies has been historically unsuccessful due to the low credentials that philosophers as conceptual prescribers have among scientists. Finally, to try to solve these obstacles, I propose a third strategy based on leveraging evidence from the contribution of philosophy as a complementary science or the so-called ‘Philosophy in Science’ (à la Pradeu et al. 2024) to integrate conceptual prescriptions and analyses of entropy and information as part of the scientific practices in which these notions are used.