Abstract
A connection between two fundamental concepts of information and symmetry breaking (SB) is established. A concept called transform information (TI) is introduced. The known information measures (Hartley, von Neumann-Shannon-Wiener, Fisher informations, Renyi entropies) can be derived as (or mathematically expressed by) the particular forms of TI for certain transforms of a physical systems (when they are described by the probability measures). As TI is zero when the system is invariant under respective transform, it can be considered, when nonzero, as a quantitative SB measure in the system under study. The classical information measures that are derived from TI also can be perceived as SB measures. This fact is a base for assigning a sense to information. The concept of TI is extended to the cases when systems are described without the use of probability concept.