Литмир - Электронная Библиотека
Содержание  
A
A

When we say that bits are “building blocks” of information, and figurae are “building blocks” of meaning, we imply that figurae, unlike bits, have qualitative properties and that the set of figurae can be divided into subsets, or that we can distinguish between basic types of figurae. This applies to meanings as such, but it was first noted for linguistic signs. Hjelmslev considered the identification of these types to be a necessary condition for understanding both the expression and the content of languages.

“Such an exhaustive description presupposes the possibility of explaining and describing an unlimited number of signs, in respect of their content as well, with the aid of a limited number of figurae. And the reduction requirement must be the same here as for the expression plane: the lower we can make the number of content-figurae, the better we can satisfy the empirical principle in its requirement of the simplest possible description” (Hjelmslev 1969, p. 67).

As we saw above, meanings are not reduced to signs and symbols. Meanings manifest themselves in the mental, social and physical existence of a person, but meanings are not born in this existence. Abstractions are not a product of the human intellect, neither in its affirmative form of understanding nor in its negative form of reason. Rather, it is understanding and reason that are the result of the evolution of social and material abstractions in action. Meanings only reproduce fundamental definitions, states, relationships, changes, directions in nature and society: “If we’re able to learn language from a few years’ worth of examples, it’s partly because of the similarity between its structure and the structure of the world” (Domingos 2015, p. 37). Hence the universality of meanings, the ability of people to understand each other, to translate each other’s languages—and this after tens of thousands of years of isolated life. During the Age of Discovery, Europeans found a common language with the Indians or Australians. All people act, talk and think in one language—the language of meaning:

“In Leibniz’s view, if we want to understand anything, we should always proceed like this: we should reduce everything that is complex to what is simple, that is, present complex ideas as configurations of very simple ones which are absolutely necessary for the expression of thoughts” (Wierzbicka 2011, p. 380). “…’Inside’ all languages we can find a small shared lexicon and a small shared grammar. Together, this panhuman lexicon and the panhuman grammar linked with it represent a minilanguage, apparently shared by the whole of humankind. … On the one hand, this mini-language is an intersection of all the languages of the world. On the other hand, it is, as we see it, the innate language of human thoughts, corresponding to what Leibniz called ‘lingua naturae’” (ibid., p. 383).

Mathematics as a domain of meaning is also a reflection of the fundamental definitions of the world. The similarities between the world and mathematics make it possible to solve scientific problems. This similarity did not arise overnight. Mathematics is a result of the evolution of meaning from the order of the universe up to the reflection of this order in the minds of people. On the scale of millions and billions of years, the difference between Turing and Wittgenstein disappears: “Turing thought of mathematics as something that was essentially discovered, something like a science of the abstract. Wittgenstein insisted that mathematics was essentially something invented, following out a set of rules we have chosen—more like an art than a science” (Grim 2017, p. 151). In fact, both mathematics and logic in general are the result of cultural evolution that occurs through selection and choice. It could be, that the logical contradiction between meanings expresses the historical and practical discrepancy of meanings in relation to the environment and the subject, and the resolution of such a contradiction reflects the overcoming of this discrepancy.

The simplicity of early meanings did not only concern making. Thinking and communicating were just as simple, relying on crude motions of body and mind. Primitive making has left us its direct results: stones, bones, etc. Unfortunately, the direct products of communicating or thinking no longer exist, so we can only judge them indirectly. In the process of social and then cultural learning, as the norm of first learned and then rational reaction expanded and cultural selection turned into traditional choice, the complexity of meanings and of the culture-society as a whole increased, as did the number of figurae and meanings.

The gradual complication of meanings becomes clear, for example, when we consider the evolution of stone tools: from the simplest Paleolithic choppers to the polished and drilled Neolithic axes, which are characterized by a much higher level of workmanship. Cultural evolution consists in the division of meanings, that is, in the emergence of ever new types of actions and their results. By dividing their activity and knowledge, people specialized in those types of actions in which they had a competitive advantage due to the characteristics of the environment or their active power. Hunting and gathering divided into farming, herding, crafts, trade. Not only the complexity of making grew, but also of communicating and thinking. Languages were more complex. Learned actions became a more important part of self-reproduction relative to instinctive behaviors, rational actions grew more vital relative to learning.

2. Complexity of meaning

Minimal subject and minimal action

That the complexity of meanings increases as they evolve may be intuitively obvious, but it was only in the middle of the 20th century that the concepts of the quantity of information and information complexity were rigorously substantiated in the works of Claude Shannon and Andrey Kolmogorov.

Shannon introduced the concept of information entropy. According to him, entropy H is a measure of the uncertainty, unpredictability, surprise or randomness of a message, event or phenomenon. In terms of culture, such a message or event is a counterfact. Without information losses, Shannon entropy is equal to the amount of information per message symbol. The amount of information is determined by the degree of surprise inherent in a particular message:

“According to this way of measuring information, it is not intrinsic to the received communication itself; rather, it is a function of its relationship to something absent—the vast ensemble of other possible communications that could have been sent, but weren’t. Without reference to this absent background of possible alternatives, the amount of potential information of a message cannot be measured. In other words, the background of unchosen signals is a critical determinant of what makes the received signals capable of conveying information. No alternatives = no uncertainty = no information. Thus Shannon measured the information received in terms of the uncertainty that it removed with respect to what could have been sent” (Deacon 2013, p. 379).

Thus, the average amount of information H a culture-society contains can be measured by the number of (counter)facts it generates and the probability of their occurrence. The Shannon entropy H is an indicator of the complexity of a culture-society as a whole. If we look at the history of human cultures-societies, we see that their complexity has consistently grown: from a meager set of primitive meanings (tribal community, elementary language, simple stone tools, causal mini-models, animism and fetishism) to a complex arsenal of meanings characteristic of agrarian societies (fields and livestock, agricultural and craft tools, city-states and empires, writing and literature, ancient and Arabic science, world religions).

15
{"b":"928322","o":1}