Yu d apresyan problem of synonym questions of linguistics. Yu.D.Apresyan lexical synonyms

Apresyan Yuri Derenikovich (b. February 2, 1930, Moscow), Russian linguist, academician of the Russian Academy of Sciences (1992). Works in the field of semantics, syntax, lexicography, structural and mathematical linguistics, machine translation, formal models of language, theoretical lexicography. One of the developers of the “Meaning - Text” theory, head of the Moscow semantic school. Compiler of a number of new type dictionaries of the Russian and English languages.

In 1953 he graduated from the Faculty of English of the 1st Moscow Pedagogical Institute of Foreign Languages, taught English at the same institute for several years and at the same time was engaged in scientific research, which resulted in his PhD thesis on the topic “Phraseological synonyms in modern English” (1958). In 1960 -1972 he worked in the Sector of Structural Linguistics of the Institute of Russian Language of the USSR Academy of Sciences. Since the early 1970s. intensively collaborated with the creators of the “Meaning-Text” theory (TST) I. A. Melchuk and A. K. Zholkovsky, participated in the work on compiling one of the main components of the theory - the “Explanatory-Combinatorial Dictionary” (was published only in 1984), a new type of dictionary.

Because of their speech in defense of dissidents, the Dictionary’s author team was relieved of their positions, they were denied work, and only in 1972 Apresyan and his colleagues were accepted into the Moscow Research Institute Informelektro. There the team was developing systems for French-Russian and English-Russian automatic translation of scientific and technical texts. However, the official linguistic science of that time did not recognize the merits of the scientist. Moreover, all sorts of obstacles were put in his way scientific activity: Apresyan was summoned to the KGB, he was denied permission to travel abroad at the invitation of foreign scientific institutions, he was prohibited from giving lectures to students, his articles were not accepted for publication in Soviet scientific journals, and references to his works were removed from the works of other domestic scientists. In 1973, the manuscript of the book “Lexical Semantics”, which he submitted to the Nauka publishing house, was deliberately lost. Subsequently, when the author managed to restore it and finally publish it, this book marked the beginning of a new era in theoretical semantics and lexicography. In 1984 he defended the monograph “Lexical Semantics” in Minsk as doctoral dissertation. In the mid-1980s. Due to perestroika, the disgrace ended. In 1991, in collaboration with other linguists, he began developing a synonymous dictionary of the Russian language of a fundamentally new type.

Basic scientific works Apresyan transferred to foreign languages: “Ideas and methods of modern structural linguistics” (1966), “Experimental study of the semantics of the Russian verb” (1967), “Lexical semantics. Synonymous means of language” (1974), “Types of information for the surface-semantic component of the “Meaning - Text” model” (Vienna 1980). Huge contribution scientist contributed in the creation of general and special dictionaries: “English-Russian synonymous dictionary” (co-authored 1979), “Explanatory-combinatorial dictionary of the Russian language” (Vienna, 1984), which went through several editions, “New Big English-Russian dictionary"(in 3 volumes, 1993-1994), two-volume dictionary "Russian verb - Hungarian verb. Management and compatibility" (Budapest, 1982, co-author). He is the author of numerous theoretical works on linguistic support of systems automatic processing text (co-authored with I. A. Melchuk and A. K. Zholkovsky). Since 1990 he has been working at the Institute of Russian Language of the Russian Academy of Sciences, where he heads the sector of theoretical semantics. In June 1992 he was elected academician of the Russian Academy of Sciences (bypassing the stage of corresponding member).

Yuri Derenikovich Apresyan(born in) - Russian linguist, academician of the Russian Academy of Sciences (1992), foreign member of the National Academy of Sciences of Armenia, professor (1991), Doctor of Philology. Works in the field of lexical semantics, syntax, Russian and English lexicography, history of linguistics, machine translation, etc. One of the developers of the “Meaning ↔ Text” theory, head of the Moscow Semantic School. Compiler of a number of dictionaries of a new type of Russian (as well as English) language.

Biography

Yuri Apresyan was born in Moscow on February 2, 1930 in the family of a figure in the Soviet state security agencies and polyglot Derenik Zakharovich Apresyan. Armenian. He graduated from the Moscow State Pedagogical Institute (1953) with a degree in English and postgraduate studies at the same institute (1956); defended his thesis (1958) “Phraseological synonyms in modern English.”

He taught at the Moscow State Pedagogical Institute of Foreign Languages ​​(1954-1960), worked at the USSR Academy of Sciences (1960-1972), from where he was dismissed “as having not passed re-certification” for political reasons (speeches in defense of Sinyavsky and Daniel, K.I. Babitsky and other dissidents). After that, he worked at the Informelektro industrial institute (1972-1985), where he headed the automatic translation group.

The defense of his doctoral dissertation (based on the 1974 book “Lexical Semantics”) could - also for political reasons - only take place in 1984 in Minsk; The authorities also tried to prevent the publication of this book itself. Since 1985, the scientist worked at , where he headed the Laboratory of Computational Linguistics (1989-1994).

Since 1990, Yu. D. Apresyan’s main place of work has again become, where he heads the sector of theoretical semantics (since 1994). In the 1990s, he lectured in Russia (MSU), Australia, USA, and Germany. In 1992 he was elected a full member of the RAS.

For many years he has led the well-known seminar “Theoretical Semantics,” which has become one of the main centers of linguistic life in Moscow.

Contribution to science

Early works

Yu. D. Apresyan's scientific interests are diverse, but lexical semantics has always remained the central area of ​​his research. In the late 1950s, when he first began studying linguistics, the field was considered “exotic” and on the fringes of language research. Therefore, the study of semantics by Yu. D. Apresyan began with the study of the previous linguistic tradition and works on the history of linguistics. He wrote several detailed reviews on the history of semantics and included in the same series the book “Ideas and Methods of Modern Structural Linguistics” (1966). Yu. D. Apresyan's first book, which combined detail and factual accuracy with popularity and fascinating presentation, became a bestseller for many years and was translated into all major European languages ​​(and was published twice in German and Spanish).

This is followed by a period of intensive immersion in the problems of syntactic semantics associated with the then famous transformational grammar. The short-term collaboration of Yu. D. Apresyan with S. K. Shaumyan, who began to develop his model of “applicative grammar” (one version of this model is described in detail in a 1966 book), dates back to the same period. The result of this period was the second book by Yu. D. Apresyan - “An Experimental Study of the Semantics of the Russian Verb” (1967), in which an attempt was made to construct a detailed classification of Russian verbs based on the types of variability of their control models. Subsequently, the author himself moved away from the ideas of “syntactically oriented” semantics developed in this book and moved on to comprehensive description lexical meanings of words based on various types of data. However, many observations and patterns reflected in this study not only retain their significance 30 years later, but even acquire new relevance in connection with the development of such areas as “grammar of constructions.”

Theory “Meaning ↔ Text”

The end of the 1960s - the beginning of the 1970s was a period of intensive collaboration between Yu. D. Apresyan and the creators of the “Meaning ↔ Text” (TST) theory I. A. Melchuk and A. K. Zholkovsky. Yu. D. Apresyan is actively involved in the work on compiling one of the main components of the theory - the “Explanatory-Combinatorial Dictionary”, designed to become a dictionary of a new type, reflecting, first of all, the non-trivial compatibility of lexemes. The semantics of words in this dictionary is described in the form of detailed formalized interpretations using a limited set of units; semantically more complex elements are interpreted through simpler ones, until it comes to the use of elements that cannot be further decomposable - the so-called. "semantic primitives". This program for a comprehensive semantic description of vocabulary had a number of similarities with the concept of the Polish semantic school of A. Boguslawski and A. Wierzbicka, with whose representatives there was an intensive exchange of ideas during that period. Both schools, in particular, believed that the meaning linguistic units correlates not directly with the surrounding reality, but with the native speaker’s ideas about this reality (sometimes called concepts). The nature of concepts depends on a given culture (culture-specific); the system of concepts of each language forms the so-called. a “naive picture of the world”, which in many details may differ from the “scientific” picture of the world, which is universal. The task of semantic analysis of vocabulary is to detect a naive picture of the world and describe its main categories.

If for Melchuk the compilation of the Explanatory-Combinatorial Dictionary was interesting primarily as a means of improving his model of language, then for Apresyan this work became initial stage for understanding a number of important and poorly researched problems of lexical semantics, such as: methods of representing lexical meaning, description of antonymy and synonymy of natural language, reflection of the structure of the meanings of a polysemantic word in interpretation, etc. The result of research on this range of issues was Apresyan’s monograph “Lexical Semantics” ( 1974, 2 ed. 1995) - one of the most significant linguistic works of the 1970s, which became (despite the official semi-ban and silence) for many years a kind of “textbook of semantics” for beginners and at the same time a research program for the future for a number of linguistic groups; the book was also translated into Polish and English. Although the monograph is very strong degree was based on the ideology of the “Meaning ↔ Text” theory (like, perhaps, no other work of Apresyan), its significance goes far beyond the scope of illustrating the semantic concept of one private theory. First, this monograph provided a detailed overview of global semantic research over the previous several decades. Secondly, in addition to illustrating a number of theoretical provisions of TST, the book very subtly and in detail analyzed a large array of data on Russian vocabulary, giving examples of interpretations of different semantic groups of words, which also became a model for lexicographers. Thirdly, central part The book, which touched on the problems of detecting and describing differences between synonyms, contained a detailed research program that was not part of the immediate intention of the creators of TST. Meanwhile, the implementation of this program became the main task of subsequent research for Apresyan.

Machine translation

However, external obstacles unexpectedly arose in the way of implementing this research program in the field of theoretical semantics and practical lexicography. After Apresyan's dismissal from the Russian Language Institute, he was unable to continue academic research - instead, he had to create English-Russian machine translation systems. Turning to this problem was largely forced, but, according to Apresyan himself, it was not without benefit: it made it possible to see and describe in practice many of the difficulties associated with interlingual equivalence. Apresyan's work on machine translation at the Informelectro Institute led to the creation experimental system STAGE (work on which continues at the IITP at the present time). In the early 1990s, Apresyan returned to theoretical lexicography at the Institute of Russian Language.

It should be noted that, despite numerous difficulties and the practical impossibility of publishing in his homeland, Apresyan continued to develop TST in the 1980s: it was during this period that he published a contrasting Russian-Hungarian dictionary of verb control (co-authored with E. Pall) and a small book published in Vienna devoted to the development of the semantic component of a multi-level language model; ideologically, both of these books continue the stage of the 1970s (although they are more related to applied problems than “Lexical semantics”). The 1980 book was the first to discuss in detail the previously expressed idea of ​​an “integral description of language,” in which vocabulary and grammar form a close unity and are “tuned” to each other, so that grammatical rules are formulated taking into account the restrictions imposed by the dictionary properties of units or classes of units, and the dictionary description indicates the types of grammatical rules in which a given word can participate. This idea naturally follows from the spirit of the “Meaning ↔ Text” theory, in which the role of the lexical-semantic component and vocabulary information was very large from the very beginning.

Work on English-Russian machine translation also contributed to the activation of a number of applied projects related to English lexicography, to which Apresyan, being an Englishist by training, also made a great contribution. In this area, it should be noted, firstly, the “English-Russian Synonymous Dictionary”, compiled by a team of authors under the leadership of Apresyan in 1979 and since then reprinted several times. In this small dictionary Apresyan was the first to test the format of a new type of synonymous dictionary, where many of the principles proclaimed in “Lexical Semantics” are implemented in practice; These principles will be developed in more detail in the future, already during the period of work on monolingual synonymous dictionaries of the Russian language. Under the general editorship and with the participation of Yu. D. Apresyan, several translated bilingual dictionaries were also prepared; The final stage of this activity was the release in 1993 of the New Large English-Russian Dictionary. In this dictionary, the development of dictionary entries was carried out taking into account the achievements of modern theoretical and practical lexicography, and currently it is considered the best and most complete bilingual English-Russian dictionary in Russia.

Synonym dictionary

Since the 1990s, Apresyan has almost completely switched to lexicographic work. The main result of this work is a synonymous dictionary of the Russian language - “New Explanatory Dictionary of Synonyms”. This is a natural continuation of both Apresyan’s theoretical research in the field of lexical semantics and some earlier practical experiments, in particular the English-Russian synonymous dictionary mentioned above.

However, the Russian synonymous dictionary was conceived as a dictionary of a new type, the likes of which lexicographic practice had not yet known. For this dictionary it was developed detailed diagram descriptions of synonymous series, where each element of the series was characterized in terms of semantics, syntax, compatibility and other properties (the description of a synonymous series in the dictionary occupies several pages of close font, and not several lines, as was previously the case in traditional synonymous dictionaries of the “school” type ). The dictionary contains and summarizes maximum quantity information about the linguistic behavior of Russian synonyms. The second edition (2004) took into account critical reviews and made many corrections.

Along with practical lexicography, Apresyan continues to develop semantic theory. One of his most interesting recent ideas was the idea of ​​the existence of elements of a semantic metalanguage, which Apresyan proposed to call “quarks” (using the term of modern physics). By quarks we mean regular meanings that are important for describing the naive picture of the world of a given language, but are never “verbalized” in this language: there are no linguistic units for their expression, they are only part of the semantics of many words of the language. An example of a quark is the meaning of “stativeness”, which is present in the interpretation of a large number of Russian verbs, important for the description of the Russian aspectual system, but which does not have formal means of expression in the Russian language. It is believed that the idea of ​​semantic quarks can be used in lexical typology - a new area of ​​semantics, the rapid development of which is observed in beginning of XXI century.

Apresyan’s contribution to lexicological theory is also his concept of “systemic lexicography,” in which the concepts of “lexicographic type” and “lexicographic portrait” he proposed play a key role; these concepts reflect both the results of lexicographic classification of words according to certain properties (“type”), and the results of identifying individual characteristics words (“portrait”).

Main publications

Books

  • Ideas and methods of modern structural linguistics (a brief outline). M.: Education, 1966.
  • Experimental study of the semantics of the Russian verb. M.: Nauka, 1967.
  • Lexical semantics (synonymous means of language). M.: Nauka, 1974.
    • Second updated edition: Yu. D. Apresyan. Selected works. T. I. M.: Languages ​​of Russian culture, 1995.
  • Types of information for the surface semantic component of the “Meaning ↔ Text” model. Wien: Wiener Slawistischer Almanach, 1980.
    • Second edition: Yu. D. Apresyan. Selected works. T. II. M.: Languages ​​of Russian culture, 1995, p. 8-101.
  • (Yu. D. Apresyan, I. M. Boguslavsky, L. L. Iomdin, etc.) Linguistic support of the ETAP-2 system. M.: Nauka, 1989.
  • (Yu. D. Apresyan, I. M. Boguslavsky, L. L. Iomdin, etc.) Linguistic processor for complex information systems. M.: Nauka, 1992.
  • Integral description of language and system lexicography // Selected works. T. II. M.: Languages ​​of Russian Culture, 1995.
  • Systematic Lexicography. / By Ju. Apresjan. Oxford: Oxford University Press, 2000.
  • Yu. D. Apresyan (ed.). Linguistic picture of the world and systemic lexicography. M.: Languages Slavic cultures, 2006.
  • Research in semantics and lexicography. T. I: Paradigmatics. M.: Languages ​​of Slavic Cultures, 2009.
  • Yu. D. Apresyan (ed.). Theoretical problems of Russian syntax: Interaction of grammar and vocabulary. M.: Languages ​​of Slavic Cultures, 2010.

Dictionaries

  • Yu. D. Apresyan and others. English-Russian synonymous dictionary. M.: Russian language, 1979 (and subsequent editions).
  • Yu. D. Apresyan, E. Pall. Russian verb - Hungarian verb. Control and compatibility. T. 1-2. Budapest, 1982.
  • I. A. Melchuk, A. K. Zholkovsky, Yu. D. Apresyan and others. Explanatory and combinatorial dictionary of the modern Russian language: Experiments in the semantic-syntactic description of Russian vocabulary. Wien, 1984.
  • Yu. D. Apresyan, E. M. Mednikova and others. New large English-Russian dictionary. M.: Russian language, 1993, vol. I-II; M.: Russian language, 1994, vol. III.
  • New explanatory dictionary of synonyms of the Russian language. / Under hand Yu. D. Apresyan. M.: Languages ​​of Russian culture. Vol. 1, 1997. Issue. 2, 2000. Issue. 3, 2003.
    • Second edition, corrected and expanded (in one volume): M.: Languages ​​of Russian Culture, 2004.
  • Yu. D. Apresyan (ed.). Prospectus of the active dictionary of the Russian language. M.: Languages ​​of Slavic Cultures, 2010.
From YourSITE.com

Theory of language
Lexical semantics
By Apresyan Yu.D.
Nov 4, 2006, 16:13

(Yu.D. Apresyan. Selected works. Volume 1. Lexical semantics. Synonymous means of language. M., 1995, pp. 3-69.)

Preface

The current era of linguistic development - This is undoubtedly the era of semantics, central position which in the circle of linguistic disciplines directly follows from the fact that human language in its main function it is a means of communication, a means of encoding and decoding certain information. The consistent development of this thesis inevitably leads to the concept of linguistics as a science that includes, along with other disciplines, also developed semantics, consisting of a description of not only grammatical, but also lexical meanings. Thus, the dictionary turns out to be a necessary part of a complete theoretical description of the language.<…>, and not just a “vocabulary monument” or a practical reference guide for its speakers. By analogy with theoretical and practical (school) grammar, it is advisable to talk about two corresponding types of dictionaries. On the other hand, a complete semantic description of the content units of a language, given, in particular, by a dictionary of a theoretical type, turns out to be a natural basis for a strict definition of any linguistic concepts, which are based on the idea of ​​semantic identities and differences of the corresponding linguistic objects.

This book can be considered as an attempt to construct a fragment of a system of semantic concepts that could serve as a theoretical basis for a new type of dictionary.<…>

Chapter one

Basic ideas of modern semantics

Origins of semantics

Modern lexical semantics has its roots in a number of linguistic and related disciplines, of which the most important are the following:

1) Lexicography, whose practical needs constantly confronted theoretical semantics with the need to create an apparatus for a comprehensive and non-redundant interpretation of lexical meanings, characterization of the lexical and syntactic compatibility of words, description of their semantic connections with other words, etc.

Lexicography requires, first of all, an answer to the question of what words mean. Meanwhile theoretical semantics The previous era dealt almost exclusively with the question of how words mean. This is precisely what is devoted to the doctrine of the methods of development of meanings - narrowing and expansion, differentiation and attraction, metaphor and metonymy, etc., as well as more subtle observations on the direction of transfers - from spatial meanings to temporal ones, but not vice versa; from nomina anatomica – to the names of physical objects, but not vice versa; from the names of properties perceived by touch, smell and taste - to the names of properties perceived by sight or hearing, but not vice versa; and a number of others.

For this reason, semantics and lexicography developed independently of each other for a long time. As L.V. Shcherba testifies, “linguistics of the 19th century, carried away by the discoveries of Bopp, Grimm, Rask and others, as a rule, was not at all interested in the issues of the theory of lexicography” (Shcherba 1940: 78). This state of affairs continued to a large extent in the first half of our century and gave U. Weinreich the basis to write about “the fatal abyss between theoretical and descriptive semantics, an abyss that dooms the former to sterility, and the latter to atomism” (Weinreich 1963: 115) . However, in general, linguistics of the 20th century is characterized by the oncoming development of semantics and lexicography, reflected in the works of such remarkable linguists as L.V. Shcherba, S. Bally, E. Sapir, K. Erdman, J. Firth, V.V. Vinogradov . Modern semantics has one way or another adopted the following principles formulated by these scientists: a) the essence called the (lexical) meaning of a word is not scientific, but “naive” (according to L.V. Shcherba - (“philistine”) concept of the corresponding thing, sometimes burdened with semantic and emotional associations that do not correspond to any essential features of the object or fact denoted by the word<…>; b) this essence must be revealed in the interpretation of the word, performed in a special “intellectual identifier language”..., which is built mainly on the basis of ordinary language, but may also contain words... that do not have direct semantic correspondences in natural language; c) words in a language are not connected to each other completely freely, i.e. not only on the basis of information about their meanings; the processes of constructing phrases and sentences are subject to special combinability restrictions - lexical and constructive...; d) even in relatively free phrases, the meaning of the whole phrase does not always consist of the meanings of the words that form it according to the simple law of summation; there are more interesting rules for the interaction of values, which do not give a “sum of values”, but some more complex product<…>

2) Linguistic semantics of the 40s and 50s<…>, from which the idea of ​​the component structure of lexical meanings was borrowed (in turn, transferred to linguistic semantics from phonology and grammar, where analysis based on differential features - phonological and grammatical-semantic - has been practiced for decades); Wed stallion= ‘horse + male’, mare= ‘horse + female’, male= ‘dog + male’, bitch= ‘dog + female’, man= ‘human + male + adult’, woman= ‘human + female + adult’, boy= ‘human + male + non-adult’, girl= ‘human + female + non-adult’, etc.

Initially, relatively simple and closed systems such as kinship terms, animal names, military and other nomenclatures, and even the idea was expressed<…>that an exhaustive decomposition of values ​​into differential features is possible only within the framework of such systems. However, in a detailed book by M. Mathiot (Mattio 1968), the principles of analysis based on differential features were extended to much wider layers of vocabulary.

Traditional theory of differential semantic features was significantly supplemented in the 60s by the concept of integral features, due to which the meaning of a word can include such semantic components that it is not opposed to any other meanings within a certain thematic range of words. For words son And daughter the sign of the degree of relationship is differential, since it is precisely this that underlies the opposition son - nephew, daughter - niece, and for the word children the same feature turns out to be integral, since the opposite children generic name for nephews And nieces not in Russian. In this regard, the observation was made that equipollent rather than privative oppositions predominate in the vocabulary (cf. boron =‘large dense coniferous forest’ - grove =‘small, usually deciduous forest’ in the absence of a one-word expression for the meanings “small dense coniferous forest”, “large deciduous forest", etc.; see Shmelev).

Along with the essential semantic features of meaning (differential and integral), it was considered necessary to consider in a number of cases unimportant features called “associative” (Shmelev 1969: 26), or “potential” (Gak 1972: 382); Wed With. 67. For the word lightning, for example, such a sign is speed, for words grandfather And grandma - old age, for words uncle And aunt - the fact that they are usually older than ego, etc. Taking into account associative features is important because in many cases they serve as the basis for various metaphorical transfers, cf. lightning telegram; uncle And aunty in circulation, etc.

At the same time, valuable thoughts were expressed about the relationships between the signs as part of the interpretation. Although the decomposition of lexical meaning into differential semantic features in principle dispenses with syntax ( stallion= “horse + male” = “male + horse”, i.e. the lexical meaning is represented as an unordered set of junctively related components), many authors were not satisfied with this representation. Thus, W. Goodenough and F. Lounsbury postulated a possessive relation between the names of features, at least when recording meaning at the denotative level (nephew ="son of a brother or sister").

A different idea of ​​a hierarchical organization of meaning was discussed later in the works of Potier 1965, Heller and Makris 1967, Tolstoy 1968, Gak 1971.

Studying color terms, G. Heller and J. Makris established the following hierarchy of semantic components (“parameters”) in the dictionary and, apparently, in the interpretation of the corresponding words: the main component is tone (wavelength, cf. red, yellow, blue etc.); dependent components - intensity (degree of unmixed with white, cf. dark, thick, light) and brightness (amount of reflected light, cf. bright, dim); the basis for this conclusion is the fact that the tone occurs without the other two components, and these latter do not occur without the tone, cf. red - purple, pink, scarlet, crimson.

Tolstoy (1968: 345, 361 et seq.) identifies two types of semes in the semantic features, or semes, that form a given meaning - supporting (concrete and unmarked) and accompanying (abstract and marked, serving as the basis for oppositions; cf. bereznichek= "birch + forest + youth + small size"; the supporting seme has been assembled in order).

According to V. G. Gak (following B. Pottier in this regard), on the contrary, the core of the meaning of the lexeme is, apparently, the seme of the generic meaning (“archiseme”), and the additional element - “differential semes of specific meaning” (Gak 1971).

Thus, the theory and practice of “component analysis of meanings” is characterized by the recognition of the hierarchical organization of meaning based on its differential features, having common features with the idea that every lexical meaning has a certain syntactic structure<…>

3) Philosophical and logical tradition of interpreting the meanings of words, dating back to antiquity (Aristotle), richly represented in the 17-18 centuries (Locke, Leibniz, Spinoza) and reviving in our time<…>In works typical for this direction, the concept expressed by a word is analyzed as part of a whole statement and in connection with the situation that it describes, and an attempt is made to reduce large number complex concepts to a small number of simple and effectively distinguish any two concepts. Interpreting, for example, such complex words as hope, fear, confidence, despair, Spinoza introduces the concept of the future and two simple binary features: “good” - “bad” and “random things” (they may or may not happen) – “necessary things” (they must come). This allows him to construct deep, although not entirely correct, interpretations, an idea of ​​which can be given by following examples: “If we know about a future thing, that it is good and that it can happen, then as a result of this the soul takes on the form that we call hope... On the other hand, if we believe that the thing that can happen is bad, then the form of the soul arises, which we call fear. If we believe that a thing is good and will come with necessity, then peace arises in the soul, which we call confidence... When we believe that a thing is bad and will come with necessity, then despair arises in the soul” Benedict Spinoza (Selected Works. M., 1957, T. 1, pp. 128-129).

4) The calculus of propositions of mathematical logic, which gave the metalanguage of semantics the foundations of recursive syntax with rules of formation and transformation; The essential features of this syntax are: a) the distinction between the names of relations, or predicates, and the names of objects, in relation to which the predicates play the role of syntactically dominant elements, used by linguistic semantics to determine and record lexical meanings, for example, A shows B X-y= shows(A, B, X) = “A causes (X sees B)” = “causes (A sees (X, B)); b) the idea of ​​higher and lower order predicates, due to which a lower order predicate can take the place of an object variable in a higher-order predicate; cf. in our example, the second place of a two-place predicate; causate, occupied by a two-place predicate see, whose order is one less than the order causate; c) transformations with connectives and quantifiers, with the help of which some correctly constructed formulas are translated into other equivalent and also correctly constructed formulas (Reichenbach 1947; see also the works of Russell 1940, Tarski 1948, 1956, Quine 1953, 1960, Church 1960). In the same direction, semantics was influenced by the generative grammar of N. Chomsky (Chomsky 1957) with its idea of ​​semantically invariant transformations, developed in modern semantics into a very meaningful theory of synonymous periphrasis, and modal logic, from where definitions of elementary modalities and operations on them (Modal Logic 1967; from the works of linguists, see Adamec 1968, Wierzbicka 1969): necessary P= "impossible not P", perhaps P= “it is not true that it is necessary not P”, etc. In the so-called deontic modal logic (logical theory of norms and normative statements), the concept of necessity corresponds to the concept of obligation, the concept of possibility - the concept of permission and the concept of impossibility - the concept of prohibition, so mandatory P ="impossible not R" allowed P= "not necessarily P", prohibited R= "necessarily not P". All these definitions and equalities have been adopted by modern linguistic semantics and are used by it in the analysis of the corresponding words.

The internal logic of the development of linguistic semantics and the impulses that it received from related sciences acted in the same direction, and by the end of the 60s, the ideological discord of the previous era had largely become history. Perhaps one of the most remarkable indicators of the maturity of modern semantics, despite the fact that subjectively it causes grief to many, is the fact that the same results are obtained by linguists working completely independently of each other. The tendency towards integration in modern semantics is undeniable and is clearly manifested in the development of the most various directions, although many of them still retain their fundamental features.

Modern semantics as part of the general theory of language

Many modern linguistic schools are characterized by an understanding of semantics as a special component of a complete description of language, which in turn is conceived as a formal device that models the linguistic behavior of people. In order to get an idea of ​​the model of language as a whole and its semantic component in particular, it is necessary to understand what skills make up the phenomenon called “linguistic behavior”, “language proficiency”, etc.

People who know this or that natural language can perform the following operations using it:

1) Construct a text in this language that expresses the desired meaning (speaking ability), as well as extract meaning from the perceived text (comprehension ability). The inability to choose words and constructions that express the required meaning leads to a semantic error, for example, this: The criminals stole several government and their own cars. This sentence is either incorrect (it should have been said private, and not own), or correct, but ridiculous (the criminals robbed themselves by stealing their own cars). The error is explained by the fact that the author of the above statement confused two words that are close, but do not coincide in meaning: privateX ="X owned by an individual" and ownX ="belonging to the person who uses X" .

2) Connect words with each other idiomatically, that is, in accordance with the norms of syntactic, semantic and lexical compatibility that have developed in a given language and are sometimes difficult to motivate. You can't say it in Russian squander or waste money(necessary: squander or waste money), get depressed(necessary: get blue), although there is no semantic error here: the form instrumental case money may have the required objective meaning (cf. litter or throw money), and the verb come - the required meaning is “to begin to be in the state indicated by the dependent noun” (cf. fly into a rage).

3) Install various semantic relations between statements, in particular: a) relations of synonymy, cf. There is no task in the world more difficult than compiling a dictionary = Compiling a dictionary is the most difficult oebusiness in the world; b) relations of logical consequence, cf. The boy was cured=> The boy recovered=> The boy is healthy. When speaking, this ability is manifested in the ability to paraphrase a constructed text in many different ways, leaving its content unchanged or changing the latter in a strictly defined way, and when understanding - in the ability to see the complete or partial semantic identity of outwardly different texts.

4) Establish various semantic properties of sentences, in particular: a) distinguish semantically correct sentences from semantically incorrect ones, b) distinguish semantically coherent texts from semantically incoherent ones.

We emphasize that here we mean skills based on the possession of purely linguistic (vocabulary and grammatical) and not encyclopedic information. Text He swam 100 meters freestyle in 45 seconds for any native speaker of the Russian language means: “Swimming the crawl style, he covered a distance of one hundred meters and spent 45 seconds on it.” For those who know not only the Russian language, but also the table of world achievements in swimming (an element of encyclopedic, not linguistic information), the same sentence may turn out to be much more meaningful. It can be perceived as a sensational message about a phenomenal world record, as a reminder of the limitless physical capabilities of man, etc.

It is enough to know only the grammar of the language and the dictionary meanings of words to construct periphrases He swam the hundred-meter distance (100 meters) in freestyle in 45 seconds, He swam a hundred meters in freestyle in 45 seconds, He swam the hundred-meter distance in freestyle in 45 seconds, He spent 45 seconds in swimming That, to crawl a distance of 100 meters, he swam the 100-meter crawl in 3/4 minutes and many others. A sports connoisseur will also have completely different possibilities for paraphrasing: He swam the shortest Olympic distance with the freestyle in 45 seconds. At the 100-meter distance with the freestyle, he improved the previous world record by 10 seconds. etc.

If a person knows only linguistic information, he will not be able to say whether the texts are semantically coherent: He swam 100 meters freestyle in 45 seconds, thus setting a phenomenal world record And He swam 100 meters freestyle in 45 seconds, barely completing Thus, the norm of the third category. If a person also has the corresponding encyclopedic information, the first sentence will be semantically coherent for him, although implausible, and the second will be incoherent or false.

So, we are talking only about modeling knowledge of language, and not knowledge of reality. Within the specified framework, native speakers perform all the listed operations intuitively and are not aware of the basis on which they choose this or that solution. Consider, for example, the sentence A good pastry chef does not fry brushwood on a gas stove. Its meaning is immediately obvious to every person who speaks Russian, although one can doubt that the average native speaker will be able to theoretically satisfactorily explain the essence of the law that he intuitively uses to understand this proposal. However, the model cannot appeal to intuition, which it does not have, and if we want it to perform operations with texts accessible to humans, we must build into it necessary information explicitly. This information consists primarily of knowledge of phonetic, morphological and syntactic units and rules and knowledge of the dictionary, but, of course, is not limited to this. There are also some semantic rules for interpreting texts; Below we explicate one of them, assuming that the syntactic structure of the sentence and the meanings of the words included in it are already known<…> .

Leaving aside the polysemy of words good, no, na, Let's write down the meanings of all other words in a column.

Confectioner frying brushwood on a gas stove

1. 1. 1. 1. 1.

"one who makes "make food "dry fallen off" consisting of "flat piece

sweets by “heating on/in a branch of a gas” (cloud) of a solid material

rial oil

2. 2. 2. 2. 2.

"sweet seller" "to throw heat" "cookies, making - "producing" heating

boiled gas "device for

production in oil

"owner" working for

confectionery "energy is burned -

of gas"

If the model does not know the law by which the meaning of a sentence is constructed from the meanings of words, nothing will prevent it from understanding this statement, for example, in the following sense: “A good candy merchant does not throw heat on the dry fallen branches on a flat piece of metal that produces gas.” This understanding results from the following combination of meanings: pastry chef 2, fry 2, brushwood 1, gas 2, stove 1; the total number of fundamentally conceivable combinations of meanings and, therefore, fundamentally possible readings of a sentence within the given information reaches 3 x 2 x 2 x 3 x 2 = 72. Of these, only one is optimal in its informativeness and naturalness. To formulate the law on the basis of which a native speaker unmistakably chooses it, let’s take a closer look at the meanings of words that provide optimal understanding of the sentence. These are the meanings pastry chef 1, fry 1, brushwood 2, gas 3 And plate 2; characteristic of them is the presence of a number of common semantic elements, namely the element “to make” (“the one who makes”, “to make food”, “made”, “for making food”), the element “heating” (“by heating on/in oil", "made by boiling", "energy of burned gas", "heating device"), the element "food" ("sweets", "to make food", "by boiling in oil"). The choice of named meanings ensures maximum repetition of semantic elements within the sentence; It is easy to see that with any other interpretation of the sentence, the repetition of semantic elements will be less high.

This is the basic semantic law that regulates the correct understanding of texts by the listener: such an understanding of a given sentence is chosen in which the repetition of semantic elements reaches a maximum. This law is a strict formulation of the old principle that the proper meaning of an ambiguous word is "clear from the context"; it is sometimes called the semantic agreement rule (Huck 1972).

Now we can formally explicate, at least in the first and roughest approximation, the concept of semantic coherence of a text: a text is semantically coherent if the lexical meanings are syntactically related words there are repeating semantic components; If this rule is not met for any pair of syntactically related words, the text is not semantically coherent.

This example alone shows that an attempt to model a person’s understanding of semantically coherent texts or his ability to distinguish semantically coherent texts from incoherent ones leads to the formulation of a serious question about the language in which the meanings of words are described. It is obvious, for example, that since only parts of complex meanings can be repeated in a text, and not the entire meaning, each of the complex meanings must be represented as a combination of simpler meanings, and each of these simpler meanings must (in formal language) always be called the same: if the same simple meaning is called differently depending on whether it is included in the complex meaning “A” or “B”, the fact of its repetition in the phrase AB cannot be directly installed.

The above allows us to conclude that the desired language differs significantly from natural language, if only in that its words are semantically much simpler than words in natural language and do not have synonyms. In the future we will deal with this issue in more detail; here it is enough to emphasize that we would inevitably come to exactly the same conclusions if we considered the requirements arising from the formal formulation of the problem of modeling any other ability from among those that together constitute “language proficiency.” In particular, without a special language for recording meanings, it is impossible to formally model the ability of a native speaker to construct texts with a given content.

It is not surprising, therefore, that the question of a language for recording the meanings of words and, more broadly, entire statements has become the focus of attention of many modern schools and areas of semantics, which is now given much attention important role: It doesn't just "learn the meanings of words" but is responsible for developing a language for recording semantic information and (in part) rules for moving from sentences of that language to sentences of natural language. In this regard, at least two levels of representation of statements are distinguished: semantic (for some authors - deep-syntactic) and surface-syntactic (cf. Zholkovsky and Melchuk 1965, 1967, Lamb 1966, Wierzbicka 1967b, Lyons 1967, Lakov 1968, McCauley 1968b, Fillmore 1969, Breckle 1969, Bellert 1969, Boguslavsky 1970, Shaumyan 1971, Barkhudarov 1973). In the works of recent years (see especially Melchuk 1974a, 19746), the number of levels increases to five or six: semantic, deep syntactic, surface syntactic, deep morphological, surface morphological, phonological. However, this concept of levels and the corresponding terminology did not emerge immediately. At the end of the 60s, many researchers did not yet distinguish between semantic and syntactic information. Without wishing to modernize the works we selected for review, we have, in most cases, retained the terminology used in them. However, the reader should keep in mind that the terms “deep level” and “deep structure” in many of them (especially in the works of J. Lakov and J. Lyons) are not used to designate what is now commonly called the deep syntactic level and deep -syntactic structure, and for designation semantic level and semantic representation of the utterance.<…>

<…>More noticeable was the impact on modern semantics of the ideas of N. Chomsky’s transformational grammar, which in the very first versions was thought of as a device that generates all grammatically correct sentences of a given language and does not generate a single incorrect one (Chomsky 1956, 1957). By assumption, such a grammar models that side of language proficiency, which is manifested in the ability to distinguish right from wrong in a language. Subsequently (Chomsky 1965), the concept of correctness began to be considered not at one, but at two levels: competence - knowledge of the language and performance - use of the language, i.e. actual speech practice. What is approved by linguistic competence is not necessarily found in speech practice, and vice versa.

Initially, work in the field of transformational grammar was carried out without taking into account the fact obvious fact that the grammatical correctness of sentences significantly depends on their lexical content. By the mid-60s, transformational theorists had freed themselves from illusions on this score (see, for example, Klima 1965, Chomsky 1965), but it was not immediately possible to draw correct conclusions from the new understanding of the relationship between grammar and vocabulary.

For at least three years<…>attempts were made to find a compromise between the original version of N. Chomsky’s generative grammar and some form of participation of dictionary information in it. The compromise proposed by J. Katz, J. Fedor and P. Postal and accepted by N. Chomsky is as follows.

The generating device first builds the deep syntactic structure of the future sentence, which is then fed to the input of the interpreting semantic device. This device 1) determines the number of possible interpretations of a given sentence, 2) records the meaning of each generated sentence using semantic components, 3) detects semantic anomalies (for example, notes the meaninglessness of a sentence Geranium got married inconsistency Bachelors are married etc.), 4) determines which semantically non-anomalous sentences are analytically true, that is, true due to the meanings assigned to words (cf. Bachelors are unmarried), and which ones are synthetically true, that is, true by virtue of correspondence to facts (cf. Sun- source of life on earth), 5) establishes relations of equivalence between sentences, i.e. periphrastic relations, and solves a number of other issues.

The construction of the deep syntactic structure of a sentence is ensured by the usual rules of NS grammar. As for the semantic interpretation of a sentence, it is carried out using a special vocabulary and so-called semantic projection rules.

In the dictionary, each word in each of its meanings receives a syntactic characteristic (for example, a noun, animate, countable, concrete); elementary semantic features are attributed to it (for example, bachelor ="unmarried", "male"); finally, it is supplied with an indication of what semantic features it requires from words combined with it (for example, honest is provided with the note that the dominant noun must have the attribute of animation).

Projection rules receive as input the values ​​of units that are directly components of any construction (for example, the meanings of words honest And bachelor in the AN construction), and combine these meanings into a new complex meaning. By checking whether the requirements for combinability of features are satisfied in a given pair of words, what meanings of these words can, in principle, be combined, etc., the addition rules generate information about the number of possible interpretations of the sentence, their anomaly-non-anomaly, etc.

Without going into the details of this system<…>, let us emphasize its main property: the generation of a sentence begins with the generation of its deep syntactic structure, which is subsequently subjected to semantic interpretation.<…>This is a tribute to the first, now rejected version of transformational grammar and evidence of the half-hearted nature of its restructuring. The unnaturalness of this order of generating operations becomes obvious when they are considered from the point of view of paraphrasing tasks. The ready-made deep syntactic structure severely limits the freedom of choice of options for expressing a certain meaning: since the syntactic component of transformational grammar generates chains of symbols of classes such as N, V, A, Adv, it turns out to be impossible to directly establish the synonymy of sentences built on the basis different parts speech, for example, Hans loves work(NVN) and Hans works willingly(NVAdv)<…>, She salts the soup - She pours salt into the soup, He was expected yesterday - He was supposed to arrive yesterday, She pretended to be deaf - She pretended to be deaf - She feigned deafness - Her deafness was imaginary (feigned). To formalize such periphrastic relations, a semantic notation free from syntax restrictions is needed, which would allow superficially completely different sentences to be represented as realizations of one semantic representation. In other words, from the point of view of paraphrasing tasks, the reverse order of operations looks more natural - from the meaning at the input to the syntactic structures at the output, as was envisaged in Zholkovsky et al. 1961. It is not surprising that in the model under consideration, periphrasis is reduced to a few semantically invariant grammatical transformations and the substitution of lexical synonyms: such transformations do not affect or almost do not affect the syntactic structure of the sentence, and less trivial transformations require its restructuring.

As a result of a critical revision of the model of J. Katz, J. Fodor and P. Postal, the idea of ​​semantic interpretation of a ready-made syntactic structure gave way to the idea of ​​synthesizing a sentence with a given meaning. In this regard, questions about the “semantic deep structure of a sentence” (from the current point of view - about the semantic representation of a statement), about the recoding of a deep structure into a surface one, about dictionaries focused on solving this problem, and about the semantic analysis of a word in such a dictionary.

The study of deep structure followed two paths. Some linguists have been content with the principled statement that for some sentences with very different surface structures, for a number of reasons, it is necessary to postulate the same deep structure; however, no language was proposed for recording deep structure. Other linguists focused on developing a language for recording deep structures and the forms of their recording.

<…>The characteristic features of the first approach were fully and clearly manifested in the work of J. Lakov (Lakov 1968), devoted to the analysis of sentences with instrumental adverbial phrases of type 1) Seymour sliced ​​the salami with a knife ‘Seymour sliced ​​the salami with a knife’. In previous transformational studies they were assigned a syntactic structure that was completely different from the syntactic structure of sentences like 2) Seymour used a knife to slice the salami. The first sentence was qualified as simple, with an instrumental circumstance, and the second - as complex, representing a transformation of two simple sentences: Seymour used a knife + Seymour sliced ​​the salami.

J. Lakov drew attention to the fact that these sentences are paraphrases of each other. If we assume that they are completely different in structure, we will have to create two different rules of semantic interpretation that would attribute the same meaning to them. Meanwhile, a number of facts indicate that the differences between the sentences under consideration concern only the surface syntactic structure; their deep structure is identical, and therefore, during their transformational generation, one can get by with one rule of semantic interpretation. At the same time, as J. Lakov believed, all the prohibitions that limit the possibilities of lexical and syntactic transformations of such sentences will be explained.

First of all, both types of sentences express the meaning of the goal. For sentences with an infinitive turn, this thesis does not need proof; As for sentences with instrumental with, they can be ambiguous, cf. I cut my finger with a knife ‘I cut my finger with a knife’ (with a purpose, intentionally) and ‘I cut my finger with a knife’ (without the purpose of cutting myself, unintentionally, accidentally). Sentences with non-target with differ from sentences with unambiguously target with in that they do not occur 1) in the form of a continuous form (I was cutting my finger with a knife - unambiguously target), 2) with modal verbs can 'be able', try 'try' etc. under. (I tried to cut my finger with a knife – unambiguously target), 3) in the imperative (Cut your finger with a knife – unambiguously target).

To others common property sentences of both types is the presence in them of a verb with the meaning of action. In sentences that do not contain such a verb, there can be neither an instrumental with nor a use verb. So, sentences with the stative verb know ‘to know’ - * I knew the answer with a sliderule – ‘I knew the answer using a slide rule’, * I used a sliderule to know the answer ‘I used slide rule, to know the answer’ – are incorrect, although sentences with the opposite know active verb learn ‘to find out’ are quite acceptable: I learned the answer with a sliderule ‘I learned the answer using a slide rule’, I used a sliderule to learn the answer ‘I used a slide rule to find out the answer’.

The third common property of sentences with the instrumental with and the verb use is the obligatory nature of the animate agent; sentences whose subject denotes an inanimate agent cannot, for obvious reasons, contain either the instrumental with or the verb use; Wed incorrectness of the sentences * The explosion killed Harry with a stone ‘The explosion killed Harry with a stone’ and * The explosion used a stone to kill Harry ‘The explosion used a stone to kill Harry’.

The meaning of the observations made by J. Lakov is, in our opinion, that he equates the lexical meanings of the preposition with and the verb use: both of these words are used to express the idea of ​​instrumentality (and could be considered as purely syntactic suppletive derivatives if prepositions and verbs were connected by productive word-formation patterns). To reach this conclusion, no other argument is required, in general, than that the sentences Seymour sliced ​​the salami with a knife and Seymour used a knife to slice the salami are situationally equivalent: since, except with and use , all other lexical units These sentences are the same and it is difficult to assume the operation of any rules for adding values ​​other than simple “summing”, we are forced to conclude that the lexical meaning of with and use also coincides. It immediately follows from this that the sentences under consideration correspond to the same sentence of the semantic language. All other properties observed by J. Lakov (by no means redundant, because they are necessary to explain the facts of compatibility - incompatibility various elements as part of superficial sentences) is not a proof, but a manifestation of semantic identity.<…>

Another approach to the study of deep structure is presented in the works of Charles Fillmore<…>. This author proposes a language for recording deep structure and some rules for translating deep structures into surface ones, which is associated with very interesting experiments in the interpretation of meanings. Therefore, his system deserves a more detailed analysis.

C. Fillmore accepts the hypothesis of the component structure of meaning and the idea of ​​sequential decomposition of the lexical meaning of each word into increasingly simpler components - up to the final<…>. They are not only abstract concepts such as 'identity', 'time', 'space', 'body', 'movement', 'life', 'fear', but also “undefined terms that directly indicate aspects or objects of cultural and the physical universe in which people live” (Fillmore 1969: 111). Another essential element of the language with which lexical meanings are described is predicate-argument syntax. Full-meaning words of the language (verbs, many adjectives and nouns, some - for example, causative - conjunctions, etc.) are described in the dictionary using predicate-argument structures that remove the differences between parts of speech (there are no parts of speech in the deep structure).

Sharing widely held views on the argument structure of predicates (buy is a four-argument verb, rob is a three-argument verb, touch is a two-argument verb, ascend is a one-argument verb), C. Fillmore deviates from common practice in that respect. which considers it necessary to indicate not only the number of arguments of a given predicate, but also their semantic content, or role. The role structure of a predicate is established on the basis of the inventory of meanings usually considered in case theory, and is therefore sometimes called its case structure. C. Fillmore establishes the following deep cases, or roles, of arguments: 1) Agent - an animate initiator of events described by the corresponding verb, for example, He says<…>; 2) Counterparty - the force against which the action is directed, for example, resist anyone ; 3) Object - a thing that moves or changes, the position or existence of which is the subject of attention, for example, smash window to condemn someone for being late, Stone fell; 4) Place (judging by the examples) – physical body, experiencing direct influence from the actor, for example, hurt someone's feelings nose ; place differs from object in that it allows periphrasis like touch someone's nose - touch someone's nose; in the case of an object, such a paraphrase is unacceptable, cf. break someone's nose - *hit someone on the nose; 5) The addressee (goal), judging by the examples, is the person in whose favor or to whose detriment the action is performed, for example, condemn anyone, learn anyone, sell something anyone, buy something from anyone ; 6) Patient - a thing that experiences the effects of an action, for example, He condemns Peter, rob anyone, steal something from anyone ; 7) Result - a thing that arises as a result of an action (Fillmore has no clear examples of this role); 8) Instrument - a stimulus or direct physical cause of an event, for example, hit someone whip, rise on your feet ; 9) Source – a place from (from) which something is directed, for example, He teaches me math He sells a book.

There is no one-to-one correspondence between roles - elements of the deep structure and arguments - elements of the surface structure. Therefore a) one argument can fulfill several roles (in He teaches me math the subject designates both the Agent and the Source); b) an argument can be obligatory, and the role it performs can be optional (cf. John fell, where there is an obligatory object - John's body - and an optional Agent - John himself, if he fell intentionally; if John fell unintentionally, then the meaning of Agent in the sentence under consideration is not expressed); c) the role may be obligatory, and the argument may be optional; the verb blame ‘to condemn’ has four semantically necessary roles - syncretically (with one argument) expressed Source and Patience (condemnor), Object (misconduct) and Addressee (subject of the offense). Mandatory at a superficial level, i.e. realized in any sentence with the verb blame, C. Fillmore considers only the last role; all other roles may not be expressed at the surface level, cf. He was blamed ‘He was condemned’; d) the role can be expressed implicitly, without any superficial exponents: in climb (stairs) And kiss contains an implicit reference to the Tool (legs and lips, respectively).<…>

We have one more question to consider before moving on to presenting methods for fixing the deep structure and the rules for its translation into the surface structure. We are talking about the second innovation of Charles Fillmore, which consists in the fact that he proposed a more subtle than traditional concept of lexical meaning. Traditional concept of meaning<…>comes from the idea that the content side of linguistic units is multi-layered. In addition to the meaning in the proper sense of the word (sens intellectuel, Begriffsinhalt, denotation), it also includes a secondary meaning, or shade of meaning (nuances, Nebensinn, contextual meaning in the sense of J. Firth), as well as stylistic and emotionally expressive elements of meaning (register, valeur é motive or affective, Gef ü hlswert, Stimmungsgehalt, feeling, tone)<…>; assessing these aspects of meaning using special system stylistic marks have long become the ABC of lexicographic work. Charles Fillmore goes further than his predecessors in the sense that he splits the previously unified concept of meaning itself into two entities. These entities are meaning and presupposition. Let us clarify the last concept.

Presuppositions are understood as conditions that must be satisfied in order for a proposal to function as a question, order, statement, etc. Presuppositions of a request Please open the door consist of two assumptions by the speaker about the knowledge that the addressee of the speech has: 1) the addressee knows which door the speaker means, 2) the addressee knows that this door is closed. Speaking Harris accused Mary of writing the editorial, the subject of the speech suggests that Harris had a negative view of Mary's performance, and states that Harris claimed that Mary was the person who wrote the editorial. Speaking Harris criticized Mary for writing the editorial, the subject of the speech suggests that Harris considered Mary to be the author of the editorial, and claims that Harris had a negative opinion of the writing of the article. The use of the verb chase suggests that the victim of the pursuit is moving at high speed, and the use of the verb escape suggests that at some time prior to the escape the subject was forcibly detained in some place. Tall "tall, tall" and short "low, squat", in contrast to high "high" and low "low", suggest that the object to which these properties are attributed is located in a vertical plane and has contact with the ground. Blame ‘condemn’ assumes that the condemner is a person, and accuse “accuse” - that both the accuser and the accused are a person.

The difference between presupposition and meaning in the proper sense of the word is manifested, for example, in the fact that they react differently to negation: only meaning, but not presupposition, falls within the scope of negation. By virtue of this principle, it turns out, for example, that in the interpretation of the word bachelor -“an adult man who has never been married” - only those semantic components that are to the right of the comma form the actual meaning: they are the ones that are negated in the sentence Peter is not a bachelor. The two remaining components - "adult male" - form presuppositions bachelor, because the offer Peter is not a bachelor under no circumstances can it be understood as denying the fact that Peter was a grown man. In other words, under negation only the meaning of the statement changes, but not its presupposition.<…>

This quick review of examples of presuppositions shows that they include three fundamentally different classes of semantic elements: 1) elements of encyclopedic knowledge, that is, knowledge of the “current situation”, which under no circumstances can be included either in the interpretation of the lexical meanings of words or in the description of their compatibility (cf. the very first example - presuppositions of a request); 2) elements that can be included directly in the interpretation, but not in the description of compatibility (cf. analysis of the verbs accuse, criticize, chase, escape, adjectives tall and short "short"); these also include elements that form the modal frame of interpretation (cf. the doctrine of the modal frame of an utterance, developed by Polish linguists A. Boguslawski and A. Wierzbicka<…>); 3) finally, elements that can be included in the description of the compatibility of a word rather than in the interpretation of its meaning (cf. the presuppositions of the Agent's animacy for the verbs accuse and blame "to blame", "to condemn").<…>

Of great interest is the semantic language (lingua mentalis) for recording the meaning of statements, developed by A. Wierzbicka<…>based on the ideas of her teacher A. Boguslavsky<…>. If the works discussed above are characterized by the desire to build a semantic language as an extension of the logical language of predicate calculus, then A. Wierzbicka builds his lingua mentalis as a narrowing of natural language. This is the simplest dictionary and syntactically part of natural language, namely its minimal vocabulary and minimal set of syntactic constructions, recognized as sufficient to describe the meanings of all other lexical and grammatical means of this language.

The dictionary of lingua mentalis consists of several dozen undefined semantic elements such as ‘want’, ‘not want’ (mutually independent and equally complex modalities), ‘count’, ‘do’ and a few others. It is obvious that it is possible to reduce the real variety of meanings to such a limited set of meanings only under the condition of a very deep analysis of semantic units, usually taken as elementary. In this regard, the semantic description proposed by A. Wierzbitskaya of such seemingly indecomposable concepts as ‘possibility’, ‘possession’, ‘truth’, ‘affirmation’, ‘denial’, and a number of others deserves attention; according to A. Vezhbitskaya, I can means ‘I will do it if I want’, I have a thing means ‘I have the right (=society wants me to be able) to do whatever I want with a thing’<.…>; Truth= ‘a judgment we must accept’, must R= ‘cannot help but P’<…>; S there is P(statement) = ‘I want you to believe that S is P’; S is not P(negation) = ‘I don’t want you to think that S is P’ (thus the concept of negation is associated with the concept of will); know= ‘to be able to tell the truth’; understand P= ‘know what P means’ (cf. A stupid= ‘it is difficult to cause A to understand’); And is interested in X= ‘A wants to know about X’, etc.

However, the main difference between lingua mentalis and other semantic languages ​​of this type lies not in the area of ​​vocabulary, but in the area of ​​syntax, namely, in the syntactic structure of its sentences. Usually, the two main elements of the semantic representation of a simple sentence are considered to be an n-place predicate and subject variables denoting its arguments (cf. the concept of role structure in Charles Fillmore). A. Wierzbicka proceeds from the fact that in the “deep structure” all predicates are monoplace names of properties and the only argument of each predicate is the subject S to which the given property P is attributed: S is P. However, this formula does not exhaust the structure of the statement; as is clear at least from the above examples, A. Wierzbicka supplements it with a third element - the modal frame M (in some works another element appears - the designation of the time when a given property characterizes a given object). As a result, the general structure of a sentence in the lingua mentalis takes the following form: M that S is P. When moving from the lingua mentalis to natural language, this structure is transformed according to certain rules, including, in particular, the rules for removing modalities (Wierzbicka 1967b: 36).

The idea itself that every sentence of a natural language expresses modality (and time) and that, therefore, the sentence of semantic language interpreting it must have special means for fixing the corresponding meanings is, of course, not new. It is not even new that in the structure of an interpretive sentence a special place is provided for symbols that explicitly reflect the implicit modalities of the interpreted sentence of a natural language - A. Seshe and S. Bally already believed that sentences like It's raining actually mean something like ‘I believe that it is raining’ (cf., for example, S. Bally’s doctrine of mode and dictum; Bally 1955: 43 et seq.). What is new is a) a set of modalities (‘I want’, ‘I believe’, ‘I understand’, ‘I think’, etc.), b) an understanding of the modal frame as complex structure With in some places for the modalities of the subject of the message and the addressee of the message (cf. 'I believe that you understand that ...'), c) the idea that the modal frame is implicitly present in any sentence of natural language and, therefore, must be explicitly represented in the interpreter sentence of a semantic language, d) the use of this apparatus to describe lexical meanings. The apparatus of modalities developed by A. Wierzbicka provides a semantic analysis of large layers of vocabulary - first of all, particles, introductory words, conjunctions and adverbs like completely, everything, even, still, fortunately, finally, but, essentially, rather, only, already, whole, although etc. – on a much deeper basis than before (cf. one of the first interesting experiments in this direction - Mushanov 1964). It will hopefully have a fruitful influence on lexicographical practice, which invariably reveals its weakness when confronted with such words. Let us note, in particular, that the use of modalities such as ‘opinion’, ‘expectation’, ‘assumption’, etc. allows you to describe very subtle semantic differences that are usually not noticed by explanatory dictionaries; Wed He brought only 10 books= ‘Know that he brought 10 books; I believe that you understand that this is not enough’; He only brought 10 books= ‘Know that he brought 10 books; don’t think it’s more’; Even John came= ‘Others have come; John came; I expected John wouldn't come.'

So, in the general formula of the sentence in the lingua mentalis - M that S is P - we analyzed the structure of the first element. Let us now move on to the structure of the second element, that is, the subject. Here, the most important innovation of A. Vezhbitskaya is that a fundamental difference is seen between the names of persons and non-persons: it is believed that nouns like person, man, woman, Peter, Mary have not one meaning, as was commonly thought, but several different ones. When we talk John was lying on the floor, or John weighs a lot, we mean John's body; if John is assigned not physical, but “mental” predicates ( John doesn't believe this story, John is kind), we do not mean John’s body, but himself, his ego, his personality; finally, in cases like John was moving According to A. Vezhbitskaya, both of the above values ​​are realized: John was moving means, roughly speaking, 'John's body moved because John (=John's personality) wanted it to move'.

A. Wierzbicka sees one of the manifestations of the distinction she established in the fact that sentences like Ivan kissed Elena's hand are converted into sentences like Ivan kissed Elena's hand, and sentences like Ivan kissed the lid of the box<руку трупа> - no (not possible) * Ivan kissed the box on the lid<труп в руку> ). This is explained as follows. In sentences like Ivan kissed Elena object position is not replaced; real object actions - not Elena (the name of a rational and will-endowed being), but her hand (a physical object), which makes expansion possible Ivan kissed Elena's hand. Not so in sentences like Ivan kissed the box; here the position of the object is already replaced by the word box, denoting a physical object, and therefore the expansion of the sentence due to the object into the lid impossible<…>.

Apparently, these examples can just as little substantiate the concept put forward by A. Wierzbicka as the examples given below can refute it: the distance between the semantic representation of a sentence and the prohibitions operating in its surface structure is too great. Nevertheless, it is useful to pay attention to contradictory examples, if only to present a more objective picture of the facts.

A. Wierzbicka attributes a special status to nouns with the meaning ‘face’. Meanwhile, the syntactic behavior described by A. Wierzbicka is characteristic of a much wider class of nouns, distinctive feature which is the semantically not very meaningful, highly grammaticalized component ‘animacy’. Noun dead man(unlike dead body) in Russian and some other languages ​​is interpreted as animate, and therefore a construction like kiss the dead man on the forehead with him it is quite possible, although dead man, as well as dead body, can hardly be described as a rational being, endowed with will and capable of expedient reactions. Apparently, animals, insects, etc. do not have these attributes, but the syntactically corresponding names behave exactly the same as nouns with the meaning ‘man’, cf. wound the bear in the ear, grab the fish by the tail, stroke the beetle on the back. Let us note, finally, that constructions of the type under consideration, although not typical for nouns denoting inanimate objects ( boat, chair etc.), but not completely alien to them; Wed hook the boat with a hook to the stern, take the chair by the back, and I’ll take it by the legs.

Let us emphasize that we do not doubt the usefulness of the distinction between three types of noun meanings, but only that it can be justified in this way.

Let us finally consider the last element of a sentence in a semantic language - its predicate. We will briefly list those theses and conclusions of A. Vezhbitskaya that draw attention to completely new aspects old problem predicates, even if they do not seem unconditionally fair. 1) Predicate is a property of an object, but not an event<…>; 2) typical predicates - adjectives, verbs with the meaning of state ( sleep), feelings-states ( afraid), position in space ( be) and a few others<…>; 3) all other verbs, that is, any transitive verbs (including verbs of perception and causative verbs themselves), verbs of movement and physical actions- not predicates. They include the meaning of cause (causation); but causes connect events, not objects, and therefore are not predicates, but conjunctions. Here are some examples of analysis: John saw a fox= 'John had the image of a fox because John's eyes came into (indirect) contact with the fox', John broke the window with a hammer= ‘The window broke because the hammer came into contact with the window, because the hammer moved, because John’s body moved, because John wanted his body to move, because John wanted the window to break’; 4) this analysis gives grounds for the conclusion that in the semantic structure there are no elements with objective, local, instrumental and other similar meanings, but there are only subjects and predicates assigned to them. “...From a semantic point of view, the concept of an object is meaningless (or at least redundant): the “object of action” is simply the subject of some situation caused by some other situation” (Wierzbicka 1967a: 34). Similarly, an element with an instrumental meaning “always hides within itself the subject of an independent sentence with an expressed or unexpressed predicate ( John broke the window with a hammer VS. John broke the window with a hammer) and the exponent of the causal relation (because)” (Wierzbicka 1967a: 15-16); elements with the meaning of place in the deep structure also perform the functions of a subject, etc.

The system outlined above is, in essence, extracted entirely from the first postulate of A. Wierzbicka: predicates are attributed only to objects, but not to other predicates. This postulate can, apparently, be contested, but regardless of our attitude towards it, it must be recognized that it allows A. Wierzbicka to realize the goal of many theorists of semantics - to reduce multi-place predicates to single-place predicates - on such a broad and deep basis that it has not been possible to build an equal one. to no one else.

We emphasize that the conclusions of A. Wierzbicka, if they are recognized as fair, do not in any way prevent the reason, method of action and similar meanings from being interpreted as predicates at levels less deep than the level of lingua mentalis, nor the fact that the object, place and other elements were interpreted at the same levels as actants.

In conclusion of this review, we emphasize once again that, despite some differences between representatives of various directions and schools of modern semantics, there is a certain minimum of ideas common to all of them. This minimum includes the idea that semantics is a component of a complete linguistic description, conceivable in the form of a model that can 1) construct correct sentences of natural language according to given meanings or extract meanings from given sentences, 2) paraphrase these sentences, 3) evaluate them from the point of view of semantic coherence and perform a number of other tasks. The main means of solving all these problems is recognized as a special semantic language for recording the content of a statement, as well as dictionaries and rules with the help of which a correspondence is established between sentences of natural and semantic languages ​​that translate each other.<… >

Chapter 2. Semantic language as a means of interpreting lexical meanings.

Linguistic sign and the concept of lexical meaning.

Saussure's concept of the linguistic sign as a two-sided unit characterized by a signifier and a signified<…>, is opposed to the sign theory of Charles Morris<…>, which originally developed in semiotics, and recently, in a significantly revised and expanded form, was transferred to linguistics (Melchuk 1968). Within the framework of this theory, a linguistic sign is characterized not only by a name (signifier) ​​and semantics (signified), but also by two more parameters - syntactics and pragmatics<…> .

We will consider the concept of a name to be quite obvious and therefore leave it without explanation. In most cases, semantics refers to information about a class of things called signs with common properties or a class of extra-linguistic situations that are invariant with respect to certain properties of the participants and the relations connecting them. The syntactics of a sign refers to information about the rules for connecting a given sign with other signs in the text. The pragmatics of a sign is understood as information that records the attitude of the speaker or the addressee of the message to the situation about which we're talking about. Let us consider the semantics, syntactics and pragmatics of the sign in more detail, but only to the extent necessary to exploit the concept of lexical meaning.

The semantics of a linguistic sign reflects naive concept of a thing, property, action, process, event, etc. The simplest example discrepancies between naive and scientific ideas was also given by L.V. Shcherba, who believed that special terms have different meanings in general literary and special languages. “A straight line (line) is defined in geometry as “the shortest distance between two points.” But in literary language this is obviously not the case. I think that in everyday life we ​​call a straight line that does not deviate either to the right or to the left (and also neither up nor down)” (Shcherba 1940: 68). Separating “philistine concepts” from scientific ones, L.V. Shcherba also says there that there is no need to “impose common language concepts that are not at all characteristic of him and which – the main and decisive thing – are not any factors in the process of verbal communication.” Subsequently, R. Halling and W. Wartburg, developing a system and classification of concepts for the ideological dictionary, set themselves the goal of reflecting in it “that idea of ​​the world that is characteristic of the average intelligent native speaker and is based on pre-scientific general concepts placed at his disposal by language.” (Hallig and Wartburg 1952; xiv). They called this view of the world “naive realism.” The same ideas formed the basis of the lexicographic experiments of a number of Moscow linguists that we discussed in the first chapter.<…>

A naive picture of the world that has evolved over centuries, which includes naive geometry, naive physics, naive psychology, etc. reflects the material and spiritual experience of the people who are native speakers of a given language and therefore can be specific to it in two respects.

Firstly, the naive picture of a certain part of the world may differ strikingly from the purely logical, scientific picture of the same part of the world, which is common to people speaking a wide variety of languages. Naive psychology, for example, as evidenced by the meanings of hundreds of words and expressions in the Russian language, singles out the heart or soul as an organ where various emotions are localized. It may be doubted whether this corresponds to scientific psychological ideas.

To correctly interpret the meaning of a word freeze related to freeze approximately the same as frenzy refers to excitation, ecstasy- To delight, panic- To fear, we must mentally draw a more complex picture of the human psyche, including the idea of ​​two types of fundamentally different devices: a) devices with the help of which we feel (soul, or heart), logically master the world (mind) and physically behave (body); b) devices that monitor and control our behavior (will). Verb freeze this means, according to the IAS, “to become completely motionless”; verb freeze denotes a process related to freezing, with the clarification, however, that physical behavior goes beyond the control of the tracking device; Wed Suddenly a telegram: one bomb turned the crew around, another- king Naturally everyone freezes, deathly silence(Yu. Davydov).

To describe the meanings of semantically more complex lexical units denoting internal states person ( Hair stands on end with fear, Goosebumps crawl down your back with horror, A lump comes to your throat with excitement etc.), it is required, as L.N. Iordanskaya (1972) showed, an addition to the model of the psyche in the form of a list physical systems people, considered as manifestations of certain classes of feelings, and a list of types of their reactions ( The eyes went up to the forehead in surprise -"extraordinary functioning" Breathing stops -"stop functioning", etc.).

The task of the lexicographer, if he does not want to leave the soil of his science and turn into an encyclopedist, is to reveal this naive picture of the world in the lexical meanings of words and reflect it in the system of interpretations. The first attempts in this direction showed how difficult this task is. It would seem that the use of Russian words height, high, low is fully regulated by the following dictionary interpretations: height ="extent of an object from bottom to top" high ="big in height" low ="small in height." However, an analysis of the naive geometry associated with them shows that there are more complex system rules for the use of these words, reflecting the different features of their meaning, which native speakers of the Russian language are excellent at and intuitively use in speech practice. Below we present some observations concerning only the word height(cf. Birwish 1967).

In the language of Euclidean geometry, this word means “perpendicular dropped from the vertex geometric figure on the base or its continuation." This concept differs from the naive concept of height at least the following signs: 1) A geometric object has as many Euclidean heights as there are vertices; A physical object has only one naive height. 2) Euclidean height continues to be a height even if it is located in the horizontal plane; naive height is vertical or tends to be vertical (cf. Euclidean and the usual height of a modern architectural structure, which has the shape of a rhombus and rests on the ground with one of its vertices). 3) In Euclidean geometry, any polygons and polyhedra have height; in naive geometry, understanding one of the dimensions of an object as heights depends on its internal structure, its shape, the place of attachment to another object, the proximity of other bodies, etc. The measurement, which in a hollow object (for example, a box, casket) is conceptualized as height, an object with exactly the same external shape, but with a continuous internal structure will be more likely to be understood as thickness(cf. book, metal casting). A window of a certain shape can be called narrow And tall, and a painting with exactly the same external frame (cf., for example, the traditional form of Japanese painting) is thought of as narrow And long. Items with a compact shape (boxes, backpacks, tables) height can be attributed regardless of whether they rest their bottom on another object or not, and to objects with an elongated shape (pipes, pillars, portable ladders) height is usually attributed when they have a point (line, edge) of fastening or support below: a wooden staircase can be high, and the rope ladder is always long, even if it touches the ground. A free-standing factory pipe is more likely to high, how long, and the metal rod of a lightning rod running along its wall is more likely long, how high, because it does not stand autonomously, but is adjacent to another, larger body. 4) For Euclidean height, it does not matter how much it is inferior to other linear dimensions of the body: even if it is an order of magnitude smaller than the base of the figure, it remains a height. The naive height, at least for some objects, cannot be an order of magnitude inferior to other linear dimensions of the object: if the vertical size of a solid round object is an order of magnitude smaller than its diameter and if the object itself is not too large, we should talk about it thickness, and not height(cf. for example, a coin).

Secondly, naive pictures of the world, extracted by analysis from the meanings of words in different languages, may differ in detail from each other, while the scientific picture of the world does not depend on the language in which it is described. From the “Russian” point of view, the sofa has length and width, and from the “English” point of view, according to Charles Fillmore, it has length and depth. In German you can measure the width of a house in the windows (zehn Fenster breit “ten windows wide” - an example by M. Birwish), but in Russian this method of measurement is at least unusual, although understandable. It has long been assumed that, despite differences in the division of the color spectrum in different languages, the system of differential features on the basis of which colors are distinguished is the same in different languages ​​and consists of hue, saturation and brightness (see Heller and Makris 1967) . In European languages ​​this is indeed the case. There are, however, languages ​​that not only divide the spectrum differently from European languages, but that also use completely different characteristics. In the Hanunoo language (Philippines) there are four color terms: they differ according to the characteristics “light” - “dark” (white and all light chromatic colors - black, purple, blue, etc.) and “wet” - “dry” (light -green, yellow, coffee - chestnut, orange, red). It turns out, therefore, that the signs of tone, saturation and brightness are not universal: “... the contrasts in terms of which the substance of color is defined in different languages ​​may depend mainly on the association of lexical units with culturally significant aspects of objects in the surrounding reality. It seems that in the example of the Hanunoo words one of the dimensions of the system is suggested by the typical appearance of fresh, young (“wet,” “juicy”) plants” (Lyons 1968: 431). Facts of this kind do not so much refute the hypothesis about the universality of elementary meanings<…>, which testify to the usefulness of the principle,<…>due to which abstract and concrete vocabulary must be described differently. In particular, the best description of both European color terms and Hanunoo color terms would be pictures, rather than interpretations using differential features: after all, even to a Russian speaker, pink is unlikely to appear as red in tone, high degree of brightness and low degree of saturation.

The provisions about the naive and scientific picture of the world (and, naturally, about naive and scientific physics, psychology, geometry, astronomy) have a fundamental meaning. The fact is that the program for describing the meanings of words using a finite and not too large set of simple concepts, proclaimed by Leibniz, has in recent times been criticized as completely utopian<…>, since it is equivalent to a description of the entire encyclopedic body of human knowledge. In relation to Leibniz, this criticism may be fair, but the distinction between a naive and a scientific picture of the world with a further lexicographic description of only the first of them makes this criticism pointless.

Until now, speaking about the semantics of a sign, we have not dissected it in any way. Meanwhile, in logical literature, starting with the classical work of G. Frege on meaning and sense, the semantics of a sign is usually considered at two levels - denotative (referential) and significative<…>. The denotation of a sign is the class of facts it denotes, and the significative is general signs all facts of this class. Thus, it is possible to have a denotative identity of signs with their significative difference. A classic example of this discrepancy is the phrases triangle center of gravity And median intersection point: these names actually define the same object of reality, but allow us to think about it in different ways.<…>

The question of the syntactics of a word in the aspect that interests us comes down to one of the central questions in modern semantics about the difference between the lexical meaning of a word and its compatibility.<…>

<…>With more difficult problem we encounter when some information X, which we have to attribute to the semantics of a sign or its syntactics, turns out to be semantic. In other words, it is more difficult to distinguish between the lexical meaning of a word and its semantic compatibility. This question allows for three different solutions.

1. Some semantic information can only be interpreted as a feature of the semantics of a word. Let us consider in this regard the verbs prick And chop. In dictionaries they are interpreted as follows: prick ="to crush, cut, divide into pieces" chop ="by hitting something sharp, to divide into parts, cut off, crush." The participial phrase in the second interpretation - “hitting something sharp” - describes a very important feature chop, which we don't have chop: chop always with the help of a tool, and prick possible without resorting to any special tools. Indeed, throwing a piece of ice on the floor can split, but not at all cut down; With on the other hand, if we wield an axe, then a piece of ice can be cut, And split, although perhaps the situation itself is somewhat unusual. This, however, is the difference between prick And chop are not exhausted. prick Only solid and non-viscous objects are allowed (cf. chop wood, sugar, nuts, ice), A in use chop There are no restrictions in this regard (cf. chop wood, knotty trunk, meat, ropes, rubber bands, cabbage). Since prick contains an indication of the hardness and invisibility of the object, chopping involves instant separation, disintegration of the resulting parts, which is not typical for chopping (this, in particular, leads to the fact that when chopping an object with a fibrous structure, the blow is usually directed towards the fiber, and when this is not necessary in the wheelhouse). Thus, at chop And prick There are certain semantic features that must be taken into account for the correct use of these words, and we must decide in what form they are best described. Let us first assume that the indications “tool”, “hardness and non-viscosity of the object” are not parts of the values chop And prick respectively, and their compatibility characteristics: chopcombines with the name of the instrument, and prickcombines with the name of a solid and inviscid object. The falsity of this assumption is obvious from the fact that chop can be such a thing, the name of which in the dictionary cannot be assigned the semantic attribute “tool”, cf. chop a frozen block of snow with a board or butt of a gun. The point is not that they chop with a thing, which by its very nature is a tool, but that a certain thing in a given situation is given the functions of a tool. Thus, “tool” is not a semantic feature of the word with which the verb is combined chop, but a property of a real participant in a specific situation and, therefore, not a feature of the semantic compatibility of the verb, but a necessary element of its meaning. Nowhere except in the interpretation of the word can this element be reflected. The question is solved in a similar way with the verb prick.

2. Some semantic feature words can only be described as a feature of its compatibility. In one of its meanings, the noun handful can be interpreted as a first approximation as "a very small number". However, it does not describe any objects or even any living beings, but mainly people (see BAS): a handful of defenders, people, brave men, but not * a handful of cats, * a handful of cabinets. Let us assume that the indicated property is not a feature of the compatibility of a word, but a feature of its meaning: handful= "a very small number of people." Because the word handful in the meaning under consideration strongly controls the noun, and this noun cannot be anything other than the name of a person, the interpretation of the corresponding phrases will always contain semantic repetition: a handful of brave men= “a very small number of people + brave people”, which is semantically equivalent to the expression “a very small number of brave people”. In other words, one occurrence of the semantic component “people” Always turns out to be superfluous and is eliminated from the interpretation of any phrase. But this means that the meaning we postulate is never realized in full, and it inevitably follows from this that its interpretation contains an excessive semantic component.

3. A certain semantic feature of a word can be interpreted either as a feature of its meaning, or as a feature of its semantic compatibility - a situation of non-uniqueness of semantic descriptions, which has become the subject of theoretical analysis only in recent years. For nouns like will, quality, temperament etc. under. Two main classes of word usage are characteristic: (1) with adjectives and verbs that have the meaning of degree or increase (decrease) degree; For example, strong or weak will, high or low quality, stormy or lethargic temperament, quality increases or goes down etc.; (2) without such adjectives and verbs; for example, education of will, a sign of quality, What a temperament! In the second case, they explicitly designate greater degree properties: education of will, for example, this is “cultivating a great will,” i.e. will -"great ability to achieve the fulfillment of one's desires or intentions." Let us now assume that the component “big” is included in the meaning of these words in the first case (semantic decision). Then we must postulate following rules addition of meanings: if a noun like will combined with a word whose meaning includes the component “big” or “more” (cf. strong will, high quality, quality increases), then we get a phrase with a repetition of the component “big” or “more”, which must be abbreviated once: strong will ="great great ability..." = "great ability...". If such a noun is combined with a word whose meaning includes the component “small” or “less” (weak will, low quality), then the result is a semantically contradictory phrase (weak will =“small great ability...”), and the “large” component should be deleted from the overall interpretation of the phrase. Now let’s consider a combinability solution: let’s assume that in (1) the noun denotes not a greater degree of a property, but simply a scale a certain property. Then it will be necessary to indicate that this meaning of the noun is realized only in combination with words whose meaning includes the components “big”, “small”, “more”, “less”. So, a semantic solution does not require splitting meanings, but involves the use of a special rule for adding values, and a combinability solution requires splitting meanings, but does not need special rule. Both decisions give a complete and consistent picture of the facts, and if we wanted to give preference to one of them, it would be necessary to bring in some additional considerations.<…>

<…>Let us finally consider the pragmatics of the sign. It includes a wide range of phenomena, starting from expressive elements of meaning, which at different times or by different authors were called Gef ü hlswert, feeling, tone, valeur é motive, semantic associations, associative features, connotations, etc., and ending with those modal components of meaning (related not to the situation being described, but to the situation of communication), which A. Wierzbicka described as a modal frame of the utterance, and C. Fillmore - as presuppositions. All these signs have the common property that they characterize the attitude of the speaker or the addressee of the message to the reality described by the sign. However, different pragmatic elements must, apparently, be recorded in different zones of the description of the sign.

Let's start with semantic associations, or connotations, - those elements of pragmatics that reflect the cultural ideas and traditions associated with the word, the dominant practice in a given society of using the corresponding thing, and many other extra-linguistic factors. They are very capricious, differ greatly between words of different languages ​​or even the same language that have the same or similar meaning. With the word donkey, for example, the idea of ​​readiness to work without complaint is associated (cf. works like a donkey; good donkey, I'm not your donkey to pull for everyone(I won’t act like an ass for everyone)), and with the word donkey - its exact synonym in the main meaning is the idea of ​​​​stubbornness and stupidity ( stubborn or stupid as a donkey; What an ass you are; Quite the ass! etc.). At the noun dog there are connotations of a hard life ( a dog's life, living in a dog's conditions), devotion ( look through a dog's eyes) and bad ( Oh you dog!, dog position); at the noun dog - servile devotion ( watchdog of tsarism) and bad ( dog son); at the noun bitch - bad ( bitch children); finally, at the noun male - lust ( When will you come to your senses, you damned dog?).

Such features, despite the fact that they are not directly included in the semantics of a word, are of primary interest to it, because in many cases it is on their basis that the word is regularly metaphorized, included in comparisons, and participates in word formation and other linguistic processes. As a result, a feature that is associative and pragmatic in one lexical meaning acts as essential and semantic in another. This is the case, for example, with verbs cut And saw Despite all the external similarity of the actions they denote (right down to the reciprocating movement of a sharp instrument over an object, which aims to divide the entire object or its surface into parts), completely different connotations are associated with them - sharpness and pain for the verb cut, and monotony and tediousness for the verb saw Evidence of this is their figurative meanings: The light hurts my eyes, it hurts my side, the cacophony hurts my ears. as opposed to She's always nagging him. It is interesting that in the richest range of types of pain - cutting, stabbing, shooting, breaking, pulling, burning, raw, aching etc. - no pain sawing. Likewise footman And servant are close synonyms in direct meanings, but due to differences in connotations they diverge sharply in figurative ones; Wed surround myself lackeys and sycophants But servant people.

Connotations must be written in a special pragmatic or connotative zone corresponding dictionary entry and serve as a support in the interpretation of such figurative meanings of words that do not have common semantic features with the main meanings.

As for those pragmatic elements of the sign that were called the modal frame and which reflect the assessment of the described situation by the speaker or listener, they, as was envisaged by A. Vezhbitskaya, should be included directly in the interpretation of the word: Even A acted ="Others acted; A acted; the speaker did not expect A to act." WholeX(in sentences like He ate two whole watermelons, He is three whole years old, He brought as many as 10 books)= "X, and the speaker thinks that's a lot." OnlyX(in sentences like He only ate two watermelons, He is only a captain, He only brought 10 books)= "X, and the speaker thinks it's too small." As we see, a necessary element of the lexical meaning of all these words is the speaker’s assessment of the probability of the situation; it is this that forms the modal frame of meaning in this case.

The meanings of other words implicitly contain a reference not to the speaker or listener, but to the perceiver, the observer - another person, also an outsider in relation to the direct participants in the described situation. Let us compare, for example, the phrases get out of smth. And get out because of smth. in their basic spatial meaning. The use of the first of them is completely independent of the position of the observer relative to the moving object. He can tell The boy left the room both in the case when he himself is in the room, and in the case when he is outside it (for example, in the corridor). Not so with the second phrase. The boy came out from behind the screen can only be said in the case when the perceiving person is not himself behind the screen and observes not the disappearance, but the appearance of the boy. Therefore, in the interpretation of the phrase get out because of smth. and other similar ones, an indication of the position of the observer (perceiver) relative to the moving object and obstacle must be included in some form. It is also reasonable to include such instructions in the modal frame.

The introduction of a modal frame into the interpretation, of course, complicates it, but the loss of simplicity in this case reflects the real complexity and multi-layered nature of the object.

The difference between the semantics of a sign and that part of its pragmatics, which, although included in the interpretation in the form of a modal frame, is an object of a fundamentally different nature, manifests itself objectively. Let us note, in particular, that the same semantic difference gives rise to completely different semantic relations between signs, depending on whether it is included in the semantics of signs or in their pragmatics (modal frame). The opposition “more” - “less” gives rise to antonymy if it is included in the semantics of signs; if it is included only in their pragmatics (see above interpretation of the words whole And only), then an antonymous relationship does not arise.<…>

Now we can explicate the concept of lexical meaning: the lexical meaning of a word is understood as the semantics of a sign (a naive concept) and that part of its pragmatics that is included in the modal frame of interpretation. The lexical meaning of a word is revealed in its interpretation, which is a translation of the word into a special semantic language.<…>

Bibliography

Adamets 1968: P. Adamets. On the issue of modifications (modal transformations) with the meaning of necessity and possibility. – Č eskoslovensk á rusistika. 1968. No. 2.

Apresyan 1968: Yu.D.Apresyan. About experimental explanatory dictionary Russian language. – Questions of linguistics. 1968. No. 5.

Apresyan 1969: Yu.D.Apresyan. About the language for describing the meanings of words. – Izv. Academy of Sciences of the USSR. Ser. lit. and language 1969. No. 5.

Bally 1955: Sh.Bally. General linguistics and French language issues. M., 1955.

Barkhudarov 1973: L.S.Barkhudarov. On the issue of surface and deep structure of sentences. – Questions of linguistics. 1973. No. 3.

Bellert 1969: I.Bellert. Arguments and predicates in the logical-semantic structure of utterances. - Studies in syntax and semantics. Dordrecht; Holland, 1969.

Bierwisch 1967: M.Bierwisch. Some semantic universals of German adjectivals. - Foundations of language. International journal of language and philosophy. 1967. Vol. 3. No. 1.

Bogusławski 1970: A.Bogusławski.On semantic primitives and meaningfulness. – “Signs, language and culture.” Mouton. The Hague. 1970.

Brekle 1969: H.E.Brekle. Generative semantics vs. deep syntax. – Studies in syntax and semantics. Dordrecht; Holland, 1969.

Wierzbicka 1967 a: A.Wierzbicka. Mind and Body – from the semantic point of view. MIT. March 1967 (mimeograph).

Wierzbicka 1967 b: A.Wierzbicka. Negation - a Study in the Deep Grammar. MIT. March 1967 (mimeograph).

Wierzbicka 1969: A. Wierzbicka. Dociekania semantyczne. Wrocław; Warsaw; Krakow, 1969.

Weinreich 1963: U.Weinreich. On the semantic structure of language. – Universals of language. Cambridge (Mass.), 1963.

Gak 1966: V.G.Gak. Conversations about the French word (From the comparative lexicology of the French and Russian languages). M., 1966.

Gak 1971: V.G.Gak. The semantic structure of a word as a component of the semantic structure of a statement. – Semantic structure of the word. Psycholinguistic research. M., 1971.

Gak 1972: V.G.Gak. On the problem of semantic syntagmatics. – Problems of structural linguistics 1971. M., 1972.

Zholkovsky and Melchuk 1967: A.K.Zholkovsky, I.A.Melchuk. About semantic synthesis. – Problems of cybernetics. 1967. Issue 19.

Zholkovsky and Melchuk 1969: A.K.Zholkovsky, I.A.Melchuk. Towards the construction of a working language model “Meaning ⇔ Text”. – Machine translation and applied linguistics. 1969. Vol. 11.

Zolotova 1973: G.A. Zolotova. Essay on the functional syntax of the Russian language. M., 1973.

Jordanskaya 1972: L.N. Jordanskaya. Lexicographic description of Russian expressions denoting physical symptoms of feelings. – Machine translation and applied linguistics. 1972. Issue. 14.

Klima 1965: E.S.Klima. Current developments in generative grammar. – Cybernetica. 1965. No. 2.

Quine 1953: W.Quine. From a logical point of view. Cambridge (Mass.), 1953.

Quine 1960: W.Quine. Word and object. N.Y., London, 1960.

Lyons 1967: J. Lyons. A note on possesive, existential and locative sentences. – Foundations of language. International journal of language and philosophy. 1967. Vol. 3. No. 4.

Lyons 1968: J. Lyons. An introduction to theoretical linguistics. Cambridge (England), 1968.

Lakov 1968: G.Lakoff. Instrumental adverbs and the concept of deep structure. - Foundations of language. International journal of language and philosophy. 1968. Vol. 4. No. 1.

Lamb 1966: S.Lamb. Stratificational grammar. N.Y., 1966.

McCawley 1968b: J.D.McCawley. The role of semantics in a grammar. – Universals in linguistic theory. N.Y., 1968.

Mathiot 1968: M. Mathiot. An approach to the cognitive stage of language. – International journal of American linguistics.1968. Vol.34. No. 1.

Melchuk 1968: I.A.Melchuk. The structure of linguistic signs and possible formal and semantic relationships between them. – Izv. Academy of Sciences of the USSR. Ser. lit. and language 1968. No. 5.

Melchuk 1974a: I.A.Melchuk. Experience in the theory of linguistic models “Meaning ⇔ Text”. M., 1974.

Melchuk 1974b: I.A.Melchuk. About one linguistic model of the “Meaning ⇔ Text” type. – Izv. Academy of Sciences of the USSR. Ser. lit. and language 1974. No. 5.

Mushanov 1964: Yu.A.Mushanov. Dependence of the choice of words on prior knowledge about the subject (based on conjunctions and particles). – Machine translation and applied linguistics. 1964. Vol. 8.

Pottier 1965: B.Pottier. La définition sémantique dans les dictionnaires. – Travaux de linguistique et de littérature... 1965. Vol. 3. No. 1.

Russell 1940: B.Russel. An inquiry into meaning and truth. N.Y., 1940.

Reichenbach 1947: H.Reichenbach. Elements of symbolic logic. N. Y., 1947.

Tarski 1948: A. Tarski. Introduction to logic and methodology of deductive sciences. M., 1948.

Tarski 1956: A.Tarski. Logic, semantics, metamathematics. Papers from 1923 to 1938. Oxford, 1956.

Tolstoy 1968: N.I. Tolstoy. Some problems of comparative Slavic semasiology. – Slavic linguistics. VI International Congress of Slavists: Reports of the Soviet delegation. M., 1968.

Fillmore 1969: Ch. J. Fillmore. Types of lexical information. – Studies in syntax and semantics. Dordrecht; Holland, 1969.

Hallig and Wartburg 1952: R. Hallig und W. Wartburg. Begriffssystem als Grundlage f ür die Lexikographie . Berlin, 1952.

Heller and Macris 1967: L. G. Heller, J. Macris. Parametric linguistics. The Hague; Paris, 1967.

Chomsky 1956: N.Chomsky. Three models for the description of language. – IRE Transactions on information theory. 1956. IT – 2. No. 3.

Chomsky 1957: N. Chomsky. Syntactic structures. – New in linguistics. M., 1962. Issue. 2.

Chomsky 1965: N.Chomsky. Aspects of the theory of syntax. Cambridge (Mass.), 1965.

Church 1960: A. Church. Introduction to mathematical logic. M., 1960.

Shaumyan 1971: S.K.Shaumyan. Philosophical questions theoretical linguistics. M., 1971.

Shmelev 1966: D.N. Shmelev. On the analysis of the semantic structure of a word. – Zeichen und System der Sprache. Berlin, 1966. Band 3.

Shmelev 1969: D.N. Shmelev. Problems of semantic analysis of vocabulary (based on the Russian language): Author's abstract. diss. ... Dr. Philol. Sci. M., 1969.

Shcherba 1940: L.V. Shcherba. Experience general theory lexicography. – Selected works in linguistics and phonetics. M., 1958, volume 1.


1 Bor This means, most likely, not ‘a large dense coniferous forest’, but ‘a pine forest consisting of large trees’.

If a car (shirt or pen) that a certain person A uses for its intended purpose (drives, wears, writes) belongs to him, then we can talk about it as this person’s own car (shirt or pen).

The meanings are formulated here very roughly.

If you use this term, you should keep in mind that there is a significant difference between grammatical and semantic agreement: the word A, grammatically consistent with IN, borrows from the latter certain meanings in a given text; Meanwhile, the words L and V that are semantically consistent with each other do not acquire common semantic elements in the text, but still have them in the dictionary. It is indisputable, however, that the concept of agreement (repetition of some elements of linguistic information) can be generalized in such a way that grammatical and semantic agreement appear as its special cases.

It should be added that the incompatibility of the non-target meaning with forms of the imperative, verbs with the meaning of attempt, etc. is explained precisely by the fact that the latter includes the meaning of the goal; for example, an imperative, or incentive, is a message about the speaker's desire for the addressee to perform a certain action, and an attempt to cause the addressee to perform it.

Role names are written with a capital letter so that the reader does not have associations with familiar semantic-syntactic concepts; C. Fillmore's word usage does not correspond to either the etymology of words or the linguistic terminological tradition. In cases of complete correspondence between English and Russian predicates, English examples are sometimes replaced by Russian ones.

For similar observations, see the works of Gak 1966: 256 et seq., Fillmore 1969 (see above), Zolotova 1973.

For further conclusions, it is important that all verbs with the meaning of position in space have a deep meaning of contact, which connects not a subject and a place, but two subjects.

Considering the material given in paragraphs 1 and 2, we can notice that in both cases certain combinability restrictions are imposed on the use of the word, cf. wrong *cut ice O stone, *prick flexible rubber bands, *handful of cabinets. However, in the first case they are semantically motivated and follow directly from the meaning of the word, but in the second they are not. Let us further note that both of them can be violated for stylistic purposes, cf. The rain is walking along Tsvetnoy Boulevard, roaming around the circus... suddenly goes blind and loses confidence(Yu. Olesha), The water muttered under the driftwood(K. Paustovsky) - violation of semantically motivated compatibility; The bus... rushed at breakneck speed(A. Eisner), ...an open dun car pulled up(M. Bulgakov) - violation of semantically unmotivated compatibility. A developed semantic theory should provide for the possibility of such violations and be able to predict the corresponding stylistic effects. By way of hypothesis, we would like to suggest that a stylistic violation of a semantically motivated compatibility rule leads to metaphor or metonymy, and a violation of a purely combinability rule leads to various kinds of humorous effects.

An additional consideration in this case could be the following circumstance: the semantic solution is poorly consistent with the fact that in the Russian language (and not only in Russian) there is not a single class of words for which the rule of striking out a repeated meaning of a high degree would be indisputably true. What is characteristic is just something else; if each of two syntactically related words has a meaning of a high degree, then the latter, so to speak, doubles, cf. very deep lake(not just deep, very deep). Thus, if we, along with the general rule of doubling a repeating value of a high degree, introduced into the system the corresponding rule of crossing out, both rules would lose their generality, and the scope of each of them would have to be determined by numerous particular conditions.




Did you like the article? Share with your friends!