Numerous contributors from both disciplines.
The study of meaning in natural language.
‘Meaning’ is an elusive concept which modern linguists tackle by dispersing into other fundamental ideas such as IMPLICATURE, MEANING-NN, sense and reference, VALUE.
Also see: theories of meaning, LEXICAL SEMANTICS, STRUCTURAL SEMANTICS, truth-conditional semantics
J Lyons, Semantics (Cambridge, 1977)
In linguistics, semantics is the subfield that studies meaning. Semantics can address meaning at the levels of words, phrases, sentences, or larger units of discourse. One of the crucial questions which unites different approaches to linguistic semantics is that of the relationship between form and meaning.
Theories in linguistic semantics
Formal semantics seeks to identify domain-specific mental operations which speakers perform when they compute a sentence’s meaning on the basis of its syntactic structure. Theories of formal semantics are typically floated on top of theories of syntax such as generative syntax or Combinatory categorial grammar and provide a model theory based on mathematical tools such as typed lambda calculi. The field’s central ideas are rooted in early twentieth century philosophical logic as well as later ideas about linguistic syntax. It emerged as its own subfield in the 1970s after the pioneering work of Richard Montague and Barbara Partee and continues to be an active area of research.
This theory is an effort to explain properties of argument structure. The assumption behind this theory is that syntactic properties of phrases reflect the meanings of the words that head them. With this theory, linguists can better deal with the fact that subtle differences in word meaning correlate with other differences in the syntactic structure that the word appears in. The way this is gone about is by looking at the internal structure of words. These small parts that make up the internal structure of words are termed semantic primitives.
Cognitive semantics approaches meaning from the perspective of cognitive linguistics. In this framework, language is explained via general human cognitive abilities rather than a domain-specific language module. The techniques native to cognitive semantics are typically used in lexical studies such as those put forth by Leonard Talmy, George Lakoff, Dirk Geeraerts, and Bruce Wayne Hawkins. Some cognitive semantic frameworks, such as that developed by Talmy, take into account syntactic structures as well. Semantics, through modern researchers can be linked to the Wernicke’s area of the brain and can be measured using the event-related potential (ERP). ERP is the rapid electrical response recorded with small disc electrodes which are placed on a person’s scalp. 
A linguistic theory that investigates word meaning. This theory understands that the meaning of a word is fully reflected by its context. Here, the meaning of a word is constituted by its contextual relations. Therefore, a distinction between degrees of participation as well as modes of participation are made. In order to accomplish this distinction any part of a sentence that bears a meaning and combines with the meanings of other constituents is labeled as a semantic constituent. Semantic constituents that cannot be broken down into more elementary constituents are labeled minimal semantic constituents.
Various fields or disciplines have long been contributing to cross-cultural semantics. Are words like love, truth, and hate universals? Is even the word sense – so central to semantics – a universal, or a concept entrenched in a long-standing but culture-specific tradition? These are the kind of crucial questions that are discussed in cross-cultural semantics. Translation theory, ethnolinguistics, linguistic anthropology and cultural linguistics specialize in the field of comparing, contrasting, and translating words, terms and meanings from one language to another (see Herder, W. von Humboldt, Boas, Sapir, and Whorf). But philosophy, sociology, and anthropology have long established traditions in contrasting the different nuances of the terms and concepts we use. And online encyclopaedias such as the Stanford encyclopedia of philosophy, https://plato.stanford.edu, and more and more Wikipedia itself have greatly facilitated the possibilities of comparing the background and usages of key cultural terms. In recent years the question of whether key terms are translatable or untranslatable has increasingly come to the fore of global discussions, especially since the publication of Barbara Cassin’s Dictionary of Untranslatables: A Philosophical Lexicon, in 2014.
Computational semantics is focused on the processing of linguistic meaning. In order to do this concrete algorithms and architectures are described. Within this framework the algorithms and architectures are also analyzed in terms of decidability, time/space complexity, data structures that they require and communication protocols.
Many of the formal approaches to semantics in mathematical logic and computer science originated in early twentieth century philosophy of language and philosophical logic. Initially, the most influential semantic theory stemmed from Gottlob Frege and Bertrand Russell. Frege and Russell are seen as the originators of a tradition in analytic philosophy to explain meaning compositionally via syntax and mathematical functionality. Ludwig Wittgenstein, a former student of Russell, is also seen as one of the seminal figures in the analytic tradition. All three of these early philosophers of language were concerned with how sentences expressed information in the form of propositions and with the truth values or truth conditions a given sentence has in virtue of the proposition it expresses.
In present day philosophy, the term “semantics” is often used to refer to linguistic formal semantics, which bridges both linguistics and philosophy. There is also an active tradition of metasemantics which studies the foundations of natural language semantics.
In computer science, the term semantics refers to the meaning of language constructs, as opposed to their form (syntax). According to Euzenat, semantics “provides the rules for interpreting the syntax which do not provide the meaning directly but constrains the possible interpretations of what is declared.”
The semantics of programming languages and other languages is an important issue and area of study in computer science. Like the syntax of a language, its semantics can be defined exactly.
For instance, the following statements use different syntaxes, but cause the same instructions to be executed, namely, perform an arithmetical addition of ‘y’ to ‘x’ and store the result in a variable called ‘x’:
x += y
$x += $y
x := x + y
|Ada, ALGOL, ALGOL 68, BCPL, Dylan, Eiffel, J, Modula-2, Oberon, OCaml, Object Pascal (Delphi), Pascal, SETL, Simula, Smalltalk, Standard ML, VHDL, etc.
|Assembly languages: Intel 8086
ldr r2, [y]
ldr r3, [x]
add r3, r3, r2
str r3, [x]
|Assembly languages: ARM
LET X = X + Y
x = x + y
|BASIC: most dialects; Fortran, MATLAB, Lua
Set x = x + y
ADD Y TO X.
ADD Y TO X GIVING X
set /a x=%x%+%y%
(incf x y)
/x y x add def
y @ x +!
Various ways have been developed to describe the semantics of programming languages formally, building on mathematical logic:
- Operational semantics: The meaning of a construct is specified by the computation it induces when it is executed on a machine. In particular, it is of interest how the effect of a computation is produced.
- Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executing the constructs. Thus only the effect is of interest, not how it is obtained.
- Axiomatic semantics: Specific properties of the effect of executing the constructs are expressed as assertions. Thus there may be aspects of the executions that are ignored.
The Semantic Web refers to the extension of the World Wide Web via embedding added semantic metadata, using semantic data modeling techniques such as Resource Description Framework (RDF) and Web Ontology Language (OWL). On the Semantic Web, terms such as semantic network and semantic data model are used to describe particular types of data model characterized by the use of directed graphs in which the vertices denote concepts or entities in the world and their properties, and the arcs denote relationships between them. These can formally be described as description logic concepts and roles, which correspond to OWL classes and properties.