Computational Semantics Tutorial Section

Computational Semantics

Explore how computational algorithms assign formal meaning to words, phrases, and sentences beyond surface-level syntax.

Computational Semantics

Semantics is the subfield of linguistics concerned with meaning. While syntax asks "Is this sentence grammatically well-formed?", semantics asks the harder question: "What does this sentence actually mean?"

Syntactically Valid, Semantically Odd

"Colorless green ideas sleep furiously."

Correct grammar. Zero real-world meaning. It violates our semantic model of reality (green cannot be colorless; ideas do not sleep).

Semantically Rich

"The quick fox jumped over the lazy dog."

Valid grammar AND a clearly grounded, compositional meaning. We can visualize it, react to it, and reason about its truth.

Core Areas of Computational Semantics

1. Lexical Semantics

Studies the meaning of individual words and how they relate to each other.

  • Synonymy: "big" ≈ "large"
  • Antonymy: "hot" ↔ "cold"
  • Hyponymy (Is-A): "poodle" IS-A "dog"
  • Polysemy: "bank" (river bank vs. savings bank)
2. Compositional Semantics

The meaning of a phrase is built from the meanings of its parts.

"Big dog" = Meaning(big) + Meaning(dog)

The Principle of Compositionality (Frege's Principle) states that the meaning of a whole sentence is determined by the meanings of its constituents.

Semantic Role Labeling (SRL)

SRL is the task of assigning semantic roles to words in a sentence — it answers Who did What to Whom, Where, When, and How?

SRL with AllenNLP (Example)
from allennlp.predictors.predictor import Predictor

predictor = Predictor.from_path(
    "https://storage.googleapis.com/allennlp-public-models/structured-prediction-srl-bert.2020.12.15.tar.gz"
)

result = predictor.predict(
    sentence="John carefully gave the book to Mary."
)