KEMBAR78
Computational Linguistic Notes | PDF | Syllable | Stress (Linguistics)
0% found this document useful (0 votes)
27 views27 pages

Computational Linguistic Notes

Uploaded by

vishnoi.ayush05
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views27 pages

Computational Linguistic Notes

Uploaded by

vishnoi.ayush05
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

NLP (LINGUISTICS)

MODULE 1:

1. Natural Language Processing (NLP)

Overview
NLP: Field combining linguistics, artificial intelligence, computer science, and
cognitive science to enable computers to process, understand, and generate
human language.

Two main areas: Processing written text and spoken language.

Point-to-Point Notes
Objectives:

Enable natural interaction between humans and computers (via


text/speech).

Extract, understand, and generate language data for applications.

Key Steps in NLP Pipeline:

1. Text/Spoken Data Acquisition: Obtain raw text or speech input.

2. Preprocessing: Tokenization, cleaning, removing stopwords,


stemming/lemmatization.

3. Morphological Analysis: Identify the structure of words and their smallest


units (morphemes).

4. Syntactic Analysis (Parsing): Analyze sentence structure using


grammatical rules.

5. Semantic Analysis: Assign meaning to syntactically valid sentences.

6. Discourse Integration: Relate sentence meaning to wider context.

NLP (LINGUISTICS) 1
7. Pragmatic Analysis: Interpret intended meaning considering real-world
knowledge.

Complexities in NLP:

Ambiguity (lexical, syntactic, semantic)

Context-dependence (meaning changes with usage)

One-to-many, many-to-one mappings

Handling of noisy input, errors, and missing data.

NLP Flowchart
Text/Speech Input


Preprocessing (Tokenization, Cleaning, Normalization)


Morphological Analysis


Syntactic Parsing


Semantic Analysis


Discourse Integration


Pragmatic Interpretation


Application Output (Translation, Information Extraction, Dialogue, etc.)

2. Computational Linguistics (CL)

Overview
Computational Linguistics: Interdisciplinary domain focused on computational
aspects of human language faculties—both in theory and application.

NLP (LINGUISTICS) 2
Two Components:

Theoretical CL: Formal models and simulation of language acquisition,


comprehension, and production.

Applied CL: Building software for language processing (translation, speech


recognition, etc.).

Point-to-Point Notes
Goals:

Develop computational models to process and analyze human language.

Understand linguistic phenomena via formal structures and algorithms.

Bridge human communication and machine processing gaps.

Interdisciplinary Nature:

Merges linguistics, mathematics, logic, AI, cognitive psychology, and


engineering.

Application in language technology, translation, speech recognition, and


cognitive sciences.

Areas of Focus:

Grammar formalisms (generative grammars, context-free grammars)

Parsing algorithms (top-down, bottom-up, chart parsers)

Word sense disambiguation

Lexical semantics and word relations (synonymy, antonymy, hyponymy,


etc.)

Corpus linguistics: Use of large text datasets for model training and
evaluation.

CL Flowchart
Linguistic Data Collection


Formal Representation (Grammar, Description)

NLP (LINGUISTICS) 3

Algorithm Design (Parsing, Generation)

fdvfdgvdccq↓
Model Implementation (Theoretical Models)


Evaluation & Refinement (against corpora/cognitive studies)


Application in NLP Tools/Systems

3. Applications of NLP and CL

Major Applications
Machine Translation: Automated translation between different languages
(e.g., Google Translate, SYSTRAN).

Speech Recognition & Synthesis: Converting speech to text and vice versa.

Information Extraction: Identifying and categorizing entities, relationships in


text (e.g., named entity recognition).

Text Summarization: Creating concise summaries from longer texts.

Text Categorization/Indexing: Sorting, classifying documents (e.g., spam


detection).

Information Retrieval: Search engines that index and retrieve documents.

Spell & Grammar Check: Automated linguistic correction tools.

Dialogue Systems & Chatbots: Conversational agents (e.g., Siri, Alexa, ELIZA).

Report Generation: Automated description of database changes.

Applications Flowchart
Raw Text/Speech


NLP Pipeline (as above)

NLP (LINGUISTICS) 4

Task-Specific Processing Module:

Machine Translation

Information Extraction

Summarization

Dialogue Management

End-User Application (translator, bot, summarizer, etc.)

4. Comparison Table
Aspect NLP Computational Linguistics (CL)

Practical language processing by Modeling language


Main Focus
computers computationally and theoretically

Formal and cognitive linguistic


Emphasis Application/software development
theories

Translation, chatbots, spell-check, Parsing algorithms, grammar


Typical Tasks
search engines models

Linguistics, Logic, Computer


Root Disciplines AI, Computer Science
Science

Linguists, researchers, AI
Typical Users Engineers, product developers
scientists

Research
More applied Both applied and theoretical
Orientation

4. Simple Comparison Table: NLP vs. Computational


Linguistics (CL)
NLP (Natural Language
Aspect Computational Linguistics (CL)
Processing)

Makes computers Studies how language works using


What it does
understand/use language computers

NLP (LINGUISTICS) 5
NLP (Natural Language
Aspect Computational Linguistics (CL)
Processing)

Build real tools (like apps, Understand language rules and


Main goal
chatbots) patterns

Translators, voice assistants, Analyzing grammar, creating


Examples of work
spell-checkers language models

Linguists, researchers, language


Who works on it Engineers, tech designers
experts

Where it comes Computer science, artificial Linguistics, computer science,


from intelligence logic

Mostly making things that people Both understanding language and


Type of work
can use making tools

Exploring language and how it


Focus Practical solutions
works

5. Central Ideas and Points


NLP and CL are deeply intertwined; both are required for successful human-
computer language interaction.

Modern advances use a fusion of symbolic (rule-based) and statistical


(machine learning) approaches.

Data-driven methods (corpus linguistics, neural models) are increasingly


prominent for robust, scalable systems.

Human-centric applications aim to reduce complexity between man and


machine communication.

6. Historical Timeline (Highlights)


1950s: Machine Translation projects (Russian-English); birth of the field.

1960s: Formal grammars, ELIZA (first chatbot), procedural semantics.

1970s/80s: Symbolic NLP (SHRDLU, TAUM Meteo, SYSTRAN).

1990s: Emergence of statistical NLP (corpora, probabilistic models).

NLP (LINGUISTICS) 6
2000s–Present: Machine learning dominance; large corpora, deep learning,
integrated symbolic-statistical techniques.

Module 2: Phonetics – Detailed Notes with


Examples

1. Automatic Speech Recognition (ASR) Model


Definition:

Automatic Speech Recognition (ASR) allows a computer system to convert spoken


language into text using algorithms that mimic aspects of human listening.
How ASR Works:

1. Audio Input: The user’s voice is recorded as an audio waveform.

2. Preprocessing: The system reduces noise and normalizes the signal.

3. Feature Extraction: The audio is divided into phonemes (basic sound units).
English has about 44 distinct phonemes.

NLP (LINGUISTICS) 7
4. Statistical Analysis: Probabilistic models match phoneme sequences to
words using context.

5. Output: Recognized speech is shown as text or triggers a response/action.

Types of ASR:

Directed Dialogue: The user can only reply with specific, recognized
commands.

Example: “Say ‘balance’ for account balance.”

Natural Language (NLP) Systems: Open-ended interaction; handles full


sentences and context.

Example: “What is the weather forecast for today?”

ASR Learning Mechanisms:

Human Tuning: Engineers update software vocabulary and behavior by


analyzing usage patterns.

Active Learning: The ASR adapts dynamically by learning from new user input
and correcting mistakes.

Popular ASR Tools:

Kaldi: Open-source C++ toolkit; widely used in research, supports deep


neural networks and various training algorithms.

CMU Sphinx: Developed at Carnegie Mellon; includes tools for creating,


training, and deploying ASR models.

NLP (LINGUISTICS) 8
2. Transcription
Definition:

Transcription converts spoken words into written language, often using a phonetic
alphabet for accuracy.
Types:

Orthographic Transcription: Writes out words as they are spelled.

Example: “The cat sat on the mat.”

Phonetic Transcription: Uses specialized symbols (like IPA) to detail precisely


how words are pronounced.

Example: “The cat sat on the mat” → /ðə kæt sæt ɑn ðə mæt/

IPA Example – English Sentence:

Sentence IPA Transcription

I wake up. /aɪ weɪk ʌp/

I like to play. /aɪ laɪk tu pleɪ/

3. Core Concepts in Speech Sounds

What is Sound?
Sound: Vibrations traveling as waves through a medium, perceived by the
brain.

Pitch: The frequency (Hz) of a sound; high pitch = high frequency.

Amplitude: Sound wave height; corresponds to loudness.

Tone: The steady pitch of a sound; in tone languages, changing tone can alter
meaning.

Example: Mandarin “ma” can mean mother, horse, scold, or hemp


depending on tone.

Intonation: Variation in pitch across phrases or sentences, indicating emotion


or grammatical structure.

NLP (LINGUISTICS) 9
Example: “You’re coming?” (rising intonation = question)

Amplitude (Loudness)
^
| /\
| / \ /\
| /\ / \ /\ / \ /\
| / \/ \/ \/ \ / \
|
/_____________________________> Time
<- ->
Wavelength (determines pitch)

Example Table: Pitch and Tone


Word Tone Meaning


mā ( ) High Mother [Mandarin]

má (麻) Rising Hemp [Mandarin]

Consonant and Vowel Structure

🔹 1. Consonants
✅ Definition:
Consonants are speech sounds produced by obstructing airflow in the vocal
tract.

🔍 Key Features:
Produced with restriction or closure.

Identified by their place of articulation.

🧭 Places of Articulation:

NLP (LINGUISTICS) 10
Place Description Example Sounds

Bilabial Both lips /p/, /b/, /m/

Labiodental Lip and teeth /f/, /v/

Interdental Between teeth /θ/, /ð/

Alveolar Tongue to ridge behind teeth /t/, /d/, /s/

Palatal Tongue to hard palate /ʃ/, /ʒ/, /j/

Velar Back of tongue to soft palate /k/, /g/, /ŋ/

Glottal Vocal cords only /h/, /ʔ/

🔹 2. Vowels
✅ Definition:
Vowels are produced without significant constriction of airflow in the vocal tract.

🔍 Key Features:
Shaped by tongue position and lip rounding.

All vowels are voiced (vocal cords vibrate).

🗺️ Classified by:
Feature Types

Tongue Height High / Mid / Low

Tongue Position Front / Central / Back

Lip Shape Rounded / Unrounded

Phone and Allophone


🔹 1. Phone
A phone is any distinct speech sound, regardless of whether it changes the
meaning of a word.

It is the actual sound produced and heard.

NLP (LINGUISTICS) 11
Phones are the physical realization of sounds.

🧾 Example:
[t] in tap and [tʰ] in top are two different phones.

🔹 2. Allophone
An allophone is a variant of a phoneme that occurs in specific contexts.

It does not change the meaning of a word.

Allophones are in complementary distribution, i.e., they never occur in the


same environment.

🧠 Key Point:
Allophones belong to the same phoneme but sound slightly different depending
on context.

Complementary and Contrastive Distribution

🔹 1. Complementary Distribution
✅ Definition:
Two sounds are in complementary distribution when they never occur in the
same phonetic environment.

📌 Key Features:
They are predictable variants of the same phoneme (i.e., allophones).

Substituting one for the other does not change the word’s meaning.

Based on context (before/after certain sounds, in specific positions).

🧾 Example:
[pʰ] (aspirated) in pin vs. [p] (unaspirated) in spin

→ Both are allophones of /p/, never used in the same context.

NLP (LINGUISTICS) 12
🔸 2. Contrastive Distribution
✅ Definition:
Two sounds are in contrastive distribution if they occur in the same environment
but change the meaning of the word.

📌 Key Features:
They are separate phonemes.

Substitution leads to a minimal pair (words that differ by only one sound).

🧾 Example:
/p/ vs. /b/ in:

pat vs. bat → Different words, different meanings → Contrastive

🧠 Quick Comparison Table


Feature Complementary Distribution Contrastive Distribution

Occurrence Different environments Same environment

Sound Type Allophones (same phoneme) Separate phonemes

Meaning Change ❌ No ✅ Yes


Example [tʰ] in top, [t] in stop /p/ in pat, /b/ in bat

Minimal Pair Exists? ❌ No ✅ Yes


🧱 1. Syllable Structure
✅ Definition:
A syllable is a basic unit of speech sound that typically contains a vowel sound,
with or without surrounding consonants.

🔹 Structure Components:
Syllable = Onset + Nucleus + Coda

NLP (LINGUISTICS) 13
Component Description Example (Word: "cat")

Onset Initial consonant(s) (optional) [k]

Nucleus Core vowel sound (mandatory) [æ]

Coda Final consonant(s) (optional) [t]

Onset and coda are optional, but nucleus is required.

Some syllables have no onset or no coda (e.g., “a”, “me”).

🔊 2. Stress Pattern
✅ Definition:
Stress is the emphasis placed on a syllable within a word or phrase.

🔹 Types:
Word Stress: One syllable is more prominent in a word.

Example: TAble , aBOUT

Sentence/Phrase Stress: Certain words in a sentence are stressed for


meaning.

Example: I didn’t say he stole the money (stress on "stole").

🔹 Features of a Stressed Syllable:


Louder

Longer duration

Higher pitch

📌 Stress Can Change Meaning:


Example:

'record (noun) vs re'cord (verb)

🧩 3. Minimal Pair
NLP (LINGUISTICS) 14
✅ Definition:
A minimal pair is a pair of words that differ by only one sound in the same
position but have different meanings.

🔹 Purpose:
Used to identify phonemes in a language.

🧾 Examples:
Word 1 Word 2 Difference

pin bin /p/ vs /b/

cat cut /æ/ vs /ʌ/

ship sheep /ɪ/ vs /iː/

1. What is Morphology?
1. Definition: Study of the internal structure of words and the rules for word
formation.

2. Key Focus: How morphemes (the smallest meaning‑bearing units) combine to


form words and convey meaning.

2. Morphemes
Free Morphemes

Can stand alone as words (e.g., book, run, happy).

Bound Morphemes

Cannot stand alone; must attach to a stem (e.g., prefixes un-, pre-,
suffixes ing, ness).

3. Types of Morphology
Morphological processes fall into two major categories:

NLP (LINGUISTICS) 15
A. Derivational Morphology
Purpose: Creates new words or changes a word’s part of speech/meaning.

Mechanism: Attaches derivational affixes (prefixes/suffixes) to stems.

Examples:

1. Prefixes

un- + happy → unhappy (changes adjective meaning to “not happy”)

re- + tell → retell (verb meaning “tell again”)

2. Suffixes

able on read → readable (verb → adjective: “able to be read”)

ness on kind → kindness (adjective → noun: “state of being kind”)

Note: Often changes word class (e.g., run (verb) → runner (noun) via er).

B. Inflectional Morphology
Purpose: Modifies a word to express grammatical features (tense, number,
degree) without changing its core meaning or part of speech.

Mechanism: Attaches inflectional suffixes, mostly at the end of words.

English Inflectional Suffixes:

1. Verbs

s (3rd‑person singular present): he walk‑s

ed (past tense): love‑ed

ing (present participle/gerund): eat‑ing

en (past participle for some verbs): eat‑en

2. Nouns

s (plural): cat‑s

’s (possessive): Laura‑’s

3. Adjectives

NLP (LINGUISTICS) 16
er (comparative): quick‑er

est (superlative): quick‑est

Key Point: Does not change word class—just marks grammatical distinctions.

4. Why Morphology Matters


Linguistics: Reveals how meaning is structured and how words relate.

NLP Applications:

Stemming/Lemmatization use morphological rules to normalize text.

Improves tasks like search, translation, and text analysis by grouping


related word forms.

5. Quick Reference Table


Changes Word
Process Affixes Example
Class?

Prefix: un-, re- Suffix: un + happy → unhappy


Derivational Often yes
-able, -ness kind + -ness → kindness

Suffix: -s, -ed, -ing, - cat + -s → cats quick + -er


Inflectional No
er, -est → quicker

🧠Acquisition
B.F. Skinner – Behaviourist Theory of Language

NLP (LINGUISTICS) 17
NLP (LINGUISTICS) 18
📌 Main Idea:
Language is learned through imitation, reinforcement, and conditioning,
similar to how animals learn behaviors.

🧪 Background:
Based on behaviourist psychology and experiments with animals (e.g., rats
and pigeons).

Introduced in his book "Verbal Behavior" (1957).

✅ Key Features:
1. Imitation:

Children imitate the language of parents/caregivers.

E.g., A child hears “mama” and tries to repeat it.

NLP (LINGUISTICS) 19
2. Positive Reinforcement:

Correct utterances are rewarded by praise, attention, or fulfilling needs.

E.g., Child says "milk" → gets milk → behavior is reinforced.

3. Negative Reinforcement:

Incorrect or meaningless speech is ignored or not rewarded.

Encourages child to improve speech attempts.

🧠 Example:
Child: “Ball” → Parent gives the ball → Child learns to associate word with the
object.

Over time, correct words are reinforced, and incorrect ones fade.

❌ Criticisms of Skinner’s Theory:


1. Overgeneralization / Virtuous Errors:

Children say “goed” or “drinked” instead of “went” or “drank”.

Shows rule-learning, not pure imitation.

2. Lack of explicit correction:

Parents rarely correct grammar. They focus more on truth or politeness.

Yet children still develop correct grammar.

3. Children can’t always repeat adult speech:

Example (McNeill, 1966):

Child: “Nobody don’t like me”

Mother: “Say ‘Nobody likes me’”

Child: “Nobody don’t likes me”

Shows child isn’t just copying but applying rules incorrectly.

4. Critical Period Evidence:

Case of Genie: Deprived of language till age 13.

NLP (LINGUISTICS) 20
She learned words but never fully mastered grammar.

Shows imitation alone is not sufficient.

🧬Acquisition
Noam Chomsky – Innateness Theory of Language

📌 Main Idea:
Children are born with an innate ability to acquire language.

Language learning is biologically programmed in the brain.

🧠 Key Concept: Language Acquisition Device (LAD)


A mental mechanism that helps children understand language structure.

Allows children to extract grammar rules from the language they hear.

🌐 Universal Grammar:
All human languages share basic rules (e.g., presence of nouns, verbs, word
order).

LAD helps child identify how their native language fits those universal rules.

🧠 Examples:
Child hears “played”, “worked” → applies rule → says “goed” (instead of
“went”).

These are “virtuous errors” → show active rule-building, not imitation.

✅ Supporting Evidence:
1. Virtuous Errors:

Children create grammatical forms they’ve never heard (e.g. “runned”).

2. Creole Languages:

In Suriname, children turned broken Dutch pidgin into a fully developed


creole.

NLP (LINGUISTICS) 21
Suggests children’s brains impose structure on limited input.

3. Sign Language Acquisition:

Deaf children develop grammatical sign languages, even without spoken


input.

Nicaraguan Sign Language: Children created a new grammar system from


informal signs.

4. Brain Studies:

Broca’s Area and Wernicke’s Area: dedicated language regions.

Stroke patients show specific language deficits, proving biological basis.

❌ Criticisms of Chomsky’s Theory:


1. Too Theoretical:

Based on theory, not real child observation.

2. Ignores Social Interaction:

Doesn’t consider how children learn through conversation, feedback, and


context.

Case of Jim (hearing child of deaf parents):

Watched TV and heard speech, but did not develop language until a
speech therapist interacted with him.

3. Doesn’t explain motivation:

LAD explains “how” language is learned but not why children want to
speak or communicate.

Let me know if you’d like similar detailed notes on Bruner, Piaget, or Saussure for
your exam prep!

NLP (LINGUISTICS) 22
Natural Language Processing (NLP):
Detailed Notes
Overview
Natural Language Processing (NLP) is an interdisciplinary field combining
linguistics, artificial intelligence (AI), computer science, and cognitive science. Its
goal is to enable computers to process, understand, interpret, and generate
human language—both written and spoken123.
Main Application Areas:

Processing written text: Emails, books, websites, social media, etc.

Processing spoken language: Speech recognition, voice assistants, call


center analytics.

Objectives
Facilitate natural, intuitive interaction between humans and machines using
text or speech.

Extract language information from data for analysis, automation, or business


applications.

Understand linguistic structures and meanings for tasks like translation,


summarization, search, and question answering.

Generate human-like language output for applications such as chatbots,


dialogue systems, and content creation123.

Key Steps in the NLP Pipeline

1. Data Acquisition
Text Data: Collect raw text from sources like documents, web pages, emails,
and chat logs.

Speech Data: Capture spoken language as audio recordings for processing24.

NLP (LINGUISTICS) 23
2. Preprocessing
Tokenization: Divide text into sentences and words (called tokens).

Cleaning: Remove irrelevant characters, whitespace, or symbols.

Lowercasing: Standardize case for uniformity.

Stopword Removal: Eliminate common, low-information words ("the", "and").

Stemming/Lemmatization: Reduce words to their base or root form (e.g.,


"running"→"run")12567.

Example:
Original: "NLP techniques are advancing rapidly!"

Tokenized: ["NLP", "techniques", "are", "advancing", "rapidly"]


After stopword removal: ["NLP", "techniques", "advancing",
"rapidly"]
After stemming: ["NLP", "techniqu", "advanc", "rapid"]

3. Morphological Analysis
Identifies structure within words, such as prefixes, suffixes, and roots.

Recognizes morphemes (smallest meaning-carrying units).

Helps understand variations (walk, walked, walking) as forms of "walk"15.

4. Syntactic Analysis (Parsing)


Analyzes sentence structure using grammatical rules.

Identifies parts of speech (nouns, verbs, adjectives).

Determines sentence parse structure (subject, object, modifiers).

Produces parse trees or dependency diagrams for sentences1267.

Example Parse:

NLP (LINGUISTICS) 24
"The cat sat on the mat."

Subject: cat

Verb: sat

Prepositional phrase: on the mat

5. Semantic Analysis
Assigns meaning to syntactically valid sentences.

Resolves word senses (disambiguation).

Understands semantic relationships (who did what to whom).

May use knowledge bases or ontologies for deeper understanding26.

Example:
"I saw the man with the telescope."

Semantic analysis clarifies: Did "I" use the telescope, or did


"the man" have it?

6. Discourse Integration
Links meaning across sentences and paragraphs.

Tracks pronoun references (anaphora resolution: "she", "it").

Maintains context for coherent understanding24.

Example:

Text: "Mary called John. He did not answer."


Discourse integration links "He" to "John".

7. Pragmatic Analysis

NLP (LINGUISTICS) 25
Interprets intended meaning beyond literal text.

Considers real-world knowledge, situational context, emotion, politeness, and


indirect speech acts124.

Answers: Is "Can you open the window?" a request or just a capability


question?

Complexities and Challenges in NLP

Ambiguity
Lexical Ambiguity: Single word, multiple meanings (e.g., "bank": riverbank vs.
financial institution)89.

Syntactic Ambiguity: Sentence can be parsed in more than one way.

Semantic Ambiguity: Unclear or multiple possible meanings in context.

Pragmatic Ambiguity: The intention behind an utterance is unclear and can


change based on the situation9.

Context-Dependence
Context can drastically change meaning: Words or sentences require
surrounding information for clarity.

Example: "She gave her dog food" (Did she give her dog some food, or did
she give food that belonged to her dog?).

One-to-Many / Many-to-One Mapping


A single word or phrase may correspond to multiple possible meanings (one-
to-many).

Many different expressions may map to a single meaning (many-to-one).

Noisy Input and Missing Data


Handling typos, speech errors, incomplete sentences, or background noise
in audio.

NLP (LINGUISTICS) 26
Must be robust to errors and gaps in input1011.

Language Diversity
Enormous variation in grammar, word order, morphology, and idioms across
languages complicate universal NLP solutions10.

Applications of NLP
Machine translation (e.g., Google Translate)

Speech recognition (e.g., voice assistants like Siri, Alexa)

Text summarization

Chatbots and dialogue systems

Sentiment analysis

Information extraction and document classification12312

Summary Table: Main Steps in the NLP Pipeline


Step Description Example Output

Text document, audio


Data Acquisition Collect text or speech samples
file

Tokenize, clean, remove stopwords,


Preprocessing ["NLP", "techniques"]
normalize words

Morphological
Identify and analyze word roots and forms "running" → root "run"
Analysis

Syntactic Analysis Determine grammar, parse structure Parse tree, POS tags

Sense of "bank" in
Semantic Analysis Assign meaning to phrases and sentences
context

Discourse Track references and flow across multiple


Resolving "he", "she"
Integration sentences

Interpret intended meaning using real- "Can you open the


Pragmatic Analysis
world and societal context window?"

NLP (LINGUISTICS) 27

You might also like