Table of Contents
Introduction
The Language Instinct is a fascinating book written by Steven Pinker and first published in 1994. He writes this book to satisfy the curiosity of those interested in language by using his examples from pop culture, children, adults, and also ostentatious academic writers. This book is a suggestion for an absolute universalism by praising the superiority of cognitive science. He also relies heavily on Chomsky's theory of generative grammar where he reminds us that a finite set of sound and words can produce an infinite number of sentences. All his chapters are interesting but above all, mentalese is the most interesting where "we have all had the experience of uttering or writing a sentence, then stopping and realizing that it wasn't exactly what we meant to say" (p. 57).
An Instinct to Acquire an Art
Language is in every one of us. By making noises with our mouths, we reliably cause precise new combinations of ideas to arise in each others mind. This comes so naturally that we very fast forget the sensation it brings. The ability to read and write makes our communications more impressive by bridging gaps of time, space, and acquaintanceship. Writing is an optional accessory since the pivot of verbal communication is the spoken language we acquired as children.
-
0
Preparing Orders
-
0
Active Writers
-
0%
Positive Feedback
-
0
Support Agents
Some thirty five years ago, a new science called "cognitive science" was born which merges tools from philosophy, computer science, linguistics, philosophy, and neurobiology to explain the workings of human intelligence. Secondly, recent enlightenment of linguistic abilities has revolutionary implications for our understanding of language and its role in human affairs and our view of humanity. Language is man's key cultural invention where the ability to use symbol separates him from other animals. The different languages we have make their speakers to take reality in different ways. It's known that children learn to talk from role models and caregivers. In addition, it's known that complexity of grammar is nurtured in schools but the dropping education standards and degradation of popular culture has led to a frightening decline in the ability of the average person to construct a grammatical sentence.
These common opinions are wrong for one reason; language is not a cultural artifact that we learn. Instead it's a distinct piece of the biological makeup of our brains. Some cognitive scientists have described language as a psychological faculty, a mental organ, a neural system, and a computational module but the best appealing term is "instinct." Thinking of language as an instinct inverts the accepted perception that it's a cultural invention. The origin of language as a kind of instinct was first articulated in 1871 by Darwin. He noted that language has not been invented; it has been slowly developed by many steps. He concluded that language ability is "an instinctive tendency to acquire an art." We have instincts like animals and however these instincts in animals appear mysterious to us, ours will appear the same to animals.
Noam Chomsky supports the argument that language is like an instinct. He proposes the two fundamental facts about language. First, every sentence a person utters or understands is a brand new combination of words. Therefore, language cannot be a repertoire of responses. Secondly, children develop complex grammars fast and without formal instruction and construct sentences they have never encountered. Therefore, children have an innate plan common to grammar of all languages which tells them to collect syntactic patterns from parent's speech. At first one will see this as absurd but a close examination will dispel the doubt. Chomsky's arguments on the nature of the language faculty are based on technical analyses of word and sentence structure, often couched in abstruse formalisms. Therefore, the story in this book is highly eclectic and the best place to begin is to ask why anyone should believe that human language is a part of human biology-an instinct-at all.
Chatterboxes
Michael Leahy, together with an Australian prospector met native highlanders who they could not understand each other but they embraced them. Their jabbering was an unfamiliar language to them. This triggers what might have happened long time ago in the human history. Whenever people encountered one another, all of them had language. Additionally, there is no region that served as a "cradle" of language where it spread to previously languageless groups. Language spoken by Leahy's hosts turned out to be no mere jabber but a medium that could express abstract concepts, invisible entities, and complex trains of reasoning. The universality of complex language is a discovery that fills linguists with awe, and is the first reason to suspect that language is not just any cultural invention but the product of a special human instinct. Cultural inventions vary from society to society or within society but language deviates from this. There are Stone Age societies, but there is no such thing as a Stone Age language.
The myth that non standard dialects of English are grammatically deficient is widespread. For example, the American black children, speeches of different social classes and social settings. The truth is that their sentences were grammatically correct. Therefore, language is innate. Complex language is universal because it's reinvented generation after generation. Whenever two people of different languages meet and do not have an opportunity to learn each other's language, then they develop pidgin to communicate. The problem with pidgin is that the intention of the speaker is filled by the listener. From another perspective, initially there was no sign language since the deaf were isolated. But when Sandinista government started them, sign language has developed to be compact.
Chomsky reasoned that if the logic of language is wired into children, then the first time they are confronted with a sentence with two auxiliaries they should be capable of turning it into a question with the proper wording. To fully conclude that language is instinct, it should have an identifiable seat in the brain though no grammar gene has been found yet. Despite Specific Language Impairment (SLI) being hereditary, this does not mean that it has to do with the DNA. Laboratory tests confirm the impression of competence at grammar; the children understand complex sentences, and fix up ungrammatical sentences, at normal levels. Therefore, one may be an intellectual but not a competent language user.
Save up to
25%!
We offer 10% more words per page than other websites, so actually you got 1 FREE page with every 10 ordered pages.
Together with 15% first order discount you get 25% OFF!
Mentalese
By 2050, the ultimate technology for thought control would be in place: the language Newspeak. The purpose of language Newspeak was not only to provide a medium of expression for the world-view and mental habits proper to the devotees of Ingsoc [English Socialism], but to make all other modes of thought impossible. This was done partly by the invention of new words, but chiefly by eliminating undesirable words and by stripping such words as remained of unorthodox meanings, and so far as possible of all secondary meanings whatever. People assume that words determine thoughts such as Orwell's account of government euphemism, sexist language, and general semantics. Sapir-Whorf hypothesis of linguistic determinism supports this but it cannot be true because thoughts cannot be the same as language.
To those people thought not to have a language, they might be isolated from the verbal world but show abstract form of thinking such as entertaining each other. Therefore, even the languageless beings reason about time, space among others. Examples are babies, monkeys, and human adults who do not think in words.
The language that people speak is unsuited as our internal medium of computation. This is due to the problems of ambiguity, lack of logical explicitness, co-reference, and the aspects of language that can only be interpreted in the context of a conversation or text- what linguists call "deixis." It can be concluded that people do not think in the language they speak but in a language of thought which probably resembles all these languages. Therefore, mentalese must be richer in some ways and simpler in others. Consequently, to know a language is to know how to translate mentalese into strings of words and vice versa.
VIP services
Get
extended REVISION 2.00 USD
Get SMS NOTIFICATIONS 3.00 USD
Get an order
Proofread by editor 3.99 USD
Get an order prepared
by Top 30 writers 4.8 USD
Get a full
PDF plagiarism report 5.99 USD
Get
VIP Support 9.99 USD
VIP SERVICES
PACKAGE
WITH 20% DISCOUNT 23.82 USD
Newspeak is not the way because mental life goes on independently of particular languages, concepts of freedom and equality will be thinkable even if they are nameless. Secondly, there are many concepts than words, and listeners must always charitably fill in what the speaker leaves unsaid, existing words will quickly gain new senses, perhaps even regain their original senses. Lastly, children are not satisfied to imitate any old input from adults but create a complex grammar that can go beyond it; they would creolize Newspeak into a natural language, possibly in a single generation.
How Language Works
The essence of language instinct is that it conveys news thorough sentences. The first principle in instincts is the arbitrariness of the sign, the wholly conventional pairing of sound with meaning. The second makes infinite use of finite media where we use a code to translate between orders of words and combinations of thoughts. That code, or set of rules, is called a generative grammar. The principle underlying grammar is unusual in the natural world. A grammar is an example of a "discrete combinatorial system." A finite number of discrete elements (in this case, words) are sampled, combined, and permuted to create larger structures (in this case, sentences) with properties that are quite distinct from those of their elements.
The way language works, then, is that each person's brain contains a dictionary of words and the concepts they stand for and a set of rules that combine the words to convey relationships among concepts. Grammar as a discrete combinatorial system has two important consequences. The first is the absolute vastness of language. The second is that it is a code that is autonomous from cognition. Grammar specifies how words combine to express meaning and that specification is independent of the particular meaning we typically convey or expect others to convey to us.
Top 30 writers
Get the highly skilled writer in the chosen discipline for $4.8 only!
Modern study of grammar was triggered by Chomsky who showed that word-chains devices such as computers are suspicious because they are fundamentally wrong in the way human language works. They are the wrong kind of discrete combinatorial systems. The difference between these word-chain devices and human brains can be summed up in a line from Joyce Kilmer poem "Only God can make a tree." Therefore, a sentence is not a chain but a tree since words are grouped into phrases like twigs joined in a branch. Additionally, modern linguistics has discovered that there is a common anatomy in all phrases in world's languages since the super-rules suffice not only in all phrases in English but also in all languages.
Grammar, a form of mental software, must have evolved in a similar manner as the workings of syntax. In addition, there is no way one can write a halfway intelligent program without defining variables and data structures that do not directly correspond to anything in the input or output. Therefore, grammar is a protocol that has to interconnect the ear, the mouth, and the mind, three very different kinds of machine. The greatest discovery in this is that at times the complexity in the mind is not caused by leaning but learning is caused by the complexity in the mind.
Words, Words, Words
Unlike the mental grammar, the mental dictionary has had no cachet. It seems like nothing more than a dull list of words, recorded into the head by inattentive rote memorization. Johnson's own dictionary defines lexicographer as "a harmless drudge that busies himself in tracing the original, and detailing the signification of words." This stereotyping is unfair because the world of words is as amazing as the syntax world or even more. Not only are people as infinitely creative with words as they are with phrases and sentences, but memorizing individual words demands its own special intelligence. Words are not just retrieved from a mental achieve. People must have a mental rule for generating new words from old ones. This rule is known as the rule of morphology. The output of one morphological rule can be the input to another, or to itself.
VIP support
VIP support services:
special attention is assured! $9.99 only!
Like syntax, morphology is a cleverly designed system, and many of the seeming oddities of words are predictable products of its internal logic. Words have a delicate anatomy consisting of pieces, called morphemes, which fit together in certain ways. This rule interfaces well with the mental dictionary. It can be used to inflect any item in the mental dictionary that lists "noun stem" in its entry, without caring what the word means. Likewise, the rule allows us to form plurals without caring what the word sounds like where we pluralize unusual-sounding words. Besides, the rule is perfectly happy applying to brand-new nouns, like faxes, dweebs, wugs, and zots.
Irregular plurals cannot be generated by a rule because they are quirky and thus can only be stored in the mental dictionary as roots or stems. Accordingly, they can be fed into the compounding rule that joins an existing stem to another existing stem to yield a new stem. But regular plurals are not stems stored in the mental dictionary; they are complex words that are assembled on the fly by inflectional rules whenever they are needed. They are put together too late in the root-to-stem-to-word assembly process to be available to the compounding rule, whose inputs can only come out of the dictionary.
The extraordinary feature in the mental dictionary is its capacity. Does this rely on the words one has read or heard? The truth is that people recognize more words than they have ever used in a fixed period of time or space. This is because, from a word like sail, for example, one can one can derive words like sailcloth, sailor, sailboat, and sailplane even when one has never heard of them. In addition, wordless thinkers split continuously flowing experiences into things, kinds of things, and actions. Experimental studies of baby cognition have shown that infants have the concept of an object before they learn any words for objects, just as we would expect. Once children know some syntax, they can use it to sort out different kinds of meaning. Therefore, in the sense of a morphological product, a name is an intricate structure, elegantly assembled by layers of rules and lawful even at it's quirkiest.
The Sounds of Silence
The brain can hear speech content in sounds that have only the remote resemblance to speech. Brains can also flip between hearing something as a bleep and hearing it as a word because phonetic perception is like a sixth sense. When we listen to speech, the actual sounds go in one year and out the other; what we perceive is language. Besides, all speech is an illusion. We hear speech as a string of separate words, but unlike the tree falling in the forest with no one to hear it, a word boundary with no one to hear it has no sound. We simply hallucinate word boundaries when we reach the edge of a stretch of sound that matches some entry in our mental dictionary. In addition, speech perception makes language instinct.
What we hear as different vowels are the different combinations of amplification and filtering of the sound coming up from the larynx. These combinations are produced by moving five speech organs around in the mouth to change the shapes and lengths of the resonant cavities that the sound passes through. The tongue is the most important of the speech organ since its three organs in one: the hump or body, the tip, and the root. The link between the postures of the tongue and the vowels it sculpts gives rise to a quaint curiosity of English and many other languages called phonetic symbolism. Lips also contribute by causing acoustic effects. Phonological rules helps listeners to make speech patterns predictable, they add redundancy to a language.
Plagiarism check
Attractive plagiarism check option:
ensure your papers are authentic!
While language is an instinct, written language is not because it was invented. Therefore, most communities lacked written language and had to borrow or perhaps inherited it. Writing systems do not aim to represent the actual sounds of talking, which we do not hear, but the abstract units of language underlying them, which we do hear.
Talking Heads
The fear that programmed creations might outsmart people has been taken as fiction. This was not until the emergence of artificial intelligence (AI) that made fiction to almost turn to fact. The mental abilities of a four year- old that we take for granted-recognizing a face, lifting a pencil, walking across a room, answering a question-in fact solve some of the hardest engineering problems ever conceived. To interact with computers, we still have to learn their languages but they are not smart enough to learn ours.
From a scientist perspective, people should not doubt their sentence understanding. They not only solve a vicious complex task, but fast. Listeners keep up with talkers; they do not wait for the end of a batch of speech and interpret it after a proportional delay. Understanding has practical applications such as human sentence comprehension and in law other than building machines we can converse with.
The parser has two computational burdens; memory and decision making. In respect to the first law of artificial intelligence, that the hard problems are easy and the easy problems are hard, it turns out that the memory part is easy for computers and hard for people, and the decision making part is easy for people and hard for computers. Computer parsers are too careful since they find ambiguities that are quite legitimate, as far as English grammar is concerned, but that would never occur to a sane person