So, language... It's a big subject and discussing it is no small task. After all, there are a lot of things we can consider “linguistics” and still have them be topical. From the folks who like cataloging different sounds and variations in sound to the folks who simply like to bitch about the way people from a certain region pronounce a certain word, it seems like everyone cares about language to some degree. While this post will not be a comprehensive explanation and discussion of the field of linguistics (which, frankly, would be more than anyone would ever want to read), I will try to cover some of the biggest areas of linguistic research and why they're important. Furthermore, I will try to provide enough information so that we have some meaty questions to talk about for everyone ranging from interested parties to academics. I am also going to limit the OP to English, although I strongly encourage others to discuss other languages, and I may update this post with more information about other languages if this thread heads in that direction.
Phonology and Phonetics
Let's get this party started with a brief discussion of phonetics. For those of us who were forced to look up words in the dictionary by teachers who wouldn't just tell us what the hell a certain word meant, we probably ran in to the International Phonetic Alphabet at some point in time. The IPA is essentially a catalog of all the sounds that are used in speech by humans. Here is a list of most of the sounds used in English and their IPA nomenclature. Keep in mind that you may not pronounce some of these words using the sounds that are listed here, and that's okay! (More on this later.)
Anyway, we have a lot of bad ass sounds here.
"No, children, you’re not seeing things. This, my little friends, is a schwa."
Principal Seymour Skinner
The schwa, for instance, is kicking it in the second row.
At any rate, phonetics provides us a standard by which we can discuss pronunciations and variations that we encounter without having to resort to silly rhyming games. A single unit of sound is referred to as a phoneme, hence the term “phonetics.”
Back in the day, Wikipedia was horrible for anything related to linguistics, but it seems to be getting better. Here is their page on phonemes in case this is the kind of thing that gets you going:
http://en.wikipedia.org/wiki/Phoneme
That link also contains a discussion of aspirated and unaspirated sounds, which you can try out for yourself by putting your hand in front of your mouth when you talk. If you feel your breath, you're looking at an aspirated sound.
Morphemes and Morphology
Let's talk about somewhat bigger units of study now and move on to morphology. Morphemes are the smallest units that have individual meaning. Morphology allows us to break apart words in to specific pieces that have individual meaning. For instance, the word “unacceptable” has three morphemes, at least as far as I see it. un/accept/able “Un” is a prefix which, in this case, means not. “Accept” is a morpheme we are probably all familiar with, as is “able.” Morphology allows us to make sense of words as well as create new words based on accepted rules for affixing. Affixes, for what it's worth, includes prefixes, suffixes, and infixes.
Infixes are pretty cool, because, in English, we only tend to use them in one way, like in the word “fanfuckingtastic.” Yeah, “fucking” is probably the most used infix that we have. Of course, we also see infixes in a lot of slang like “hizzouse” and “saxamaphone.” Infixes are damn cool, and it seems like we're only going to see more of them as English evolves.
From here, we could move on to sentence and paragraph level syntax, and I may include information about that stuff at a later date, but I'd rather move on to what I imagine will be the meat of this thread.
The most interesting areas of linguistic research, from my perspective anyway, come from sociolinguistics, psycholinguistics, and cognitive linguistics.
Sociolinguistics
Sociolinguistics is the study of how language is fixated in terms of various societies and cultures and how culture influences language use. Have you ever wondered why some people in Appalachia speak differently from some people in Boston? So have sociolinguists, and they've spent a lot of time studying it. In short, different people come from different backgrounds, including race, class, and socioeconomic status to name a few. As a result, different people have different norms and those norms in turn affect the manner in which people speak. There's a big myth in the United States that some people don't have an accent. This is simply not true. We can see an example of linguistic variation in the US in this video, and we can also see some of the attitudes people have about certain manners of speech:
http://www.youtube.com/watch?v=_vF9g37FCmk The gentleman at 1:05 expresses a racist viewpoint, and you may want to skip that section or not watch it at work. At any rate, there is a lot to untangle here, and I imagine we will all have a lot to say on this topic.
For a slightly more fun example of linguistic variation, let's look at this scene from Clueless:
http://www.youtube.com/watch?v=sFR9TNsByLk
Cognitive Lingusitics
While we're here, let's talk about cognitive linguistics. Wikipedia provides us with a pretty elegant definition of cognitive linguistics: “Cognitive linguistics is characterized by adherence to three central positions. First, it denies that there is an autonomous linguistic faculty in the mind; second, it understands grammar in terms of conceptualization; and third, it claims that knowledge of language arises out of language use.”
Cognitive linguistics is pretty closely linked to the Sapir-Whorf Hypothesis, which argues that there is a recursive relationship between language usage and the way a speaker sees the world. While research suggests that language usage and worldview do not completely determine each other, there is strong evidence that there is some relationship between the two:
http://en.wikipedia.org/wiki/Linguistic_relativity
While a lot of the information in that link is pretty dry, it does include a discussion of computer languages, which is way over my head as I cannot code, but I know a lot of people here can. I will do my best to provide links and articles if this is the type of thing people want to pursue.
Psycholinguistics
Finally, the last big one I want to discuss is psycholinguistics, which is the study of language acquisition and how the hell it is that we came up with languages in the first place. Psycholinguistics is not my area of specialty, and I know enough about it to explain some of the main ideas but not enough to really argue or care about them. Basically, that Noam Chomsky guy, yeah, the guy who also wrote all those political books and was interviewed by the guy from Rage Against the Machine, is actually a highly regarded professor of linguistics. Chomsky argued in the 60's that language is an innate characteristic of human existence, and all hell has broken loose since. Here is a paragraph from Wikipedia on the main arguments on both sides: “The field of psycholinguistics since then has been defined by reactions to Chomsky, pro and con. The pro view still holds that the human ability to use syntax is qualitatively different from any sort of animal communication. This ability may have resulted from a favorable mutation or from an adaptation of skills evolved for other purposes. In support of the latter view is the theory that language serves group needs; better linguistic expression may have produced more cohesion, cooperation, and potential for survival. The con view still holds that language—including syntax—is an outgrowth of hundreds of thousands of years of increasing intelligence and tens of thousands of years of human interaction. From that view, syntax in language gradually increased group cohesion and potential for survival. Language—syntax and all—is a cultural artifact. This view challenges the "innate" view as scientifically unfalsifiable; that is to say, it can't be tested; the fact that a particular, conceivable syntactic structure does not exist in any of the world's finite repertoire of languages is an interesting observation, but it is not proof of a genetic constraint on possible forms, nor does it prove that such forms couldn't exist or couldn't be learned.”
I wanted to throw all of this out there, because I feel like this is important stuff to ponder, but I also have noticed an interesting trend, particularly on nerdy message boards, towards prescriptive grammar. Prescriptive grammar is likely what a lot of us think of when we think of the type of grammar we were introduced in school. It includes such rules as not using the word “ain't,” being all pissed off when someone writes “I could care less,” and raging against Star Trek's usage of “to boldly go” (after all, there's a split infinitive there). Some of the most famous prescriptivists include Strunk and White, who wrote a guide to style that it would probably be best to forget.
Prescriptivists routinely fight it out with descriptivists. Descriptive grammar is the study of how language is actually used rather than how it should be used. A descriptive grammarian is much more interested in how someone speaks rather than how they should speak. Descriptivists often have their work cut out for them, because almost all of us were taught from a basis of prescription rather than description.
With that incredibly long-winded intro to the thread, let's chat about language, because it's a lot of fun. And, I think it should be noted, I showed a great deal of restraint in not once making a “cunning linguist” joke.
Posts
Not that this is terribly important, but I think there's a difference between "I could care less" and either of "ain't" or "to boldly go". The latter two merely defy prescriptivist notions of correctness, but the literal meaning of the former directly contradicts the intended meaning and is thus arguably more wrong.
I was going to make a similar example with flammable and inflammable being apparently contradictory yet meaning the same thing, but according to Wikipedia apparently the "in" of inflammable refers to inside rather than not. TMYK!
...as he said with bigger words
kpop appreciation station i also like to tweet some
There is definitely precedent for using both, and, according to a brief Google search, they both appear with about equal frequency. "I could care less" is a pretty established expression.
Personally I was always more into semantics, pragmatics and sociolinguistics, which makes sense since my major was English Literature (can't interpret Pinter without a good grasp of Linguistics). My main interest was in how meaning is constructed, not the bric-a-brac of linguistic microstructures. The latter was fun for the couple of sessions we did, but I would quickly have grown bored with its fiddliness.
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
Yuuuup. That's pretty much exactly why I skipped syntax. I figured that I couldn't skip phonology and morphology, but I'm primarily interested in sociolinguistics. You can only diagram so many sentences so many ways before you decide to take a nap.
You make a good point. The ultimate purpose of language is to facilitate communication, and in turn the point of communication is to convey a particular idea to another person. Proper spelling and grammar help to convey that idea consistently to other people, but if the person you're speaking to understands what you're saying then it doesn't really matter whether your spelling or grammar matched a given convention or not because you've succeeded in your ultimate objective. Along those lines, whenever somebody says "I could care less" they clearly are trying to convey the idea that they don't care at all about the subject in question (unless they stressed the word "could", which would imply the literal interpretation of the phrase, but that's very uncommon). Thus, in practice "I could care less" is effective in communicating the intended idea.
That said, there is something to be said for attempting to maintain grammatical and spelling conventions. If there are enough idiomatic exceptions to a rule it becomes difficult to determine whether that rule applies in a given situation or not, which undermines the ultimate purpose of conveying a particular idea. Because of that, while I wouldn't label myself as a prescriptivist there is merit to their attempts to proscribe grammar and spelling.
Thread fails.
The issue with "I could care less," is that it's a colloquial malaprop based on simple near-homonym which has basically persisted through ignorance.
The actual phrase is, "I couldn't care less," but in American English, particularly Southern and African-American sub-dialects, aspirations are frequently dropped, which is a commonality of all languages and dialects, and usually crops up in communities where proper lingual employment isn't strictly enforced. Ergo, "couldn't" becomes "couldn'd," which softens even further lower down the scale to "koodn." Then in the pairing of "couldn't" and "care" you have two glottal fricatives separated by a near-dental aspiration, which is a lot of exercise for one's tongue, so lax diligence in pronunciation omits the lingual constructs that cause the most congative trouble. Thus, the contractive end of "couldn't" gets removed entirely. The persistence of the incorrect usage, however, has more to do with colloquial homonymal scripting, as successive generations hear the same incorrect phrasing, and confuse ubiquity with correctness.
I would imagine that the phrase is misspoken far less often in most British dialects, where societal factors encourage far more adherence to pronunciation, or better yet for the fate of the phrase, an adversion to contraction altogether. Concordantly, in British dialects (and near-British dialects, like Australian, New Zealandian, and South African), aspirative glottal-stop phones like "T" sounds don't get rounded to stopless "D" sounds nearly as frequently. If the phrase becomes, "I could not care less," rather than, "I couldn't care less," the room for lazy tongues to inadvertently misstate the intent is significantly reduced.
At a certain point, however, ubiquity is all that matters (to me, at least). I think that it would be silly to suggest an entire culture that pronounces a phrase a certain way is wrong... I am also not sure that the phrase is actually a regionalism. It may have its origins in the South, but I have been all over the states, and I wouldn't say the phrase would be out of place anywhere I have been.
I'm very interested in discovering whether the phrase extends to other English speaking countries, though, and I think that's an interesting question for us to pursue. Would any of the non US folks care to weigh in?
I don't know if I would agree entirely.
Ubiquity undoubtedly leads to persistence, but it doesn't somehow legitmize a phraseology that's objectively incorrect. I don't think the linguistic community should collectively agree that a malaprop should be taken as correct simply because a lot of people ignorant of what they're saying or why they're saying it incorrectly persist in being ignorant.
I mean, you can argue that long-term evolutionary morphology has a lot to do with lax lingual aderence; this is basically how early Anglo-Saxon evolved out of old Nordic/Danish dialects, but I'd like to think that the allowances made for word-specific evolution, especially in an era of extreme dearths in education and written instruction as post-Roman Brittania, wouldn't be extended to entire incorrect phrases that basically can be explained by being too lazy think about the words coming out of your mouth.
It seems that the phrase "I could care less" is actually more common than the alternative, at least according to these findings.
This post gives a very interesting hypothesis for the origin of the could formation, though I'm not sure how much the analogy with French negation works (any French speakers?)
There should be a Language Log link in the OP, they cover just about everything linguistics-related that could conceivably stir up discussion.
EDIT: Ah, beat'd.
Being married to a posh London Brit with Scottish parents, I see this played out all the time. Americans constantly will ask my wife if she's Australian or South African, when those accents aren't remotely similar to the educated London accent. Whereas my wife can hear a British accent in a crowd and tell right away, "Oh, you're from north Glasgow, aren't you?" or "Which part of Nottingham did you grow up in?"
In defense of Americans, a large part of this is due to the far broader geographical dispersal of dialect groups compared to most other English-speaking region. For example, an area like Texas only has two dominant (and closely-related) English dialects, Gulf South and Southwestern, yet contains an measurable geographical area larger than the entirety of the UK, which Wikipedia lists as having around 20 distinct dialects.
Here are some good dialect maps for comparison:
The US:
The UK:
Texas has an area of 268,820 sq. miles, the whole UK & Ireland has 130,436 sq. miles.
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
It's an interesting argument to be sure, because I'm not sure how much precedence there is for evolutionary appropriation of an illogical phrasing that isn't due to cross-lingual translational/contextual incongruities or contextual externalities (like sarcasm, irony, whathaveyou).
If my understanding is correct, the origination of the malaprop is quite simply apathy and/or ignorance of one's own primary language in the instance of some pronucitive anomalies, which I don't recall seeing a lot of precedence for in linguistic history. I could be wrong, I admit, but I don't think I'm too far off-base here.
Like I said earlier, I'd like to see a cross-referencing study among dialects and accents that place more emphasis on near-dental plosive (as well as glottal-ending) phonemes than most American accents outside the upper-class North East. My thinking is that the instances of incorrect phrasing would be much lower, or at least there would be an increase in the instance of correct phrasing.
To sum up, I wonder how descriptivists feel about dialect-specific phonetic obfuscation literally changing the value of a binary construct, but not changing the functionality of its usage. It's almost like changing "yes" to mean "no," albeit in a very limited context.
But I will never throw my support behind stuff like "I could care less," until it becomes so prevalent that it's the only version of the idiom available. I'm also not going to stop cringing at "irregardless" or using "nauseous" in place of "nauseated" or other things that are less idiomatic and more ignorant people blatantly misusing the language.
A split infinitive or a dangling participle really doesn't affect meaning much. But using a word that doesn't exist (even though you think it does) or using a word that actually has the opposite meaning of what you think it does just makes it harder to communicate with people, and while the evolution of language is a good thing, you still need basic standards.
I mean, what about spelling? If we can use "I could care less" or "irregardless," if the respective rules they break doesn't matter, then what about discarding spelling, too? Can we use "rediculous"? Seems hard to rail against the one and not the other, after all. One might call such a distinction rediculous.
One thing, though: using words that don't exist has vastly enriched the English language. Shakespeare apparently coined something over 1500+ words that didn't exist before him. Language should be a playground, it should allow for creativity, and to some extent this means saying, "Fuck the rules." Obviously there is a difference between Shakespeare doing so or some phlegmatic teenager who's repeating what everyone around him is saying, though.
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
Nice post.
Knowing the rules is important, precisely for the reasons you stated: to adequately defend their breaking. Otherwise, it's just the persistence of ignorance.
On a side tangent, I actually think "y'all" should really be considered proper grammar, or at the very least linguists should work to create something that replaces it. Most languages allow for 2nd-person pluralities (like the the various plural forms derived from 'Vos' in Latin, found throughout romance language conjugations).
It's not bad English, it's the creation of necessary pronoun class.
Well, there's a difference between things like evolutionary changes in pronunciation and simple persistence of incorrect usage. Like, in the instance we're talking about, if "could" can replace "couldn't" without any context of irony or juxtapostional usage, that's just ridiculous, and it actually destroys word values.
English, perhaps better than any other language, is fantastically adaptive. Don't you ever think it's funny when other languages can't come up with cognates for English words? Like, the Spanish word for "sandwich" is . . . sandwich.
Creation of new words is actually a vital part of maintaining the versatility of the English language. The problems arise when mispronounced words persist, again from ignorance, that don't justify their existence. For example, "irregardless" is no different in usage or context from "regardless," and "refutiate" doesn't add to the lexicon simply by mashing two words with similar meanings ("refute" and "repudiate").
I barely ever hear it
I mainly hear "I couldn't give a shit"
or "I don't give a fuck"
I could care less sounds very very American to me
Same, I've never heard "I could care less". It's either "I couldn't care less" or "I don't give a shit".
edit: UK here btw
"Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
I didn't address the viability of its persistence. I would assume that, barring social pressure to correct the phraseology, that dumb thing is going to keep on keepin' on.
We already have that. It's spelled "ye."
From what I've read, "could care" started as sarcasm.
A thousand times this. I find my writing improved a lot as I paid more attention to mechanics, but since that stage of my life, it's gotten even better as I understand when to relax the rules. The most common example in my writing would be ending a sentence with a preposition- it's technically incorrect, but sometimes you have to break that rule for the sake of clarity, and even sometimes to avoid coming across as pretentious. The important thing is not to skip the step where you learn the rules.
(And since we're in a thread where it is remotely acceptable to point this out, punctuation always follows parentheses (so that parenthetical statements "belong" to a sentence, like this, and lack capitals and ending punctuation). The only accepted exception I know of would be bracketing a separate paragraph, to emphasize stylistically that it's an aside.)
When people use "I could care less," I treat it as a literal statement and usually piss them off. I'm rarely openly anal about language, but in this case I feel compelled to point it out.
I think there are plenty of people who would feel that there is nothing wrong with sweat pants, but that when you go to work at, say, a law firm, you shouldn't wear sweatpants.
I would imagine that using the wrong style in the wrong format would be analogous.
I say it all the time.
This is a pretty good read on Strunk and White and why a lot of linguists object to its dominance as a grammar guide: http://chronicle.com/article/50-Years-of-Stupid-Grammar/25497
Ehh, it's hyperbolic, nitpicky and makes the author come across as humourless. He validly points out flaws and then makes sweeping statements about how Strunk & White are undermining themselves. It's kind of weak, all around. If it weren't so fervent, I'd probably be more inclined to appreciate its valid arguments.
What should be taken away from the article is that the book isn't perfect, and it's better taught than just read. It's still one of the most valuable books out there.
That's very sensible.
I teach EFL and ESL and am a hardcore descriptivist. So I teach people that the slang they picked up at work or in the pub is fine, but if they want certain kinds of jobs (e.g. academic, office) they have to be able to talk/write in a certain way, and recognise when a particular register is appropriate. Which is exactly what we native speakers do all the time. You can't teach people 'proper English' any more than you can teach people 'proper walking'. Becoming linguistically adept is what's needed.
I agree that slowing the rate of language change is useful to facilitate communication. However, please remember that communication is not the only function of language. It is used to gain power, to delineate social groups, to denote intelligence and education, and much much more. People decrying language change tend to focus on the 'communication' aspect of language while ignoring the aspects that they themselves are using in their speech acts on it.
Things such as dictionaries seem to be keeping the rate of language change at an acceptable level. Very very few people actually misunderstand 'I could care less', even if they pretend to because of their revulsion for the phrase. I don't think the language is actually under any threat of some kind of communication collapse.
So the real problem with it is that people are unashamedly showing their lack of education, not that it miscommunicates.
Whether you think that's a big deal is up to you, but saying 'it's not a real word' is showing that you've divided words into 'real words' and 'unreal words' depending on their etymology. And you might want to think why you care so much about whether a word comes from a source that you have mastered and why that might anger you.
Perhaps this is really a debate about anti-intellectualism? But then how many professors of linguistics, perhaps the ultimate source of authority and intellect in the realm of language, have a problem with slang and semantic drift? Very few.
Hating terms like these seems like a very middle-brow thing to do.
But then middle-brow is my personal bête noire, and perhaps others have no problem with that.
Oh, and an amusing anecdote: My father was a pilot, and as English is the international language of aviation he ended up talking to a large number of (mostly European) non-native English speakers when communicating with ground personnel. He once told me that the Germans, Greeks, Turks, and just about everyone else spoke crystal-clear English--except the goddamn British, who were almost unintelligible on the radio.
/pedantry
For what it's worth, you would hate the way I pronounce dissect. I grew up in the South, and I don't know anyone who pronounces it the way you (and the OED, for what it's worth) say it's pronounced.
The Southern dialects are infamous for wrongly emphasizing words with latin prefixes, e.g., words that begin in de-, re-, ex-, di-, or in-.
It's not wrong; it's just different. Do you think AAVE is wrong? Do you think Appalachian English is wrong?
Descriptively? No.
Academically and intellectually? Yes.
Persistence and success of variance due to ignorance has a way of bristling me. Like ElJeffe pointed out, you can't really be credited for breaking the rules if you don't bother to learn what they are first.
In general, I think expressivism about normative discourse is clearly and obviously false, however, I do not think that the reason it's false is because of it's difficulty in giving a compositional semantics. So that is what I am trying to argue in my paper.
kpop appreciation station i also like to tweet some