Concepts : A Thing or An Act?

(by melody) Oct 08 2010

There is a temptation to see a concept as a static ‘thing’ somehow stored in mind. One might imagine, for instance, that our concept of the word ‘hammer’ consists of visual memory for a hammer or series of hammers (or a ‘prototype’ or ‘schema’ of a hammer, whatever that should be).  That we can think how this might be so, is not a good reason to adopt this idea.  Indeed, it would be far better if should we discard this notion altogether.  If we take language to be a skill, like tennis or painting, we see quickly how this idea breaks apart at the seams.  For one does not have a static ‘concept’ of how one paints a landscape, or a fixed ‘concept’ of how one approaches a serve; rather one learns, over time, the toss of the ball, the arc of the back, the gentle shiting of weight, the flex of the wrist; and in all of this, there is the demand of sustained practice and coordination, the reproof and rebuke of time, and as ever, a great number of processes – physiological and mental – that contribute to the execution of the act (which is, almost certainly, imprecise).  If a concept is a skill too, then it is a learned process; an active engagement; one of a suite of ways of representing the world.

On our confusion with words (and our idea that a word 'stands' for a thing), Wittgenstein said :

"This [confusion] is connected with the conception of naming as... an occult process.  Naming appears as a queer connection of a word with an object. --[But] you really [only] get such a queer connexion when the philosopher tries to bring out the relation between name and thing by staring at an object in front of him and repeating a name or even the word "this" innumerable times.  For philosophical problems arise when language goes on holiday.  ...We can [of course] say the word "this" to an object, or as it were, address the object as "this"--[but this] is a queer use of the word, which doubtless only occurs in doing philosophy."

10 responses so far

Hilarity did not ensue

(by melody) Oct 07 2010

There's this disturbing (and simultaneously hilarious) article in yesterday's NY Times about rampant fraud and plagiarism among China's academic ranks.

My favorite line, by far :

He cited the case of Chen Jin, a computer scientist who was once celebrated for having invented a sophisticated microprocessor but who, it turned out, had taken a chip made by Motorola, scratched out its name, and claimed it as his own.

I can just imagine the poor man patiently scratching out "Motorola" and writing "Chen Jin" over it in crayon.  Brilliant, really.  What's truly amazing, of course, is that anyone believed this -- as my friend Joe pointed out, taking credit for a chip is kind of like taking credit for a 767.  "Oh zees?  I built it in in ze evenings, weeth some scrap metal an' a soldering iron."

More unnerving :

After Mr. Chen was showered with government largess and accolades, the exposure in 2006 was an embarrassment for the scientific establishment that backed him.  But even though Mr. Chen lost his university post, he was never prosecuted. “When people see the accused still driving their flashy cars, it sends the wrong message,” Mr. Zeng said.

The problems in China are more than a little blatant.  But what have the recent American scandals told us about US institutions?  --Are these anomalies, to be brushed under the table?  Or does the integrity of scientific research in the US deserve a closer look?  (Thanks to @CaldenWloka for the scoop)

One response so far

What is ADHD? Paradigm Shifts in Psychopathology

(by Jason G. Goldman) Oct 05 2010

Wow. Lots of psycho linguists around lately, huh? How about a change of pace? Think you guys can handle something not about Lord Chomsky?

Over the last one hundred years, paradigm shifts in the study of psychopathology have altered our conceptualization of attention deficit/hyperactivity disorder (ADHD), as a construct and as a diagnostic category. With few exceptions, it has generally been accepted that there is a brain-based neurological cause for the set of behaviors associated with ADHD. However, as technology has progressed and our understanding of the brain and central nervous system has improved, the nature of the neurological etiology for ADHD has changed dramatically. The diagnostic category itself has also undergone many changes as the field of psychopathology has changed.

In the 1920s, a disorder referred to as minimal brain dysfunction described the symptoms now associated with ADHD. Researchers thought that encephalitis caused some subtle neurological deficit that could not be medically detected. Encephalitis is an acute inflammation of the brain that can be caused by a bacterial infection, or as a complication of another disease such as rabies, syphilis, or lyme disease. Indeed, children presented in hospitals during an outbreak of encephalitis in the United States in 1917-1918 with a set of symptoms that would now be described within the construct of ADHD.

In the 1950s and 1960s, new descriptions of ADHD emerged due to the split between the neo-Kraepelinian biological psychiatrists and the Freudian psychodynamic theorists. The term hyperkinetic impulse disorder, used in the medical literature, referred to the impulsive behaviors associated with ADHD. At the same time, the Freudian psychodynamic researchers (who seem to have won the battle in the DSM-II) described a hyperkinetic reaction of childhood, in which unresolved childhood conflicts manifested in disruptive behavior. The term "hyperkinetic," which appears in both diagnoses, describes the set of behaviors that would later be known as hyperactive – despite the fact that medical and psychological professionals were aware that there were many children who presented without hyperactivity. In either case, it was the presenting behavior that was the focus – which was implicit, given the behavioral paradigm that guided the field.

When the cognitive paradigm became dominant, inattention became the focus of ADHD, and disorder was renamed attention deficit disorder (ADD). Two subtypes would later appear in the literature, which correspond to ADD with or without hyperactivity. The diagnostic nomenclature reflects the notion that the primary problem was an attentional (and thus, cognitive) one and not primarily behavioral. The attentional problems had to do with the ability to shift attention from one stimulus to another (something that Jonah Lehrer has called an attention-allocation disorder, since it isn't really a deficit of attention). The hyperactivity symptoms were also reformulated as cognitive: connected with an executive processing deficit termed “freedom from distractibility.”

In DSM-IV, published in 1994, the subtypes were made standard and there wasn’t much change in the diagnostic criteria per se, but there were changes in the name of the disorder, which reflected changes in the literature in terms of the understanding of the etiology of the disorder. The term ADD did not hold up, and the disorder became known as ADHD, with three subtypes: ADHD with hyperactivity/impulsiveness, ADHD with inattention, and a combined subtype in which patients have both hyperactive and attention-related symptoms. Due to improved neuroimaging technology, these subtypes seem to reflect structural and functional abnormalities found in the frontal lobe, and in its connections with the basal ganglia and cerebellum.

The set of the symptoms associated with ADHD seem not to have changed much in the last one hundred years. However, paradigm shifts within the field of psychopathology have changed the way in which researchers understand the underlying causal factors, as well as which of the symptoms are thought to be primary.

5 responses so far

The reality of a universal language faculty?

(by melody) Oct 05 2010

An argument is often made that similarities between languages (so-called "linguistic universals") provide strong evidence for the existence of an innate, universal grammar (UG) that is shared by all humans, regardless of language spoken.  If language were not underpinned by such a grammar, it is argued, there would be endless (and extreme) variation, of the kind that has never been documented.  Therefore -- the reasoning goes -- there simply must be design biases that shape how children learn language from the input they receive.

There are several potentially convincing arguments made in favor of innateness in language, but this, I think, is not one of them.

Why?  Let me explain by way of a evolutionary biology:

Both bats and pterodactlys have wings, and both humans and squid have eyes, but neither pair shares a common ancestor that had these traits.  This is because wings and eyes are classic examples of 'convergent' evolution -- traits that arose in separate species as 'optimal' (convergent) solutions to the surrounding environment.  Convergent evolution has always struck me as a subversive evolutionary trick, because it demonstrates how external constraints can produce markedly similar adaptations from utterly different genetic stock.  Not only that, but it upends our commonsense intuitions about how to classify the world around us, by revealing just how powerfully surface similarities can mask differences in origin.

When we get to language, then, it need not be surprising that many human languages have evolved similar means of efficiently communicating information. From an evolutionary perspective, this would simply suggest that various languages have, over time, 'converged' on many of the same solutions.  This is made even more plausible by the fact that every competent human speaker, regardless of language spoken, shares roughly the same physical and cognitive machinery, which dictates a shared set of drives, instincts, and sensory faculties, and a certain range of temperaments, response-patterns, learning facilities and so on.  In large part, we also share fairly similar environments -- indeed, the languages that linguists have found hardest to document are typically those of societies at the farthest remove from our own (take the Piraha as a case in point).

Continue Reading »

11 responses so far

The questions we should still be asking about gender

(by melody) Oct 04 2010

"...for the present enshrines the past – and in the past all history has been made by men." --Simone de Beauvoir, The Second Sex

Last week, I wrote a short piece on my (stunning) failure to be socialized according to our culture's gender norms. As I pointed out, I spent much of my adolescence wearing my father's hand-me-downs and drinking cheap whiskey with the loud boys (the kind, you know, who wore cordovan wingtips and eyeliner to first period). We were a delightful lot of misfits. Anyhow, an old friend, reading this, sent me a link to a new collection by Dries Van Noten, as photographed by The Sartorialist, with the note "ahead of the times, eh?" But of course.

The Sartorialist writes : "The take away from this show? Steal your Dad's clothes, all your dad's clothes. His shirts, his jeans, his sportcoats are all fair game now."

Why, I wonder, did I ever learn to wear lipstick? And like it?

To be fair, menswear has long been a 'classic look' on women of my build (broad shoulders, long neck) and height (as tall as a man). In search of my style predecessors, I searched the Internets for some of my women-loves. I've posted a small gallery at the end of the post.

But before getting to that, I thought I would raise a number of questions on gender that still demand discussion :

On language
How does language shape our thoughts on gender? How do we use language - as a form of behavior and as an expression and extension of culture - to implicitly enforce gender norms?  And then -- secondarily -- If we think that language is shaping (or implicitly constraining) our thoughts and beliefs about gender, is it worthwhile to assess and try to change those values? Should we try to self-consciously change the way we speak?

On biology
In society, what role does biology play in propelling men (and not women) to the top? Are the traits of highly successful men (e.g., hyperfocus, ambition, hypomania) truly absent in women? In what way is the expression of these traits mediated by cultural norms and practice?

On sex
How does female sexuality play into all of this? What part does modern culture (pornography, fashion, etc) play in shaping our expectations of women? Must powerful, iconic women necessarily be de-sexualized, gay, or explicitly counter-culture?  What happens when women turn the tables and objectify men?  [Links are to the Martin Amis classic on pornography "Pussies are bullshit," the Dove ads scandal, and the Karen Owen sex-thesis (er, f*ck list), respectively.]

And finally, a provocative question from a conversation I was having this evening about Hemingway (who was often accused of misogyny) :

What does it mean to be a misogynist in an age (or society) where women are socialized to be powerless, subservient and inferior?  What does it mean to be a feminist?

In the days to come, I'll write a little on the research I've been doing on the differences in how we use gendered words (like "he" and "she" and "man" and "woman").  The differences are striking, and sometimes more than a little startling.  Here's a simple one you might not expect : when it comes to labeling people by their sexual orientation, we're far more interested in a man's preference than a woman's.  In fact, we label men by their orientation (gay, straight, bi) about ten times more often than we do women.  But that ratio nearly reverses when it comes to marital status.  We talk incessantly about whether women are "married," "single," or "divorced," but when it comes to the guys, we couldn't care less.  What does it all mean?  --I'll get to that bit shortly.









11 responses so far

A Chomsky Reader

(by melody) Oct 01 2010

"Questions of fact cannot be resolved on the basis of ideological commitment." --Noam Chomsky

Not familiar? Not a problem. Here, in much abridged form, are some of the main ideas from "Language and Problems of Knowledge: The Managua Lectures," one of the most accessible of Chomsky's texts on language, and a clear, cogent and articulate elaboration of his views.

Why post this? I frequently receive comments on my posts accusing me of caricaturing Chomsky's position.  "Chomsky didn't say that!" the standard line goes (and sometimes, more vehemently: "Chomsky wouldn't say that").  There's really no good response to this except to say, "yes he did, it says so right here, on page..." which strikes me as annoyingly pedantic.  While I would encourage all my readers who have more than a passing interest in the debate to read the original texts, this compilation of quotes should serve as a helpful introduction (or refresher) on Chomsky's ideas.  Of course, The Managua Lectures are a concise elaboration of Chomsky's theoretical stance, rather than arguments for or against it -- but this is actually quite helpful, since it allows me to present short, well-formulated excerpts, without doing a hack-job on an extended piece of reasoning.

Of the excerpts that follow, there are some that I think are reasonable, some absurd, and some simply amusing. In the upcoming months, I hope to provide a similar treatment for some of Chomsky's other books and for Wittgenstein and Skinner.

On innate concepts:

"The speed and precision of vocabulary acquisition leaves no real alternative to the conclusion that the child somehow has the concepts available before experience with language and is basically learning labels for concepts that are already part of his or her conceptual apparatus. This is why dictionary definitions can be sufficient for their purpose, though they are so imprecise. The rough approximation suffices because the basic principles of the word meaning (whatever they are) are known to the dictionary user, as they are to the language learner, independent of any instruction or experience."

"Now that can only mean one thing. Namely, human nature gives us the concept "climb" for free. That is, the concept "climb" is just part of the way in which we are able to interpret experience available to us before we even have the experience. That is probably true for most concepts that have words for them in language. This is the way we learn language. We simply learn the label that goes with the preexisting concept. So in other words, it is as if the child, prior to any experience, has a long list of concepts like "climb," and then the child is looking at the world to figure out which sound goes with which concept."

Continue Reading »

8 responses so far

Guess what turns me on?

(by melody) Sep 30 2010

...according to suspect sources on the Interwebs, I'm a logomaniac who enjoys gratification, control and long walks on the work.  Gross!  (At least "models" and "cookie" made the list?)  If *I* had devised my own list of favorite hot-topics (and not some half-baked word-counting algorithm that knows none of the particulars of my rich internal life), it would have had a lot of this, and this, and also that.

Clearly what this blog needs is more talk of sex, drugs, and cupcakes.

(Thanks to @gameswithwords for the generator tip.)

Comments are off for this post

Intelligent Nihilism

(by melody) Sep 30 2010

I wanted to register a quick reply to some of the comments on last week's post "The question is : are you dumber than a rat?"  In the comments there, and in posts on other blogs, our research program has been accused of intelligent nihilism.  By one such characterization, our position is that "we don't how the brain could give rise to a particular type of behavior, so humans must not be capable of it."  Though I think the label is quite witty -- and would love to have badges made for the lab! -- I think this misrepresents our stance rather badly ; our argument is that many of the properties that linguists have attributed to language are either empty theoretical constructs (hypotheses that are not supported by the empirical evidence) or are conceptually confused (and have been shown to be so; by Wittgenstein, Quine and many others).  We are not denying that language -- and linguistic behavior -- are complex; rather, we are rejecting a particular  stance towards language that we think is theoretically and empirically vacuous.  This does not lead us to nihilism, but rather to a different conception of language and how language is learned.

In any case, the comments on last week's post prove to be fertile ground for discussion, so I've posted them (in pared down fashion) along with a brief response.  The full comment thread can be found at the original post.

Continue Reading »

20 responses so far

On becoming Birkin and letting go of Gainsbourg

(by melody) Sep 27 2010

The attitude of defiance of many American women proves that they are haunted by a sense of their femininity. In truth, to go for a walk with one’s eyes open is enough to demonstrate that humanity is divided into two classes of individuals whose clothes, faces, bodies, smiles, gaits, interests, and occupations are manifestly different. Perhaps these differences are superficial, perhaps they are destined to disappear. What is certain is that they do most obviously exist.  --Simone de Beauvoir, The Second Sex

As a scientist, I take seriously the idea that our expectations about the world shape not only our understanding and perception of it, but our engagement with it. Here, I muse on the implications this has both for how we perceive successful women in society, and how our expectations may shape the course and outcome of their achievement.

Continue Reading »

22 responses so far

On Language : in response to Ben Zimmer

(by melody) Sep 25 2010

Last week, Ben Zimmer published an article in the NY Times on chunking in language.  This morning, I received an email in my inbox from one Prof Plum, who was writing to a parent (a good friend of his) to explain what was right -- and what was wrong -- with the article.  I was CC'd, as Plum thought I might enjoy the explanation.  I did!  With permission, I have included it below for any interested readers.  But first, here's Zimmer :

In recent decades, the study of language acquisition and instruction has increasingly focused on “chunking”: how children learn language not so much on a word-by-word basis but in larger “lexical chunks” or meaningful strings of words that are committed to memory. Chunks may consist of fixed idioms or conventional speech routines, but they can also simply be combinations of words that appear together frequently, in patterns that are known as “collocations.” In the 1960s, the linguist Michael Halliday pointed out that we tend to talk of “strong tea” instead of “powerful tea,” even though the phrases make equal sense. Rain, on the other hand, is much more likely to be described as “heavy” than “strong.”

A native speaker picks up thousands of chunks like “heavy rain” or “make yourself at home” in childhood, and psycholinguistic research suggests that these phrases are stored and processed in the brain as individual units. As the University of Nottingham linguist Norbert Schmitt has explained, it is much less taxing cognitively to have a set of ready-made lexical chunks at our disposal than to have to work through all the possibilities of word selection and sequencing every time we open our mouths.

Plum's response covers why word-chunks are better thought of as a skill than an inventory, how we unconsciously change our speech when we talk on cellphones, and how we can learn verbal humor.

Continue Reading »

8 responses so far

« Newer posts Older posts »

Bad Behavior has blocked 510 access attempts in the last 7 days.