Science

Daisy

“…any mans death diminishes me, because I am involved in Mankinde…”

John Donne

 

The blank page beckons like the first cut with the scalpel—

tremulous, uncertain, unknown.

How does one eulogize the unknown?

 

We walked in silent that first day—

death replaced—diluted with chemicals

sterilized and fixed, but still present.

 

I caught myself resting my hand on the table.

Your table.

Ashamed of my casualness, I withdrew it.

There is comfort in the dead—comfort in you, Daisy,

as I hold your hand, with fingernails vivid pink.

I did not know you, not even your name, but I knew

your sacrifice for science—in vitro resurrection.

Your life after death.

 

I mine for rubies of coagulated blood—

deep, muddy red. They cover my gloved hand.

I take you with me when I leave—on my clothes—

your sanguine life force, condensed

Given up. No more. Reborn.

 

I see you everywhere.

At the grocer’s

and coffee shops

in my mother.

It is your organs I envision,

My first patient—my only patient.

 

I hold your heart up—pick your life apart,

the background you never knew—

 

the source of your love, the object of mine

my love of science heightened by your gift.

 

I am in love with a compete stranger, not complete,

I know part of you. The carnal half.

 

I wonder: what did you love?

 

photo by: nimishgogri

Explaining Empathy

This article originally appeared in The Curator May 21, 2010.

How do I know that I know what I know – about you? This is clearly a question about epistemology, about knowledge. But it’s a special kind of knowledge, about others.

The ability to understand what another human being is thinking or feeling is most commonly known as empathy. The word empathy comes from the German einfühlung, which literally translates as “feeling into.” For thousands of years, empathy has attracted the attention of great thinkers in many fields of study, but only recently has empathy experienced a serious comeback, signaled by the advent of social neuroscience. This field, a melding of social psychology and cognitive neuroscience, is startlingly young and the researchers in it are duly young, and maybe even hip (as David Brooks has pointed out).  Empathy has found center stage in a large body of social neuroscience research. So far there doesn’t seem to be a definite consensus on how we empathize with others, but there are two prominent theories on the table that try to explain the phenomenon of empathy.

The first one, called Simulation Theory, proposes that empathy is possible because when we see another person experiencing an emotion, we “simulate” or represent that same emotion in ourselves so we can know firsthand what it feels like. In fact, there is some preliminary evidence of so-called “mirror neurons” in humans that fire during both the observation and experience of actions and emotions. And there are even parts of the brain in the medial prefrontal cortex (responsible for higher-level kinds of thought) that show overlap of activation for both self-focused and other-focused thoughts and judgments. On an intuitive level, Simulation Theory makes sense, because it seems glaringly obvious that in order to understand what another person is feeling, I can simply pretend as if I were feeling the same thing. Despite its intuitive appeal, Simulation Theory has to be tested to see what evidence exists for it in the brain.

The other proposed theory that attempts to explain empathy, which some researchers think completely opposes Simulation Theory, is known as Theory of Mind—the ability to understand what another person is thinking and feeling based on rules for how one should think and feel. Research exploring Theory of Mind became very popular in clinical work on autism, the basic finding showing that autistic individuals cannot effectively represent or explain the mental states of another. More recently, tasks that tap Theory of Mind processes have been implemented in brain scanning studies. The results from these studies show that there may be specific brain areas that underlie and support a Theory of Mind.

Sadly, some researchers have pledged their allegiance exclusively to one of these theories, creating an academic duel with the naïve assumption that one of these theories is right and the other blatantly wrong. Not to risk sounding too cliché, but I can’t help but ask the question: can’t we just get along?

What’s most likely, maybe, is that empathy is a multi-faceted process, with some aspects of it being more automatic and emotional (immediately getting upset when we see a loved one who’s upset) and other aspects of it that are more reflective and conceptual (understanding why someone might be upset based on what we know about the person, his/her personality, etc.). Whether the more automatic or the more reflective aspect “kicks in” will necessarily depend on the social context in which we find ourselves. This is a daunting, open question, and we’ll have to wait for social neuroscience as a field to grow a bit more and address it.

For now, what we can say from empathy research is that we have begun to understand how the brain gives rise to the wonderful capacity we have to “feel into” another human being. With the newfound tools of social neuroscience in hand, psychologists and neuroscientists are now on the cusp of more discoveries about the vibrant life of the empathic brain.

 

photo by: A_of_DooM

Princeton Explores the Art of Science

Check out this article via Metafilter about the Art of Science Competition at Princeton University

Princeton’s 5th Annual Art of Science Exhibition

“The Art of Science exhibition explores the interplay between science and art.  These practices both involve the pursuit of those moments of discovery when what you perceive suddenly becomes more than the sum of its parts.  Each piece in this exhibition is, in its own way, a record of such a moment.”

……………………………………………………………….

“This is the 5th Art of Science competition hosted by Princeton University.  The 2011 competition drew 168 submission from 20 departments.  The exhibit includes work by undergraduates, faculty, research staff, graduate students, and alumni.”

……………………………………………………………………………

View the Art of Science Submissions Here

Tropical Fish by Yunlai Zha Image Courtesy of http://www.princeton.edu/artofscience

RoboRoach Academy

I may have just met the kid who grows up and cures Alzheimer’s– the person who will one day claim that he or she started their journey in biomedicine thanks to two guys on a mission to democratize neuroscience.

Also, I saw a remote-controlled cockroach. A live cockroach saddled with a circuit backpack, steered via wireless controller. When I heard of it, I was standing in a park watching my son play soccer. A friend of mine came over and mentioned it, cyborg cockroaches in Clarkston, Michigan.

It was as if a spacecraft landed at midfield and the ghost of Jules Verne beckoned. I had to go.

***

I arrived at Clarkston Science, Math, and Technology Academy at about 9 a.m. Soldering irons surrounded eleventh-grade biology students. They spent the morning building biomedical equipment, SpikerBoxes, from kits developed by Backyard Brains.

Greg Gage and Tim Marzullo, both PhDs, founded the company.

Cockroaches are submerged in ice water to anesthetize them.

“This actual project started as a joke,” Marzullo says of the SpikerBox. We pause to listen to the teacher instruct the students to check twice and solder once. Marzullo explains that PhD candidates who work in solid state electronics labs may spend up to 6 years developing sophisticated, patent-able equipment using next generation chips and sensors. He and Gage wondered, if you just want to read neural activity, could someone do what costs a million dollars for less than a hundred?

They presented non-working prototypes at a conference two and a half years ago, and were flooded with responses.

“We had more attention in three hours in presenting these non-working prototypes than I did in six years of experiments in grad school,” says Marzullo. They responded and created a working model from off-the-shelf components.

The SpikerBoxes allow students to hear and see neural activity, called a spike.

Marzullo explains,“The EKG, the lub-dub, is a kind of a cultural phenomenon … [Like the heart], the neurons also fire, also use electricity to communicate as well. But it’s much faster—one millisecond long—and it’s much smaller in amplitude; it’s a much weaker signal. So that spike is kind of like that electrical pulse that travels down a neuron, and the rate of those pulses is one way that the brain encodes information. So when you’re seeing a spike, it’s like the first time you hear a heart beat.”

***

The students finish the SpikerBoxes; the cockroach experience begins. A few Blaberus discoidalis cockroaches will have a limb surgically removed. The legs, the scientists explain, have neurons firing in them, even after they are amputated, and will remain alive for up to two days.

Volunteers take on the roles of anesthesiologist and surgeon. Marzullo guides them through the procedure. The cockroaches are removed from their habitat and submerged in ice water. Using small, curved scissors, the leg is quickly and carefully cut and pinned to a SpikerBox. Students huddle around it, waiting to hear the spikes. It sounds like static. Gage and Marzullo then connect the box to an iPad, and students can see a visual representation of the sounds.

They discuss possible responses of the leg to stimuli, and reveal what will be one of the student’s favorite experiments: How will a cockroach leg respond to the sound vibrations of hip-hop, specifically the song “Love the Way You Lie?”

Visual evidence suggests that the legs preferred the beats of the Eminem verse to the melodic sections featuring Rihanna. The cockroach leg appears to dance.

Later, Marzullo says, “the first time that dancing leg thing worked, I nearly fell off my seat… [it’s] just science fiction far out.”

It’s more than that. Gage and Marzullo encourage the students to have a healthy skepticism. Is this real, they want to know, or are we tricking you? I find myself playing along. They could fake the spikes; how would I know the sound or wave pattern of a neuron? I could argue away that evidence as trickery. I’m having a harder time arguing with a newly severed, rhythmic limb.

***

I am inspired to make up words. Entrepreneurologists. Revulsionary. Creeptastic.

***

A group of prospective students, eighth-graders, come through the classroom on a tour. Mike Olsen, my friend, the teacher, tells them about the day’s activity. One of the students asks, “Is that ethical?”

He sees this kind of hands-on work as intellectual nutrition for his students, and reminds me that the cockroaches aren’t actually dying, and it’s true. Both the amputees and the implanted cockroaches continue their lives: eating, reproducing. Despite this, Marzullo tells me some of his colleagues feel that a three-dimensional computer model would suffice, that this is a step backward. In their eyes the experiments are less ethical when less supervised, less controlled, less mature students participate.

Gage and Marzullo see the participation differently. They see themselves at 16, longing to have this sort of opportunity. Beyond this, they wonder if an early understanding might lead to more rapid advancement in their field, eventually leading to breakthroughs that improve the quality of life for people dealing with brain function anomalies. They’ve received funding from the Kauffman Foundation, the Michigan New Economy Initiative, and the National Institutes of Health’s Small Business Innovation Research grant program. They’ll be reporting on how student retention of neuroscience concepts is impacted by these experiments over the next two years.

The cockroaches, then, aren’t the only subjects.

***

I survey the room. Gage walks around as spikes screech from each table. Marzullo holds his breath as he brings together electrode and antennae for a different experiment, the much-anticipated RoboRoach. When this step is complete, he says, “This is so wonderful, hearing sounds like this in a high school classroom.”

Another screech rises up from the lab tables, and Marzullo laughs as he returns to the prep. About a minute and a half later, one box sounds like high-pitched, club-style scratching. Marzullo looks up and explains to the cockroach deejay that this is how a theremin works as well. I find myself singing “Good Vibrations.”

The RoboRoach prep complete, students take turns pressing the buttons on a control panel about the size of the roach itself, laughing about what the cockroach might say if it could speak. They observe the cockroach at first responding to, then eventually ignoring the microstimulation.

The cockroach isn’t really a cyborg; he’s being tricked into moving in one direction or another. Eventually, the RoboRoach is no longer steerable. The microstimulation provides no reinforcement, so the impulse is adapted to, ignored. I imagine that this could be altered with a reward, a treat. For the rest of the day, I try not to be distracted by the vision of someone breeding a cockroach army.

***

I text my husband that this is the best day ever. It’s almost like living poetry in the classroom, watching students so engaged, watching scientists and teachers work with such enthusiasm and passion.

“The average person on the street, not even the average person, the above-average person doesn’t know how the brain works,” says Gage, “doesn’t even know the basic principles of the brain, that energy from the outside world, be it sound, light, heat, gets transformed into a neural code through these things, through these neurons, and then your brain processes this information and then causes your body to move, all through electricity.”

Marzullo says, “When you’re seeing a spike, it’s like the first time you hear a heart beat. You’re seeing that basic element of information-processing in your brain. And so we’ll see some this afternoon, and when you look at it, it’s like you’re looking at reality.”

It’s the stuff of fiction, but it’s real. It’s science and meta-science. It’s challenging; it’s full of potential; it feels like art.

Explaining Empathy

How do I know that I know what I know – about you? This is clearly a question about epistemology, about knowledge. But it’s a special kind of knowledge, about others.

The ability to understand what another human being is thinking or feeling is most commonly known as empathy. The word empathy comes from the German einfühlung, which literally translates as “feeling into.” For thousands of years, empathy has attracted the attention of great thinkers in many fields of study, but only recently has empathy experienced a serious comeback, signaled by the advent of social neuroscience. This field, a melding of social psychology and cognitive neuroscience, is startlingly young and the researchers in it are duly young, and maybe even hip (as David Brooks has pointed out).  Empathy has found center stage in a large body of social neuroscience research. So far there doesn’t seem to be a definite consensus on how we empathize with others, but there are two prominent theories on the table that try to explain the phenomenon of empathy.

The first one, called Simulation Theory, proposes that empathy is possible because when we see another person experiencing an emotion, we “simulate” or represent that same emotion in ourselves so we can know firsthand what it feels like. In fact, there is some preliminary evidence of so-called “mirror neurons” in humans that fire during both the observation and experience of actions and emotions. And there are even parts of the brain in the medial prefrontal cortex (responsible for higher-level kinds of thought) that show overlap of activation for both self-focused and other-focused thoughts and judgments. On an intuitive level, Simulation Theory makes sense, because it seems glaringly obvious that in order to understand what another person is feeling, I can simply pretend as if I were feeling the same thing. Despite its intuitive appeal, Simulation Theory has to be tested to see what evidence exists for it in the brain.

The other proposed theory that attempts to explain empathy, which some researchers think completely opposes Simulation Theory, is known as Theory of Mind—the ability to understand what another person is thinking and feeling based on rules for how one should think and feel. Research exploring Theory of Mind became very popular in clinical work on autism, the basic finding showing that autistic individuals cannot effectively represent or explain the mental states of another. More recently, tasks that tap Theory of Mind processes have been implemented in brain scanning studies. The results from these studies show that there may be specific brain areas that underlie and support a Theory of Mind.

Sadly, some researchers have pledged their allegiance exclusively to one of these theories, creating an academic duel with the naïve assumption that one of these theories is right and the other blatantly wrong. Not to risk sounding too cliché, but I can’t help but ask the question: can’t we just get along?

What’s most likely, maybe, is that empathy is a multi-faceted process, with some aspects of it being more automatic and emotional (immediately getting upset when we see a loved one who’s upset) and other aspects of it that are more reflective and conceptual (understanding why someone might be upset based on what we know about the person, his/her personality, etc.). Whether the more automatic or the more reflective aspect “kicks in” will necessarily depend on the social context in which we find ourselves. This is a daunting, open question, and we’ll have to wait for social neuroscience as a field to grow a bit more and address it.

For now, what we can say from empathy research is that we have begun to understand how the brain gives rise to the wonderful capacity we have to “feel into” another human being. With the newfound tools of social neuroscience in hand, psychologists and neuroscientists are now on the cusp of more discoveries about the vibrant life of the empathic brain.

photo by:

Angry Babies and Automatic Minds

 

BAM! ANGRY BABY!

There it is.

“It” is the quick “Whoa, angry baby!” reaction you experienced when you saw the picture. Quick cognitive processes carried out by your brain immediately perceived the baby’s facial expression as the primary input. The output – the “whoa” response – is seemingly not subject to reflective thought. It just happens.

We can say our reaction is automatic; it doesn’t “wait” for us to reflect on it or change it. For many years, psychologists debated about what comes first: affective (good/bad) reactions, or deliberate and reflective thinking? Of course, these two processes are not mutually exclusive, since a lot of our cognition is probably an interaction between them. But many times, one type of process might have greater influence in driving a response to a stimulus, as we’ve seen here with this agitated little one.

So what makes up the stuff of thought? “Hot” affective reactions, or “cold” cognitive operations? For much of the last century, information processing models of cognition assumed that affective (evaluative) judgments happened at a post-cognitive stage. For example, if I wanted to determine whether or not I like a certain sweater, I would first make informed inferences based on multiple criteria of the sweater (such as softness of fabric, size, style, and so on). After individually calculating “weights” for each criterion, I would then sum the weights and, finally, make an evaluative decision about the sweater.

The game-changer came in 1980, when the late Robert Zajonc published a seminal paper that swept the theory-scape of psychology. According to Google Scholar, this paper – “Feeling and Thinking: Preferences Need No Inferences” – has been cited nearly 3,600 times since its publication. Zajonc’s main premise was that affect tends to overwhelmingly color our perceptions and judgments, leaving little room for more reasoned thought. Affect dominates so much, Zajonc argued, that affective processes temporally precede and therefore trump deliberate thinking.

Social psychology has taken this assumption and run with it – probably prematurely. For example, in 1999 John Bargh published “The Unbearable Automaticity of Being” in American Psychologist. This paper highlights some interesting findings from Bargh’s research, suggesting that a lot of our behaviors are activated and carried out by unconscious processes. These processes, sometimes called heuristics, are usually rapid and affective in quality, and while they might be more functionally efficient, they give rise to errors and biases in our reasoning and judgments (see Kahneman & Tversky, 1974).

But Bargh, at the end of what seems like a permissive concession to the intractability of our automatic minds, comes to the strange conclusion that automatic processes “are in our service and best interests…They are, if anything, ‘mental butlers’ who know our tendencies and preferences so well that they anticipate and take care of them for us, without having to be asked.”

That’s all well and good – if our preferences and tendencies are healthy and adaptive, contributing to our (and others’) well-being and flourishing. But what about in the case of psychopathology? What should I tell the depressed patient about her recurring negative thoughts that paralyze her and prevent her from connecting with those whom she loves? Should I just tell her that her “mental butler” is taking care of everything, and that she shouldn’t go to therapy to try to change her thought processes?

While I respect Bargh’s opinion, I do not like its implications. While we might, without proper training or knowledge, succumb to automatic thought processes that guide our judgments and behavior, we should not believe that we have no power over our thoughts. There is hope in the budding field of social and affective neuroscience, as many lines of research are revealing the possibility of adaptive cognitive change (such as mindfulness meditation and cognitive reappraisal).

William James once said, “Compared with what we ought to be, we are only half awake.” I would like to think that as we keep charting courses in psychology and neuroscience, we are waking up to the possibilities that our mysteriously beautiful minds can offer us.

Why studies of popular science are often wrong

From Newsweek: Popular Science: Beware False Claims.

It is a sorry fact of science that many, many of the results reported even in peer-reviewed, published studies are wrong-by some accounts, most are wrong. By dumb luck (also known as statistical errors), something that seems to be associated with something else isn’t; something that seems to cause something else doesn’t; or something that seems to be the result of something else isn’t. Alternatively, a study can fail to find evidence for something that, it later turns out, is indeed true. Both kinds of mistakes-false positives and false negatives-are well known to scientists, who take lots of precautions (not always successfully) to prevent them.

This would be bad enough if the areas where these mistakes cropped up were obscure, unimportant backwaters. But they’re not. According to a new study (yes, I understand the irony), the more popular a science topic is, the more likely its findings are to be riddled with errors.