Talk is cheap. Actions speak louder than words. These are cliches. But they are not necessarily true, as an experiment by Maros Servatka and colleagues shows.
They got subjects to play a simple game. In this, player A has $10 and can choose to give some, none or all to player B. However much he gives, the experimenter triples it. Player B then chooses to return some, none or all of the money to A.
This game hinges upon trust. The more A trusts B to return cash, the more he’ll give. So, how can A be made to trust B? The experimenters tried three things before the game was played. One was simply to do nothing. A second was to get B to write a message to A. The third was to have B give money to A.
And here’s the thing. The message worked much better at winning A’s trust and elicting his contribution. On average, A gave $8.92 when B sent a message, compared to $7.31 when B sent a gift and just $5.55 when there was nothing.
Words, then, are better than actions. This, says Dr Servatka, is because “a key to building a trusting relationship is in conveying the idea that both players are entering a mutually beneficial transaction that will result in both of them being better off.” And it seems that a message does more to build such a relationship and to convey that idea than does a gift. It seems that people are keen to trust those who send them a message.
This is consistent with the finding (pdf) from dictator games that “communication dramatically influences altruistic behavior, and appears to largely work by heightening empathy.”
All this helps explain why firms spend so much on advertising: communication elicits giving.
But it raises a thought. Lots of communication is asymmetric; it tends to run more from the rich to the poor, from rulers to ruled and from bosses to workers rather than vice versa. This suggests that the rich and powerful are more likely to elicit trust and gifts from the poor and weak than vice versa.
In this sense, the very existence of the media serves to entrench inequality, even aside from their explicit ideological content.
How do you scale up such experiments?
Is there proof they demonstrate real world social behaviour?
Posted by: Keith | April 27, 2011 at 07:48 PM
I'm not sure about this. In Superfreakoomics they cited an experiment where people tried things like this in the real world as opposed to a lab, and the findings were basically the exact opposite.
Posted by: Cahal | April 28, 2011 at 01:11 AM
Were there results published for how much money was given *back*? The first part of the game has strategic interdependence; in the second stage it's always dominant to take all the money.
Paul Zak has run a series of experiments just like this where he identified a hormone (oxytocin) that correlates strongly with the more generous outcomes on both sides. Later, when he introduced the hormone to the subjects, those subjects were more generous.
What Zak dubbed the "moral molecule" seems to cause the feeling of trust -- but it only lasts in the human body for a few minutes. It seems to me that this biological basis for generosity could just as easily be identified as "irrationality", since it leads players in these games to non-Nash -- but strictly superior -- outcomes. Which is preferred from a social standpoint.
I chatted with Dan Ariely about this and he said he'd been unable to get FDA approval for the same technique in his own irrationality experiments, but that it definitely made sense from the evolutionary standpoint -- it's really useful to have us be randomly irrational in situations where rational behavior makes us collectively worse off.
Posted by: Ben Daniels | April 29, 2011 at 10:37 AM