Our competitive nature is not natural

The heart of economics is an understanding of human choice. Our theories are almost all constructed on a utilitarian model of preferences and choice so understanding how we can improve that model is crucial to progress. Plenty of work has been done by behavioural empiricists on the heuristics that guide our behaviour and the anomalies in our choices. Now a paper by Leibbrandt, Gneezy, and List demonstrates that preferences also change over time in a predictable way. Read more

Nobel 2013: Fama, Shiller, and Hansen

Yah, Nobel prize.  All guys that deserved it … I just wouldn’t have expected them to get it together.  To be honest, the reasoning makes sense though – they have all added significantly to the empirical analysis of asset prices, albeit in quite different ways 🙂

Still, don’t read me.  Read Cochrane (here, here, herehere, here).  And Marginal Revolution (here, here, here, here, here, here).

Also I enjoyed this.  And this post on why the Chicago school gets so many Nobel laureates is a good counter-measure to all the arbitrary bile that can be thrown around on the interwebs 🙂 .  I also enjoyed this post from Noah Smith.

I have a bias towards Shiller in all of this because of my interests.  He is a big proponent of trying to view economic phenomenon through a lens of history dependence (with the regulatory difficulties that entails) and has talked about how exciting neuroeconomics is – completely agree.  However, this has nothing to do with empirical finance in of itself, as this is not my field.  While I think some of the stuff is pretty cool (and remember really like GMM a few years back) I have nothing to say.  Hence why you should be going back and clicking those links to Cochrane and Marginal Revolution 😉

Quote of the day: Pinker on changes in social sciences

Via Noah Smith.  We have the following post on the Pinker vs Wieseltier debate on science and humanities (if you have a chance I would suggest reading the debates themselves as well).

The era in which an essayist can get away with ex cathedra pronouncements on factual questions in social science is coming to an end.

Very good, and Pinker’s co-operative version of science with the humanities seems appropriate to me (where instead we are merely asking about how to deal with certain propositions and using the best tools available).  I think Pinker won this debate, I am unsure why Wieseltier felt it necessary to take such an extreme position though – I think he initially believed Pinker was trying to force through a view based on the superiority of scientific authority (one that Pinker rules out in his initial article!), when he was really just suggesting the use of the scientific method (namely introducing a degree of the positivist view of theory creation) given the improvements in data availability and usability we have had.

As XKCD says:

But even within Pinker’s reasonable claims there is one area where I would be a touch careful Read more

Quote(s) of the day: Keuzenkamp on controls and data mining

As I indicated here I am reading Probability, Econometrics, and Truth.  A nice outline of things, I’m enjoying it at present (I was 20% of the way through when I wrote this post over five weeks ago).

Two quotes I’d like to note down here: Read more

Lies, damn lies, and statistics

Last week David Grimmond wrote (here and here):

However, despite the great informational power of statistics, bear in mind that sample based statistics are still always measured with error.

How often do we hear news items that note something like: ‘according to the latest political poll, the support for the Haveigotadealforyou Party has increased from 9% to 9.5%” etc, but then just before closing the item they state that the survey has a 2% margin of error.

If you are awake to this point you suddenly realise that you have just been totally misled.

With an error margin of 2 percentage points, you cannot make any inference about anything within a 2 percentage point margin.

After discussing this point he states:

One can react to this article in (at least) two ways: one could become a bit more relaxed about the significance of changes reported in statistics or one could seek improvements to the accuracy of statistical collection.

In what areas do you think the first option is appropriate, and in what ways would it be worthwhile to increase spending to improve accuracy?


Visualisation is the trendy way to represent data these days, but sight is not our only sense. Via Dave Giles I see that some researchers are exploring sonification as a way to represent information. They claim that

…with complex data series one can often hear patterns or persistent pitches that would be difficult to show visually. Musical pitches are periodic components of sound and repetition over time can be readily discerned by the listener.

Giles has previously covered the topic and it’s well worth having a look at the series he examines in that post. I know I’ve spent far too much time staring blankly at a volatile time series plot before attempting various transformations just so that my eyes can make sense of it. If there is a better way that uses other senses to quickly discern patterns in the data then I’m all for it!

For a more populist example, here’s a cellist playing a time series of temperature readings: A song of our warming planet