In The Papers: Defending Homo Oeconomicus

I’ve got my beefs with the idea of homo oeconomicus:  namely, it’s a caricature of humanity.  This is as old as the idea of homo oeconomicus itself, and something that economists recognize, even as they use the model.  Behavioral economics has, in recent years, become popular as a reaction to some of the assumptions of homo oeconomicus, such as pure rationality (in the sense that homo oeconomicus has unlimited computing capacity and full knowledge of all consequences of its actions).  There are two camps that have formed on this:  the first camp includes behavioral economists and argues that in order to further our understanding of the real world, we must model man in a more complex and serious manner; the second camp, however, follows from Milton Friedman’s view that as long as our models comport with reality, it doesn’t really matter how ridiculous the simplifying assumptions are.  And that’s exactly what homo oeconomicus is:  a simplifying assumption.

Anyhow, I have a few papers which stick up for homo oeconomicus, either implicitly or explicitly.  The first paper is by Faruk Gul and Wolfgang Pesendorfer, entitled The Case for Mindless Economics (ungated version).

Abstract:

Neuroeconomics proposes radical changes in the methods of economics. This essay discusses
the proposed changes in methodology, together with the the neuroeconomic critique
of standard economics. We do not assess the contributions or promise of neuroeconomic
research. Rather, we offer a response to the neuroeconomic critique of standard economics.

In this paper, Gul and Pesendorfer defend standard economics against the critiques of neuroeconomists and psychologists, who argue that the model of homo oeconomicus does not take into consideration what happens in the brain.  This field tries to take psychological insights and apply them to economics.  Gul and Pesendorfer take a fairly pragmatic stance here, arguing that how people get to the decisions they choose is not a part of economics as such, and any information discovered by neuroeconomists indicating that people can make poor choices in selecting what they really want is secondary to the study of economics as a science of choices.  Standard economics does not depend on a person’s brain working in particular ways, as economics abstracts away from that.  In short, argue the authors, the field of neuroeconomics may very well be a good field for research and may turn up interesting things, but these authors are looking at a different piece of the social science world, and results in that field will not affect standard economics one way or the other.

Moving along, we have next David Levine’s Max Weber Lecture, Is Behavioral Economics Doomed? (ungated version)

There is no abstract to this, as it was a talk.  Levine’s talk focused on how standard economics gets a lot right.  Even though people do not live according to the model of homo oeconomicus, for example, Levine argues that standard economics gets voting pretty well (though Bryan Caplan would disagree).

Levine even defends standard economics against the ultimatum game critique.  Basically, the ultimatum game goes a bit like this:  you have $10 to split between yourself and a second person.  You choose the split, and the second person chooses whether or not to accept this.  If he accepts, you both take the money according to your split; if he refuses, neither of you gets a penny.  In either event, the game ends after this choice.  According to standard economic theory, you should offer the other person 1 cent, and the other person should accept because 1 cent is still better than nothing.  This is one of the games that behavioral economists use to critique the standard economic model.  Levine points out, though, that if you think about the game a bit longer, there is a good interpretation of the results in the standard economic framework, and that is the power of spite and the threat of retaliation, especially relating to multi-shot games.  In addition, Levine argues, people are searching out the equilibrium, so we tend to see multi-shot games ending closer to what standard economics would suggest:  a high take-home amount for the first person, but not quite 100%.

Finally, Elif Incekara Hafalir and George Loewenstein accidentally support homo oeconomicus in The Impact of Credit Cards on Spending:  A Field Experiment (ungated version).

Abstract:

In a field experiment, we measure the impact of payment with credit card as compared with cash on insurance company employees’ spending on lunch in a cafeteria. We exogenously changed some diners’ payment medium from cash to a credit card by giving them an incentive to pay with a credit card. Surprisingly, we find that credit cards do not increase spending. However, the use of credit cards has a differential impact on spending for revolvers and convenience users: Revolvers spend less when induced to spend with a credit card, whereas convenience users display the opposite pattern.

Basically, the authors have an experiment in which they tried to show that people will act irrationally if you have them use credit cards as opposed to cash:  they will spend more money if spending with credit than with cash.  In the results, however, they actually showed that there was no statistically significant difference between their control group and the group of individuals they attempted to entice into using credit cards as opposed to cash (by offering higher-valued Amazon gift cards as a reward for paying for lunch with a credit card).

And, to end this, I shall link to a blog post on a David Levine article.  It seems that people who want to dump homo oeconomicus are going to have to do a bit more than say that the simplifying assumptions are realistic; they’ll also have to show that more complex assumptions explain the world better, that these results directly affect utility-based choice behavior, and that they can explain a set of experimental results of which those that standard economics can explain is a subset.  Behavioral economics as it stands does not fit these characteristics, at least according to the papers above.

Like A Hot, Heavy Notepad

I purchased a used Gateway M275 from an auction and picked it up on Sunday.  Here’s my quick review, having had a day to play with it.

When I bought it, the original owners apparently wiped the hard drive to remove any sensitive business information (it was a business auction) and re-installed Windows.  Unfortunately, they re-installed the normal version of Windows XP.  Fortunately, I have access to the Tablet edition, so I installed that.  When the device is in laptop mode, the monitor is a little rickety, as it likes to swivel a little bit.  Otherwise, however, it’s a pretty good Gateway.  It doesn’t have a nub between the G and H like my other Gateway does, and its touchpad isn’t quite as good (it only has two buttons below the touchpad and a scroll bar built into the right side of the pad) as my M460, which I really like.

Anyhow, after re-installing with the Tablet edition of Windows XP, I figured I would try things out for the real use:  tablet mode.  When you swivel the monitor around and lay it back on the keyboard, you can enter tablet mode.  At that point, the laptop is a heavy notepad.  Writing is fairly easy to do, and after getting Firefox to display scrollbars on the left-hand side, I decided to try to do the same for all applications, given my sinister nature.  Unfortunately, it appears that Windows does not have an option to change the default position of its scroll bars.  That ruined that idea…

One thing I really like is that the “rotate 90 degrees” button is built into the monitor as a button.  So far, I have used the tablet vertically, so I have to rotate it a couple of times to get things in the correct position.  It would have been nice to use a gyroscope to figure out which way the tablet is turned, but if you lay it flat, I guess that could cause some problems.  Anyhow, I don’t like laying this flat because the LCD starts getting dark if you aren’t staring at it almost head-on.

All in all, it’s pretty nice considering how much I paid and that it’s a few years old.  The screen isn’t quite tall enough to make Firefox browsing in vertical mode a joy—many webpages tend to want relatively wide screens, so you don’t get too many columns in the middle on pages in 1024×768 mode (well, 768×1024 actually), and that’s the highest that’s supported by the graphics card.  The 512 MB of RAM is also a little low, but it wasn’t by the standards of the time, and there’s a free slot for additional RAM and a maximum capacity of 2 GB.

Oh, and one final thing:  it does get pretty hot, so if you’re going through a 4-hour design session, you will feel the laptop in the end.

Smoot And Hawley Are Off The Hook

Replace their names with Waxman and Markey.  People don’t like it, it doesn’t do anything good, nobody really knows what’s in it (sounds like the “stimulus” bill there, doesn’t it?), and results in major tax hikes and increased gas prices (diesel will go up too, Pat, so wipe that smug look off your face…).  This is really little more than a grab bag for Democrats’ constituent groups and doesn’t do much of anything to “solve” the problem they’re supposedly “fixing.”  Congratulations, Democrats—you’ve just given Republicans a winning issue for 2010, even if the Senate bottles this up.

In The Papers: Name Discrimination

I recently found out that I have access to NBER working papers, due to my .gov domain.  As a result, I can print off the latest papers that folks are working on.  Because I’m reading a lot more academic papers now than previously (kind of sad when you think about it), I figured I could start an ongoing series, discussing some of what I’ve been reading.

The first paper is Philip Oreopoulos’s Why Do Skilled Immigrants Struggle in the Labor Market? A Field Experiment with Six Thousand Resumes.  Here is the abstract:

Thousands of resumes were sent in response to online job postings across multiple occupations in Toronto to investigate why Canadian immigrants, allowed in based on skill, struggle in the labor market. Resumes were constructed to plausibly represent recent immigrants under the point system from the three largest countries of origin (China, India, and Pakistan) and Britain, as well as non-immigrants with and without ethnic-sounding names. In addition to names, I randomized where applicants received their undergraduate degree, whether their job experience was gained in Toronto or Mumbai (or another foreign city), whether they listed being fluent in multiple languages (including French). The study produced four main findings: 1) Interview request rates for English-named applicants with Canadian education and experience were more than three times higher compared to resumes with Chinese, Indian, or Pakistani names with foreign education and experience (5 percent versus 16 percent), but were no different compared to foreign applicants from Britain. 2) Employers valued experience acquired in Canada much more than if acquired in a foreign country. Changing foreign resumes to include only experience from Canada raised callback rates to 11 percent. 3) Among resumes listing 4 to 6 years of Canadian experience, whether an applicant’s degree was from Canada or not, or whether the applicant obtained additional Canadian education or not had no impact on the chances for an interview request. 4) Canadian applicants that differed only by name had substantially different callback rates: Those with English-sounding names received interview requests 40 percent more often than applicants with Chinese, Indian, or Pakistani names (16 percent versus 11 percent). Overall, the results suggest considerable employer discrimination against applicants with ethnic names or with experience from foreign firms.

A couple of things struck me as extremely interesting here.  First, ethnic names still ended up with a lower callback ratio even if their first names were Canadianized (so, for example, Amy Wang rather than Tao Wang).

Second, Chinese-sounding names had a statistically significant disadvantage even if they had Canadian education and experience or foreign education and Canadian experience; meanwhile, in those cases, it is marginally better to be Indian or Pakastani (and kind of sucks altogether if you don’t have the major Canadian experience; foreign experience had a 5% callback rate).  Given the ethnic Chinese population in Canada, this is very surprising.  British resumes, however, had basically the same callback rates as Canadian ones.

Third, fitting in with the stereotype, programmers were least likely to discriminate against ethnic names.  In three of the four cases, the difference was within one standard deviation of the British/Canadian mean.  The only situation in which it mattered was with foreign education and mixed experience.  I can’t figure that one out…  On the other side, finance jobs tended to discriminate most harshly against ethnic names—all four types were statistically significant at the 1% level.  Even with 6 years of Canadian experience and having studied at a Canadian university, a person with a foreign name would only get called back 14% of the time, as opposed to 21% for Canadians with English names.  Individuals with foreign names, education, and experience received a call back roughly 5% of the time.  I think I can understand this, at least to some extent:  financial rules differ across countries, so it is important to have a good understanding of Canadian financial regulations and accounting practices.  But in that case, you would think that somebody who studied in Canada and had 6 years of financial experience would have just as much knowledge as somebody who was born in Canada with the same education and experience.

One thing I do wish this paper covered was adding American resumes as well, to see if they play the same role as British resumes (i.e., no discrimination).  I would guess that they do, though.  I would also guess that trends are similar in the US, once you control for degree (in Canada, due to their skill-based immigration program, almost all immigrants have bachelor’s degrees, so this study had all of the fictitious candidates holding bachelor’s or master’s degrees).

Database & Programming Notes

Here are a few database notes I’ve squirreled away for your enjoyment.

- I’m seriously thinking about going to PASS this year.  It will be in Seattle, so I could visit my brother and his wife again.  We’ll see how things look, though, when it gets closer to that time.

- Speaking of conferences, PyOhio will take place next month, July 25th & 26th.  It will be hosted at Knowlton Hall on OSU’s campus, which is about 2 1/2 miles away from me.  This Saturday, I’m thinking about walking around there a bit, to change up my normal walking routine, but I’m very interested in attending that conference, too.  Fortunately, it’s free of charge, and I might be able to talk a co-worker or two into attending, as I’ve been advocating that we use Python in the office, especially with IronPython getting more stable.

- This is something which seems counter-intuitive:  nested user-defined functions might not give you much of a performance hit at all.  I was under the impression that, at least when you aren’t using Microsoft’s built-in UDFs, a non-table user-defined function forced SQL Server to process row-by-row, leading to the well-named RBAR.  I’m going to have to re-evaluate that thought, I guess.

- Here are a couple of interesting posts on rollback and rollforward tables.  The idea is interesting, particularly from an auditing perspective.  I haven’t had a chance to test these out yet, so I’m not sure exactly how difficult they would be to fit into our current framework at work, but the idea seems sound.

- Sometimes, SQL Server articles aren’t all that good.  In this particular article, the author uses the “select * from table where (@Variable is NULL or Variable = @Variable)” syntax.  We were doing this at work for a while before finding out in Adam Machanic’s book that this is probably the absolute worst way of solving the problem.  Basically, you’re trying to do two separate queries but SQL Server can only cache the plan for one of them.  If it caches the “@Variable is NULL” portion, it won’t optimize for a small record selection, but if it optimizes for that small record selection and you pass in a value of null for that @Variable, the end result could be worse than a table scan.  Instead, if you have variables which may or may not be null, you should build a dynamic SQL statement and execute that instead.  Erland Sommarskog has more on all of the dynamic search condition possibilities in T-SQL, so read that.  His recommendation is that you only use this syntax if you know you are dealing with a rather small table, as the benefit is that it is very easy for a developer to read, and if the additional time isn’t that big of a factor (half a second, say, over a superior method), stick with the easy stuff.  He is perhaps a little bit kinder to it than I, but probably because this syntax burned me previously.

- I’m a big fan of indexed views, but this advice is good:  know that indexed views shouldn’t be permanent.  Even if they are helpful now, they may be harmful with a different table size or structure.

Re-Living The Fatal Conceit

John Stossel has a new blog, and in one of his first posts, he brings up FA Hayek’s idea of the fatal conceit.  Unfortunately, politicians never seem to learn that lesson.

He also points out that Joe Biden, Supergenius, says that nobody could have predicted that the “stimulus” wouldn’t do what proponents claimed.  Stossel takes him to task, and Patrick, in the comments, cracked me up with “Four out of every seven days, Joe is surprised by the sun coming up.”