Browsing by Author "Feld, Jan"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Endophilia or Exophobia: Beyond Discrimination(2014-05) Feld, Jan; Salamanca, Nicolás; Hamermesh, Daniel S.; Dept. of Economics, University of GothenburgThe discrimination literature treats outcomes as relative. But does a differential arise because agents discriminate against others—exophobia—or because they favor their own kind—endophilia? Using a field experiment that assigned graders randomly to students' exams that did/ did not contain names, on average we find favoritism but no discrimination by nationality, and some evidence of favoritism for the opposite gender. We identify distributions of individuals' preferences for favoritism and discrimination. We show that a changing correlation between them generates perverse changes in market differentials and that their relative importance informs the choice of a base group in adjusting wage differentials.Item Estimating the Relationship between Skill and Overconfidence(2015-09) Feld, Jan; Sauermann, Jan; De Grip, Andries; Dept. of Economics, University of GothenburgThe Dunning–Kruger effect states that the low skilled are overconfident while the high skilled are more accurate in assessing their skill. In apparent support of this effect, many studies have shown that low performers overestimate their performance while high performers are more accurate. This empirical pattern, however, might be a statistical artifact caused by measurement error. We are the first paper to consistently estimate the Dunning–Kruger effect using an instrumental variable approach. In the context of exam grade predictions of economics students, we use students’ grade point average as an instrument for their skill. Our results support the existence of the Dunning–Kruger effect.Item Understanding Peer Effects: On the Nature, Estimation and Channels of Peer Effects(2014-06) Feld, Jan; Zölitz, Ulf; Dept. of Economics, University of GothenburgThis paper provides evidence on ability peer effects in university education. Identification comes from the random assignment of students to sections. We find that students on average benefit from better-ability peers. Low-ability students, however, are harmed by high-ability peers. We introduce a placebo analysis that provides a simple test to quantify the estimation bias driven by the mechanisms described in Angrist (2013). In our setting, the bias is small and does not drive our results. Analyzing students’ course evaluations suggests that peer effects are driven by improved student interaction rather than adjustments in teachers’ behavior or students’ effort.