Skip to main content

Sayles Research Guidelines

2017—2018 Application Guidelines


Winter, Spring, Summer, and Fall Terms


Amount of award may vary. As determined by Dartmouth’s GCC (Grants Coordinating Committee), award maximums are set at $2,000 per student for domestic research projects and $2,500 for projects that take place abroad. Awards are considered taxable income by the IRS.


The Ethics Institute has Student Research Opportunities available to support Dartmouth undergraduate students. Students develop and present their proposed project to the Ethics Institute as outlined below. Projects may take place either off-campus or in Hanover. Research opportunities are not given academic credit. 


  • Applicant must be at least a sophomore and must be enrolled for at least one term following the research;
  • Applicant must submit a complete application to the Ethics Institute by the deadline;
  • A 3-5 page Final Report must be submitted evaluating the completed research in terms of its value and its impact on the student. Copies of any formal work resulting
    from the research period will be submitted with the Final Report or when completed;
  • A brief Financial Report must be submitted showing how funds were utilized;
  • The Final Report and Financial Report are due the third week of the term following the research.

Grants are awarded on the basis of:

  • the merits of the project and the strength of the proposal;
  • a solid academic background and overall grade point average;
  • faculty recommendation and/or sponsor support for the project.

Application deadline

The sixth week of term preceding term of research.

  • for Winter Term ’18: October 21, 2017
  • for Spring Term '18: February 12, 2018
  • for Summer Term '18:  May 07, 2018
  • for Fall Term '17: August 02, 2017

General Application Instructions

Sayles Grant Award Recipient, 2013, Christopher Givens

Excerpt from his experience:  For the past 8 weeks, I worked as a research assistant for the Moral Cognition Lab at Harvard University, directed by Professor Joshua Greene. I worked primarily under Joseph Paxton, a graduate student in the laboratory who studies moral judgment and moral decision-making. Together, Joe and I embarked to create an experiment that would investigate the respective roles of automatic and conscious processes in the formation of moral judgments.

Past research has supported a dual-system framework in moral psychology, in which deontological judgments typically result from automatic emotional responses, while utilitarian judgments primarily stem from cognitively controlled processes. However, Joe and I wanted to further investigate this distinction, asking: what are the conditions under which people will make utilitarian or deontological judgments? We designed a survey that included two sections that were randomly ordered. The first was a cognitive reflection task (CRT), which included three short math questions whose correct answers were counter intuitive (e.g. A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?). The second sections presented three different moral dilemmas that asked the participant to assess the moral permissibility (on a scale from 1 to 7) of an action that was intuitively undesirable but, from a utilitarian perspective, brought about better consequences than inaction:

Enemy soldiers have taken over Jane's village. They have orders to kill all remaining civilians. Jane and some of her townspeople have sought refuge in the cellar of a large house. Outside they hear the voices of soldiers who have come to search the house for valuables. Jane's baby begins to cry loudly. She covers his mouth to block the sound. If she removes her hand from his mouth, his crying will summon the attention of the soldiers who will kill her, her child, and the others hiding out in the cellar. To save herself and the others she must smother her child to death. Should Jane smother her child in order to save herself and the other townspeople?

Our hypothesis was that participants who received the CRT questions first, and answered them correctly, would be more likely to make utilitarian judgments in rating moral permissibility. And, after running the experiment on 300 participants, the data confirmed our hypothesis. Those primed with the CRT questions, and who answered them correctly, were significantly more likely to override intuitive aversion in order make more utilitarian judgments when presented with moral dilemmas.

The results of this experiment will inform the projected topic of my senior culminating project, which will concern the normative implications of research findings in moral psychology. That is, how should the emerging science of morality change, if at all, the way we think about normativity? After my research this summer, I can confidently conclude that a reliance on intuitive moral judgments can lead to disastrous outcomes. It is in the best interest of our society to encourage reasoning and reflection in not only the moral domain, but in decision-making generally. Only through reason and reflection can decision and judgment be utilized to bring about the best overall outcomes, rather than be subject to the unforeseen consequences of intuitive judgment and decision. Additionally, from a practical perspective, the results of our study indicate a means by which we can encourage reason and reflection judgment and decision-making. If we can make the utilitarian principle more accessible to the general consciousness, then people will be more likely critically evaluate their intuitive reactions to moral concerns.

Last Updated: 9/22/17