Sociology

Erin Macke

2023–24 Dissertation Fellowship

One of the primary studies in Erin's dissertation focuses on credit allocation for team performance. Past researchers studied status by having individuals on collectively-oriented teams work on interdependent tasks and measuring influence through deference, or the number of times group members yielded decision-making power. High-status individuals typically defer less often and are deferred to more often. While measuring deference is crucial for understanding how the task itself is completed, status processes extend beyond the interaction period. Specifically, we examine how individual credit is allotted for the team effort. Because teamwork is incredibly common across industries, and individuals reap the rewards of teamwork in the form of promotion, recognition, or monetary bonuses, it is crucial to understand how gender-status beliefs manifest beyond the task dynamics alone. To investigate this, we design an experiment where we can measure both deference during a task and credit-taking (or -giving) behavior upon receiving positive performance feedback after the task.

Ideal Remote Workers: Exploring Gendered Implications of the Rise of Remote Work
2020–21 Survey Lab Project

In collaboration with Claire Daviss and Emma Williams-Baron.

We use six survey experiments to investigate three main theoretical questions about gender inequality in the workplace. First, how does an accountability intervention operate across contexts (first in a context where gender and parental status are salient, and next in a gendered and racialized context)? Second, how does a candidate’s history of remote work affect evaluators’ perceptions of their hireability for positions with traditional work arrangements, and how does this effect differ by gender and parental status? And third, how do evaluators’ preferences for certain candidates (women versus men, parents versus non-parents) differ when hiring for remote positions versus when hiring for in-person positions? Across these three studies, each of which includes two survey experiments, we also explore a methodological question: can digital trace data be leveraged to detect bias and understand theoretical mechanisms in survey experiments? Digital trace data are records of participants’ behaviors (such as mouse clicks, time spent reading sections of a resume, etc.) rather than their expressed attitudes or beliefs, and can therefore yield new insights into theoretical questions when paired with traditional survey measures.