Emma Williams-Baron
2024–25 Dissertation Fellowship
My dissertation investigates gender and race segregation in the labor market from three angles. First, I focus on the role of employer discrimination, exploring whether and how an accountability intervention mitigates hiring bias against women job-seekers based on their race and parental status. Next, from the angle of job-seeker self-segregation, I test how hiring organizations' race and gender demographics affect job-seekers' perceptions of the organizations, and subsequent job application rates. In this project, I additionally establish whether remote and hybrid options increase job-seekers' interest in organizations with demographics that do not match their own. Finally, I evaluate the implications of multiple job-holding for the extent of and trends in occupational segregation by gender and race over the past several decades. Across these three projects, my dissertation contributes novel theoretical and empirical insights to the literatures on gender and race inequality, workplace and labor market dynamics, diversity, bias, and segregation. I not only establish the extent of inequality across multiple domains, but also evaluate potential tools for increasing equity.
Ideal Remote Workers: Exploring Gendered Implications of the Rise of Remote Work
2020–21 Survey Lab Project
In collaboration with Claire Daviss and Erin Macke.
We use six survey experiments to investigate three main theoretical questions about gender inequality in the workplace. First, how does an accountability intervention operate across contexts (first in a context where gender and parental status are salient, and next in a gendered and racialized context)? Second, how does a candidate’s history of remote work affect evaluators’ perceptions of their hireability for positions with traditional work arrangements, and how does this effect differ by gender and parental status? And third, how do evaluators’ preferences for certain candidates (women versus men, parents versus non-parents) differ when hiring for remote positions versus when hiring for in-person positions? Across these three studies, each of which includes two survey experiments, we also explore a methodological question: can digital trace data be leveraged to detect bias and understand theoretical mechanisms in survey experiments? Digital trace data are records of participants’ behaviors (such as mouse clicks, time spent reading sections of a resume, etc.) rather than their expressed attitudes or beliefs, and can therefore yield new insights into theoretical questions when paired with traditional survey measures.