Associate Professor of Psychology, Murray State University
Co-founder and Director of Research, scite.ai
My work centers on technology and society and has two foci. First, I examine the intersection of technology and relationships, and the role social media use plays in influencing psychosocial outcomes in dyadic and extra-dyadic processes. Specifically, I am interested in social support processes as they occur in different venues (online, through social network services, and in more traditional contexts), and how individual-level and situational factors influence support seeking behavior.
Second, I examine moral expression over social media, and motivations for online activism. I am particularly interested in the application of personality theory to the study of moral messaging as expressed in an online context.
I am an advocate of the open science movement and methodological reform in the psychological sciences and have been involved in multiple replication efforts. I am also interested in improving the intellectual diversity of social and personality psychology and apply a wide variety of theoretical and empirical approaches in my research and teaching.
Full CV available here.
2009 - 2014
Kent State University, Kent, OH
2008 - 2009
East Tennessee State University, Johnson City, TN
MA, Experimental Psychology
2006 - 2008
East Tennessee State University, Johnson City, TN
2004 - 2005
North Georgia College and State University, Dahlonega, GA
2001 - 2003
Gainesville Community College, Gainesville, GA
Rife, S.C., Risati, D., & Nicholson, J.M. (2021). scite: The Next Generation of Citations. Forthcoming in Learned Publishing.
Nicholson, J. M., Uppala, A., Sieber, M., Grabitz, P., Mordaunt, M., & Rife, S.C. (2020). Measuring the quality of scientific references in Wikipedia: an analysis of more than 115M citations to over 800,000 scientific articles. Forthcoming in The FEBS Journal.
Wikipedia is a widely used online reference work which cites hundreds of thousands of scientific articles across its entries. The quality of these citations has not been previously measured, and such measurements have a bearing on the reliability and quality of the scientific portions of this reference work. Using a novel technique, a massive database of qualitatively described citations, and machine learning algorithms, we analyzed 1,923,575 Wikipedia articles which cited a total of 824 298 scientific articles in our database and found that most scientific articles cited by Wikipedia articles are uncited or untested by subsequent studies, and the remainder show a wide variability in contradicting or supporting evidence. Additionally, we analyzed 51 804 643 scientific articles from journals indexed in the Web of Science and found that similarly most were uncited or untested by subsequent studies, while the remainder show a wide variability in contradicting or supporting evidence.
Baranski, E., Baskin, E., Coary, S., Ebersole, C. R., Krueger, L. E., Lazarević, L. B., ... & Rife, S. C. (2020). Many Labs 5: Registered Replication of Shnabel and Nadler (2008), Study 4. Forthcoming in Advances in Methods and Practices in Psychological Science.
Shnabel and Nadler (2008) assessed a needs-based model of reconciliation suggesting that in conflicts, victims and perpetrators have different psychological needs that when satisfied increase the chances of reconciliation. For instance, Shnabel and Nadler found that after a conflict, perpetrators indicated that they had a need for social acceptance and were more likely to reconcile after their sense of social acceptance was restored, whereas victims indicated that they had a need for power and were more likely to reconcile after their sense of power was restored. Gilbert (2016), as a part of the Reproducibility Project: Psychology (RP:P), attempted to replicate these findings using different study materials but did not find support for the original effect. In an attempt to reconcile these discrepant findings, we conducted two new sets of replications—one using the RP:P protocol and another using modified materials meant to be more relatable to undergraduate participants. Teams from eight universities contributed to data collection (N = 2,738). We did find moderation by protocol; the focal interaction from the revised protocol, but not from the RP:P protocol, replicated the interaction in the original study. We discuss differences in, and possible explanations for, the patterns of results across protocols.
Rife, S.C., Cate, K.L., Kosinski, M., & Stillwell, D. (2016). Participant recruitment and data collection through Facebook: the role of personality factors. International Journal of Social Research Methodology, 19(1), 69-83.As participant recruitment and data collection over the Internet have become more common, numerous observers have expressed concern regarding the validity of research conducted in this fashion. One growing method of conducting research over the Internet involves recruiting participants and administering questionnaires over Facebook, the world’s largest social networking service. If Facebook is to be considered a viable platform for social research, it is necessary to demonstrate that Facebook users are sufficiently heterogeneous and that research conducted through Facebook is likely to produce results that can be generalized to a larger population. The present study examines these questions by comparing demographic and personality data collected over Facebook with data collected through a standalone website, and data collected from college undergraduates at two universities. Results indicate that statistically significant differences exist between Facebook data and the comparison data-sets, but since 80% of analyses exhibited partial η2 < .05, such differences are small or practically nonsignificant in magnitude. We conclude that Facebook is a viable research platform, and that recruiting Facebook users for research purposes is a promising avenue that offers numerous advantages over traditional samples.
209 Wells Hall Murray State University Murray, KY 42071