GalacticSpace4ptbar4.jpg

Psychology of Technology Research Network

We advance research that improves the human-technology relationship

WHY WE STUDY THE PSYCHOLOGY OF TECHNOLOGY

The Psychology of Technology Research Network is a non-profit network of behavioral scientists, technology designers, and decision-makers that protects and improves psychological health for society by advancing our understanding and effective use of transformative technologies. We focus on psychology because it is the core driver of individual and collective behavior; we focus on technology because it has powerful shaping effects on psychological health, at the individual and societal level. To solve major challenges such as climate change, pandemics, and nuclear threats, we must understand and improve the human-technology relationship. To achieve our mission, our network facilitates new partnerships and conversations across the tech ecosystem in order to ask better questions, find answers more quickly, and spread insights more efficiently among scientists, tech designers, policymakers, and end users.


Updates on our work

  • December 13 - In a new article, our Director, Nate Fast, Managing Director, Ravi Iyer, and their coauthors explore the transformative potential of large language models (LLMs) to foster more inclusive and participatory online spaces. While LLMs hold immense promise—such as enabling deliberative dialogues at scale—they also pose challenges that could deepen societal divides. To address this, the authors propose a forward-looking agenda for strengthening digital public squares and ensuring the responsible use of AI. Their goal? To foster innovation while safeguarding against the misuse of these powerful technologies.

  • December 2 - The 2024 presidential election has left many of us feeling like we’re living in two separate realities. With political polarization at an all-time high, only a third of Americans believe our democracy is functioning well, and nearly 60% report feeling stressed and frustrated when discussing politics with those who hold opposing views. In his recent San Francisco Chronicle op-ed, Nate Fast, Director of the Neely Center for Ethical Leadership and Decision-Making, offers a hopeful perspective grounded in his work fostering dialogue among young people. This summer, Nate and his partners brought 438 young Americans from across the political spectrum to Washington, DC for a unique event titled America in One Room: The Youth Vote. Participants engaged in Deliberative Polling, a process designed to encourage thoughtful, informed discussions. The outcome was remarkable: Participants communicated respectfully and deliberated earnestly, even on contentious issues like abortion, artificial intelligence, and climate policy. Breaking the stereotypes of Gen Z as disengaged or overly polarized, these young individuals demonstrated that under the right conditions, meaningful dialogue is not just possible—it’s transformative.

  • November 14-16 - The Neely Center is proud to sponsor the Build Peace Conference for both 2024 and 2025. Technology impacts societies worldwide, and some of the most profound effects—both positive and negative—are seen in global majority countries. Last year, in Nairobi, we led several events to share insights on global platform design trends and gather input from diverse communities on their ideal technology products. This year, we facilitated similar discussions on the progress of global design governance and measurement initiatives. Our goal is to inform global policymakers and engage peacebuilders on how best to design AI, social media, and mixed reality systems for the benefit of all.

  • November 14-15 - Nate Fast, Director of the Neely Center, was an invited speaker at the Impact Guild Forum 2024. Organized by the UTA Foundation—a nonprofit committed to harnessing the power of media, entertainment, and the arts for social impact—the Forum convened thought leaders to explore media's role in democracy and its potential to shape our shared future. At the event, Nate contributed to the plenary session, "Persuasion at Scale: Artificial Intelligence, Data, and the Future of Storytelling," offering insights on how emerging technologies are transforming the ways we communicate and foster connection in today's digital landscape.

  • November 6 - Recently, Ravi Iyer, Managing Director of the Neely Center, delivered a keynote presentation at OfCom's public event on "Evaluating Effectiveness of Online Safety Measures." During his presentation, he introduced both the Neely Center Design Code for Social Media and the Neely Indices. Ravi continues to serve on OfCom's academic panel, contributing insights to support the implementation of the United Kingdom's Online Safety Act.

  • November 5 - In a recent Tech Policy Press article examining the quality of social media newsfeeds in the lead-up to the 2024 election, the author talked about the Neely Social Media Index. Emerging evidence from the index highlights the impact of declining information quality on user engagement, suggesting that in the long term, platforms may have an incentive to invest in content integrity. According to the USC Neely Social Media Index, 30 percent of adults reported seeing content they considered “bad for the world” on social media, particularly on platforms like X and Facebook.

  • October 31 - Recently, EY invited thought leaders from business, government, and academia to join the EY.ai Global AI Advisory Council. This newly established council unites top thinkers to guide EY’s AI strategy and address the rapid technological and market shifts shaping AI today. The council’s focus spans several domains, including customer experience, talent, human behavior, and industry impact. We are thrilled to share that Nathanael Fast, director of the Neely Center, has been invited to join the Council. Nate will contribute insights on ethical considerations in democratizing AI, joining a diverse group of leaders to help navigate the opportunities and challenges of this transformative field. Congratulations to Nate on this prestigious appointment!

  • October 16 - Our partners at Search for Common Ground and Build Up continue to champion the power of design in enhancing the global impact of technology platforms. Their efforts are particularly significant in contexts where civil society groups lack trust in governments to make fair content-related decisions. At the 2024 Online Safety Forum in Lagos, Nigeria, they showcased several innovative design ideas, including contributions from the Neely Center's Design Code for Social Media. Their session, titled "Designing for Good: The Role of Prosocial Tech Design in Ensuring Safety and Cohesion," explored how thoughtful tech design can promote safety and foster social harmony. Key takeaways from the session are available here.

  • October 12-13 - The Psychology of Technology Institute, in collaboration with the Digital Business Institute at Boston University’s Questrom School of Business, was honored to host the 8th Annual Psychology of Technology Conference, titled “New Directions in Research on the Psychology of Technology,” on October 12-13, 2024. This year’s theme, “The Quantified Society,” brought together a diverse group of industry leaders, behavioral scientists, technologists, and AI experts dedicated to fostering a healthy psychological future as AI becomes an integral part of daily life. This year, conference speakers included Madeleine Daepp, Microsoft Research; Johannes Eichstaedt, Stanford University; Emily Saltz, Google Jigsaw; Glenn Ellingson, Civic Health Project; Tara Behrend, Michigan State University; Andrea Liebman, Swedish Psychological Defence Agency; Chloe Autio, Autio Strategies, and Dokyun "DK" Lee, Boston University, among others. The keynote was delivered by Luis von Ahn, CEO and co-founder of Duolingo.

  • October 9-10 - The USC Marshall’s Neely Center showcased its Design Code for Social Media at multiple key engagements across Europe. We participated in a European Commission-sponsored workshop focused on protecting minors and co-hosted an event in Brussels with civil society groups influencing the implementation of the Digital Services Act through the Council on Technology and Social Cohesion, which we co-chair. Additionally, the Council organized a workshop at the European University Institute in Florence on 9 and 10 October 2024, bringing together leading scholars advancing regulatory approaches. Across these events, the Neely Center Design Code for Social Media played a pivotal role in shaping discussions on how technology platforms can be designed to positively impact society.

  • October - Ravi Iyer, Managing Director of the Neely Center, delivered an invited talk as part of the University of Utah's Daniels Fund Lecture Series, which aims to provide students with an ethical perspective on current events. In his talk, Ravi shared insights into the challenges and opportunities for creating a more ethical and socially responsible social media environment. Drawing from his experiences at platforms like Facebook, he discussed how thoughtful design choices can mitigate negative societal impacts. Key topics included the importance of effective content moderation, incentivizing positive engagement, and addressing the influence of algorithms on user behavior.

  • September 26-27 - As part of its outreach to technologists, the USC Neely Center presented its Design Code for Social Media and Neely Indices to hundreds of technology professionals at Trustcon 2024—the world’s leading conference for the Trust and Safety community. The Neely Center also participated in the Trust and Safety Research Conference, which brings together academics and industry professionals working in trust and safety.

  • September 23 - The European Union Office in San Francisco hosted Behind the Screen: Policy Approaches to Protecting Children Online, a public event focused on safeguarding children in digital environments. At the event, Ravi Iyer, Managing Director of the Neely Center, presented insights from the Neely Design Code highlighting the critical importance of designing digital platforms with child safety at the forefront. His discussion underscored the need for responsible technological development that balances safety with privacy, ensuring that online spaces are not only engaging but also safe and supportive of children's well-being. The event was open to the public.

  • September 23 - Ravi Iyer, Managing Director of the Neely Center, recently joined a panel with Buffy Wicks, California Assemblymember and Martin Harris Hess, Head of Protection of Minors for the EU, to discuss policy approaches to protecting children online. A recording of the event is available online. The Neely Center also provided formal input into the upcoming code of practice for the Digital Services Act and participated in several workshops to support this work. The Neely Center remains actively engaged with EU regulators in shaping the implementation of the Digital Services Act in ways that protect minors.

  • August - Under the leadership of David Evan Harris, a Senior Advisor to the Neely Center, the "AI Ethics for Leaders" course has been taught over several semesters at the University of California-Berkeley with two sections being taught this year—one with undergrads and one with international students. Ravi Iyer, Managing Director of the Neely Center, has guest lectured both last semester and this semester and the Neely Center is continuing to refine the curriculum to offer it to other institutions. USC's Neely Center both conceptualized the course and funded its initial curriculum.