GalacticSpace4ptbar4.jpg

Psychology of Technology Research Network

We advance research that improves the human-technology relationship

WHY WE STUDY THE PSYCHOLOGY OF TECHNOLOGY

The Psychology of Technology Research Network is a non-profit network of behavioral scientists, technology designers, and decision-makers that protects and improves psychological health for society by advancing our understanding and effective use of transformative technologies. We focus on psychology because it is the core driver of individual and collective behavior; we focus on technology because it has powerful shaping effects on psychological health, at the individual and societal level. To solve major challenges such as climate change, pandemics, and nuclear threats, we must understand and improve the human-technology relationship. To achieve our mission, our network facilitates new partnerships and conversations across the tech ecosystem in order to ask better questions, find answers more quickly, and spread insights more efficiently among scientists, tech designers, policymakers, and end users.


Updates on our work

  • May 23-26 - The Association for Psychological Science (APS) Annual Convention is the premier global event in psychological science that attracts more than 2,000 researchers every year from around the world who share the latest discoveries and developments in a variety of fields. In 2024, the convention is set to take place in San Francisco, California. Nathanael Fast, Director of the Neely Center, has been invited to contribute to a panel at the event to speak about his research on AI.

  • April 8-9 - The University of Michigan organized a hybrid conference on Social Media and Society in India, featuring a host of speakers to discuss various ways in which social media is impacting contemporary life in India. The event is in its fourth iteration at the University of Michigan and is a premier venue for conversations around social media and society in India. Jimmy Narang, a postdoc at the Neely Center, presented his research on the mechanics of misinformation distribution in India.  The Neely Center is proud to co-sponsor the event.  

  • April 1 - In the summer of 2023, a small team at Yale's Justice Collaboratory – comprising Matt Katsaros, Andrea Gately, and Jessica Araujo - collaborated with Ishita Chordia, a researcher at the University of Washington Information School, to better understand discussions about crime on Nextdoor. They leveraged both the Neely Center Social Media Index and the Neely Center Design Code in their work which was published recently in Tech Policy Press. The paper proposes recommendations for enhancing online crime discussions.

  • March 27 - When we launched the Neely Social Media Index last year, we found that US adults who used X (Twitter) and Facebook were 2-3 times more likely to see content on those platforms that they considered bad for the world. In a Substack post, Matt Motyl delved into whether these experiences with detrimental content have evolved over the past year and how this varies across different social media and communication platforms. He found a decrease in reports of harmful content on Facebook, whereas on X (Twitter), there was an increase in reports of content potentially escalating the risk of violence.         

  • March 26 - The European Commission recently sought our feedback on guidelines to mitigate systemic risks in electoral processes on large online platforms. The Neely Center provided input on the importance of design-based solutions that address many of the known limitations of watermarking. The Commission specifically cited the Neely Center among academic stakeholders who warned against an over-reliance on watermarking and labeling. We advised exploring design-based approaches, especially for scenarios where malicious actors might circumvent detection through watermarking.

  • March 26 - The Neely Center is proud to contribute to Jonathan Haidt's new book, The Anxious Generation, where several of the ideas from our Design Code were included in Chapter 10, concerning steps that technology companies and governments can take to improve teen mental health. Ravi Iyer, Neely Center Managing Director and longtime research collaborator of Prof. Haidt, helped write parts of the chapter which included a focus on design and the use of device-based age authentication.  The Neely Center continues to work with the team behind the book to turn the energy behind the book into positive societal change.              

  • March 22 - It is exciting to see several of our partners leveraging the Neely Center’s Design Code in engaging with global policymakers and civil society groups. Our partners at Build Up have engaged with the Kenyan and Ghanaian governments about specific ideas within our Design Code and continue to have fruitful dialogues about how these codes can be integrated into government policies.  On March 22, 2024, our partners at Search for Common Ground organized a gathering in Sri Lanka of global civil society organizations working to combat Technology Facilitated Gender Based Violence (TFGBV) and invited the Neely Center to present our design recommendations to the group, as an alternative to current content-based legislation being considered in places like Sri Lanka, which civil society organizations worry will be used to curb free expression.

  • March 13 - Ravi Iyer, Managing Director of the Neely Center, was invited to join a panel at Stanford to present academic and industry perspectives to the White House Kids Online Health and Safety Task Force.  In his remarks, he emphasized the recommendations from the Neely Center's Design Code that call for specific changes to platforms, backed by empirical evidence, that empower kids to avoid negative experiences with technology.               

  • March 12 - In an invited talk with the Federal Trade Commission (FTC), Neely Center's Ravi Iyer, discussed the impact of manipulative design patterns in social media, aligning with the FTC's focus on "Dark Patterns." His testimony emphasizes our role in advocating for transparency and fairness in digital design. The Neely Design Code provides specific design recommendations for policymakers and technologists to improve the impact of social media platforms on society.  We are excited to see the Neely Center's work contributing to substantive discussions on digital ethics.

  • March 11 - The Neely Center, in collaboration with the Council on Technology and Social Cohesion, hosted the Design Solutions Summit 2024 in Washington DC. This event brought together a select group of thought leaders and innovators at the forefront of technology and democracy, focusing on the critical role of design in enhancing online civic discourse. The event was kicked off by a speech from Kathy Boockvar, former Secretary of State of Pennsylvania, who discussed the effect that online discourse can have on elections and included talks by numerous technologists with experience at Google, Facebook, and Instagram, as to potential design-based solutions.  Among the participants in the workshop were representatives from Meta, Twitter, Google, USC, Notre Dame, Build Up, Search for Common Ground, Villanova University, the Prosocial Design Network, the National Democratic Institute, Stanford, Reset.Tech, Aspen Institute, the US State Department, Knight Georgetown, the Department of Homeland Security, the American Psychological Association, Athena Strategies, the Alliance for Peacebuilding, Protect Democracy, and India Civil Watch International.  The event was co-sponsored by the Council of Technology and Social Cohesion and hosted at Search for Common Ground headquarters.  Several participants leveraged the Neely Center's design code and election recommendations in their remarks.  The convening was productive and resonated with the participants for its relevance. As one attendee noted, “For someone working in the responsible tech field, the summit was an incredible opportunity to learn not just about new design solutions but. almost more importantly, where the field is converging on which design solutions are most powerful.”

  • March 7-8 - The Managing Director of the Neely Center, Ravi Iyer, spoke at the Story Movements 2024, a convening supported by the MacArthur Foundation and hosted by American University's Center for Media and Social Impact. Ravi was part of a session entitled "AI, Social Media & Tech for the Future". Taking place on March 7th in Washington DC, the conference brought together people and organizations who are doing the good work of repairing and imagining a just world through media, storytelling, comedy, research, and technology. The event was open to the public.

  • March 5 - In a recent Substack post, Neely Center’s senior advisor Matt Motyl delves into the shifting dynamics of social media usage and its impact on user well-being and societal norms between 2023 and 2024. The study, supported by the Neely Social Media Index, provides a comparative look at how engagement with social media platforms has evolved since our initial survey in early 2023, revealing interesting trends such as a 5.8% decrease in YouTube and 2.9% decrease on LinkedIn and X usage among US adults. No platform increased its share of users in this time span.

  • March - We are excited to share our Design Code with decision-makers within the UK Government and the UK’s communications regulator (Ofcom), as they develop the new Online Safety Act. Ofcom is now designing and consulting on their codes of practices to implement the Act. Ofcom also has a history of measuring user experiences online, similar to our Neely Center Indices, and there is much to be learned methodologically across both efforts. Ofcom recently added Ravi Iyer, our Managing Director, as a member of their Economics and Analytics Group Academic Panel. As Ofcom implements the Online Safety Act in the UK, Ravi Iyer will be advising them on conceptual frameworks and empirical approaches to understand, measure, and improve outcomes for people in digital communications.          

  • February 27 - Following recommendations from the Neely Center, Rep. Zach Stephenson has introduced the “Prohibiting Social Media Manipulation Act” aimed at curbing design practices that undermine user autonomy and elevate risks for Minnesotans on social media platforms. Ravi Iyer, Managing Director at the Neely Center, contributed insights and testified in support of the bill, which incorporates several of the Center's proposals such as enhanced privacy settings, ethical content amplification, reasonable usage limits, and greater transparency in platform testing.

  • February 20 - In an article discussing the risk of AI powered deepfakes for India's 2024 election, Alj Jazeera talked with Ravi Iyer, the Neely Center's Managing Director, about how platforms should respond. In keeping with our previous work on algorithmic design, Ravi discussed the difficulty platforms would have in detecting deepfakes and instead suggested redesigning algorithms that currently incentivize polarizing content. The ethical implications of deepfakes are undeniable, and regulating them remains a complex issue. Yet, safeguarding the integrity of our elections and democracy is paramount.

  • February 18 - We are excited to share this recently released paper that Neely Center helped sponsor and co-author that illuminates industry knowledge about the tradeoffs between quality and engagement optimization within algorithms. The paper highlights one of our core design code proposals that platforms should not optimize for engagement, but instead for judgments of quality. In collaboration with numerous partners in academia (University of California, Berkeley, Cornell Tech), civic society (Integrity Institute), and industry (Pinterest, LinkedIn), it also discusses many concrete alternative ways that platforms have introduced signals of quality into algorithms, often by eliciting explicit preference, with measurable results. The paper was also recently covered in Tech Policy Press's Sunday Show podcast.
    February 6 - Ravi Iyer, Managing Director of the Neely Center, presented on “AI and Human Relationships: The Problem of Authenticity” at a symposium titled Science and Religion: Being Human in the Age of AI, organized by the Nova Forum. Ravi’s panel explored the intricate interplay between science and religion in our rapidly evolving technological landscape.

  • February 1 - In a recently released report, Minnesota Attorney General's office has leveraged the Neely Center for Ethical Leadership and Decision Making's Design Code for its comprehensive study on the impacts of social media and artificial intelligence on young people. It not only highlights the challenges posed by digital platforms but also recommends actionable steps towards creating a safer online environment for youth, drawing on the principles outlined in the Design Code. Moreover, the report cites the Neely Center's Social Media Index as a credible tool for monitoring user experiences with technology. Attorney General Ellison emphasized the report's importance in shaping policies that protect young internet users from the adverse effects of emerging technologies: "The report my office released today details how technologies like social media and AI are harming children and teenagers and makes recommendations for what we can do to create a better online environment for young people.  I will continue to use all the tools at my disposal to prevent ruthless corporations from preying on our children. I hope other policymakers will use the contents of this report to do the same."

  • February 1 - NBC11 in Minneapolis spoke with Ravi Iyer, Managing Director, Neely Center for Ethical Leadership and Decision Making, about the Center's role in helping to shape the state's recommendations to safeguard social media user experiences.

  • February - We are thrilled to share an insightful essay that emerged from our collaboration with global peacebuilding organizations. Published by Conciliation Resources in “Accord: An International Review of Peace Initiative” (Issue 30), this piece advocates for stakeholders to not only identify and address individual instances of harmful content within their communities but also to push for systemic reforms of the incentives within these digital ecosystems. The essay argues that peacebuilders and mediators must move beyond reactive moderation to proactive prevention, influencing the foundational policies that govern social media platforms.

  • January 31 - X CEO Linda Yaccarino's recent Senate testimony revealed a shift in the platform's approach to safety, with a notable increase in trust and safety staff and plans to hire more moderators. However, this move has sparked discussions about its sufficiency in ensuring user protection, especially for minors. In this Wired article, Matt Motyl, our senior advisor at the Neely Center, highlights the challenges of such measures, calling for a more genuine commitment to safety in tech.

  • January 31 - In the American Psychological Association "Speaking of Psychology" podcast (Episode 271),  Nathanael Fast, Director of the Neely Center for Ethical Leadership and Decision Making, discussed how AI affects people’s decision-making and why it’s important that the potential benefits of AI flow to everyone, not just the privileged.

    January 12 - In a recent insightful article by the American Psychological Association, the Neely Center’s Director Nathanael Fast and Managing Director Ravi Iyer, shared their thoughts and recommendations on an innovative approach to integrating psychological principles in tech design. Ravi's expertise in ethical tech applications can potentially shape the future of social media that is safer and healthier for children and youth. Nate emphasized the critical need for diverse inputs in AI design, advocating for ethical frameworks to prevent biases and societal harm. This piece is a great read for anyone interested in how psychology can drive technological advancements.

  • January 11 - Artificial Intelligence is weaving its way into the fabric of our daily lives more seamlessly than ever before. According to the latest analysis from the Neely UAS AI Index, 18% of US adults have interacted with AI-driven chat tools such as ChatGPT, Bard, and Claude. With this rapid growth in adoption of AI tools and an estimated generative AI market value of $1.3T by 2030, we must examine the adoption of chat-based AI tools. In a thought-provoking Substack post, our senior advisor Matt Motyl, postdoctoral researcher Jimmy Narang, and Neely Center Director Nate Fast, unpack the potential ramifications of chat-based AI tool adoption.

  • January 9 - In this article by Politico, the Neely Center's Director Nathanael Fast and Affiliate Faculty Director Juliana Schroeder were featured for their insights on AI's growing influence. The piece delves into the rapid integration of AI technologies in various industries and the ethical implications that accompany this trend. Addressing the issues around AI ethics and the challenges we face in this rapidly evolving landscape is crucial for understanding how we can navigate these advancements responsibly.

  • January 9 - Ravi Iyer, the Managing Director of the Neely Center, was a featured guest on the 375th episode of the Techdirt Podcast. In the segment, Ravi talked about the Design Code for Social Media developed by the Neely Center which proposes specific steps we can take to design social media systems that safeguard society more effectively. A lively debate ensued!

  • January 9-12 - The 2024 Consumer Technology Association (CES) conference, a pivotal event in the tech world, featured the Neely Center's Director, Nathanael Fast, and Managing Director, Ravi Iyer, as contributing speakers. They presented at sessions that provide enlightening insights into the ethical implications of technology, covering both well-established and emerging areas. This conference represents an invaluable opportunity for attendees to delve into the rapidly evolving landscape of tech ethics and understand the critical role of leadership in navigating its complexities. Registration is open here.