Neely Center for Ethical Leadership and Decision Making
Neely Center for Ethical Leadership and Decision Making
Today's leaders are burdened with ever-growing expectations and dilemmas. The Neely Center for Ethical Leadership and Decision Making provides leaders with tools at the intersection of ethics and technology so they can make wise decisions for their organizations while feeling confident about the future. In so doing, we aim to help solve what we believe to be the most difficult, but most important, challenge of our time – how to align emerging technologies with ethical, human-centered values.
About
The Neely Center focuses on three technology-powered innovations poised to revolutionize business and society: 1) Social media platforms and their effects on individuals and communities around the world; 2) Artificial intelligence and its implications for human coordination and decision making; and 3) Immersive and mixed reality (AR/VR/XR) environments with the potential to enhance or detract from physical reality. In each case, we seek to steer these powerful tools toward their benefits and away from harms.
The Neely Center was founded thanks to a generous endowment provided by USC Trustee Jerry Neely and his wife Nancy. The Center’s mission is to guide leaders in making responsible decisions surrounding the development, implementation, and management of emerging technologies. By fostering and promoting cutting-edge research, comprehensive education, and cross-disciplinary dialogue, we strive to ensure that technological advancements contribute to both immediate and enduring societal benefits.
Neely Design Code for Social Media
The USC Neely Center remains a thought leader in advocating for specific design changes that can improve technology platform’s impact on both our democracy and on our children.
Discover the Neely Design Code for Social Media HERE
Recommendations from our design code have been adopted into a new law in Minnesota as well as the UK’s Code of Practice for kids. We have written pieces for TechPolicyPress and Lawfare and done podcasts with TechPolicyPress and Techdirt. We have hosted and/or spoke at numerous DC events, including America in One Room: The Youth Vote. We hold formal advisory roles with NIST, the Minnesota Attorney General, and the UK’s OfCom. We have also advised officials from New Mexico, California, the UN, the FTC, and the EU’s Digital Services Act enforcement team.
We have collaborated on design oriented efforts with diverse groups such as Tech Law Justice Project, Knight Columbia, Knight Georgetown, Convergence, the UNDP, Search for Common Ground, Center for Humane Technology, Protect Democracy and many others. We co-chair the Council on Technology and Social Cohesion which has used our design code in advocacy efforts in places as diverse as Mali, Kenya, Sri Lanka, and Kyrgyzstan.
We have worked on design code based innovations with product leaders at Nextdoor, Google, Pinterest, Twitch and Meta. Our work has been cited by employees as being influential within several technology companies. We have published a cross-industry collaborative paper on better designed algorithms alongside industry, and have two other cross-industry papers in preparation.
The Neely Indices
USC Marshall’s Neely Ethics & Technology Indices includes three nationally representative metrics aimed at capturing people’s usage and experiences of harm and benefit across three critical technologies: social media, artificial intelligence (AI), and mixed reality (AR/VR).
The Neely Social Media Index, Neely-UAS Artificial Intelligence (AI) Index, and Neely Mixed Reality Index are publicly available and represent America's first longitudinal, nationally representative analysis of social media, AI, and mixed reality user interactions.
We collect monthly data from a representative sample of US households on usage of and positive and negative experiences with specific social media platforms. Our hope is that this information can help stakeholders reward platforms that meaningfully connect and inform users, while holding accountable platforms that create negative experiences for users and society.
We collect monthly data from a representative sample of US households on usage of and attitudes toward artificial intelligence. We are developing questions on people’s experiences toward artificial intelligence.
We collect monthly data from a representative sample of US households to measure increasing usage of and experiences with mixed reality technology, including augmented reality (AR) and virtual reality (VR) applications. Our focus extends to understanding how the growing usage of these technologies impact daily life, productivity, and overall well-being. Through this index, we aim to shed light on the potential benefits and challenges associated with mixed reality, providing insights that can guide developers, policymakers, and consumers in making informed decisions about the adoption and regulation of these emerging technologies.
The Neely Ethics & Technology Fellows Program aims to support visionary MBA students poised to become the next generation of technology leaders. Each cohort explores and guides the development of a new area of transformative technology. The 2023-24 cohort is focusing on mixed reality (AR/VR), with implications for entertainment, gaming, collaboration, education, and healthcare.
An innovation-focused professional with 11 years of extensive experience in technology services, tech-consulting, and project leadership, specializing in new product development and data analytics. Led global teams across Europe, North America, India, and Australia, delivering transformative solutions in Big Data, AI, Cloud, and Analytics. Currently enrolled in a leadership development program, cultivating a lifelong network. Excels in driving collaborative change, commercial software development, leveraging data for decisions, and fostering collaboration within diverse global teams.
Cindy (Ningxin) Chen has over ten years of experience in cross-border technology operations and venture investment. She has worked for the publicly listed online retailer JD.COM, as well as venture capital firms such as Comet Labs and WI Harper Group, where she was responsible for tech investments in semiconductors, the Internet of Things (IoT), artificial intelligence (AI), intelligent manufacturing, among others. Additionally, she co-founded a SaaS platform for electronics manufacturing, where she successfully developed the business development and marketing team and formulated a global go-to-market strategy. Cindy holds a bachelor’s degree in physics from Fudan University and is an MBA candidate at the University of Southern California. As a Fellow at the USC Neely Center, she is dedicated to researching ethical methods for investing venture capital funds within the technology sector, particularly in VR/AR.
Tejaswa is an MBA student and one of his leadership roles is heading emerging technology for the graduate technology club at USC Marshall. XR and Neely Center's work is a natural extension of this interest. He has an undergraduate degree in computer science from the College of Engineering Pune. He began his career at Credit Suisse, building technology products and has since moved into consulting. At KPMG, he advised aerospace and defense clients on growth strategy. And his MBA summer was with EY's technology strategy team where he plans to return full-time. He has also interned with Meta and HP Tech Venture in product marketing and corporate VC roles respectively. In his free time, he enjoys hiking and reading up on topics far removed from his profession.
Callie (Yu-Wen) Huang specializes in digital transformation and streamlining business processes with innovative solutions. She is passionate about exploring how advanced technologies, including mixed reality and AI, can boost corporate profits and drive business success. She is currently an MBA student at USC Marshall School and a fellow at the Neely Center for Ethical Leadership and Decision Making. Callie has a strong background in the tech industry, including working with an AI-powered startup in Silicon Valley and with Big 4 consulting firms. Demonstrating adept leadership, she led global team to upgrade the global operations system across the U.S., Europe, and Asia.
Nicole "Zara" Oparaugo is a dual degree MD/MBA candidate. She is currently completing her MBA at USC Marshall School of Business and Medical school at the UCLA David Geffen School of Medicine. Zara is passionate about increasing access to health care and using technology to bridge gaps.
The Neely Center is proud to support ShiftSC, a USC student-led organization that aims to catalyze a shift toward a socially responsible technological future.
The Neely Center is proud to co-chair the Council on Technology and Social Cohesion. The Council facilitates networking and collaboration between tech sector leaders and peace builders to explore ideas on how to design and deploy technology for social cohesion.
The Prosocial Design Network (PDN) connects research to practice toward a world in which online spaces are healthy, productive, respect human dignity, and improve society. The Neely Center is delighted to sponsor PDN in Identifying, curating, and translating evidence-based solutions to build a better society.
The Neely Center is excited to partner with the Psychology of Technology Research Network which is a non-profit consortium of behavioral scientists, technology designers, and decision-makers that protects and improves psychological health for society by advancing our understanding and effective use of transformative technologies...
The Psychology of Technology Institute, in collaboration with the Digital Business Institute at Boston University’s Questrom School of Business, was honored to host the 8th Annual Psychology of Technology Conference, titled “New Directions in Research on the Psychology of Technology,” on October 12-13, 2024. This year’s theme, “The Quantified Society,” brought together a diverse group of industry leaders, behavioral scientists, technologists, and AI experts dedicated to fostering a healthy psychological future as AI becomes an integral part of daily life. This year, conference speakers included Madeleine Daepp, Microsoft Research; Johannes Eichstaedt, Stanford University; Emily Saltz, Google Jigsaw; Glenn Ellingson, Civic Health Project; Tara Behrend, Michigan State University; Andrea Liebman, Swedish Psychological Defence Agency; Chloe Autio, Autio Strategies, and Dokyun "DK" Lee, Boston University, among others. The keynote was delivered by Luis von Ahn, CEO and co-founder of Duolingo.
The Neely Center and the Initiative on Digital Competition gathered leading minds from industry and academia to discuss what’s now and what’s next in responsible AI.
Associate Professor of Management and Organization
Director of the Neely Center for Ethical Leadership and Decision Making
Co-Director of the Psychology of Technology Institute
Nathanael Fast studies the psychological underpinnings of power, leadership, and technology adoption. His research examines how power and status hierarchies shape decision making, how people’s identities shape their professional networks, and how AI is shaping the future.
Fast is Director of the Neely Center for Ethical Leadership and Decision Making and Co-Director of the Psychology of Technology Institute.
He received his PhD in Organizational Behavior from Stanford University and has been recognized for both teaching and research, including USC’s Golden Apple Teaching Award, the Dean’s Award for Excellence in Research, and Poets & Quants "best 40 B-school profs under the age of 40.
Juliana Schroeder
Affiliate Faculty Director
Juliana Schroeder is the Harold Furst Chair in Management Philosophy and Values Professor at the UC Berkeley Haas School of Business.Schroeder is a behavioral scientist who researches the psychological processes by which people think about the minds of other people, particularly in workplace contexts. The attributions that people make about others’ minds are consequential because they underlie decisions about how to interact with others, such as whether to help or harm them. For instance, determining whether a negotiation partner is trustworthy affects willingness to cooperate. Determining whether an outgroup member is competent affects moral concern for their well-being. Schroeder uses experiments to understand how people make inferences about other minds, and test the consequences of their inferences.
Managing Director, Neely Center for Ethical Leadership and Decision Making
Managing Director, Psychology of Technology Institute
Ravi Iyer is a technologist and academic psychologist working to improve technology's impact on society. He is currently the Research Director for the USC Marshall School's Neely Center.
Carsten Becker
Program Advisor, Neely Ethics & Technology Fellows Program
Carsten Becker is faculty lead for extended reality at the Iovine and Young Academy for the Arts, Technology and Business of Innovation. Joining USC in 2019 as a lecturer for design and communication, he developed the academy’s curriculum in exploring creative technologies for the development of purpose-driven narratives and impact. He has created many industry connections for the academy and USC including with Niantic, Nike, Meta, Microsoft, Qualcomm, and Snap and teaches courses related to extended reality, spatial computing and other narrative technologies for IYA bachelor’s and master’s degrees.
In this article, our Director, Nathanael Fast, Managing Director, Ravi Iyer, and their coauthors explore the transformative potential of large language models (LLMs) to foster more inclusive and participatory online spaces. While LLMs hold immense promise—such as enabling deliberative dialogues at scale—they also pose challenges that could deepen societal divides. To address this, the authors propose a forward-looking agenda for strengthening digital public squares and ensuring the responsible use of AI. Their goal? To foster innovation while safeguarding against the misuse of these powerful technologies.
The effects of technology platforms span state, federal, and global levels, requiring responses from policymakers across jurisdictions. Despite varying contexts, the challenges faced remain considerably similar. To address these, the Neely Center, in partnership with Knight Georgetown Institute and the Tech Law Justice Project, convened a gathering of state officials, legal scholars, federal regulators, and technology experts in Washington, DC. The discussion focused on identifying pathways to create effective, feasible, and constitutional policy solutions.
One key outcome of this gathering was the development of a Design Taxonomy that is now being used across jurisdictions to guide regulatory responses. Ongoing collaborations leveraging this taxonomy aim to produce syndicated policy options ready for implementation in 2025.
Recently, Ravi Iyer, Managing Director of the Neely Center, delivered a keynote presentation at OfCom's public event on "Evaluating Effectiveness of Online Safety Measures." During his presentation, he introduced both the Neely Center Design Code for Social Media and the Neely Indices. Ravi continues to serve on OfCom's academic panel, contributing insights to support the implementation of the United Kingdom's Online Safety Act.
The Neely Center's Design Codes were recently shared with decision-makers within the UK Government and the UK’s communications regulator (Ofcom), as they develop the new Online Safety Act. Ofcom is now designing and consulting on their codes of practices to implement the Act. Ofcom also has a history of measuring user experiences online, similar to our Neely Center Indices, and there is much to be learned methodologically across both efforts. Ofcom recently added Ravi Iyer, our Managing Director, as a member of their Economics and Analytics Group Academic Panel. As Ofcom implements the Online Safety Act in the UK, Ravi Iyer will be advising them on conceptual frameworks and empirical approaches to understand, measure, and improve outcomes for people in digital communications.
In an invited talk with the Federal Trade Commission (FTC), Neely Center's Ravi Iyer, discussed the impact of manipulative design patterns in social media, aligning with the FTC's focus on "Dark Patterns." His testimony emphasizes our role in advocating for transparency and fairness in digital design. The Neely Design Code provides specific design recommendations for policymakers and technologists to improve the impact of social media platforms on society. We are excited to see the Neely Center's work contributing to substantive discussions on digital ethics.
Developed in collaboration with an array of experts, newly released Neely Center Design Code for Social Media provides recommendations for improving the digital platforms impact on society.
In a recent article by Politico, the Neely Center's Director Nathanael Fast and Affiliate Faculty Director Juliana Schroeder were featured for their insights on AI's growing influence. The piece delves into the rapid integration of AI technologies in various industries and the ethical implications that accompany this trend. Addressing the issues around AI ethics and the challenges we face in this rapidly evolving landscape is crucial for understanding how we can navigate these advancements responsibly.
By tracking usage and quality of experiences, the Social Media Index will allow the public, researchers, and policymakers to make meaningful comparisons across time, events, and platforms, for the first time.
FAST, associate professor of management and organization, writes in the San Francisco Chronicle ways in which Americans can bridge divides and have more productive political conversations.
AI Magazine reports FAST, director of the Neely Center for Ethical Leadership, was named to EY's new AI advisory council to help guide the implementation of artificial intelligence across the company's global operations.
FAST, Director of the Neely Center, explains to Bloomberg that recent data collected at the America in One Room conference shows push back against social media regulations among first-time youth voters.
Iyer, managing director of the Neely Center, writes in The Boston Globe that Minnesota' s new law, based on select Neely Center Design Codes, is an important step to begin holding social media companies accountable.
Commenting on California state legislation under appeals, IYER, managing director, Neely Center, comments on how the center's Design Codes focus on best practices for upstream design not content moderation.
FAST, director of the Neely Center, explains to Bloomberg that warning labels on social media may have some effect in changing behaviors in protecting youth, but more needs to be done.
IYER, managing director of the Neely Center, explains to TIME that positive content begets positive engagement and vice versa, so elevating desired forms of online speech will be a paradigm shift for social media platforms.