Economist Debates


About the capstone
The undergraduate Human-Computer Interaction major at Carnegie Mellon University culminates with a capstone course intended to synthesize what we’ve learned in the program into one end-to-end experience. Our client was The Economist Digital, and we were asked to reimagine the user experience of their online debates platform. I worked with Harold Kim, Andrew Lee, Chris Reid, and Prerna Singh, and I served as Design Lead of the team.

Problem framing
During our kickoff meeting, we found our clients were most interested in enhancing the debates experience via three channels: (1) community-driven social engagement, (2) content integration with additional sources of knowledge, and (3) platform delivery through new mediums.


Brainstorming with our clients

Research phase
Research was divided into four domains: internal research, user research, competitive analysis, and third party research.


Communication flow model for stakeholders involved with debate production

We conducted internal research to learn about the work involved in producing a debate and what aspects of debates have been effective or ineffective from a business standpoint. We interviewed 8 individuals who have been involved with producing Economist Debates or live debates at Economist events, such as conferences. We also conducted 5 remote hour-long interviews with existing users, which took place with users from the United States, Canada, the United Kingdom, and Finland.


Literature reviews gave us insight into the current social and technological landscape of online debates


Some competitors we examined extensively

We evaluated each competitor according to our three focus themes: social, content, and platforms. In social, we were interested in what tools they used for social engagement, the frequency of social communications, and the engagement for returning users. In content, we were interested in who created the content and what kind of content was presented. In platforms, we were interested in how content was delivered to the user and what new platforms they were exploring.


We consolidated our findings into several “engagement models” which we felt had emerged in research and described the interaction between the users and a debate-like experience. Based on our user research and findings, we generated a list of potential features which we sketched individually and evaluated as a team.


We used a whiteboard to rapidly communicate our design ideas and feature explorations.


We ended up with more than 30 design ideas, satisfying one of our initial project aims, and presented them our client to rank by importance and comment on in a “speed-dating” process.


Noting the breadth of our ideas, we next began to prioritize building out our ideas into low-fidelity prototypes of both individual features and complete website layouts to share with our clients, advisers, and to validate with users. A low-fidelity comments filtering concept is shown above.


This is a screenshot of a high-fidelity prototype of the redesigned homepage.

Back to Top