As helpful as Stack Overflow can be with over 14 million programming questions, it can also be just as toxic due to the malfunctioning community mechanics that cause users to suffer and feel un-welcomed in the community. Our previous work demonstrated that barriers such as jumping through onboarding hoops and the fear of negative feedback affect who feels they can participate on the site. To dismantle these barriers, we designed a just-in-time mentoring experience.
Just-in-time mentoring
We created this just-in-time mentoring experience to enhance the feedback process for new question askers on Stack Overflow. Previously, users would gather feedback via slow comment sections taking hours or even days. In this new format, novices(users with 15 points or less) were able to get instant feedback on how to structure a well-received question in this community. Drawing on insights from communities of practice, we found that guidance from experts would help encourage new users to engage. For example, guiding novices through onboarding hoops with the help of a mentor or reducing the feeling of an intimidating community size with a private space where users can make improve their questions, can help users feel more comfortable participating in this community and others like it.
How?
To take advantage of the existing chat room feature in the community, we built designated chat Room to serve as 4 Help Rooms and 1 Private Mentor Room. In the Private Mentor Room, mentors are notified of novices that need help and announce who will help each one. In the Help Room, novices interact with mentors about their question draft.
Once novices entered the help room, a mentor greeted them and suggested improvements(often based on mentor and author generated guidelines) for their question. From here, novices iteratively edited their question with feedback from mentors and then chose to post their question at any point. Mentors did not edit novice questions directly, we wanted to keep the ball in the novice’s court. One of our design goals was to understand how experts teach novice users to fish, not necessarily fishing for them.
By the end of our 33-day pilot experiment, we presented 71,068 novices with the option to join the help room, 520 enter the help room, and 271 who interacted with a mentor and post a question.
Findings
Novices have higher rated questions. Following Stack Overflow’s question characterization framework, we had more GOOD(positive score) questions and less BAD(negative score) questions. We also observed a 50% increase in the mean question score for mentored questions.
Mentors offer high-fidelity improvements. Overall, mentors suggested many very explicit changes for what novices could do to make their questions more likely to be answered. This included adding more details about what they have already tried, adapting to the community’s culture of asking by removing greetings, and finding the right home for a question that may belong in another stack exchange community.
Both novices and mentors were highly satisfied. It was important for us to not only conduct the experiment but also gather feedback on the design from novices and mentors. From our follow up surveys, we found that novices found the suggestion from mentors very helpful. Even in interviews, mentors were genuinely excited to make sure novices had a great experience:
“If we can get the [original poster] through the first question with a positive experience and they can see how this site really works, then we should get more good questions which feeds in to having more good answers.”
What does this mean?
Human-human guidance definitely paid off in this experiment. Having human mentors allowed novices to engage in an in-depth clarifying dialogue that welcomed user across language barriers and varying levels of programming experience.
The greatest takeaway from this project is we can now enhance the quality of experience for new users. But even better, it leads to a new feature design beyond renaming the code of conduct! In fact, one direct resolution that has come for this study as described by the EVP of Culture and Experience described is a beginner ask page where we can break down the suggestions mentors offered into an automated prompt to guide their questions a bit better.
Read more
The formal research paper, “We Don’t Do That Here”: How Collaborative Editing with Mentors Improves Engagement in Social Q&A Communities by Denae Ford, Kristina Lustig, Jeremy Banks, and Chris Parnin is published at CHI‘18, ACM Conference on Human Factors in Computing Systems.
The slides from my talk and the paper are online at the ACM digital library and locally on my website. Also, Kristina, the first Stack Overflow Researcher and now Design Manager, was featured on the Stack Overflow podcast to talk about this work so check it out!
Questions? Comment below!