June 20, 2013

Creating Better Experiences Through Feedback

Designing Experiences

At brightspot, we believe in quantitative and qualitative analysis along with continued assessment in order to identify opportunities for improvement in services and spaces. Recently, during a visit to the University of Michigan’s Ann Arbor campus, we noticed an intriguing approach to gathering feedback in order to improve the student experience.

 

The “This Sucks” campaign was born out of the College of Literature, Science & the Arts Student Government’s (LSA-SG) desire to make the college better for students. The campaign invites students to identify areas of improvement within the student experience. Created six years ago, the campaign has been tremendously successful in generating feedback from students and translating that feedback into real, tangible improvements in the delivery of services on campus. Analyzing the approach of the campaign and its success reveals several key components of service analysis that support an iterative design approach, applicable across many sectors.

By using feedback as part of an iterative approach and actively seeking input from users or customers, organizations can ensure they are addressing the shifting desires of the populations they serve. Integrating that feedback into a responsive process and celebrating improvements as they are made makes for not only an improved user experience, but also an enhanced appreciation of the service provider.

1. Use feedback as part of an iterative approach

As the role of design shifts from delivering a product people use once to continuously supporting use or products and services, design should strive to be more responsive and iterative. Take, for example, Google, which allows users to opt-in to anonymously send data about their browser use. Google then uses this data to get an idea of how people use its services, so it can address pressing needs and critical products in real time, striving to improve its service level to users. Brick-and-mortar institutions don’t often have the luxury of automatically gathering digital feedback, so they must do more to solicit feedback from the people to whom they provide services.

A feedback program like the “This Sucks” campaign helps to encourage an iterative approach to supporting experiences by continually analyzing services. The campaign helps the LSA-SG understand how well the University is meeting students’ needs and identify gaps in ideal experiences. Gaps, or opportunities for improvement identified through emails sent by students in response to the campaign, constantly inform the creation or revision of services on campus.

 

Google uses automatically-sent user data to improve its service delivery on an ongoing basis.

2. Invite feedback (make it easy and visible):

Comments from users (in this case students) are an invaluable part of generating proper assessment of services that are being provided. However, users often feel that their feedback goes unheard and as a result can be hesitant to share it. Campaigns that invite users to submit comments in a way that is engaging and visible, like MoMa’s popular “I went to MoMA and…” campaign, can spark a wide range of informative and emotive feedback on visitor experiences.

In Michigan, a “This Sucks Diag Day” was held, where students wrote their suggestions and complaints about various experiences on a banner strung across the campus’ main thoroughfare. This open forum approach intrigued and engaged students in a way they hadn’t been before, and it gave the LSA-SG new insights into students’ needs and desires.

 

The MoMa invites visitors to share their feedback, both positive and negative, in an open-ended way that inspires people to respond creatively, rather than simply answering predetermined questions.

3. Design a transparent and responsive process:

Once feedback is received, it should be responded to, assessed and prioritized for action. A well-designed process allows for users to know their comments have been heard and see how they will have an impact on improving services and experiences. The rise of online help chats, many run by companies like ZenDesk and Desk.com, illustrates this sort of response instantaneously, with friendly representatives responding directly to customer queries and complaints. Companies like Zappos are renowned for their customer service, largely thanks to their transparent and responsive processes.

The feedback received via the “This Sucks” campaign is directed to the appropriate people (LSA-SG executives and committee chairs) so that concerns could are heard by those who are able to make or influence the desired improvement. Students know their complaints have been heard and are being acted on through a direct, personalized email response.

 

Zappos responds to questions and complaints quickly and publicly, earning it a reputation for superior customer service.

4. Make impacts and progress visible

The importance of transparency and visibility in the assessment process also applies to the design or re-design of services and spaces. Beyond simply showing people that their feedback is being heard, organizations need to show how that feedback is spurring action, as it happens. New York City’s Metropolitan Transportation Authority’s “Improving, non-stop” campaign makes riders aware of the city’s current projects to improve the transit system through a series of posters and graphics on buses and trains.

At Michigan, students were initially doubtful of the campaign and the LSA SG’s ability to make changes around campus. The LSA recognized this challenge and made a point to share each change they made or were in the process of making with the student body on their website and in campus publications – and by updating each individual on the progress with their issue.

 

The MTA uses signage on its busses and trains to publicize progress being made on various projects throughout the transportation system on an ongoing basis.

Conclusion:

Asking users what “sucks,” or inviting open feedback, plays an important role in taking a user centered approach to creating and providing services. But more than simply soliciting feedback, organizations should strive to structure their service delivery in a way that allows for iterative, ongoing action to address comments and criticism. By processing feedback quickly, responding in a transparent way and showing how they’re working to improve services as a result, organizations can better serve their audiences while simultaneously improving reputation and trust.

By integrating users into the design process, organizations can create an environment that is both receptive and responsive. Pairing user feedback with other quantitative and qualitative analysis methods of your services and spaces on an ongoing basis will improve the experiences of students, visitors or customers alike.

Related articles