Logged in users benefit from exclusive intelligence and analysis including premium reports, monthly source market information, on-the-ground intelligence, and IDP research.
Don't have an account? Register
Research is often one of the first things sacrificed in a company in times of crisis. There is no time, no budget and possibly no headspace when allegedly what is needed is swift, decisive action. Past knowledge and experience are employed to navigate stormy waters but it is questionable whether, without research as a compass, the ship can make its way towards calmer shores. When uncertainty is the word of the day, distributing research as a practice across multiple teams, rather than leaving it to just one or two researchers, may prove to be both cheaper and more reliable.
Having worked as IDP Connect's sole UX researcher for more than 2 years, I have seen first-hand how powerful a grassroots approach to research can be. With the higher education sector facing intense pressure at the moment, I want to share how to hack the research process in a way that makes it both cheaper and more effective, enabling institutions to better serve the needs of students. You’ll find that while some of the ideas will have a quick turnaround, allowing you to start collecting user feedback in a matter of weeks, others will take a bit longer but will establish a healthy research-focused culture and all the benefits that brings.
IDP Connect’s journey towards a new way of operating began with an increasing need for research across our multiple products, but no resources for an additional researcher. A lot of my time was taken up by work that could easily be delegated to colleagues after some basic training. It became obvious that diversifying the skillset of colleagues across different teams would enable a greater range of research whilst saving both time and money, so that’s what I set out to do.
I started with user testing: a fairly approachable method with which user feedback could be collected in a matter of hours. The relatively simple nature of this type of research coupled with its considerable impact meant that my colleagues were able to learn it rapidly and see results immediately, which in turn built their confidence. The research projects they assisted on also had immediate relevance to wider company objectives and led to wider company recognition, which in turn showed the value of the work and led the testers to become more enthusiastic about research in general.
If your institution is not currently doing user testing, basic survey skills or even interviewing can also go a long way. In my case, some theory, mixed in with a showcase of real tests and followed by individual practice, meant that in a few weeks I was no longer the only one who could test functionality, visual design and copy. In a university context, with access to hundreds of students, if three people learn how to ask a few non-biased questions, they could take the pulse of their student community on a weekly basis. More specifically, they could identify potential problems before they arose, as well as build a well-rounded understanding of current student generations that can help universities speak their language, meet their expectations and foster trust.
Initially, when doing interviews or moderated user testing, I roped in the same few people in our small design team to help with note taking. But at one point I switched to inviting colleagues who were either part of the same project, or were working on something related, so that the insights the interview brought would also feed into other areas of work. Sometimes, we even got to switch roles, so my colleagues could get first-hand experience interviewing and moderating users. In time, a few colleagues took the initiative to do research by themselves, booking in users for short interview sessions.
The chance of meeting the people you design or write for face-to-face (even if remotely) is very motivating and helps create empathy, which is essential especially in times of crisis. Talking to our real audience meant that our content team for instance got closely acquainted with the authentic feelings and thoughts of students and could therefore write in a more compelling way. Research stopped being only product focused, or the task of the researcher.
In a university setting, just as in the case of IDP Connect, the needs, interests and wellbeing of students are relevant to multiple teams, even if the way they act on those insights will be different. A collaboration between marketing teams and teaching staff could mean that the latter could provide important insights into the emotional wellbeing of students, informing comms tactics and even strategies, as well as fine tuning student support. If done over a longer period of time,this could go as far as improving retention rates, with marketing and recruitment teams working together by conducting quick, guerilla style interviews or surveys. Listening to students talk about their hopes and challenges has the added benefit of actually hearing the words they use, the tone of voice, seeing their body language, little cues that can later help tailor a message so that it comes alive.
Having different teams gathering and analysing research will undoubtedly enrich the resulting insights and bring more coherence across the different departments in terms of processes, reasoning and action.
Existing processes, limited resources and different priorities might not allow much space for such explorations as part of day to day business, but that doesn’t mean that, within bounds, such work cannot be conducted. Fortnightly, my colleagues and I gather for one hour to discuss novel frameworks for understanding our users and their behaviour, and brainstorm recommendations for existing or future products. In these sessions research is never sacrificed, but becomes a central piece in our reasoning about why we take action and how we can verify that what we’re planning to do will work. Left to its own devices, such a group can foster a great deal of involvement and motivation, and ultimately bring in novel insights that otherwise could be lost in the humdrum of daily practices.
What this means in practice is that our agenda always starts at objectives: we either refine already-given objectives in a way that makes them more specific and actionable, or we set our own objectives for new pieces of work. Once this is done, we employ a certain framework to lead us to solutions that meet those objectives. The method that we’ve recently settled upon is called the ‘Opportunity-Solution Tree’, a step by step process meant to ensure that the solutions that we develop stay in scope and on point. Sharing this method in the team and constantly checking that we stay true to it is an amazing way to uncover previously unverified assumptions or problematic shortcuts that we take, in our desire to get immediate results. But the method used is not what’s important - any method can be good as long as it’s been validated in the industry. What matters is constantly paying attention to why and how we do things, and that’s where working in a team really pays out as we not only motivate one another but we keep each other accountable in a safe environment.
Project leaders (product owners, project managers etc.) have visibility over an entire project, its objectives, scope, timeline and resource needs. They are the gatekeepers of research, deciding whether to allocate time for it or not. Knowing this, I took great care to develop an understanding of their perspective on research, making sure that I could better answer their needs and communicate clearly how my work helped. I brought transparency to different research processes and methods by mapping out the kind of considerations and decision trees that project leaders should go through at the start of a new project. I also created research flow charts for older projects as case studies to act as reference points whenever similar projects come along. With these in hand, project leaders can quickly evaluate the type of research they need for every piece of work, as well as the amount of time they need to put aside for it. That means research is no longer a challenge that only the researcher can understand and tackle.
Don’t wait for a top-down approach: go grassroots.
Since undertaking the task of evangelizing UX research at IDP Connect I have seen a slow but continuous ripple effect that has taken many shapes, more or less quantifiable. More colleagues are now actively involved in user testing, tackling a wider range of design challenges now that there is no longer one single person conducting all the tests. There is more best-practice sharing between designers working on different products, a direct result of increased awareness of the importance of interaction patterns and the needs of our user groups. One of the biggest gains has been in the marketing department, where campaigns are much more research-informed, with personas and user journeys playing an ever bigger role in planning. What is more, marketing will be more actively supporting research activities in the coming months thanks to a new initiative that will allow IDP Connect to be even closer to the people it serves (more on this later). Finally, there have been totally unexpected developments as well, with one colleague convincing our management of the advantage of officially taking on UX writing tasks after being involved in research. The possibilities seem to be endless, and the ball keeps rolling.
Working in an organisation with a culture of continuous exploration enables its people to find opportunities for intervention and change even when the task seems of biblical proportions. Bringing as many people into the research process as possible grassroots style - not waiting for a top-down approach - is essential: it’s quicker, it’s cheaper and it strengthens relationships between colleagues, creating motivation and resilience in times of crisis.