freeimage-20531808-high.jpg

Energy Industry UX Research

Did we get our new application right?

ROLE: Lead Researcher
PROJECT TYPE: Usability Testing and Informal Observation
RESEARCH: Qualitative, Communicative, Evaluative
TIME FRAME: 1 month

Project Summary
While contracting with a large oil and gas company, I was brought into an existing project team to conduct research activities during a pre-planned site visit. The visit purpose was to launch a new desktop application for the site’s oil field workers. The application was intended to streamline the general work process and reduce “screen time” (ie. windshield time or the time spent driving between well pads) which would significantly increase the workers’ safety while on the job. The project team had been working closely with a small group of potential end users on-site for almost a year to create an application that was custom tailored to meet their needs and reduce pain points in the existing process. The team was eager to test the user experience of this new tool, but also anxious to be sure that they were meeting their end users’ expectations after so many months of working together. I was asked to formally usability test the application and informally record user attitudes towards the rollout and project team to help them ensure a they maintained a strong continued relationship with this end user group.

Approach

At the time of my joining the project, the team had already engaged in some preliminary UX research activities. They had conducted a site visit and held both interviews and rough prototype usability testing as part of their application development process. Since the site was fairly small and most likely we would be speaking to the same participants from the first round of UX research, I reviewed the previous work as my starting point. I was provided a draft testing script by the design team, which I modified to better suit task-based testing objectives and I decided to leverage the previous researcher’s core research questions with just a few minor tweaks to reflect the current application design. This way my results would be more easily comparable to the previous testing sessions and would set up a replicable pattern for any for any future testing sessions the project chose to pursue. In addition to preparing the usability testing materials, I also planned some simple feedback gathering activities we could do on-site to collect general attitudes towards the project team and sat in on several trip planning meetings to give the project team feedback on their kickoff training materials and activities.

During this planning phase, aside from the normal issues that always arise, I had two particularly important potential barriers that had to be creatively addressed if we were going to have a successful outcome. The first was that the site trip was only scheduled for two and a half days total with just one day dedicated to usability testing. Taking into account the unpredictable nature of our participants’ work and the likelihood we would have one or more no-shows in the testing schedule, this meant we were almost guaranteed to not have enough data per user role to make the trip cost worthwhile. To increase our odds of success, I asked to recruit our project design lead to help administer the usability tests. She had not been involved in the original design work so she had no attachment to the current designs, she had past experience in usability testing, and was already planning to go on the trip, making her the perfect research partner for this project. To round out the on-site team, we also had two dev leads going to observe and provide tech support. Thankfully they also agreed to help with backup note taking duties during our testing sessions. Our second challenge was that I knew the project team could not wait the usual turnaround time for their research results. This became doubly challenging since we were also in an enterprise environment and unable to use any of the cloud-based collaborative UX software that might normally be used to analyze data and share out results more quickly. To mitigate the issue, I used the tools we had immediately available and created an Excel research analysis tool on the network that was accessible by the whole team. As a group, we then committed to using this Excel tool as our data collection document where we would record all the raw testing session data while on site with the intent of completing the analysis as soon as we were back in the office.

Once on-site at the field office, things progressed according to plan. The project team conducted a training session for all potential end users on Day One. During this session, I made general observations, assisted with questions, technical issues, and ended the training session by leading a brief “Rose, Thorn, Bud” activity to gather initial impressions of the application. We spent Day Two administering usability testing sessions at the field office and recording interview responses and testing data into the Excel analysis tool. At the end of both days I led a debriefing/synthesis session with the project team to keep them up to date with our observations and talk about any early trends that were emerging. Since the majority of the project team came to the kickoff events and observed at least one testing session, by the end of Day Two, as a group we already had a pretty clear picture of what needed to happen next and items were already being added to the backlog. Before we headed to the airport on Day Three, we had a quick wrap up meeting with site leadership to thank them for their time and share some of our learnings and intended next steps.

Upon our return to the office, I quickly met with the design team to finish synthesizing our findings. Since we had already recorded all the raw test session data directly into the Excel analysis tool, we were able to very quickly code and analyze the test data for recurring pain points and usage trends. In the end, I was able to give the project team not only a prioritized recommendation list guided by UX best practices in record time, but also all the raw anonymized data in case they ever needed to go back and review the research. The Excel tool “report” was so complete that they actually did not want me to spend time making a formal report of my findings.

Revised Research Questions

• Does the application meet user expectations?

• What is working and not working from a usability standpoint in the current application design?

• Based on their first-hand training experience and general first impressions, how often do users expect to be using this new tool?

Project Breakdown
1 Lead researcher
• 3 Assistant researchers
• 1 Three-day site visit
~12 Participants for one-hour usability testing sessions
~4 Weeks total for planning, site visit, research, and synthesis activities

Final deliverables took the form of a comprehensive Excel file (referenced above) and at the team’s request a short verbal report with a minimal slide deck.

Project Timeline
Week 1 & 2
Planning and secondary eesearch
Week 3 Site visit
Week 4 Synthesis and analysis, write report

Site Visit Schedule
• Tuesday: Arrive at airport, drive 2 hrs to site, train research support, observe rollout presentation, facilitate Rose-Thorn-Bud activity, meet with site leadership, end of day
• Wednesday: Usability tests all day, end of day research team synthesis meeting, end of day
• Thursday: Load up cars with luggage, final usability tests in AM, meet with site leadership, drive 2 hrs to airport, fly home, end of day

Individual Responsibilities
• Create and maintain the research plan
• Communicate regularly with project team
• Participant recruitment, scheduling, and site visit coordination for research activities
• Managing data privacy practices
• Writing all interview and usability testing scripts
• Training assistant researchers
• Planning and leading on-site research activities
• Managing, analyzing, synthesizing all collected data
• Final deliverables design and presentation

Outcomes
Overall, the site visit was considered a complete success. We gathered a surprising amount of data in a very short period of time thanks to recruiting design team members to assist with the research activities. Through this strategy we were easily able to double our data yield without adding any additional cost to the project. My ad-hoc Excel analysis tool also proved quite successful and I ended up sharing it out amongst my UX research colleagues who were keen to try it for themselves. In regards to our research questions, testing data clearly showed that the interface was easy to use and generally made sense to its intended audience despite a few easily remedied design flaws, users reported expecting to use the application much more often than originally anticipated after seeing it firsthand, and the project team was happy to validate that they were successfully maintaining a strong partnership with their end users.

Key Project Takeaways

The best projects center users as your partner and not your client. I was consistently impressed by the lengths to which the project team worked to involve their end users in the development of this product—and it worked! Everyone I spoke with on-site had nothing but praise for the project team and a sense of pride and community around the development of their own tool. This was easily my favorite user centered design project out of my time working in oil and gas. I came away feeling like I saw the best possible working example of what user centered design can be and felt very humbled to have been a small part of their project’s success.

Be willing to share the work. When I was setting up this research plan I had a choice: I could take sole responsibility for all the testing sessions and only have one or two testing sessions per role title (assuming everyone showed up and there were no field emergencies, etc.), or I could ask for help and have double the amount of data. The only drawback— I had to be OK with the fact it might not be exactly the way I would do it. I chose option two, because I knew we had a really good team and I believed with only a small amount of training they could help me do the work and I was right. We got really great data with clear trends that translated to solid UX recommendations. If I had not chosen to share the load, the research would have been incomplete at best. This kind of scenario is why I believe so strongly in the value of cross-functional collaborative teams.

Know your audience as a researcher when considering deliverables. A formalized approach to research with a set synthesis period and a final report was not appropriate in this case. The project team was already sold on the value of UX and was ready to move on any recommendations or observations I could offer, it was just a matter of getting this information to them in a timely manner. Luckily, I had already developed my research collection tool just as this project was taking off and it proved to be exactly what we needed to bridge the gap between synthesis and reporting. I was able to offer timely recommendations knowing that no corners were cut in the process and the project team got what they needed to keep things moving. Finding novel solutions like this are a big part of why I enjoy UX research.