Elastic Grid is heavily focused on delivering the best end-to-end user experience we can (from interface to face-to-face). Transitioning the platform from an administrative utility for sales teams to a user-friendly marketing engine is an ongoing, ever-evolving process. In this series of articles focused on our user experience design process, I’ll describe how we work and why, giving examples (good and bad) of how we are going.
Today, I’ll introduce our process foundations and how they fit into our overall Agile development approach.
At the start of any design project, we ensure we don’t get caught up in solution mode too early. Elastic Grid is used by marketers globally to access campaigns made available by some of the world’s largest companies. As such, we have many feature requests coming through. Along with our Business Analyst, my role as UX designer is to ask “why” as many times as required to get to the heart of the problem. It is easy to fall into the trap of implementing a solution rather than solving a problem.
Talking to users, clients and stakeholders in order to ensure feature priorities are matched against real user needs is paramount. It’s also important that the context is aligned so you aren’t trying to solve a business problem with user interface or a user interface problem with support labor. Solutions are much more on point when the core problem is defined but don’t expect that definition is always or easily given.
As much as we’d like to start everything from scratch, when you have an Enterprise application used by a massive set of users, you can’t just restart every time. As user problems arise, or features are designed, keep things small. Agile development is based on this principle. UX designers need to be part of that process, embracing the idea of failing fast. At Elastic Grid, we keep bringing things back to their core, breaking larger problems into smaller chunks and solving one at a time. Agile UX doesn’t mean not thinking about the road ahead, just not fixating on what may cross it later on. This minimizes time spent on edge-cases and assumptions.
We don’t assume anything and don’t think of Analysis as a project we’ll get to one day, it’s part of the continuous flow. Feedback is critical and gathered from many different sources, understanding how it fits in the context. We split feedback into two categories; subjective and objective.
Subjective feedback comes from emotion, it is usually solution or feature orientated. The source is generally conversation and the scope is extremely limited. The designer’s role is to ensure the wider organization is aware of this. Subjective feedback can cause reaction before getting to the core and its source is usually:
We don’t dismiss this feedback but we take it and validate it with objective feedback methods.
Objective feedback comes from systematized research into the effectiveness of your user experience design. It requires discipline when assessing the data you receive. For example, how many times have you heard someone say: “I’m not sure it’s intuitive,” while they subconsciously complete a task using the workflow in front of them? They may not like it, but that doesn’t mean it’s difficult to use.
There are horses for courses when gathering objective feedback, here’s a rough guide:
Observational: We put new features in front of users and ask them to perform relevant tasks, observing the steps they take. The principle idea is to watch them interact with your experience flows and see what happens. We don’t interrupt or lead them, allowing their actions to speak for themselves. We record these sessions and review them to determine flaws in our design so we can take steps to amend them.
Behavioral: Tracking tools such as Google Analytics are useful to determine pathway effectiveness and getting an overall sense of what people are doing; which parts of your application are being accessed, on what devices and when, where and for how long. We set up events to determine if and where we are getting drop-offs in our user flows and keep track of traffic patterns.
Layout Optimization: In addition to conversion funnel monitoring, when we want to see how users are performing specific actions or assess the effectiveness of page layouts, we use Crazy Egg. Another popular tool of this type is Click Tale. These tools allow you to monitor how multiple users are interacting on individual pages through heat maps and scroll reach so you can compare the results as you make layout changes.
Surveys: As long as questions are framed correctly and your net is wide enough, surveys can prove worthwhile as a good method of gaining objective feedback. At Elastic Grid we have an initiative through our partner support teams (Grid Marketing Specialists) where they are asking BIG (Before I Go) questions to understand more about users’ pain points. We can then objectively collate the feedback and determine if and how we can help drive solutions, whether application, campaign or support-based.
Examples of Crazy Egg snapshot reports.
If your analysis leads you to make changes after identifying a problem, re-test! Don’t assume you’ve solved the problem just because you intended to. Validate that your new ideas and features are actually having a positive impact. This will also ensure you haven’t created an unexpected side-effect. You might even uncover a deeper issue.
All of us in the software space are essentially in the open source community. So if you discover something interesting, then share it! We’re doing just that with these UX articles. Hopefully you’ll enjoy them and they help to your own user experience design journey.
Lead UX designer and Design Team Lead at Elastic Grid, responsible for the user experience of the platform and the products visual brand. Lorenzo has extensive experience designing simple digital solutions for large, complex commercial projects, including e-Commerce, marketing campaigns and content management platforms.