Getting your mobile nav right 🧭

As your product keeps evolving, that means your mobile navigation has to evolve too
 and that’s where things can start to break.

As structure shifts and labels change, it’s easy for your nav to become outdated, bloated, or confusing. Especially on mobile, where space is tight and user patience is even tighter.

That’s why we just launched a new Concept Page focused on Mobile Navigation, showing how to test whether your menu is keeping up with your product.

To bring it to life, we show how we tested the mobile nav for Indiana University’s online college site. Here’s how it scored using a focused stack of 3 UX metrics:

Source: Glare Framework - Mobile Navigation Concept

The results showed that users could quickly find core content, navigate with ease, and left with a positive impression, exactly what you want in a high-change environment like higher ed.

The Concept page breaks down how we tested this nav, what we learned, and how to apply this stack to your own product’s evolving menu structure.

:megaphone: Let’s hear from you:
What types of mobile navs or menu patterns have been the hardest to get right in your work?

Drop your challenges and thoughts below :backhand_index_pointing_down:

2 Likes

Curious, what did they do next? Did IU call it quits as “good enough” or did they continue to improve their nav score?

Also, did they start from a worse place and work up to this score?

3 Likes

Cool breakdown here @MoData - how do you determine which UX Metrics to use?

3 Likes

Depends on the user needs! Mobile navs mostly just need to present information in a Findable and Efficient way, without trying to do too much selling of products or offerings to users.

With that in mind, we want metrics that focus on users Behavioral decisions (where they click, how successful they are, how long it takes to find items in the nav, etc.).

Out of our group of Behavioral metrics in the image below, Success reveals where a user clicks given a certain goal, and Usability tracks that success across multiple actions in the nav, so those two metrics were chosen. And since we still wanted to ensure that the process of interacting with the nav felt good, we measured Satisfaction at the end of the test (though Effort could have just as easily been substituted).

3 Likes

We actually tested three versions of IU’s mobile nav at the same time, this was actually the worst performing version! The other two achieved Very Good marks on Usability because they didn’t separate the primary nav items up top from the de-emphasized nav items on the bottom. They moved forward with one of those other two versions.

3 Likes

Very cool. It’s interesting that there was a decision to keep improving it, why not just stop with “good”?

We tested three versions at the same time, so they didn’t get this ‘Good’ score and then keep testing. They found that the other versions performed better (‘Very Good’ scores), and moved forward with one of those.

2 Likes