top of page

The testing went better than I expected. 


My ideal testing age group was between 25-40 years old, seeing as people in that range were more likely to have worked with the Google Suite of products and possibly have their own website or work with a website (or were at least familiar with the basis of what Google Analytics provides).




My first tester was a 26-year-old female who had made a website for a class marketing project where she learned about Google Analytics. 


My second tester was a 30-year-old male software engineer, who never used Google Analytics before, but is very familiar with the Google Suite of products and has strong computer literacy.


*Unfortunately, there was a malfunction with the computer microphone and not all the sound was captured. Sometimes errors like this cannot be predicted and having a larger pool of test applicants would have served this study better.*



Key Takeaways


  • Though both testers were able to successfully complete all 3 scenarios, tester 2 struggled more with the main menu.

    • One comment he made was that items like "behavior" were listed in different sections of the menu as well as in the secondary dimension drop down and that was a little confusing


  • Tester 1 pointed out in a survey comment that when changing dates and working with the dashboard, she to click "apply" for items to be modified instead of just hitting enter or auto-populating as selections were chosen. 


  • Bother testers took the longest in the Scenario 2, trying to find the "referrals" section under the Acquisition tab.  The user test itself is to understand what comes naturally to the user and what doesn't, and without being prescriptive, letting them find their way to the "right" answer, which can be difficult if they unable to find it and feeling frustrated. Both of my users did a lot of clicking to find this section and gravitated toward the behavior section first. Maybe they were unsure what Acquisition meant? I can't be 100% sure.


  • The post-test survey, both user only answered 1 question the same and it was very inconclusive. The indifference here leads me to believe that my hypothesis was not correct and the usability of the program was more difficult that it was intuitive.














Future Iterations


If the testing of this software were to be expanded, I would like to test users on creating custom alerts and reports, as I think that is a more complex ability of the platform, but in setting stipulations for the test I would like to have testers with knowledge of platform (which can be more difficult to find in a randomized setting)




My purpose with these tests was to find areas of friction with the platform and possible areas of improvement, which I think we accomplished.  I believe my hypothesis served false and Google Analytics is not an appropriate tool for novice users. There are many aspects of labeling that can be updated, ie behaviors & pages as mentioned by the users and heuristic evaluation. Since this not program is not intuitive, I would also suggest in-program learning aids or step-by-step tutorials that can turned on or off and easily accessible. As users, we don't always dig into the full functionality of a platform until we need it and learning something new, quickly does not serve the public and its needs.


Since Google is such a well-known service provider, they are very intuned with what their customers want, especially with a more advanced product like this, but it goes without saying that nothing is perfect. Every audience has their own skill level and needs and the main objection is to serve the using population as best as possible.


Screen Shot 2018-12-19 at 8.20.46 PM.png
bottom of page