What is Omeka Everywhere?
Omeka Everywhere is a multi-touch, open source museum software funded by a 2013 Institute of Museum and Library Services National Leadership Grant. The project is in partnership with UConn Digital Media and Design department, George Mason University’s Roy Rosenzweig Center for History and New Media, and Ideum, a creator of digital experiences for public spaces.
The software includes a table-kiosk interface and supplementary mobile application that creates interactive and engaging experiences for museumgoers, allowing them to explore and collection items and save favorites to their own mobile devices. Omeka Everywhere combines two existing softwares: Omeka and Open Exhibits. Combining the two together creates a powerful symbiotic relationship where collections and their metadata will be transported to the museum floor with Open Exhibits interactive templates to keep the content flowing and visitors engaged.Learn More
My role in the software development
I applied my knowledge of human-centered and user experience design to the iterative development of Omeka Everywhere by running the second round of usability testing. I analyzed the new user interface for the multiuser touchtable display that was designed and developed after receiving feedback from the testing and analysis of the first iteration. I also formulated and conducted a usability assessment of the new mobile application. Through this research I established and validated user performance measures, found bugs in the software, identified design concerns, and made recommendations that will improve the efficiency, productivity, and satisfaction of the end-user experience. A white paper co-authored with my professor & research advisor is in the process of being published.
Step 1 - Planning & Organizing the Usability Tests
- To determine design inconsistencies and usability problem areas within the user interface and content areas. Potential sources of error may include
- Navigation errors – Consistent difficulty of respondents to locate functions, excessive touches to complete a function, differences in screen flow between the intended path and that taken by respondents
- Presentation errors – Selection of wrong task or material due to labeling ambiguities, ignoring critical steps, etc.
- Control usage problems – improper filtering or entry field usage
- To observe how users respond to the mobile app compatibility and test its performance to learn how to make the process more intuitive and efficient
- Establish baseline user performance and satisfaction levels of the new interface to compare past and future usability evaluations.
- To reveal the user’s conceptual model of the interface and learn how it differs with the designers model
- To learn about features that aren’t in application, but users desire
- Explore how the software affects the user’s understanding of the exhibit, the digital collection
Three different methodologies are fully described and thought out in the Phase II research plan after analyzing the phase I usability test, goals of the study, resources, and constraints.
- Remote Mobile Application Testing
- Solo Moderated In-Person Table & Mobile Application Testing
- 2’s & 4’s Unmoderated In Person Table & Mobile Application Testing
Recruiting Participants, Scheduling Test Sessions, Following Up Via Email
Step 2 - Mobile Application Usability Test
I lead usability tests to analyze the intuitiveness, utility, and overall experience of the Omeka Everywhere mobile application. I collected the data by using Google forms and LookBack, a service used to record the user’s phone screen and simultaneously their facial reactions.
Sessions: Moderated In-Person Usability Test, User Satisfaction Survey
Participants filled in demographics and a background information questionnaire prior to running the moderated usability test. When a participant arrived for the session, he or she started the Lookback recording, opened the Omeka Everywhere app, and started following the instructions on the Google form. Participants' responsibilities were to think aloud their thoughts while initially exploring the app & while they attempted to complete a set of representative task scenarios presented to them. I probed instruction when appropriate. Participants were also told to provide feedback regarding the usability and acceptability of the user interface after each task. After all the user tasks were completed, they filled in a post session questionnaire.
I used Lookback, a UX software that allows a researcher to record the users phone screen while simultaneously recording their qualitative facial expressions through the front-facing camera.
Case Study of Remote Mobile Application Usability Test
Task 1: See information about an item
Participant Notes: It gave me more information about what was in the picture as well as some history about it.
Task 2: Zoom in to see item details
Participant Notes: I didn't even realize there was a zoom function but now that I know, I think it's a great way to see the object in question in more detail.
Task 3: Save several items so that you can access them later.
Participant Notes: I like the heart icon - pretty intuitive to "like" stuff and save it away.
Task 4: Find your saved items
Participant Notes: I like that you can consolidate all your favorites into one spot for later viewing. I would have liked to see the description with the picture saved as well instead of just the image.
Task 5: Share an item with a friend
Participant Notes: I like the share function, but I might suggest an icon change to reflect sharing (like two people next to each other or something). I might also suggest more social media outlets for sharing because people really like sharing things over social media nowadays (Facebook, Twitter).
Task 6: Filter by the President's tag
Participant Notes: I like the tags feature because it suggests topics to the app user (who might not be sure what they want to look at).
Task 7: Watch Ronald Reagan Inauguration
Participant Notes: I like the video format (easy pausing/starting). Maybe include a volume feature on the video, and on the thumbnail indicate that it's a video so people aren't surprised.
Task 8: Change the way items are displayed
Participant Notes: I started on Grid View and went to List View (I think I was supposed to go the other way around) but I like that the titles come with the list view so you can identify what you're looking at.
|-saves/finds saved items||-When able to pinch and zoom in photo
-Filtered by the tag
|-when bug happened(picture not in proportion)
-when volume started on video automatically
Opened up layout tag in settings but didn’t realize had to press grid; went back to home to check if changed
- Understands point is to save items you like; It says their places
- Swipes on item photo to see if there are more pictures included
- tried to type where active tag was to enter search- didn’t realized typed wrong and then tried to find in presidents tag
- likes grid view more
- wants the change layout button to be on feed pages, like instagram
Clicked empty search result and never loaded
Users expressed ambiguity of feed significance, so a collection name was added to the homepage.
Users had a hard time finding the search bar, so it got a magnifying glass icon and the “...”
Users had a difficult time locating the view full size button, therefore it got more noticeable after a gray underlay was added. A tap to view full size feature was also added on the item page
Step 3 - Touch Table and Heist Connectivity Usability Test
The purpose of this phase of the usability testing was to get user feedback on the new giant touch-table user interface design and for the first time test the mobile app and table connectivity. This occurred in the Benton Museum of Art at UConn where one of IDEUM’s tables were placed with the Omeka Everywhere software loaded collection from the current museum exhibit.
Sessions: Solo Moderated In-Person Table & Heist Mobile Application Testing, User Satisfaction Survey
Pairs Unmoderated In-Person Table & Heist Mobile Application Testing
Participants arrived at test location and were handed a mobile phone with the Omeka Everywhere application. Prior to this, they filled out an online questionnaire and background information. There were two facilitators recording data with on an identical document with different focuses: one focusing on which task the participant discovered on their own or with a prompt while the other researcher focused on the participant’s emotional responses. Both researchers also noted observations about difficulty navigating, excessive steps, confusion, ineffective attempts, and bugs while making note of the order he or she goes to the different sections. After all the tasks have been complete and they are done exploring, participants were directed to a laptop and filled out a posttest questionnaire.
Data Collection Sheet Example
About half of users were not sure what to make of the fact that the four trays were different colors. Those that expressed confusion mostly believed that the different colors signaled that each tray held a different collection. To fix the confusion, all trays became the same color.
Automatic 180° rotation of an image when it crosses the “invisible” centerline confusing to users on 1st occurrence then increasingly irritating thereafter for both single and pair use scenarios. In the new design, there is no automatic rotation and users can manually rotate items though gestures.
Vertical images, particularly when enlarged, result in only a small fraction of the metadata field being visible to user (see left). This proved frustrating for users, so the update will include a cap on the height of the item container so that the text doesn’t overshoot the window.