Since 1926 when NBCUniversal was founded, the company has become a worldwide recognizable brand and it’s portfolio keeps on growing. NBCUniversal currently runs NBC News, CNBC, MSNBC, Telemundo, NBC Sports, and over 218 local affiliate news stations.
Currently there is no system in place to easily share live news shots across networks. The workflow is very manual and communication heavy. Each show has a paper print out sheet with the rundown of guests, and a director in the control room calls commands line by line while someone else turns a knob to route changes. In breaking news situations the current workflow produces chaos.
To create a new internal application from the ground up that packages live shot metadata, centralizes external systems, informs users, and automates the control room workflow.
What does a live shot consist of?
Who is involved? What are their needs and pain points?
How can we remove the need for verbal/email communication?
How can we remove the need to rely on external spreadsheets and software?
What integrations will accelerate the workflow?
How can we limit human error?
What are the most important indicators to keep users well informed?
I lead the UX research of this new massive application to prioritize requirements and determine the new optimal workflow. I worked alongside my talented UX team of 5-6 people.
I facilitate an ideation session to promote collaboration/innovation with product & engineering teams. Then I helped design mockups that we iterated based off multiple rounds of user testing.
I also crafted a presentation to convey the problem & solution to stakeholders which won their buy in! NBCU is now putting lots of money into developing my work which is very rewarding.
The project kicked off when directors of the news group presented a very early stage prototype of their vision. Live Shot Manager, or LSM, was meant to automate the control room process by treating live shots as reusable digital objects.
Based on the prototype demo, a follow up interview with a news director, and our field research we analyzed all the information and documented…
Essential UI features
We framed the problem to stakeholders, dev team members, and product team members with the prezi below during our ideation session. Here you can find all our discovery research documented.
In the ideation session, everyone sketched and/or listed their ideas for the Live Shot Manager. There were three rounds of design charrettes with the first two being individual rounds and the last round being pair-up with a partner. After each round, everyone presented their ideas and we held group discussions about the strongest points. At the end of the session, we were given three stickers to vote for the solutions we felt strongly about. You can view the final results below...
In the first low-fidelity mockups each show has its own live shot canvas, controlled by workers in the control room. The live shot canvas could be toggled between timeline, top-down, and list views. In the sidebar users could create/edit live shots. Our lead UI designer at the time designed this while I feed him user requirements.
The application has a notification system and live shot approval process for the Media Traffic group. Each live shot has bundled path data that can consist of cameras, audio lines, phone lines, prompters, monitor fill graphics. There are more views and features to this powerful design, but I won’t get into it all now.
Our team created a prototype in UXpin for the first round of user testing. It did not go as well as planned because this was the first time our team got access to real users; we didn’t fully understand the participants’ pain points, wants, and needs so our design didn’t nail it. We started off with a general questions about their roles. We prepared user testing tasks, but halfway going through them the sessions transformed into interviews. We we took this valuable time as an opportunity to let the participants lead the conversation so we could empathise with them better.
The control room is a very fast paced environment. Users need to be able to see everything clearly and quickly.
UI CHANGES UI Changes Make text bigger, make more metadata visible in the canvas, add media traffic review status, add bold pop-up notifications that users can action
The default timeline view isn’t helpful to the user. They prefer a view that replicates their liveshot rundown sheet
UI CHANGES Make top-down view default, display same information the liveshot sheet has
The schedule changes often if guests are late or there is breaking news
UI CHANGES Make live shots draggable; organize live shots into groups by hit time (which are also draggable)
The user needs to signal that a show is ready prior to airing so there are no live mistakes
UI CHANGES Add a ready-for-air checkbox on each liveshot
It is very daunting if a live shot object disappears when a show ends. Live shots are often reused
UI CHANGES Live shot objects should expire at 3am with the option to extend
We made the top-down view the default screen and added metadata into the live shot cards to replicate the “Live Shot Rundown” sheet. Most of the metadata is placed in the “additional info” accordion.
Live shots were rearranged into blocks by hit time. The live shot for the host (Nicole Wallace) is now fixed to the top because the host’s hit time is throughout the show.
If users needed to edit a live shot, they could click the “i” icon which would expand the “Edit Live Shot” panel on the right of the image above.
To edit the paths of a live shot, users could click the “Path Data” button to open up this modal. Previously users could only access the path data in the sidebar, which is hidden at the very bottom of the edit live shot sidebar.
Pop up notifications were added to inform users and provide them with control.
We added a “saved preset” feature to quickly fill in paths and metadata for recurrent live shots.
The second round of user testing was much more successful because the mocks were more similar to the live shot rundown sheet, but we discovered more user needs we had to address. After this session we had another meeting with the stakeholders to clarify new vocabulary, get a better picture of the workflow, and determine next steps.
Users need to see as many live shots as possible in the viewport so they aren’t switching views or scrolling often. The thick live shot cards are too big to make room for the whole hour
UI CHANGES Make the cards thinner
The path data modal obstructs the view of the canvas and is extra clicks to access which could cause stress. Users need to be able to quickly tell what type of live shot it is and the paths it contains at a glance
UI CHANGES Add path data to the live shot cards
Some live shot metadata in the cards are more important than others and should visually stick out more
UI CHANGES Bold the name, make ready-for-air pop out the most
The path data type of “Send, Receive, and Phone” is not natural language to the users
UI CHANGES It should be split up by inbound and outbound lines. The UI also needs to display the transmission info
Users use iNews, a database of all the studios/trucks worldwide NBC, to confirm and document important information of a studio/truck
UI CHANGES iNews should be integrated into the system. When selecting a studio/truck, it should automatically fill in relevant info
Once a live shot is ready for air, there shouldn’t be any changes because it can mess up the live stream
UI CHANGES If someone edits something after it is assigned ready for air, it is no longer ready for air and notifications are sent out to control rooms
Survey results after the 1st round of user testing expresses that the user wasn't fully satisfied and there is room for imporovement.
After the 2nd round of user testing, the user's satisfaction increased and we were closer to creating a seamless user experience.
In this iteration we explored how to condense all the information in one view without making it feel too crowded. We removed the path data modal & edit live shot sidebar by integrating all that information into the live shot card. We prioritized live shot metadata by separating it in the header and an additional “Object Info” dropdown.
Users have more important information they need in the header of live shot cards so they can get the information they need at a glance
UI CHANGES Rearrange header vs. “Object Info” metadata based on user needs i.e. add studio/truck & live shot type to header, move hit time next to live shot name
Each show has a homebase studio that the host films in. Multiple guests can also be filmed in the main studio. Hosts/guests in the same studio share cameras and other equiptment.
UI CHANGES Distinguish between on and off-set live shots, pin the studio to the top of the page, spawn on-set live shot objects, map host routes to studios
The application needs to run in real time and multiple users could be working on the same canvas at once
UI CHANGES Add edit/lock mode of live shots, add save button
At this part of the process our lead UI designer quit, so I picked up the design updates as well as continuing research.
We took the feedback from the last round of user testing and made adjustments- i.e. save/edit mode, the “New on set object” button, and rearrangement of live shot metadata. On-set live shots now displays all of the studio cameras, and a user checks which ones are directed at the guest/host.
I documented the full step by step workflow and prepared a prototype to demo stakeholders. During that meeting, we came up with a few more updates such as indenting on-set objects.
At this stage of the process our team hired a new designer and she took over the UI design, converting my latest low-fidelity mockups into the beautiful dark theme UI above.
After the dark UI theme was finalized I took the sketch file and created a 53 slide demo-workflow that covered every feature of the application. I also prepared a script that explained how our solution solves the users problems.
We got the buy in from higher ups and now NBCU is putting almost a million dollars into developing an MVP!