In 2019 NBCUniversal news departments still follow an analog workflow that consists of a paper sheet and a knob. If we treat a live news shot as a digital object, how can we create a revolutionary software that maximizes productivity and limits errors?

I am part of this ambitious project to digitize the live news workflow for the largest media company in the country.


Since 1926 when NBCUniversal was founded, the company has become a worldwide recognizable brand and it’s portfolio keeps on growing. NBCUniversal currently runs NBC News, CNBC, MSNBC, Telemundo, NBC Sports, and over 218 local affiliate news stations.

Currently there is no system in place to easily share live news shots across networks. The workflow is very manual and communication heavy. Each show has a paper print out sheet with the rundown of guests, and a director in the control room calls commands line by line while someone else turns a knob to route changes. In breaking news situations the current workflow produces chaos.

Live Shot RundownSheet

The print out of the "Live Shot Rundown Sheet"


To create a new internal application from the ground up that packages live shot metadata, centralizes external systems, informs users, and automates the control room workflow.



  1. What does a live shot consist of?

  2. Who is involved? What are their needs and pain points?

  3. How can we remove the need for verbal/email communication?

  4. How can we remove the need to rely on external spreadsheets and software?

  5. What integrations will accelerate the workflow?

  6. How can we limit human error?

  7. What are the most important indicators to keep users well informed?


I lead the UX research of this new massive application to prioritize requirements and determine the new optimal workflow. I worked alongside my talented UX team of 5-6 people.

I facilitate an ideation session to promote collaboration/innovation with product & engineering teams. Then I helped design mockups that we iterated based off multiple rounds of user testing.

I also crafted a presentation to convey the problem & solution to stakeholders which won their buy in! NBCU is now putting lots of money into developing my work which is very rewarding.


  • Interviewing

  • Usability Testing

  • Wireframing (Sketch)

  • Prototyping (UXpin)

  • Surveys

  • Design Thinking

  • Presenting

  • Ideation Charette


The very early stage LSM prototype that the stakeholders presented to our team

The project kicked off when directors of the news group presented a very early stage prototype of their vision. Live Shot Manager, or LSM, was meant to automate the control room process by treating live shots as reusable digital objects.


NBC news room
NBC news room
NBC news room

"There can be 50,000 e-mails day."


Based on the prototype demo, a follow up interview with a news director, and our field research we analyzed all the information and documented…

  • Current workflow

  • Proposed workflow

  • Essential UI features

  • Potential Integrations

  • User Personas


The news group needs a real-time user interface to manage live shots & simplify workflows because the current decentralized system is very verbal communication heavy and prone to error.

We framed the problem to stakeholders, dev team members, and product team members with the prezi below during our ideation session. Here you can find all our discovery research documented.


In the ideation session, everyone sketched and/or listed their ideas for the Live Shot Manager. There were three rounds of design charrettes with the first two being individual rounds and the last round being pair-up with a partner. After each round, everyone presented their ideas and we held group discussions about the strongest points. At the end of the session, we were given three stickers to vote for the solutions we felt strongly about. You can view the final results below...

  • A | Dynamic data fields and UI with “Remote or Local” being a required field to determine the necessity of Sat Ops review (5)
  • B | Unique LSM ID (1)
  • C | Timeline view with all shows on the left and different color indicators of the objects by hit time on the right (3)
  • D | Accordion view on sidebar grouped by category with user presets and the ability to rearrange category cards (4)
  • E | Sending notifications to specific users in the system (1)
  • F | Filtered list view of pending objects (1)
  • G | Action items for reviewing objects for Sat Ops: Approve or Contact TPM (2)
  • H | Toggle-able views between timeline, list view, table view (1)
  • I | Global timeline of every control room and a side panel with the status of each control room (4)
  • J | Notifications for added parameters to an object (1)
  • K | Checklist for Sat Ops reviewing each line of an object (1)
  • L | Include “Last Created” and “Last Updated” information to increase accountability and allow for easy communication (2)
  • M | Log of changes for each object (2)
  • N | CSV export for expired shots for tracking and reporting (1)
  • O | Must be multi-browser (PC and Mac) friendly (1)
  • P | Multi-shot integration on the large monitors in the control room (1)
  • Q | Color indicators for grabbed and taken and icons for Sat Ops review; have a 2 level process where producers and TPMs/TPCs are working together in the first phase before submitting an object to Sat Ops in the second phase (2)


LSM v1 mocks

In the first low-fidelity mockups each show has its own live shot canvas, controlled by workers in the control room. The live shot canvas could be toggled between timeline, top-down, and list views. In the sidebar users could create/edit live shots. Our lead UI designer at the time designed this while I feed him user requirements.

LSM v1 mocks

The application has a notification system and live shot approval process for the Media Traffic group. Each live shot has bundled path data that can consist of cameras, audio lines, phone lines, prompters, monitor fill graphics. There are more views and features to this powerful design, but I won’t get into it all now.


Our team created a prototype in UXpin for the first round of user testing. It did not go as well as planned because this was the first time our team got access to real users; we didn’t fully understand the participants’ pain points, wants, and needs so our design didn’t nail it. We started off with a general questions about their roles. We prepared user testing tasks, but halfway going through them the sessions transformed into interviews. We we took this valuable time as an opportunity to let the participants lead the conversation so we could empathise with them better.

Round 1 Interview/User Testing Script


  • The control room is a very fast paced environment. Users need to be able to see everything clearly and quickly.

  • UI CHANGES UI Changes Make text bigger, make more metadata visible in the canvas, add media traffic review status, add bold pop-up notifications that users can action

  • The default timeline view isn’t helpful to the user. They prefer a view that replicates their liveshot rundown sheet

  • UI CHANGES Make top-down view default, display same information the liveshot sheet has

  • The schedule changes often if guests are late or there is breaking news

  • UI CHANGES Make live shots draggable; organize live shots into groups by hit time (which are also draggable)

  • The user needs to signal that a show is ready prior to airing so there are no live mistakes

  • UI CHANGES Add a ready-for-air checkbox on each liveshot

  • It is very daunting if a live shot object disappears when a show ends. Live shots are often reused

  • UI CHANGES Live shot objects should expire at 3am with the option to extend


LSM v2 mocks

We made the top-down view the default screen and added metadata into the live shot cards to replicate the “Live Shot Rundown” sheet. Most of the metadata is placed in the “additional info” accordion.

Live shots were rearranged into blocks by hit time. The live shot for the host (Nicole Wallace) is now fixed to the top because the host’s hit time is throughout the show.

If users needed to edit a live shot, they could click the “i” icon which would expand the “Edit Live Shot” panel on the right of the image above.

LSM v2 mocks

To edit the paths of a live shot, users could click the “Path Data” button to open up this modal. Previously users could only access the path data in the sidebar, which is hidden at the very bottom of the edit live shot sidebar.

LSM v2 mocks

Pop up notifications were added to inform users and provide them with control.

LSM v2 mocks

We added a “saved preset” feature to quickly fill in paths and metadata for recurrent live shots.


The second round of user testing was much more successful because the mocks were more similar to the live shot rundown sheet, but we discovered more user needs we had to address. After this session we had another meeting with the stakeholders to clarify new vocabulary, get a better picture of the workflow, and determine next steps.


  • Users need to see as many live shots as possible in the viewport so they aren’t switching views or scrolling often. The thick live shot cards are too big to make room for the whole hour

  • UI CHANGES Make the cards thinner

  • The path data modal obstructs the view of the canvas and is extra clicks to access which could cause stress. Users need to be able to quickly tell what type of live shot it is and the paths it contains at a glance

  • UI CHANGES Add path data to the live shot cards

  • Some live shot metadata in the cards are more important than others and should visually stick out more

  • UI CHANGES Bold the name, make ready-for-air pop out the most

  • The path data type of “Send, Receive, and Phone” is not natural language to the users

  • UI CHANGES It should be split up by inbound and outbound lines. The UI also needs to display the transmission info

  • Users use iNews, a database of all the studios/trucks worldwide NBC, to confirm and document important information of a studio/truck

  • UI CHANGES iNews should be integrated into the system. When selecting a studio/truck, it should automatically fill in relevant info

  • Once a live shot is ready for air, there shouldn’t be any changes because it can mess up the live stream

  • UI CHANGES If someone edits something after it is assigned ready for air, it is no longer ready for air and notifications are sent out to control rooms


LSM survey

Survey results after the 1st round of user testing expresses that the user wasn't fully satisfied and there is room for imporovement.

LSM survey

After the 2nd round of user testing, the user's satisfaction increased and we were closer to creating a seamless user experience.


LSM v3 mocks

In this iteration we explored how to condense all the information in one view without making it feel too crowded. We removed the path data modal & edit live shot sidebar by integrating all that information into the live shot card. We prioritized live shot metadata by separating it in the header and an additional “Object Info” dropdown.


“It can work…. Need to play around with the actual software more."

Round 3 Interview/User Testing Script


  • Users have more important information they need in the header of live shot cards so they can get the information they need at a glance

  • UI CHANGES Rearrange header vs. “Object Info” metadata based on user needs i.e. add studio/truck & live shot type to header, move hit time next to live shot name

  • Each show has a homebase studio that the host films in. Multiple guests can also be filmed in the main studio. Hosts/guests in the same studio share cameras and other equiptment.

  • UI CHANGES Distinguish between on and off-set live shots, pin the studio to the top of the page, spawn on-set live shot objects, map host routes to studios

  • The application needs to run in real time and multiple users could be working on the same canvas at once

  • UI CHANGES Add edit/lock mode of live shots, add save button


LSM v4 mocks

At this part of the process our lead UI designer quit, so I picked up the design updates as well as continuing research.

We took the feedback from the last round of user testing and made adjustments- i.e. save/edit mode, the “New on set object” button, and rearrangement of live shot metadata. On-set live shots now displays all of the studio cameras, and a user checks which ones are directed at the guest/host.

I documented the full step by step workflow and prepared a prototype to demo stakeholders. During that meeting, we came up with a few more updates such as indenting on-set objects.


LSM v4 mocks

At this stage of the process our team hired a new designer and she took over the UI design, converting my latest low-fidelity mockups into the beautiful dark theme UI above.

After the dark UI theme was finalized I took the sketch file and created a 53 slide demo-workflow that covered every feature of the application. I also prepared a script that explained how our solution solves the users problems.

We got the buy in from higher ups and now NBCU is putting almost a million dollars into developing an MVP!