Asset Automation Tool

Overview

The Asset Automation Tool provides the company a way to manage and automate the process of rendering and quality assurance testing media assets. The tool is used by Futureverse, a media tech company, for all asset productions when producing internal content or when collaborating with internationally renowned clients such as Coca Cola, Reebok and FIFA. An important part of this software product is the QA Tool, which allows QA participants to quickly QA test large batches of media assets in an efficient and optimised way before they go out into production.

In this project, I had the privilege of working closely with the Asset Pipeline Team, where we adopted a very cross-collaborative approach in designing, building and delivering the final product.

Role
Product Designer

Year
2023

Deliverables
User research

Documentation

Design system

Wireframes

Prototype

Asset Automation Tool

Overview

The Asset Automation Tool provides the company a way to manage and automate the process of rendering and quality assurance testing media assets. The tool is used by Futureverse, a media tech company, for all asset productions when producing internal content or when collaborating with internationally renowned clients such as Coca Cola, Reebok and FIFA. An important part of this software product is the QA Tool, which allows QA participants to quickly QA test large batches of media assets in an efficient and optimised way before they go out into production.

In this project, I had the privilege of working closely with the Asset Pipeline Team, where we adopted a very cross-collaborative approach in designing, building and delivering the final product.

Role
Product Designer

Year
2023

Deliverables
User research

Documentation

Design system

Wireframes

Prototype

Asset Automation Tool

Overview

The Asset Automation Tool provides the company a way to manage and automate the process of rendering and quality assurance testing media assets. The tool is used by Futureverse, a media tech company, for all asset productions when producing internal content or when collaborating with internationally renowned clients such as Coca Cola, Reebok and FIFA. An important part of this software product is the QA Tool, which allows QA participants to quickly QA test large batches of media assets in an efficient and optimised way before they go out into production.

In this project, I had the privilege of working closely with the Asset Pipeline Team, where we adopted a very cross-collaborative approach in designing, building and delivering the final product.

Role
Product Designer

Year
2023

Deliverables
User research

Documentation

Design system

Wireframes

Prototype

Bottleneck in the work stream

The company identified a major bottleneck in the delivery of projects throughout the year. There was a phase in our work stream where the process was incredibly manual and time consuming. Every project that required high quality media assets would have to go through the process of setting up new scripts in the rendering of new media assets, and then go through an incredibly manual process of testing large batches of assets. QA participants were using two systems to manually test their batches; Notion and Google Drive. They were using Notion to manage their batch of tasks and using Google Drive to reference the assets. This way of working would often create errors and inefficiencies.

Approach

I used these UX research methods to better understand the pain-points, needs and goals of the asset pipeline developers and QA participants:

  • Contextual inquiry - I observed and got involved in the old manual QA process to have a better understanding of the user flow and goals

  • User interviews and feedback - I talked to developers and QA participants to discover their needs and also to gather design requirements

  • Focus groups - for brainstorming and generating new ideas. Through these sessions we were able to share insights, suggestions, and creative solutions

There were three general themes that were highlighted about the QA process. Below are some of the common frustrations expressed by the participants.

Initial scoping

Using the research findings, I outlined the goals, needs, pain points and solution ideas for the developers and the QA participants. For the developers, it came down to creating a pipeline dashboard that prioritised efficiency, workflow management, quality control and scalability.

With the QA participants, it really came down to being able to manage their list of tasks and being able to efficiently QA the tasks they were assigned.

Specific functionality requirements were listed:

QA participants

Visibility:

  • QA participants need visibility of their batch progress and status

    • Approved, pending or failed assets

Multi-select actions:

  • Bulk publish

  • Bulk approve

  • Bulk fail

  • Bulk edit

Fail reasons:

  • Unique trait issue(s)

  • Video

  • No 4k Image

  • No Video

Asset viewer:

  • Video

  • Still

  • All

Developers

(The developers provided a list of design requirements)

Design specifications:

  • A list of tasks assigned to a user

  • First tasks are assigned tests

  • If you are a QA participant, you will have a list of tests awaiting

  • If you are an admin: list of error in pipeline

  • We should only have one pipeline review per collection at a time - other pipelines will be more automatic

  • Can group by pipeline ID rather than collection

  • First QA task type are assigned tests

Render configurations:

  • AWS credentials

  • S3 configurations

  • Render workflow

  • Batch names & prefix

  • Media issues

  • Traits



Creating the floor plan

I translated the requirements and scope into a sitemap and user flow.

Cardinality

Diagrams to clarify important data structures were created. I referenced GraphQL schemas as a source of truth to establish these rules:

  • A batch cannot have assets from different collections

  • One collection can have multiple batches

  • One batch can have multiple QA tasks

  • One user can have multiple QA tasks, multiple batches and multiple collections

  • One QA task can have multiple assets

  • One batch can have multiple assets

Example: “Batch 1” contains “Asset” 1,2 and 3. “QA Task X” contains “Asset” 1 and 2 and QA Task Y can also contain “Asset” 1 and 3. QA Tasks X and Y are both have relationships to Batch 1.

Ideation and exploration

I used a mix of sketching and rapid wire-framing on Figma to explore different ideas. Depending on what you are designing, starting with sketching is a great way to brain storm ideas and ideate tangible digital solutions. It’s quick, cheap and one of the best ways to iterate over different solutions.

I explored different dashboard layouts on Figma.

Validating the designs

The designs were validated by involving the developers and volunteers from the group of QA participants. For each important touch points we checked that the core functionalities were being met. Along with the designs, I delivered annotated documents and a design system. Due to a deadline of having the tool ready to use for the next project delivery, the Render Configuration screens were delivered in the next release.

Refining designs post production

The Asset Automation Tool was delivered in time to be used in the production of the new Buzzies collection. Based on how the tool performed, there were adjustments that needed to be made, especially with the QA Tool. To gather feedback, I sent out a survey and led a focus group with the QA participants. I also collaborated closely with the dev team through refinement sessions to make improvements on the pipeline dashboard.

In the first 3 questions of the survey, I sought to gather quantitive data to capture the QA participants overall impression of using the tool in a real production environment using a Likert scale from 1 to 5, where 1 represented "Strongly Disagree" and 5 represented "Strongly Agree". This would provide a good indication of the overall experience of using the product.

The second half of the survey was focused on bringing out qualitative data and digging deeper into what could be improved through open-ended questions. This was also used to establish a solid starting point of discussion for the focus group.

Although the overall response to the QA Tool was positive, there was a lot of frustration with the filter not being persistent page to page. Furthermore, in the focus group I was able to uncover key areas that needed refining:

  • Filter functionality

  • Button layout

  • Improved search functionality

  • Settings required for asset viewer

Improvements

I enhanced the filter system by implementing a cascading structure and redesigned the search field to incorporate an elastic search approach. Unlike traditional auto-suggest features, the elastic search allows users to select predefined fields from the backend, enabling a more precise and efficient search experience. This refinement enables users to narrow down and pinpoint the exact asset they are seeking. Additionally, the layout of the QA tool was improved on, and Render Configurations screens were included as part of the subsequent release.

Impact and learnings

We were able measure the success of the Asset Automation Tool by comparing the cost of past projects with projects that used the Asset Automation Tool. With the tool saving the company time and money, it is used today for every Futureverse project involving media assets. There were few key lessons I took away from this project:

  • User testing does not end after development

  • Talking to real users and how they interact with a product is important to understand the flows they are familiar with

  • Involving open ended discussions can help bring in unique insights that the product team alone can miss

Sang Woo Moon

Sang Woo Moon

Sang Woo Moon