EngineeringFebruary 01, 2022

A simpler way to implement and maintain video analytics code

Video analytics are essential to maximizing the impact and value of video content. For technical teams, however, capturing this data can often be more challenging than collecting other user events. In this article, we’ll show how mParticle’s Media SDK simplifies this process for engineering teams, and provides data stakeholders with actionable user insights.

Video content is a staple of online experiences today, and its importance is steadily increasing. Regardless of the industry you’re in or the market you serve, there’s a high likelihood that your company produces and distributes video content to tell the story of your brand. As with any other product or user experience you create, it is critical to collect insights into how these assets are performing in order to maximize the value this content delivers for your business.

For engineering teams, however, collecting analytics on video content can often be more challenging than tracking interactions with other types of UI components, since media events typically require additional metadata to be meaningful. For example, the fact that a user clicked on a media player is not very useful to marketing, analytics or product teams. In order to use this data strategically, you need information like which specific video the user clicked on, whether that click started, paused, or stopped the video player, and how long the video played. This is why engineers who implement video tracking code often need to capture separate data objects––one that describes what the user is doing, and another that describes the content itself––then map these objects together to form a complete event. 

Additionally, the video metrics that teams can potentially evaluate––like play rate, social shares, watch time, and click-through rates, to name just a few–– are many and varied. Different teams often need to track separate metrics to measure the ROI of their specific initiatives. A growth team interested in using video to help qualify prospects, for example, might want to track all users who watch a video for more than the average amount of time. A marketing team interested in using video to build brand awareness, however, may be more interested in engagement metrics like social shares, video impressions, and view counts. For engineers, this often means implementing multiple APIs to track each separate event type, and potentially making configuration changes if a vendor happens to change their integration specifications at a later date. 

Considering the complexities specific to measuring video performance, it’s easy to see how implementing and maintaining video tracking code can quickly amount to a significant demand on engineering resources. Without an API that abstracts away these complexities, implementing and maintaining video tracking code can quickly become unwieldy for developers.

A simpler way to measure video performance

mParticle, a Customer Data Platform (CDP) that enables teams to collect customer data from any platform through secure APIs and SDKs, unify this data into customer profiles, and forward it directly to activation systems through direct integrations, has developed a Media SDK is an all-in-one toolkit for collecting video and audio data across web, iOS and Android platforms. It delivers a single API for collecting common events across any media player, and automatically initializes events and objects to contain the relevant metadata. The Media SDK supports direct integration with Adobe Analytics for Media, enabling you to use Adobe’s full range of media events without having to set them up individually. This includes the Adobe “Heartbeat”––a self-firing event that records continuous snapshots of in-session media players to capture granular session data. 

Getting started

Tracking video interactions can quickly result in a very large volume of data, and it is unlikely that all of this detail will be actionable in the use cases that marketing and product teams need to accomplish. This is where the mParticle Media SDK comes in. This easy-to-setup, lightweight SDK streamlines the process of capturing view counts, watch time, average views, engagement, click performance and more. Additionally, it provides a wrapper for the Adobe Analytics API that offers a simpler, less resource-intensive way to forward events to Adobe Analytics than directly installing these vendor APIs directly. 

Of course, any data that the Media SDK forwards to Adobe can be sent to mParticle as custom events as well. This can be done on a selective basis, allowing teams to be strategic about the media events they capture for analytics and activation use cases, and avoid unnecessary clutter in their internal data pipelines. In this example, we’ll look at how to set up the mParticle Media SDK in a web application, and selectively forward events to mParticle that could benefit your marketing and product teams. To get started, we will first need to install the regular mParticle client SDK installed in your project, along with packages for the Media SDK and mParticle Adobe Client web integration:

npm i @mparticle/web-sdk @mparticle/web-media-sdk @mparticle/web-adobe-client-kit

Next, in your main JavaScript file, import these three modules, and add Adobe as a data output for your web app. Once you fill this out, mParticle will load the Heartbeat SDK and send media events to it:

// Import each library
import mParticle from '@mparticle/web-sdk';
import MediaSession from '@mparticle/web-media-sdk';
import Adobe from '@mparticle/web-adobe-client-kit';

// Configure mParticle as needed for your project
const mParticleConfig = {
  // configuration items

// Register each kit to the configuration

// Initialize mParticle
mParticle.init('your-api-key', mParticleConfig);

The MediaSession object is the main vehicle through which the Media SDK captures metadata about user interactions with media across your site. Each time a customer interacts with a piece of media content––for example, starting, stopping, pausing, seeking, or scrubbing a video––instantiates a new MediaSession, which contains the following properties by default:

const mediaSession = new MediaSession(
    mParticle,                    // mParticle SDK Instance
    '1234567',                    // Custom media ID, added as content_id for media events
    'Funny Internet cat video',   // Custom media Title, added as content_title for media events
    120000,                       // Duration in milliseconds, added as content_duration for media events
    'Video',                      // Content Type (Video or Audio), added as content_type for media events
    'OnDemand'                    // Stream Type (OnDemand or LiveStream), added as stream_type for media events

// optionally set percentage at which you consider content completed
mediaSession.mediaContentCompleteLimit = 90;

Each time a MediaSession is initialized, the Media SDK also creates a Session Summary event, which contains more specific information about what the user did in a specific session, including timestamps for media_session_start_time and media_session_end_time, media_content_time_spent, media_content_complete, and media_session_segment_total, among other keys. Additionally, for tracking user engagement with video advertising, the Media SDK also provides an Ad Summary event that tracks interactions beginning when the logAdStart method is called, and ending when either logAdSkip or logAdEnd is called. 

Separate the signal from the noise with selective data forwarding

It is easy to see how the average user would generate a great deal of MediaSessions and accompanying events throughout the course of a normal browsing session on a webpage containing several video players. While we want to capture all of this information in our dedicated media analytics platform, it probably doesn’t make sense to forward all of these events to mParticle. 

One example of video data that you would not want to forward to your CDP is the Adobe “heartbeat,” the main event that Adobe Analytics uses to capture video engagement. Every ten seconds during video playback, a heartbeat event is sent from the client to a tracking server. Each heartbeat contains a detailed snapshot of what is happening at that particular time in the playback, and a large portion of this metadata describes aspects of video performance like bitrate, dropped frames, and buffering times. While this information is essential for putting together a complete picture of how video content is performing across your sites or applications, much of it is not relevant to growth teams who are concerned with leveraging video analytics to understand customer preferences and behavior. 

In order to enable our marketing and product teams to effectively activate our video analytics data, we need to identify the events that matter to them the most, and selectively forward these events to mParticle. Luckily, the Media SDK provides a solution for easily implementing this  selective event forwarding to mParticle. The SDK exposes several methods based on common player functions including mediaSession.logPlay(), mediaSession.logPause(), mediaSession.logSeekStart(), and mediaSession.logAdClick(). Each of these events take in an object that can be populated with custom attributes, providing a way to capture important session characteristics at the time these methods are called. 

For example:

    customAttributes: {
        content_title: 'An Incredible Episode',
        content_season: 3,
        content_episode: 26,
        my_custom_attribute: 'Locutus'

This event above would be called at the moment a user starts viewing a piece of content, capturing the title, season, and episode number of the video they are watching. Once these events begin populating in mParticle, our marketing team can use this data to deliver personalized messaging to users. For example, they could create an audience consisting of viewers who have watched a specific number of episodes of this show, and target these viewers in email and push messaging to generate excitement about the upcoming season among an engaged audience segment.

Measure more while maintaining less

The mParticle Media SDK greatly simplifies the process of implementing code to track video performance, as well as mapping the metadata of a video asset to the specific information describing user interactions. By streamlining the setup process and minimizing any maintenance, the Media SDK can greatly reduce the significant technical overhead that is usually associated with measuring video performance. Additionally, by providing a way to selectively forward video events to mParticle, the Media SDK helps elevate the most actionable and meaningful data for marketing and product teams to leverage. 

Watch a video walkthrough of how to get started with the mParticle Media SDK here:

Latest from mParticle

MACH Alliance and mParticle featured image


Leading the next generation of CDP solutions: mParticle celebrates acceptance into the MACH Alliance

Madeleine Doyle – April 16, 2024
A stock image of a woman hiking with the onX logo

How onX accelerated marketing campaigns with mParticle's AI engine

April 17, 2024
A headshot of mParticle's Director of Digital Strategy & Business Value, Robin Geier


Introducing Robin Geier: mParticle's Director, Digital Strategy and Business Value

Robin Geier – April 16, 2024
M&S Customer Story

How Marks & Spencer drove revenue growth with mParticle's real-time segmentation and personalization

February 26, 2024

Try out mParticle

See how leading multi-channel consumer brands solve E2E customer data challenges with a real-time customer data platform.

Explore demoSign upContact us

Startups can now receive up to one year of complimentary access to mParticle. Receive access