NASCAR Races to AWS Cloud for Computing, Machine Learning and AI
NASCAR (National Association for Stock Car Auto Racing) operates several different media assets including websites, mobile applications and social media access. Through all of these assets, NASCAR reaches and engages its 80 million fans around the world, and continues to attract new fans. Interested in enhancing their apps and websites by increasing interactivity and the quality of the material fans can view, NASCAR is using cloud-based machine learning and artificial intelligence to make its video archive more accessible and therefore more profitable.
NASCAR will use AWS systems to build cloud-based services and automate processes, including distribution of a new video series on NASCAR.com called ‘This Moment in NASCAR History’ powered by AWS. The video series debuted during the lead-up into the Monster Energy NASCAR Cup Series race at Michigan International Speedway, sharing great historical moments in NASCAR racing with viewers.
Racing to the Cloud
NASCAR is migrating its 18 PB video archive, gathered over the events of 70 years, to AWS and will use Amazon Rekognition to automatically tag specific video frames with metadata, such as driver, car, race, lap, time and sponsors. Rekognition is an AWS API, supplied as a service, that developers can use to add intelligent image and video analysis to applications. Then NASCAR’s staff can search on the tags to locate specific footage from past races.
By using AWS’s services, the company expects to save hours of manual search time each year, and will be able to find well-known events - like Dale Earnhardt Sr.’s 1987 ‘Pass in the Grass’ or Denny Hamlin’s 2016 Daytona 500 photo finish – much faster, and deliver them to fans as video clips on NASCAR.com and social media channels.
Through the 1970s, fans used to watch race day action on television, supplemented with 15 to 30-minute video highlight packages on the major networks’ sports programs. In 1979, the entire Daytona 500 was broadcast live for the first time. More recently, since second-screen experiences are also a critical aspect of consuming live NASCAR content, a new digital platform was introduced in 2015 for online and mobile access.
Fans Behind the Wheel
Now, through the NASCAR Drive mobile app, subscribers can climb into the race car with the driver to experience the track, and access real-time race stats and predictions using in-car cameras, live audio and live leaderboards. For this app, NASCAR is using AWS Media Services including AWS Elemental MediaLive live video processing service, and AWS Elemental MediaStore storage service. MediaStore is optimised for media and acts as the origin store in the video workflow. These services process, package and store broadcast video content, including live races, for delivery to broadcast and industry partners via the Amazon CloudFront CDN.
To enhance metadata and video analytics, NASCAR plans to use Amazon SageMaker to train deep learning models against their footage archive. Developers use it as a managed service that encompasses a complete machine learning workflow, starting with labelling and preparing data. Then they choose an algorithm, train the model and tune and optimise it for deployment. With Amazon Transcribe, an automatic speech recognition (ASR) service, NASCAR will be able to caption and time stamp each word spoken within their archived videos so that their producers can use text searches to locate source footage. aws.amazon.com