Posts

LGE is the First to Integrate Audioburst Deep Analysis API for Live Audio Streams

 

After several significant partnership announcements throughout the past twelve months (Harman, Radioline, Top Buzz & News Republic, and Nippon Broadcasting), Audioburst is closing out the year with a bang as we announce our latest product offering, Audioburst Deep Analysis, and as part of this offering, our first collaboration with LG Electronics (LGE).

Our two companies are teaming up to build the next big thing in in-car infotainment centers for leading automakers. The experience, which is still in stealth mode, is built upon our new Deep Analysis API, which adds an additional level of metadata categorization and provides a more in-depth understanding of the content being searched for and enjoyed in real-time.

“Automotive makers are seeking to redefine what in-car infotainment looks like alongside the progression of voice search, in-car personal assistants, and autonomous driving,” said Vice President Lee Sang-yong, Head of LGE Vehicle Infotainment System Research Lab. “ We are excited to lead the charge in developing the new gold standard and be the first to partner with Audioburst’s technology, which we will proudly debut to select partners at CES.”

“LGE is known for building innovative in-car infotainment systems, and today’s partnership showcases what’s achievable when the brand comes together with Audioburst’s unmatched AI technology to reimagine in-vehicle entertainment like never before” said Assaf Gad, VP of Marketing and Partnerships at Audioburst, “We are excited to introduce Audioburst Deep Analysis and collaborate with brands to better understand consumer behavior and listening experiences.

In January more details about the new Deep Analysis API will be unveiled at CES 2019 in Las Vegas — We hope to see you there!

For media inquiries, contact Jaimen Sfetko at: jaimen@audioburst.com

 

 

 

Artificial Intelligence (AI) is everywhere these days. From chatbots to autonomous cars to our very own search platform, computers are working hard to help improve everyday life. But how do they do it, exactly?

We thought it’d be fun to give you a short primer on how AI aides Audioburst in our mission to organize the world’s audio content.

Algorithms

 

In order to provide you with the best search results possible, our engineers have come up with a set of algorithms.

An algorithm is basically a set of rules used to resolve a problem….kind of like learning PEMDAS back in your high school Algebra class. A mathematical algorithm, like the order of operations, gives you instruction on how to solve a math problem. Quite similarly, a programming algorithm is a set of problem-solving instructions given to a computer to assist in attaining a desired result in a computer application.

In the case of Audioburst, that means that we have developed a set of algorithms based in two branches of AI, Natural Language Processing (NLP) and Machine Learning, which take different factors (trending stories, topics of interest, listening habits, intent, etc) into consideration when deciding what audio ‘bursts’ we provide when you query Audioburst Search or News Feed, our Google Assistant / Alexa skill.  

Natural Language Processing

 

In order for the system to be able to produce search results, it needs to understand the meaning of the search terms, but even more importantly, the content of all of the possible results. This is where the use of Natural Language Processing (NLP) technology comes into play. NLP is a type of artificial intelligence that facilitates communication between people and computers using human language. A subset of NLP is Natural Language Understanding (NLU). NLU dials in on the comprehension aspects of natural language communication. It’s important because it isn’t enough to just have a basic vocabulary with the ability to complete simple conversations. You also need the ability to determine the intent of the language if you are going to successfully communicate without excessive mistakes.  

By harnessing the power of NLP and NLU, Audioburst is able to listen to, understand, segment and index millions of minutes of daily talk content from thousands of top audio sources including radio, podcasts, and TV in real time. This thorough analysis is part of what makes Audioburst so special. We don’t just take your request for audio based on a certain topic, event, or keyword and match it up with news headlines trending on text-based search engines. Instead, this deep understanding of both content and user need aids our platform in providing the right story, at right time, from the right sources.

Machine Learning

 

Machine Learning is AI that allows a computer to analyze data and use statistics in order to improve its ability to perform a specific task, or “learn” without being directly aided by programmers.

Audioburst’s platform is constantly learning. Using the above mentioned NLP and NLU technology we already have a deep understanding of what is happening in the news. However simply knowing what’s in the news isn’t enough to provide the best results. Just like with real human interaction, you also need to consider context, intent, and tone in order to know exactly what a listener is looking for in a response.

For example, let’s take a look at the ambiguous potential of the search term ‘Kim.’ It’s included in both stories about the infamous North Korean leader, Kim Jong-un, but it can also found just as plentifully in stories about reality TV star, Kim Kardashian. Our AI is extremely versed in both world news and politics, as well as entertainment and juicy celebrity gossip. When audio content about either of the ‘Kims’ is received, the platform is able to categorize them into their appropriate categories. That in itself is extremely helpful, but it gets better: even in a week where President Trump has both Kims appearing in his news coverage, Audioburst can still track and associate each story with its appropriate context.

That is just one example.

Machine learning is also the technology that helps us identify new stories. In this case, we’re going to focus on the search term ‘Michael.’ Most of the time, a user searching for information about Michael is going to be looking for information on a famous person, such as Michael Jackson, or Michael Jordan. However, last month when a powerful hurricane formed in the Atlantic, the next name on the World Meteorological Organization’s Tropical Cyclone Names list for the region happened to also be ‘Michael.’ Within moments of Hurricane Michael news hitting the airwaves and being indexed by Audioburst, a query to our platform for “Any news about Michael?” suddenly had a completely different meaning. Through the power of machine learning, our AI made sure that users seeking pertinent information about the powerful storm were hearing the most up to date information about the correct Michael at a time when accessing that news was critical.

As you can see, our AI is an extremely powerful tool that helps drown out the noise, surfacing only the most relevant information. With each search interaction, the platform learns more about the user, and their preferences, interests, and habits teach the system how to provide more accurate and meaningful results. When you combine this strong understanding of the individual with the ability to process audio in real-time,  you create the experience that has made Audioburst untouchable in the realm of audio search.

 

The #AskNewsFeed Challenge

 

We hope you enjoyed this glimpse into how AI is powering the next generation of audio here at Audioburst. We also wanted to take a moment to let you know about a fun event we are currently running.

The #AskNewsFeed Challenge is a contest that helps us challenge and stretch the boundaries of our AI by inviting you to do your best to stump our Google Assistant / Alexa skill, News Feed.

The contest is on now, however you can still register here, with full details on our blog. We feel pretty confident in our AI’s abilities, so hit us with your best shot!

 

Alexa is a Content and Social Media Specialist at Audioburst. Previously Alexa has held roles at companies such as Intel and the fitness watch company, Basis Science.