Text analytics, also called text mining, has countless applications. Businesses are taking advantage of text analytics to update their service offerings, improve compliance, get ahead of PR disasters, and more.
Here are 5 examples of the industries taking advantage of text analytics in 2021.
1. Hospitality
Hotels live and die by their reviews. Reviews are not only crucial to whether someone books a stay, but they also give valuable insight into what a business is doing well – or not. And while the hospitality industry has been decimated over the COVID-19 pandemic, the quickening vaccine deployment holds great promise for 2021 and beyond for the industry. Hotels use text analytics to get a deep understanding of where they excel and where they can improve, as well as what others are doing. Say some reviews mention poor wi-fi. A hotel can analyze these reviews deeper to nail down whether the wi-fi problem is a hotel-wide approach or just in some rooms. Once they’ve figured it out, they can make the fix, thank the reviewer for their feedback, and be on their way to improved reviews in the future.
2. Financial Services
The financial services sector is hugely complex. There’s an enormous amount of interaction, documentation, risk analysis, and compliance involved. Financial services firms are using text analytics to analyze customer feedback, evaluate customer interactions, assess claims, and to identify compliance risks. Take compliance. Staff can use an NLP-based text analytics solution to quickly and easily search internal legal documents for phrases relating to finance or fraud. This can save an enormous amount of time compared with doing so manually.
3. Medical Affairs and Pharma
Medical affairs specialists help move pharmaceutical products from R&D to commercialization. This involves an encyclopedic knowledge of drug body and government regulations, as well as drug compendia. Medical affairs specialists are using text analytics to parse each of these and automatically report back on changes. The specialists can then course correct depending on what these changes mean for the drug they’re developing. Using text analytics rather than human effort reduces the time spent on tracking these changes, and is more accurate and far-reaching as well. Download AI for Medical Affairs Whitepaper
4. PR and Advertising
Text analytics is brilliant at sentiment analysis – something that PR is all about monitoring. Text analytics can run in real-time to track the sentiment in mentions about a particular company, alerting them to potential brand reputation emergencies. In advertising, text analytics can help monitor the reach of a campaign and how it’s being received. For example, a leading provider of Media Monitoring and Social Influencing used Lexalytics’, an InMoment company, API to create custom dashboards to analyze its customers’ media relations programs in terms of sentiment, engagement, perception, and performance.
5. Retail
In retail, the customer is always right. E-eCommerce retailers in particular need to make sure that the customer experience is as positive as possible, and with the boon in online buying during the pandemic, this is more important than ever. A poor experience means a customer is unlikely to return – even more so than in physical stores that people frequent due to their proximity. Many e-tailers are turning to text analytics to curate, collate and analyze feedback that helps identify points of friction when using an ecommerce website or dealing with customer support.
Would you like to know how text analytics can help your business or industry? Get in touch
Healthcare databases are growing exponentially. Today, healthcare providers, drug makers and others are turning this data into value by using text analytics and natural language processing to mine unstructured healthcare data and then doing something with the results. Here are some examples.
This article explores some new and emerging applications of text analytics and natural language processing (NLP) in healthcare. Each application demonstrates how HCPs and others use natural language processing to mine unstructured text-based healthcare data and then do something with the results.
Healthcare databases are growing exponentially, and text analytics and natural language processing (NLP) systems turn this data into value. Healthcare providers, pharmaceutical companies and biotechnology firms all use text analytics and NLP to improve patient outcomes, streamline operations and manage regulatory compliance.
In order, we’ll talk about:
Sources of healthcare data and how much is out there
Improving customer care while reducing Medical Information Department costs
Hearing how people really talk about and experience ADHD
Facilitating value-based care models by demonstrating real-world outcomes
Guiding communications between pharmaceutical companies and patients
Even more applications of text analytics and natural language processing in healthcare
Some more things to think about, including major ethical concerns
NLP in the Healthcare Industry: Sources of Data for Text Mining
Patient health records, order entries, and physician notes aren’t the only sources of data in healthcare. In fact, 26 million people have already added their genetic information to commercial databases through take-home kits. And wearable devices have opened new floodgates of consumer health data. All told, Emerj lists 7 healthcare data sources that, especially when taken together, form a veritable goldmine of healthcare data:
1. The Internet of Things (IoT) think FitBit data)
2. Electronic Medical Records (EMR)/Electronic Health Records (EHR) (classic)
3. Insurance Providers (claims from private and government payers)
4. Other Clinical Data (including computerized physician order entries, physician notes, medical imaging records, and more)
5. Opt-In Genome and Research Registries
6. Social Media (tweets, Facebook comments, message boards, etc.)
7. Web Knowledge (emergency care data, news feeds, and medical journals)
Just how much health data is there from these sources? More than 2,314 exabytes by 2020, says BIS Research. For reference, just 1 exabyte is 10^9 gigabytes. Or, written out, 1EB=1,000,000,000GB. That’s a lot of GB.
But adding to the ocean of healthcare data doesn’t do much if you’re not actually using it. And many experts agree that utilization of this data is… underwhelming. So let’s talk about text analytics and NLP in the health industry, particularly focusing on new and emerging applications of the technology.
Improving Customer Care While Reducing Medical Information Department Costs
Every physician knows how annoying it can be to get a drug-maker to give them a straight, clear answer. Many patients know it, too. For the rest of us, here’s how it works:
You (a physician, patient or media person) call into a biotechnology or pharmaceutical company’s Medical Information Department (MID)
Your call is routed to the MID contact center
MID operators reference all available documentation to provide an answer, or punt your question to a full clinician
Simple in theory, sure. Unfortunately, the pharma/biotech business is complicated. Biogen, for example, develops therapies for people living with serious neurological and neurodegenerative diseases. When you call into their MID to ask a question, Biogen’s operators are there to answer your inquiry. Naturally, you expect a quick, clear answer. At Biogen Japan, any call that lasts more than 1 minute is automatically escalated to an expensive second-line medical directors. Before, Biogen struggled with a high number of calls being escalated because their MID agents spent too long parsing through FAQs, product information brochures, and other resources.
Today, Biogen uses text analytics (and some other technologies) to answer these questions more quickly, thereby improving customer care while reducing their MID operating costs. When you call into their MID, operators use a Lexalytics-built search application that combines natural language processing and machine learning to immediately suggest best-fit answers and related resources to people’s inquiries. MID operators can type in keywords or exact questions and get what they need in seconds. (The system looks like this illustration.) Early testing already shows faster answers and fewer calls sent to medical directors, and the application also helps new hires work at the level of experienced operators, further reducing costs.
Hearing How People Really Talk About and Experience ADHD
The human brain is terribly complicated, and two people may experience the same condition in vastly different ways. This is especially true of conditions like Attention Deficit Hyperactivity Disorder (ADHD). In order to optimize treatment, physicians need to understand exactly how their individual patients experience it. But people often tell their doctor one thing, and then turn around and tell their friends and family something else entirely.
A Lexalytics (an InMoment company) data scientist used our text analytics and natural language processing to analyze data from Reddit, multiple ADHD blogs, news websites, and scientific papers sourced from the PubMed and HubMed databases. Based on the output, they modeled the conversations to show how people talk about ADHD in their own words.
The results showed stark differences in how people talk about ADHD in research papers, on the news, in Reddit comments and on ADHD blogs. Although our analysis was fairly basic, our methods show how using text analytics in this way can help healthcare organizations connect with their patients and develop personalized treatment plans.
Facilitating Value-Based Care Models by Demonstrating Real-World Outcomes
Our analysis of conversations surrounding ADHD is just one example in the large field of text analytics in healthcare. Everyone involved in the healthcare value chain, including HCPs, drug manufacturers, and insurance companies are using text analytics as part of the drive towards value-based care models.
Within the value-based care model, and outcome-based care in general, providers and payers all want to demonstrate that their patients are experiencing positive outcomes after they leave the clinical setting. To do this, more and more stakeholders are using text analytics systems to analyze social media posts, patient comments, and other sources of unstructured patient feedback. These insights help HCPs and others identify positive outcomes to highlight and negative outcomes to follow-up with.
Some HCPs even use text analytics to compare what patients say to their doctors, versus what they say to their friends, to identify how they can improve patient-clinician communication. In fact, the larger trend here almost exactly follows the push in more retail-focused industries towards data-driven Voice of Customer: using technology to understand how people talk about and experience products and services, in their own words.
Guiding Communications Between Pharmaceutical Companies and Patients
Pharmaceutical marketing teams face countless challenges. These include growing market share, demonstrating product value, increasing patient adherence and improving buy-in from healthcare professionals. Lexalytics customer AlternativesPharma helped those professionals by providing useful market insights and effective recommendations.
Before, companies like AlternativesPharma relied on basic customer surveys and some other quantitative data sources to create their recommendations. Using our text analytics and natural language processing, however, AlternativesPharma was able to categorize large quantities of qualitative, unstructured patient comments into “thematic maps.” The output of their analyses led to research publications at the 2015 Nephrology Professional Congress and in the Journal Néphrologie et Thérapeutiques.
Further, AlternativesPharma helped customers verify assumptions made by Key Opinion Leaders (KOLs) regarding the psychology of patients with schizophrenia. This theory was then documented in collateral and widely communicated to physicians. (Full case study)
More Applications of Text Analytics and Natural Language Processing in Healthcare
The above applications of text analytics in healthcare are just the tip of the iceberg. McKinsey has identified several more applications of NLP in healthcare, under the umbrellas of “Administrative cost reduction” and “Medical value creation”. Their detailed infographic is a good explainer. Click the image (or this link) to read the full infographic on McKinsey’s website.
Meanwhile, this 2018 paper in The University of Western Ontario Medical Journal titled “The promise of natural language processing in healthcare” dives into how and where NLP is improving healthcare. The authors, Rohin Attrey and Alexander Levitt, divide healthcare NLP applications into four categories. These cover NLP for:
Patients – including teletriage services, where NLP-powered chatbots could free up nurses and physicians
Physicians – where a computerized clinical decision support system using NLP has already demonstrated value in alerting clinicians to consider Kawasaki disease in emergency presentations
Researchers – where NLP helps enable, empower and accelerate qualitative studies across a number of vectors
Healthcare Management – where patient experience management is brought into the 21st-century by NLP used on qualitative data sources
Next, researchers from Sant Baba Bhag Singh University (former link) explored how healthcare groups can use sentiment analysis. The authors concluded that using sentiment analysis to examine social media data is an effective way for HCPs to improve treatments and patient services by understanding how patients talk about their Type-1 and Type-2 Diabetes treatments, drugs, and diet practices.
Finally, market research firm Emerj has written up a number of NLP applications for hospitals and other HCPs, including systems from IQVIA, 3M, Amazon and Nuance Communications. These applications include improving compliance with industry standards and regulations; accelerating and improving medical coding processes; building clinical study cohorts; and speech recognition and speech-to-text for doctors and healthcare providers.
Some More Things to Consider: Data Ethics, AI Fails, and Algorithmic Bias
If you’re thinking about building or buying any data analytics system for use in a healthcare or biopharma environment, here are some more things you should be aware of and take into account. All of these are especially relevant for text analytics in healthcare.
First: According to a study from the University of California Berkeley, advances in artificial intelligence (AI) have rendered the privacy standards set by the Health Insurance Portability and Accountability Act of 1996 (HIPAA) obsolete. We investigated and found some alarming data privacy and ethics concerns surrounding AI in healthcare.
Second: Companies with regulatory compliance burdens are flocking to AI for time savings and cost reductions. But costly failures of large-scale AI systems are also making companies more wary of investing millions into big projects with vague promises of future returns. How can AI deliver real value in the regulatory compliance space? We wrote a white paper on this very subject.
Third: The “moonshot” attitude of big tech companies comes with huge risk for the customer. And no AI project tells the story of large-scale AI failure quite like Watson for Oncology. In 2013, IBM partnered with The University of Texas MD Anderson Cancer Center to develop a new “Oncology Expert Advisor” system. The goal? Nothing less than to cure cancer. The result? “This product is a piece of sh–.”
Fourth: “Bias in AI” refers to situations where machine learning-based data analytics systems discriminate against particular groups of people. Algorithmic bias in healthcare AI systems manifests when data scientists building machine learning models for healthcare-related use cases train their algorithms on biased data from the start. Societal biases manifest when the output or usage of an AI-based healthcare system reinforces societal biases and discriminatory practices.
Improve Your Understanding: What Are Text Analytics and Natural Language Processing?
In order to put any tool to good use, you need to have some basic understanding of what it is and how it works. This is equally true of text analytics and natural language processing. So, what are they?
Text analytics and natural language processing are technologies for transforming unstructured data (i.e. free text) into structured data and insights (i.e. dashboards, spreadsheets and databases). Text analytics refers to breaking apart text documents into their component parts. Natural language processing then analyzes those parts to understand the entities, topics, opinions, and intentions within.
While the impact of artificial intelligence (AI) is a bit of a mixed bag in a number of industries, we’re seeing some exciting traction in financial services. In this month’s article, I take a look at some specific examples of where machine learning and AI are helping financial services organizations improve their services, products, and processes.
AI Helps Financial Services Reduce Non-Disclosure Risk
Financial firms and banks are taking advantage of AI to ensure that their employees are meeting complex disclosure requirements.
Generally, financial advisors must make sure that their “client advice” documents include proper disclosures to demonstrate that they’re working in their client’s best interests. These disclosures may cover conflicts of interest, commission structure, cost of credit, own-product recommendations and more. For example, advisors must clearly disclose the fact that they’re encouraging a client to purchase a position in a company that the firm represents (a potential conflict of interest).
To ensure compliance, firm auditors randomly sample these documents and spot-check them by keyword or phrase searches. But this process is clunky and unreliable, and the cost of failure is high: Some estimates put the price of non-compliance as high as $39.22 million in lost revenue, business disruption, productivity loss and penalties.
To help financial services firms ensure disclosure compliance, companies like FINRA Technology, Quantiply and my company offer AI solutions that use semi-structured data parsing to analyze client advice documents and extract all of the component pieces of the document (including disclosures). Then, using natural language processing to understand the meaning of the underlying text, the AI structures this data into an easily-reviewable form (like an Excel document) where human auditors can quickly evaluate whether all necessary disclosures were made. Where before an auditor might spend hours to review 1% of their firm’s documents, AI solutions like this empower the same person to review more documents in less time.
AI Fights Elder Financial Exploitation
$1.7 billion. That’s the value of suspicious activities targeting the elderly, as reported by financial institutions in 2017 alone. In total, the United States Consumer Financial Protection Bureau (CFPB) says that older adults have lost $6 billion to exploitation since 2013. One-third of these people were aged 80 or older, some of whom lost more than $100,000.
Thankfully, tech companies and financial institutions are fighting back. The CFPB notes that “Regularly studying the trends, patterns and issues in EFE SARs [Elder Financial Exploitation Suspicious Activity Reports] can help stakeholders enhance protections through independent and collaborative work.” This is a great opportunity for machine learning and AI, which use reams of historical data to predict what is likely to happen next.
Wells Fargo, for example, uses machine learning and AI to identify suspicious transactions that merit further investigation. Ron Long, director of elder client initiatives for Wells Fargo Advisors, told American Banker earlier this year that their data scientists are constantly working to add new unstructured and structured data sources to improve their capabilities. “While a tool can’t replace human assessment,” he said, “machine-learning capabilities play an important part in our strategy to reduce the number of matters requiring a closer look so we can focus on actual cases of financial abuse.”
One example is EverSafe, an identity protection technology company founded in 2012, which draws on multiple data sources to train its AI. EverSafe places itself at the nexus of a user’s entire financial life, analyzing behavior across multiple accounts and financial advisors. This approach dramatically improves their AI’s ability to identify erratic activity or anomalous transactions. Eversafe’s founder, Howard Tischler, says he was inspired to create the company after his aging, legally blind mother was scammed multiple times, including by someone who sold her a deluxe auto club membership.
AI Adds A Crucial Competitive Edge In High-Frequency Trading
Back in the 1980s, Bloombergbuilt the first computer system for real-time financial trading. A decade later, computer-based high-frequency trading (HFT) had transformed professional investing. Some estimates put HFT at 1,000x faster than human-human trading. But since the 2010s, when trading speeds reached nanoseconds, industry leaders have been looking for a new competitive edge.
To keep up with (and ahead of) the competition, industry leaders are turning to algorithmic trading. The sheer volume of trading information available for machines to analyze makes artificial intelligence and machine learning formidable tools in financial marketplaces. Investment firms use AI to increase the predictive power of the neural networks that determine optimal portfolio allocation for different types of securities. In simpler terms: Data scientists use reams of historical prices to train computers to predict future price fluctuations.
AI has already proven its value in HFT. Renaissance Technologies, an early adopter of AI, boasted a return of 71.8% annually from 1994 to 2014 on its Medallion Fund (paywall). Domeyard, a hedge fund, uses machine learning to parse 300 million data points in the New York Stock Exchange, just in the opening hour. And PanAgora, a Boston-based quant fund, deployed a specialized NLP algorithm to quickly decipher the cyber-slang that Chinese investors use on social media to get around government censorship. These findings give PanAgora, a firm that operates at the speed of fiber optic cables, vital insights into investor sentiment fast enough to keep up with (and influence) its trading algorithms.
Wrapping Up: Tempering Expectations For AI In Financial Services
The value of AI in financial services is clear. But don’t get lost in the hype. For every useful AI system, you can find a dozen problematic algorithms and large-scale failures. To succeed, keep a realistic perspective of what AI can and can’t do to help.
The truth is that artificial intelligence is just a tool. Alone, AI doesn’t really “do” anything. What matters is how you combine AI with other technologies to solve a specific business problem.
How are real business actually using natural language processing? In this blog post we explore 9 interesting business applications of text analytics and NLP across a wide range of industries.
Building an effective NLP application starts with defining a concrete use-case within a specific domain. No two companies are completely alike, and the same goes for business solutions. But this doesn’t mean that learnings from one project cannot be applied to another. With this in mind, we’ve collected case studies across nine different industries to illustrate the potential uses for natural language processing and text analytics.
Biotechnology
When someone calls into the Medical Information Department (MID) at Biogen, they’re routed to operators who search through FAQ’s, brochures, and product resources to answer questions. If the answer cannot be provided within a minute, the call escalates to an expensive medical director. Biogen wanted to reduce the involvement of these directors. So, they turned to InMoment for a solution to empower, not replace, their human operators.
First, we configured our core NLP to identify relevant information within Biogen’s resources. Then, we combined this solution with an open-source search engine and custom user interface. The resulting system understands complex relationships within Biogen’s data. Now, MID operators can now type in keywords or questions to get answers in seconds. Early testing by Biogen already shows faster responses and fewer calls sent to medical directors.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“We’ve worked with InMoment for years on programs surrounding Voice of the Patient, Voice of the Key Opinion Leader (KoL), and social media monitoring… They’ve always been a key partner.” — Keith Ho Director of Customer Focus and Medical Digital, Biogen[/perfectpullquote]
Sports & Entertainment
Brandtix delivers actionable brand performance insight for the world’s top athletes and teams by gathering data from social media and news platforms. They turned to InMoment for a powerful NLP platform that could analyze and decode the jargon-filled language of professional sports. Together, Semantria API and Brandtix’s proprietary algorithms now process fan vernacular across 19 languages. As part of this, Semantria analyzes and structures the sentiment of fan conversations as positive or negative, based on context. These capabilities allow franchise owners, player agents, and PR teams to separate meaningful mentions from general chatter and address PR problems before they get out of hand.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“Choosing InMoment over its competitors was easy — thanks to the mix of service, price, ease of use, and language packs. Further, InMoment counts extraction and sentiment analysis as one action. The other solutions we looked at bill extraction and sentiment separately, charging double the volume and double the price.” — Shahar Fogel Vice President of Product, Brandtix[/perfectpullquote]
Social Media Monitoring
evolve24 is a data analytics firm that combines myriad data sources to help companies develop strategic direction. To process information and provide market intelligence in real-time, evolve24 can only employ best-in-class toolsets with the lowest possible latency and downtime. Salience, a core AI-based NLP engine, provides low-latency text analytics that processes five or more tweets every second, expediting evolve24’s time-to-value for their customers. Salience’s power and customizability give evolve24 the ability to keep up with increasing volumes while helping them maintain high standards of consistency and measurement across a range of text data sources.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“The text analytics engine is a key tool for us in conjunction with our proprietary emotion metric; this next evolution of functionality promises an even more comprehensive look into the conversations our customers’ customers are having.” — Noah Krusell VP of Product Development, evolve24[/perfectpullquote]
Customer Experience Management
VOZIQ offers a suite of Predictive Customer Retention and Customer Experience Management solutions for call centers. Traditional churn prediction models rely on transaction histories and demographics data but fail to incorporate consumer-generated input with real customer sentiment. VOZIQ turned to InMoment to fill this gap.
Through Semantria, VOZIQ categorizes the text comments and identifies customer sentiment from survey scores and keywords in each call log. Since partnering with InMoment, VOZIQ has retained thousands of customers for their clients, resulting in millions of dollars in additional revenue each year.
Industrial & Aviation Design
Gensler’s Los Angeles Aviation and Transportation Studio partnered with InMoment, leveraging sentiment analysis on customer feedback to make better-informed decisions about the planning and design of airports. The result is a data-driven voice of customer program that can help win contracts and build airports that better serve stakeholders and travelers alike.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“As a global industry leader in airport architecture, we utilize the power of Semantria’s rapid and precise data analysis to create better-informed designs for the airports of tomorrow.” — Andy Huang, AIA LEED Associate Designer, Gensler Aviation and Transportation Studio[/perfectpullquote]
Hospitality & Hotel Management
Revinate helps over 30,000 hospitality providers measure online presence, analyze consumer feedback, and reinvent the guest experience. With over 2,700 categories, 100 restaurant topics, 200 hotel topics, and nine languages, Revinate gives their customers the ability to measure consumer sentiment in critical categories, such as rooms, staff, service, and food. Semantria’s customizability lets Revinate’s users create lists of custom topics, follow trending topics as they evolve, and compare sentiment scores across multiple organization-specific metrics.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“The support from the team at InMoment was outstanding; they made a very complex project seem simple. With their partnership, we met our goals on time, delivered the best possible product, and were set up to ensure continued success.” — Matt Zarem, Senior Director of Product, Revinate[/perfectpullquote]
Technology & Electronics
A large tech company’s Customer Market Research (CMR) team helps managers across the company make better decisions regarding product and market strategy. Before, the CMR team used to listen to the Voice of the Customer by designing, distributing, and analyzing a wide range of surveys. As the group began working to integrate social media data, they turned to InMoment.
Their team needed to effectively filter social content in order to extract relevant data, reduce survey spend, easily configure flexible one-off analyses, and validate long-term trends. Traditional social listening tools didn’t offer the customizability and scalability that the CMR team needed, so they contacted InMoment to discuss a “semi-custom” solution.
First, the CMR team extracts a subset of social comments from a InMoment-built data warehouse, based on the products and brands they want to know more about. Then they use Spotlight to analyze this data and understand what people are saying, how they feel, and why they feel that way. Next, they validate the results and relate the net sentiment score to quantitative Likert™ Scale survey data. This approach allows them to compare and contrast what people say in structured surveys, versus what they say in the unstructured environment of social media.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“InMoment is the only vendor we’ve seen that can offer the flexibility that is required to support our complex product line.” — Csaba Dancshazy Senior Market Research Manager [/perfectpullquote]
Fitness Lifestyle & Events
Tough Mudder Inc. has grown to become a leading active lifestyle brand and endurance event company with more than 2.5 million global participants. The Net Promoter Score (NPS) is an essential measurement for the company. However, the volume and the qualitative format of their post-event surveys make it challenging to garner insight.
Using Semantria for Excel, the Tough Mudder team reduced manual survey coding time by 90%. Working with InMoment staff, they designed custom queries to solve an industry-specific sentiment analysis problem. In total, Tough Mudder uses InMoment to process 2,000 surveys for each of the company’s 78 events per season, some 156,000 surveys total.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]“By teaming with InMoment, Tough Mudder is able to report Net Promoter Scores and review participant feedback within a week of every event. The company’s ability to make strategic adjustments based on customer insights is invaluable to providing the ultimate event experience.” — Sydney Friedkin Consumer Insights Analyst, Tough Mudder Inc.[/perfectpullquote]
Regulatory Compliance & Financial Services
The Australian government mandates that financial Statements of Advice (SoAs) include disclosures covering conflicts of interest, own product recommendations, and more. Financial services providers doing business in Australia use SoA templates and frequent spot-checks. This helps make sure that financial advisors aren’t modifying or deleting critical disclosures.
An average-sized firm produces hundreds of pages of SoAs each week. Manual review is costly, unreliable, and exposes the firm to high non-compliance risk. One such firm, unable to find an existing contract analysis tool that could solve this exact problem, turned to InMoment for help. First, we trained our semi-structured data parser with machine learning to understand the underlying structure of the Statement of Advice document. Then, we built a custom natural language processing configuration to extract and analyze entities and other text elements. Then, we structured and exported the resulting data into a simple spreadsheet.
Now, in mere minutes the firm’s auditors can see whether proper disclosures were made across hundreds of documents. They can even identify where an advisor’s recommendations may go against their client’s stated goals and risk attitude. This substantially lowers the firm’s non-compliance risk even while reducing their disclosure compliance costs.
Deploying NLP in Your Business
All of the NLP applications above show how text analytics/NLP can help companies increase revenue and reduce costs. But can a natural language processing application solve your business problems?
Start by answering these questions:
What’s your need?
What’s your desired outcome?
Do you have enough data?
Do you have the right data?
Does the technology exist?
Can you build it?
Is there an established vendor you can work with?
How will you measure your outcome?
Your answers will help you figure out the best way towards solving your own business problems in a cost-effective way. Often this comes down to a question of Build vs Buy. In many (most) cases, it will make more sense to partner with a reliable NLP vendor – so long as you do your homework.
The truth is that many companies flaunt shiny AI systems that promise to solve all the world’s problems. But while moon-shot projects certainly are admirable, the nature of those projects often doom them to failure from the outset. And in the end, business users are not angel investors. They need real applications that deliver results today, not years in the future.
We can’t stress this enough: Everything comes down to how applicable an NLP solution is to your business. Whether you’re in hospitality, entertainment, financial services or any other text data heavy industry, natural language and text analytics can be utilized to unlock value. If you see potential for NLP within your organization, then the next step is to reach out to a vendor. If you speak with InMoment, we’ll start by sitting down with you to understand precisely what you’re trying to achieve, the context you’re working in, and why other providers don’t meet your needs.
Instead of going for a high-risk moon-shot, here's how to effectively integrate AI into your business to solve tangible problems. In this article, originally posted on Forbes, Lexalytics CEO Jeff Catlin keeps it clear and concise.
When people think AI, they often think big, such as curing cancer or solving climate change. Everybody is dreaming up the biggest problems possible and attempting to solve them with AI. Or there’s the flip side: not knowing what to do with AI and avoiding it accordingly. Hence why, according to McKinsey, just 20% of surveyed executives use AI-related technologies in their businesses.
There is a middle ground that will allow you to effectively integrate AI into your business without shooting for the moon (and blowing up on the launch pad). Look for business use cases where AI is already a proven solution — or an emergent one. And ensure that you have the data ecosystem available for AI to do its work.
With the right business case and the right data, AI can deliver powerful time and cost savings, as well as valuable insights you can use to improve your business.
Let’s take a look at a handful of business problems and how AI has been employed to solve them. These are practical, pragmatic, replicable efforts. It’s not intended to be a comprehensive list but instead a group of examples of “right-sized” projects.
The Problem: Predicting Customer Churn And Acting On It
VOZIQ provides customer experience management software to contact and call centers. (Full disclosure: VOZIQ and AlternativesPharma are Lexalytics, an InMoment Company, customers.) For these centers, churn reduction is a major KPI. And they do so largely by using demographic and transaction history data.
However, this approach fails to capture the real-time, dynamic customer data picked up over the phone, much of which is recorded in notes taken by call center workers.
Rather than letting this data sit untapped, VOZIQ made use of it. It integrated AI to analyze post-call comments, categorizing them by topic and flagging sentiment scores that indicate customer dissatisfaction and the likelihood of churn. The company’s call center clients now receive insight into customer motivations, concerns and reasons for calling and are able to use this data to quickly spot and address customer churn.
The Problem: Creating Surveys That Deliver High-Quality Responses
SurveyMonkey is a leading survey software that lets businesses create and publish digital surveys in minutes. The system crunches an incredible 3 million responses every day. Since launching in 1999, SurveyMonkey has built up a powerful database of consumer and employee responses, and it’s now using AI to leverage this data.
It’s doing so in a few ways. One of them is by tapping into past survey results to help businesses create high-performing surveys with high completion rates. The system delivers real-time recommendations for adjusting which questions are asked and how in order to generate higher quality data. The data received by SurveyMonkey comes from unpaid survey-takers, so optimizing for high-quality responses is essential.
The company is also using AI to help organizations map customer feedback via sentiment analysis and to help vet candidates for jobs, scholarships and programs. Together these changes mark SurveyMonkey’s shift toward becoming a business intelligence tool.
The Problem: Reading And Handling Online Reviews
There are countless online reviews sites for guests, travelers and diners to post their experiences. But reading and reviewing them is no simple task. Reviews are scattered across a variety of sites, many of which use different formats. Add to this the challenge of unstructured, text-based reviews and the multilingual nature of the hospitality industry, and obtaining a comprehensive snapshot is a serious challenge.
But this is exactly the sort of situation where AI shines. For example, luxury hotel operator Dorchester Collection is using AI to monitor its own and competitor reviews to identify genuine guest needs. Using a platform called Metis, Dorchester Collection parses, summarizes and contextualizes reviews in order to gain insights, plan next steps and maintain a competitive advantage.
The Problem: Creating Messaging That Resonates With Users
What patients say in a clinical setting is different from what they say behind closed doors — or in the anonymity of the internet. AlternativesPharma is all too aware of this, which is why it uses qualitative data from web forums, social media and blogs in its efforts to help pharmaceutical marketing teams connect with both patients and doctors.
However, sourcing, collating and analyzing such data on a suitably large scale is impossible without the help of technology. To get the insights and in-depth analysis needed to improve pharmaceutical messaging and communications, AlternativesPharma turned to AI. This has allowed the company to analyze, categorize, and “theme” patients’ online discussions around particular diseases and pharmaceutical products. With new insights into how patients talk about certain ailments, AlternativesPharma has been able to help its clients more effectively communicate with patients and medical providers.
Building A Business Case For Your AI Problem
So how do you go about bridging the gap between AI as a possibility and AI as your chosen solution? Building a business case for AI isn’t so different from building one for any other business problem.
First, identify a need and a desired outcome (automation and efficiency are common drivers of successful AI projects). Then undertake a feasibility assessment. You’ll need to determine whether you have enough data to work with and whether it’s the kind of data that lends itself to pattern identification and subsequent decision making. You’ll need to make sure that the technology is sufficiently advanced to do what you need to do. If not, an existing solution may be the more cost-effective option.
Finally, you’ll need to ensure that the ROI of “success” is there. How will you measure your outcomes, and how will you incorporate these new understandings into your business model?
Implementing AI can be a big undertaking. But if you start with a business problem and take an incremental approach, you’ll be able to leverage its time and cost efficiencies to stay competitive both now and in the future.
The story of this Australian financial services firm shows how new Regulatory Technology solutions for financial disclosure compliance monitoring can help firms reduce their costs and non-compliance risks by empowering, not replacing, human auditors.
Financial services firms around the world face strict regulations around disclosure compliance and monitoring. For example, the Australian government mandates that financial Statements of Advice (SoAs) include disclosures covering conflicts of interest, own-product recommendations and more. Each disclosure, in turn, may contain a dozen or more sub-components. This adds up to a major burden for the service provider. On average, globally, financial firms dedicate 10-15% of their workforces and spend a combined $270 billion on regulatory compliance annually. New Regulatory Technology solutions can help financial services firms lower the costs associated with disclosure compliance monitoring and reduce their non-compliance risk. Here’s how.
Meanwhile, a 2016 BBVA Research report found that financial services firms are dedicating around 10-15% of total workforce just to governance, risk management and compliance – that number has almost certainly gone up in the intervening years. In the very next sentence, the same report identified “compliance costs” and “reliance on manual processes in data management” as two of the top issues facing financial institutions.
Financial Document Templates Are Great for Disclosure Compliance, But You Have to Go Further
Faced with strict disclosure mandates, many financial firms build libraries of document templates. Each template is “pre-loaded” with all of the proper disclosures and legal language. Each advisor or broker then modifies the appropriate template, such as a Statement of Advice, on a client-by-client basis. As far as reducing non-compliance risk, this strategy is certainly a good start. But it’s not enough on its own.
The problem is that an average-size financial services firm may produce thousands of pages of client-facing documents every week. In the process, important disclosures may be accidentally modified or removed entirely. What’s more, the sheer volume of data and information in each document may obscure problematic or even predatory advice.
Many firms rely on spot-checks and keyword searches to confirm disclosure compliance and ensure that their advisors are working in each client’s best interests. But this process is slow, costly and unreliable.
Consider this: Looking for individual keywords may return hundreds of irrelevant matches littered through a document. Searching for whole phrases may miss where a disclosure has been truncated or deleted. And how can you use keywords to search for bad advice?
Document templates are a start. But you have to go further.
This is where Regulatory Technology (RegTech) comes into play.
Quick Context: What is Regulatory Technology? and Why Do Many “AI for Regulatory Compliance” Tools Fall Short?
Regulatory Technology (RegTech) is a category of systems that help companies comply with government regulations. For example, solutions like NetGuardians help companies with identifying, tracking and managing fraud incidents.
The RegTech market is hot. Between 2012 and 2017, RegTech companies raised $2.3 billion in funding, according to CBInsights. And ComplyAdvantage reports that the automation of due diligence is at the forefront of the RegTech revolution. But they caution that custom-fitting is key. Indeed, our own research supports the idea that one-size-fits-all RegTech solutions are, by their nature, more likely to fail.
The truth is, traditional data analytics tools often can’t handle legal, financial and and medical documents. In short, many RegTech tools don’t have the technology they need to parse the structure and content of regulatory documents. As a result, disclosure compliance systems may leave behind valuable data or overlook important context. (More info in this client story)
Story Time: A Financial Disclosure Compliance Monitoring Solution That Empowers, Not Replaces, Human Auditors
An Australian financial services company came to Lexalytics for help reducing the time they spent auditing hundreds of pages of Statements of Advice (SoAs). Regular contract analysis tools couldn’t be customized to do exactly what they wanted. And costly failures of other large-scale AI systems had made them wary of entrusting millions to one of the Big Tech companies.
So, rather than building a high-cost, high-risk “AI for disclosure compliance,” Lexalytics focused on improving the Australian firm’s existing process.
First, we trained our semi-structured data parser to understand the underlying structure of Statement of Advice documents. This included teaching the parser how to identify where sections begin and end, such as Scope of Advice and Duty of Disclosure portions.
Then, we built a custom natural language processing configuration to extract and analyze entities and other text elements. In an SoA, important entities are things like recipients, needs, goals, product recommendations, risk attitude and the actual disclosure statements.
Finally, we built a connector to structure and export the resulting data into a simple spreadsheet. (Then, based on user feedback, we made a few tweaks to how the data is organized and displayed.)
Using this system, the firm’s auditors can see at a glance whether proper disclosures were made across hundreds of SoA documents, and even where an advisor’s recommendations may go against their client’s stated goals and risk attitude. This substantially lowers their time spent on SoA review, reduces their non-compliance risk, and helps them demonstrate disclosure compliance whenever needed.
(See this other white paper for more on the process we followed for building this “semi-custom” regulatory compliance application.)
How to Build a Better Regulatory Compliance Solution
Regulatory compliance as a field varies by industry, by country, and even by company. This means that every compliance challenge is unique to some degree.
What’s more, the nature of the documents involved in regulatory compliance means that to build an “AI for regulatory compliance,” you need more than just AI. In fact, you need a combination of semi-structured data parsing, natural language processing, and machine learning. (More on why that is in this paper.)
Of course, not every RegTech system will necessarily need all three technologies at the same time. Our financial services disclosure compliance monitoring solution, for example, only uses semi-structured data parsing and natural language processing. (Of course, the NLP itself involves a lot of machine learning.)
Together, these factors mean that traditional data analytics techniques and one-size-fits-all compliance tools will often, by their very nature, fall short.
To really solve regulatory compliance problems, the most important thing is to choose a solution provider who combines the following characteristics:
Has all three of these technologies at their disposal (semi-structured parsing, NLP, and machine learning)
Can demonstrate that they know how, and where, to use them (and when not to use them)
Demonstrates a proven methodology for building a system that’s custom-fit to your unique needs
Feel free to contact us if you’d like to discuss your own regulatory compliance challenges and how Regulatory Technology could help reduce your costs and risk.
Net Promoter Score (NPS) is great for a quick view of customer satisfaction and brand health. But NPS can be dangerously misleading. Here's why.
Net Promoter Score (NPS) is great for a quick overview of customer satisfaction and brand health. But NPS ignores nuance. A single number can’t tell you why customers feel the way they do. The upshot? You may be making bad decisions based on misleading NPS metrics. In a world where customer experience is everything, this can be disastrous.
Key take-aways
A high Net Promoter Score doesn’t mean your brand is healthy
People often leave comments that don’t match their NPS
You can fill this consumer insights gap by analyzing open-ended survey responses, social comments and online reviews
Read on to learn more about the dangers of measuring customer satisfaction with pure-NPS, and how you can use NLP-powered BI tools to fill this customer insights gap.
What is NPS?
The benefits of NPS
Why is it bad to rely on NPS alone?
The NPS insights gap
How NPS can be misleading
Bridging the NPS insights gap
Customer review analytics in action
How to build a better Voice of Customer program
What is NPS?
Net Promoter Score (NPS) is a single-question survey designed to measure customer brand loyalty. NPS asks,
“How likely are you to recommend [Company X] to a friend or colleague?”
Customers can answer on a scale:
0-6 = Detractor
7-8 = Passive
9-10 = Promoter
Promoters are likely to buy again or generate referral business. Detractors are unlikely to buy again and may actively discourage others. Passives fall between the two.
A company’s net promoter score is a simple calculation:
Company NPS = [% Promoters] – [% Detractors]
As we’ll see, NPS is a versatile number that offers a wide range of benefits and practical applications. But NPS can be dangerously misleading without deeper, supplemental business intelligence.
The benefits of NPS
First and foremost, the NPS system is proven to increase survey response rates by giving customers a chance to have their voice heard, without requiring a substantial time investment.
Next, a company’s Net Promoter Score can simultaneously be taken as a snapshot and tracked over time for predictive analytics.
Third, NPS can be measured by company, product, franchise location, support agent, and a wide range of vectors.
Fourth, NPS can serve as a predictor of business growth. A Promoter’s customer lifecycle value (CLV) is usually higher than a Detractor’s or Passive’s. So, a higher NPS naturally correlates with higher revenue, and vice versa.
Fifth, NPS drives rapid changes in policies, products and processes. By using a simple, shared vocabulary, NPS lets you quickly share information within an organization, while being sure that everyone reaches the same conclusions. This helps companies reduce the communication delay between customers, support agents, and product managers.
Finally, you can compare your company’s aggregated Net Promoter Score against your competitors for a simple picture of your brand’s relative health. If your business has an NPS of 70, but your chief competitor boasts a 90, you know to start digging deeper to find out why.
In short, the Net Promoter Score system is a simple, easy way for businesses to paint a clear picture of consumer opinion and brand health.
For these reasons and more, NPS has become a go-to customer success metric for companies and agencies across every industry and vertical. But NPS isn’t enough on its own.
Why is it bad to rely on NPS alone?
The NPS system delivers an easy-to-understand measure of customer satisfaction. And because NPS questions generate more responses than traditional satisfaction surveys, NPS can give you more data to act on. But in the end, this is a dangerous oversimplification. A high NPS doesn’t mean your brand is healthy.
Customers don’t care about your NPS. They want to know that they’ll enjoy the experience of using your products and services. And without understanding why you’re receiving your scores, and without giving your customers the chance to tell you in their own words, you’ll never have the data you need to make informed, effective decisions.
The “why” comes from open-ended survey responses, customer comments, social media posts, and online reviews (which is an information source that is notoriously challenging and labor-intensive to utilize). As we’ll show, this is where natural language processing comes into play.
The NPS insights gap
Meet Stephanie.
Stephanie just wrapped up a four-night stay at a San Francisco hotel while in town for a conference. When asked how likely she’d be to recommend the hotel to others, Stephanie responds with an enthusiastic 9.
Sounds great, right? Another promoter created, a higher NPS for the hotel, and a happy management team. Bonuses all around!
Not so fast. Stephanie also left a free-text comment on the same survey:
“Stayed for 4 nights. The room was spotless, and the bed was super comfy. Especially loved the shampoo and conditioner in the bathroom since I forgot mine at home! I did notice the fruit in the bowl at the front desk looked off and the breakfast was kind of lame. But overall a great stay.”
Overall, Stephanie describes a positive experience and offered a generous Net Promoter Score. But her comment raises two red flags that demand attention: rotten fruit and a “lame” breakfast.
How NPS can be misleading
As we said before, NPS deliberately ignores the nuance of open-response customer surveys in favor of higher response rates and fast action. That’s a fine way to gather basic feedback. But customers often leave comments that don’t match their Net Promoter Score. Ignoring this disconnect can seriously damage your business.
Remember that Stephanie gave her hotel a Net Promoter Score of 9. In her open-ended survey comment, however, she mentioned that the fruit at the front desk looked old and the breakfast was “lame”. Both of these data points are valuable. But a traditional NPS system will totally ignore the critical feedback about the front desk and breakfast service.
And it gets worse. What happens if Stephanie posts her review on TripAdvisor, Yelp, or the hotel’s Facebook page? That shiny NPS may be quickly overshadowed by lost revenue from people turned off by her review.
Without a system in place to analyze Stephanie’s open-ended comment and identify her complaints, the hotel’s managers may never even know why business is down.
Bridging the NPS insights gap
As Stephanie’s story demonstrates, a customer’s Net Promoter Score and their actual comments can send two very different messages.
The best way to fill this “NPS insights gap” is, of course, to read survey responses, online reviews, social comments, and other sources of open-ended feedback.
But the sheer volume of this text is impossible to handle. Until recently, businesses had to comb through customer satisfaction surveys and online review sites by hand. This was a tedious process that required an enormous labor investment for minimal returns.
As a result, customers had few channels through which they could tell companies about their experiences. Companies were all-but-deaf to these stories, and everyone suffered for it.
Today, however, customer feedback analytics tools like the Lexalytics Intelligence Platform enable you to analyze thousands of open-ended survey responses and real, unstructured customer comments and reviews, all in the time it takes to brew your morning coffee..
These solutions combine natural language processing and artificial intelligence to show you how people talk about their experiences with your products, brands and services, in their own words.
Through intuitive dashboards, you can see exactly what people are talking about, how they feel about those subjects, and why they feel that way.
In short: By analyzing open-ended survey responses and real customer comments, you’ll catch the valuable, context-rich data that NPS systems would fail to pick up on.
The outcome? Better customer experiences can increase lifecycle value 6-14x, reduce churn up to 55% and grow revenue 4-8% (source).
The flexibility and customizability of these platforms make them applicable across industries and verticals, particularly in hospitality/transportation, financial services, pharmaceuticals, and retail.
For example, take a look at this dashboard built in the Lexalytics (an InMoment company) Intelligence Platform, using a data set of Facebook reviews of San Francisco International Airport (SFO).
This dashboard tells a compelling story of traveler experiences at SFO.
Overall, guests are satisfied with the airport – but there are several areas of concern that the airport’s management should investigate. For one, there’s a problem with the charging stations that needs to be addressed immediately. Travelers are complaining about flight scheduling, and mentions of this issue have been increasing over time. And Terminal 1 should be speedily modernized like Terminals 2 and 3.
Through rich, multi-layered analytics dashboards like this one, you can uncover compelling stories of customer experiences, as they tell it.
How to build a better Voice of Customer program
To be clear: Net Promoter Score can and should still have a role in your customer experience management. But as we’ve demonstrated, the NPS insights gap can lead you unwittingly into disaster.
To fill this gap, combine NPS and an NLP-powered Voice of Customer analytics tool to paint detailed pictures of customer experiences.
For example, send NPS surveys for a quick, easily-digestible snapshot of brand health. Use this information to make fast, agile changes.
Meanwhile, use your VoC platform to analyze unstructured customer comments, reviews, and open-ended survey responses at scale.
Together, this comprehensive VoC analytics program will deliver the detailed information you need to make informed, effective changes to improve your customer experience.
It's hard to find consensus when it comes to airport rankings. Slideshows here, listicles there — it's always a matter of conjecture and PR. That's why the Lexalytics marketing team set out to define the definitive, data-driven ranking of America's 10 busiest airports. Use this as a resource when planning all your travel and layovers.
Google this: “best airports in the US 2018”
Do you see that? 660 million results. There are gigabytes upon gigabytes of articles across the internet about the pros and cons of various US airports. However, there’s rarely consensus between the many listicles, slideshows and travel blogs. So, Lexalytics, an InMoment company, mined social data from ten of the busiest airports in America. The goal? Rate these airports based on actual customer experience signal. The result: a ranked list driven by data science, not editorializations and PR.
After this project concluded, Lexalytics partnered with Gensler’s Los Angeles Aviation and Transportation Studio, the industry leader in global airport architecture. Together, the teams are completing a firm-wide research project that, in the words of Gensler, aims at “leveraging sentiment analysis to inform the planning and design of airports.”
Architects Kate O’Connor, Justin Wortman and Andy Huang from Gensler have been using Lexalytics’ Semantria Storage & Visualization (Semantria SV) to mine social media data about dozens of America’s airports, with the aim being to find the signal in the noise of customer feedback. Through Semantria’s sentiment analysis technology, the team is generating deep, data-driven insights into what travelers and staff value in their airport experience. It bears mentioning that the information and views presented in this article are Lexalytics alone. This list does not in any way represent the opinions of Gensler or its affiliates.
Methodology
We’ve taken 30,748 Facebook comments from ten of America’s busiest airports and ran them through our Semantria Storage & Visualization platform. A bit of perspective: 30,748 Facebook comments equates to 869,973 words, enough to fill 2,768 pages. That’s more than double the size of War and Peace!
Using natural language processing, we algorithmically sorted the airports based on real customer feedback. In other words, the ranking is based on an airport’s average customer sentiment, rather than opinion or star rating. Want to know more about the factors that influenced each airport’s ranking? Click on each airport’s name to review our deep dive into customer sentiment.
If you read our earlier SFO article, it should come as no surprise that San Francisco International Airport tops the list for customer satisfaction. The airport began using social listening in 2017, making improvements aligned with customer demand. Judi Mosqueda, the Director of Project Management for SFO, oversaw the investment of $7.3 million towards improving the airport’s wayfinding experience. This project addresses a major customer concern. The data shows how the social mentions of wayfinding at SFO jumps from very negative to very positive in a year’s time. It’s clear that customers can be your best business adviser if you listen properly – using the right tools.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=””]I fly out of SFO for work and fun roughly 30 times per year. Yes there are weather delays, but this airport is truly a pleasure to use! The terminals are all being updated or have recently been updated, their [sic] are improvements in the weather delays, and there are plenty of flight options to choose from. The food has been getting better as the terminal renovations finish, which makes the weather delays more tolerable… The staff is great at serving the customers that pack in as delays stack up. The lounges are where I’ve noticed the biggest changes. American’s Terminal 2 Admirals Club is immaculate! If you have the pleasure of flying through SFO I highly recommend it![/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=””]Flights frequently get delayed due to ‘weather.’ You can play the weather card here and there, but when half your flights throughout the year are delayed due to ‘weather,’ that’s called a ‘scheduling’ problem.[/perfectpullquote]
What happens in Vegas can also skew the feelings of travelers who make their way through Las Vegas McCarran International Airport. Negatively scored reviews often criticize aspects of the city well beyond the control of airport stakeholders. Using a properly tuned social listening tool like Semantria SV, we see that often the negative customer sentiment is aimed at the City of Sin itself. We also learn that many travelers seem to enjoy the idea of having slot machines in the airport, while others found them an unpleasant reminder of past decisions. When it comes to the airport services themselves, like complimentary wifi on the tarmac, customer reviews score very positively. Overall, travelers find themselves engaged, on time, and happy while at LAS.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]Flew Virgin America for the first time. Classic funk hit ‘Flashlight’ was blasting through the speakers at ticketing! Gotta give them at least four stars for that alone! Also, won at gambling in the terminal. Even better? You get to enjoy super-fast FREE wifi![/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]I liked it but they need to get rid of the slots. After a week n [sic] Vegas I didn’t want to see another slot. I was so ready to go home. Four days is plenty n [sic] that town.[/perfectpullquote]
The deep-dive into Seattle-Tacoma International Airport details how the customer experience breaks all departmental silos. Dirty bathrooms, for example, exacerbate complaints about costly food. However, despite some small challenges, Sea-Tac’s positive reviews account for nearly 40% of its data set — not bad. Topics ranging from the attitude and helpfulness of staff to the quality of the (expensive) food all score positively. By using smart NLP solutions to its advantage, Sea-Tac could quite easily make the changes needed to find its way to the top of this list.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]A welcome stop in a long trip the food is good a little pricey but I guess that’s to be expected, train ride was nice and quick and the staff was friendly!!![/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]Why are all airport’s food so darn expensive? $12 for water and crackers??? Additionally the terminals aren’t equal in terms of food options. Previously I’ve never had an issue with bathrooms but I gotta say this experience was DISGUSTING. Hair everywhere. Gag me. But otherwise it’s great, the staff is very friendly and very helpful![/perfectpullquote]
As the busiest airport in the world, Hartsfield–Jackson Atlanta International Airport is bound to face some challenges. Through the power of NLP and social listening, it becomes clear that its biggest problem is with the wayfinding experience, despite recent and costly renovations. The data also shows that the saving grace for Atlanta is the staff, who are reviewed as courteous and helpful. Still, no amount of good cheer and manners can make up for the navigation nightmare of trying to find the proper gate.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=””]I had to navigate this huge airport with two small children, luggage, and a carseat [sic]. I can honestly say any attendant who saw me was more than helpful. Directing me to TSA, picking us up from the park and ride, and just being considerate. A large place but the staff is more than capable.[/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]Nice airport, good restaurants… only complaint — the signage could use a little bit of work. I stepped off the train to [sic]soon because of the confusing signage between terminal T and the baggage claim. (I’ve actually done this twice on two different trips :-D)[/perfectpullquote]
Denver International
Sentiment Weight: +0.15
Thanks to a colorful connection with conspiracy theorists, Denver’s reviews speak of anti-gravity rooms, the Illuminati, and “Blucifer” — the giant bucking bronco sculpture at the entrance to the airport. As detailed in the linked write-up, social listening shows many of these complaints represent a potential path for customer engagement. When you tune the results, the data show that staff get twice as many positive comments as negative ones. They also reveal that while customers don’t mind the Illuminati so much, they could go for some more hooks on bathroom stalls and an improved baggage claim experience.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]It is actually an underground Illuminati headquarters. The enormous, hideous blue horse statue with glowing red eyes next to the entrance road killed it’s [sic] sculptor before he finished it. The runway layout looks like a swastika when seen from space. There is a mural in one terminal that shows a child in a coffin. There are gargoyles on the inside of the building… Also, the people that work here are extremely nice and helpful![/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]Fantastic airport ruined by terrible bag delays which are common. Waited more than 30 minutes for my bags. Unacceptable.[/perfectpullquote]
In 2017, Dallas-Fort Worth International Airport ranked as both the best and worst airport (on different lists, of course). That it lands here on our list makes perfect sense as customer reviews score mostly neutral. Still, our sentiment analysis reveals that when travelers do get passionate about DFW, it’s usually about cleanliness. Interestingly, a PR director working for a DFW affiliated agency reached out to contest our findings; head over to the deep-dive article to see the exchange — and see the differences between AI and PR. A trove of positive comments highlight DFW’s inter-terminal tram system, Skylink. When properly maintained, Skylink is a unanimous crowd pleaser.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]Best airport for kids! Kids play areas and the Skylink! We purposefully will always layover at DFW because it’s always a great experience.[/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]One of the grossest airports I’ve been to. Carpet in the waiting areas is absolutely filthy, as are the seats in the sky trams as well. Not sure if they even vacuum???[/perfectpullquote]
Chicago’s position on this list is unique from the others, as the sentiment weight is skewed. The subject of a viral national news story, ORD received a flurry of one-star reviews in a short period of time. Using Lexalytics’ web dashboard, Semantria Storage & Visualization, we see occurrences of 1-star Facebook ratings jump from 11% to 58% within days of the incident. The number of 5-star ratings dropped by more than half overnight. There is no question the right social listening tool might have made all the difference to Chicago’s standing during that crucial time, a subject we explore in the article.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]Flight on time. Security line reasonable. Was not beaten and dragged off the flight I paid for by agents of an unchecked police state. So all in all a better than average experience.[/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]I’m paying for wifi and I’m using my last amount of battery to write this glowing review of O’Hare, that’s how much I’m disappointed with this airport. As a disclaimer my flight was delayed due to not being able to clear snow from the runways. I won’t fault am airline/airport for weather, but I will fault you for not being able to handle minor precipitation. Not prepared for snow of any kind in December! Concourses are dated. Waiting areas have no outlets… and chairs look like they’re straight out of the 1970’s. Avoid O’Hare at all costs… Also, I figured out that all these large plastic bins scattered around on the floor of the airport are for the crappy leaky roof. Real great look, Chicago.[/perfectpullquote]
Los Angeles International
Sentiment Weight: +0.06
If first impressions last forever, then Los Angeles International Airport is in trouble. Thanks to LA’s famous traffic, the airport faces challenges before customers and staff even arrive at the facility. While some of the responsibility lies with the City of Angels itself, the airport could use — and fortunately is using — AI powered natural language processing to effect landside improvements. The airport recently began deploying text analytics to inform infrastructure changes. Now, myriad improvements are in the works, ranging from a metro link to a tram connecting arrivals to rental car companies. If they follow through, LAX might rank much higher on future lists.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]I was pleasantly surprised by how chic the new airport remodel was! Nice little shops, a MAC counter, trendy eateries, a Tumi, and a Frederic M, plus plenty of places to charge my phone and tablet, and although I don’t drink, lots of bars for those that do![/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]Food options are great….if you can get there threw [sic] the traffic. Once in the vicinity it is so bottle necked its [sic] a mess. The rental car return is so far away dont [sic]even bother. You teally[sic]have to leave where you are at (if youre [sic]within a 35 miles radius, longer if you are farther) at least 4-5 hours before your plane leaves, just to get to LAX. Onc e [sic] within your terminal its [sic] nice.[/perfectpullquote]
Like Las Vegas McCarran, John F. Kennedy International Airport is as much a cultural landmark as a business. Nonetheless, customers come through by the millions and have many of the complaints you might expect of a giant transit hub. Last year, the New York state government earmarked $7 billion for renovations. They run the risk of throwing good money after bad if they fail to tap into the strategic knowledge afforded by text analytics and social listening. When it comes to positive customer sentiment for JFK, there isn’t much consensus. The airport faces many challenges ahead if it wishes to win customers from its two nearest competitors, LGA and EWR.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]My favorite of the major NYC airports (JFK, LGA, EWR). Staff is somewhat friendlier and it’s an all around better environment.[/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]Beyond frustrated with the lack of breastfeeding/pumping areas anywhere! No, I don’t want to pump at an airport terminal gate or in a booth at one of the terminal restaurants, but thanks for the offer. JFK is one of the busiest airports so not a lot of low traffic areas. Why don’t airports put outlets in the family bathrooms? It is hard enough having to travel with the extra supplies as is. Ugh… Also, no free wifi?[/perfectpullquote]
People really, really don’t enjoy Charlotte-Douglass International Airport. The consensus narrative exposed by the analysis reveals a systemic attitude problem among staff. Data extracted from the reviews reflect personnel who don’t appear to care about customer experience. Complaints extend to a variety of other areas as well, from ADA compliance to poor signage and wayfinding design, and the recently defunct bathroom attendant program. Charlotte does stand tall with its communal spaces — central to which is a sunlit atrium appointed with trees and snow-white rocking chairs. If Charlotte begins listening to its customers, it will be better empowered to solve their core challenges. In turn, bonuses like rocking chairs will be seen more as a cherry on top, and less as a manifestation of tone-deaf customer support.
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#1d8413″ class=”” size=”14″]As a connecting airport, it has a way to go to compete with Atlanta in terms of efficiency, services, scope… but I *have* had some respectable longer layovers here that were pleasant enough in a big white rocker with someone playing piano in the terminal gently in the evening. As an embarkation airport, it still kinda sucks but is getting better.[/perfectpullquote]
[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”#c8300d” class=”” size=”14″]Horrible!!!! I am Stage 4 Triple Negative Breast Cancer with METS to many bones. I am wheelchair bound as walking is very difficult. One of the people was to take me to terminal 15 and instead dumped me at terminal 8 telling me the terminal has been changed. The terminal was never changed and I was simply deserted. People began asking me if I needed help, these were passengers not employees. Never again will I fly through Charlotte.[/perfectpullquote]
Thank you for flying with us
And so concludes our definitive, data-driven guide to ten of America’s busiest airports. During the ensuing data analysis, we’ve discovered insights about airport architecture and uncovered customer experience best practices; we even learned how to use AI to handle a viral press crisis. It’s clear that when visiting an airport, customers care most about staff attitude, cleanliness, and wayfinding (in that order).
San Francisco International Airport is the gateway to the world's tech capital. In this installment of our airport review analytics series, we see how SFO effectively listens to its customers to guide billion dollar terminal renovations and make daily improvements.
In this series, we’ve been using text analytics to analyze the social media data around America’s busiest airports. In this installment, we broaden the scope to include Tweets as well as Facebook reviews. To begin, San Francisco International Airport is not having the greatest year. Over a six-month period, SFO had three near-miss aviation accidents, any one of which could have been “the worst disasters in aviation history” according toa Business Insider report. An editorialfrom the local Mercury News calling for “action” from the federal government reveals that over a 14-month timespan there were two additional near-misses. SFO has been called the “worst” airport to travel through during the holidays by the New York Daily News, anda 2012 study supposedly found that SFO was the “worst” airport in the country when it comesto delayed or cancelled flights.
Nonetheless, upon closer inspection of the airport’s social media, it’s easy to see that stopping short at occasional bad press and travel column listicles gives only a fraction of SFO’s story; to wit, San Francisco excels where many other airports in this series fall down. As an example, take wayfinding, the architectural study of how people orient themselves in physical space and navigate from place to place. Airports are complex buildings to design. In premise, they must connect large, mixed use spaces through navigational cues intuitive enough to be grasped by cabin crew and children alike. Frequently however, the design falls short of effectively communicating with the subconscious. SFO is no exception, said one disgruntled traveler on Twitter in 2017: “Why aren’t there helpful signs here @flysfo?”
Social listening for airports
San Francisco International Airport actively listened to this feedback, recently pivoting a negative narrative into a positive one. “Yes, we do get comments from passengers [who find themselves lost]” said Judi Mosqueda, SFO Director of Project Management. In response, the airport allocated $7.3 million to remedy the problem throughout the 2.5 million-square-foot space. That was in January, and text analytics already demonstrates a positive customer feedback. One frequent flyer to the United States stopped by the SFO Facebook page to shower praise on the new wayfinding experience: “I find SFO to be one of the easiest airports in the USA to navigate,” they say. “Kudos to SFO for consistently providing a super travel experience!”
Building a better delay at SFO
San Francisco International, like all airports, can spark the ire of its customers when acts of nature foil schedules — perhaps more frequently than most, compliments to its fog, Karl. However, the airport aims to mitigate the stress of delays by investing in lounges, a yoga room, complimentary high speed broadband, museum installations, therapy animals, and more — all of which is represented in the topic sentiment present in the social data.
What is more, recent renovations to Terminals 2 and 3 set a strong standard for other American airports to follow. The social media data set is replete with praise for the new terminals, which boast sophisticated art exhibits, stylish seating areas, strong food vendor offerings, and evocative architectural features, with one reviewer describing the airport as “architecturally stunning.”
“Terminal 2 is probably the nicest domestic terminal in the entire country. Spacious, modern, clean and plenty of places to sit + free wifi!” says one reviewer. Another echoes this sentiment on Twitter, pointing out simply that “Terminal 2 is a class act!” Recently, a customer doubled down on this sentiment, urging SFO to begin similar renovations on Terminal 1: “Terminal 2 in SFO… best terminal by far in the USA. Can’t wait for the renovations in Terminal 1!”
While considering the design for Terminal 1’s renovation, which is estimated to be a $2.4 billion project, SFO’s stakeholders and the design firms they work with ought to dig even deeper into the text. Many of SFO’s review specifically target the airport’s facilities. A frequent target is the connector passageway between Terminals 2 and 3, or rather the lack of it. In 2009, SFO developed a connector passageway between the domestic Terminal 3 to the International Terminal. However, there is no way to navigate between the Terminals 1, 2, and 3 without exiting security. Says one aggravated guest, “I had to move from terminal 3 to terminal 1 and I had to get out one terminal and to get into the other one and I had to go through the already tedious, painful and unfriendly/brainless/rude TSA security checkpoint. Why don’t they have a way to move through terminals without passing security.” Being able to move freely between terminals, especially during a stressful delay or layover, makes a monumental difference in a customer’s experience at an airport.
Beating the competition by winning travelers
Text analytics helps airport stakeholders and travelers alike cut through the noise of editorialization by identifying the signal present within actual user data. With this technology, airports like San Francisco can better respond reactively — like in the case of wayfinding — while developing other proactive structural strategies to grow the customer base. What is more, SFO operates in a busy travel corridor where it competes with two other nearby airports, Oakland International and San Jose International, all the while protecting its market share from the behemoth in the south: LAX. This level of competition is not uncommon in the United States. As such, major airport brands need to get an edge where they can.
When an airport company reviews social data they need to be able to find the signal in the noise. Airports and the cities they serve are often confusingly interchanged on social media. In this article, we use filters and a custom configuration to see exactly what people are saying about McCarran International Airport.
Pop-quiz: what do you think of when you imagine Las Vegas?
Prostitution, gambling, hotels, and recreational marijuana might come to mind. But rather than fighting these associations, Las Vegas’ McCarran International Airport embraces them.
For example, the airport has “pot amnesty boxes”, where people can dump their legally-purchased weed in the event they’re traveling to a state or country with stricter regulations. And they’ve installed slot machines in the terminal, so travelers can get a jump on their gambling.
Or, as one Facebook reviewer put it, “try for one last hurrah.”
A city and its airport
However, this inextricable link between a city and its airport can pose a problem for a business analyst. For example, we recently sourced thousands of reviews from Las Vegas McCarran International Airport’s Facebook page. While analyzing this data set we discovered an interesting phenomenon: reviews on their Facebook page frequently criticized not just McCarran International, but also the city of Las Vegas itself.
Of course, listening to natural language reviews of Las Vegas is interesting. But it’s not useful for a business analyst tasked with understanding how customers experience the airport.
Finding the signal in the noise
To cut through the noise, we configured an analysis to extract what’s being said about McCarran International Airport based on reviews that mention both McCarran and Las Vegas.
To do this, we used Lexalytics’, an InMoment company, web-based dashboard, Semantria Storage & Visualization (SSV). SSV allows any business person to create configurations and run an analysis, even if they have no previous experience with data analytics.
To start, we used the SSV configuration builder. We can easily train the analysis to recognize sentiment in the text data set pertaining to other brands, such as the airlines flying into the airport, or even the city of Las Vegas itself.
First, let’s take a moment to appreciate how the sentiment surrounding “vice” is only positive. In Las Vegas, it seems, vice is virtue!
Now, let’s pull this apart. In this data set, many customers complain about construction on the highway and roads leading to the airport. If our hypothetical business analyst working at the airport doesn’t configure their analysis properly, complaints about this roadwork may impact the sentiment score for McCarran. This will skew the results of the analysis, as civic works, like road construction, are outside the purview of the airport.
However, accounting for this can be tricky. Take this Facebook comment from March 2017, in which a customer complains about road construction:
“Our experience with the airport was overall great no problems at all I just don’t understand why car rentals can’t cooperate and have transportation inside the fence. Then there’s traffic congestion and detours everywhere. A 5 minute trip takes 15-20”
A properly-configured data analytics tool can split this review into its components.
For example, our own Semantria will sort this comment as positive for the airport, while identifying the other entities involved. In this case, “Overall great” adds +0.2 to McCarran’s sentiment score, while “car rentals” and “city infrastructure” get dinged -0.16 and -0.19 respectively.
Working with airport partners
Within any given airport, customers are exposed to numerous third-party vendors and agents. By tuning our analysis, we can focus on conversations about airlines, rental car agencies, and the TSA — all of which are operated by authorities independent from the airport.
Ultimately, these insights will help airport stakeholders share valuable intel with the brands that act as airport ambassadors every day. Furthermore, an analysis like this allows the airport to drill down into relevant conversations where they might affect change.
McCarran customer insights
Overall, analyzing Facebook reviews of McCarran International Airport shows us a mixed bag of opinions. There are some complaints about the cost of food and beverages (although we could say that high prices are inevitable, as the airport shares the retail concession with their restaurant partners, driving prices upward).
A whopping 48% of baggage handling reviews are negative, citing lost, damaged, or delayed luggage. If baggage isn’t delayed, the customers are. Many comments focus on out-of-service doors, people movers, and more.
Says one commenter on Facebook:
“Looked great with the Welcome to Vegas signs BUT couldn’t get to baggage collection as the doors were broken, no airline or airport staff or signage to say how to take a different route. You guys may know it, but visitors don’t!”
Speaking of signage, wayfinding is a consistent problem. As we’ve learned in the past, wayfinding is crucial to the success of an airport.
There are places where McCarran outshines the rest. In 2005, the airport became one of the first to provide complimentary Wifi. Thanks to an emphasis on network friendly infrastructure and regular uptime airline passengers are able to enjoy complimentary unlimited connection even while their on the tarmac. Stuck on a grounded flight? Now you may connect to an LAS branded wifi hotspot and while away the delay. This brand experience goes a long way in promoting customer retention. The emphasis on wifi as a customer experience touchpoint is something an airport company can suss out using text analytics. And, as we’ve pointed out with other examples, this intel can then be baked into the very fabric of the facility.
This fact is reinforced by Samuel G. Ingalls, assistant director of aviation, information systems at LAS, “By the time we started construction on our new Terminal 3, which opened in June 2012, we had a pretty good idea about where to place the Wi-Fi antennas for maximum effectiveness.” The work on expanding network connection onto the tarmac was put to a test in 2015 when 170,000 tech oriented conference attendees descended on Las Vegas. Mr. Ingalls and his team might’ve used text analytics to mine feedback about the experience of these power users, identifying any problem areas. “I saw many people around the airport with at least three devices.” reported Ingalls. “And we didn’t get any negative feedback from these attendees, who used the Wi-Fi system both inside and outside the terminal. I considered that a very positive sign.”
What should McCarran do with these insights?
McCarran might use this social data to design a 2019 budget aimed at solving problems real customers encounter every day. Natural language data is the single best resource for businesses to make profitable decisions. Now, with tools like Semantria Storage & Visualization, all stakeholders in a business may leverage this resource, even if they have no data analytics experience.
JFK International Airport is about to undergo a massive renovation. To understand what JFK should change and why, we analyzed 1000's of traveler reviews using our web-based text analytics platform. Here's what we found.
Consider New York City: the tangy gradient of smells emerging from chocolate shops and beer halls between 18th Street and 14th Street; the dissonance of high heels and sirens pounding against the Upper West Side; plumes of steam on a cold night, seething from deep within the City’s crust. New York is the navel of civilization — a hub where all people meet. To this end, its primary ports of entry, its airports, are unique in their role as ambassadors of the City.
A business and an icon
In this series we’ve examined airports like any other business. But for John F. Kennedy International Airport (JFK), the delineations between retail space, transit hub, and cultural monument are blurred. Analyzing public comments on JFK’s official Facebook page, we found an uncanny trend of users equating the airport to the city as a whole. Unfortunately, the comparison rarely proved positive. This is true even for the locals: “I have lived in NYC for 12 years,” says one man, “this airport is an example of everything wrong in this great city.” So, how might such an airport remedy this reputation crisis?
Here’s to new beginnings
In January 2017, New York Governor Andrew Cuomo announced a $7-10 billion renovation plan for JFK International Airport. While a start-date has yet to be announced, the Governor’s office is accepting proposals. How might this portal to New York City respond to customer feedback? We uncovered some rich insights by mining and structuring thousands of free text reviews from travelers passing through JFK. The body overseeing the renovations, the Airport Master Plan Advisory Panel (AMPAP), might set some criteria based on these qualitative feedback.
Get connected
We ran this Facebook text data set though our web client, Semantria Storage & Visualization (SSV). By viewing Topics, which are query- or model-generated document classifications (in other words, known categories you’re actively looking for), we can see an immediate issue.
Figure 1: JFK topic sentiment polarity.
Notice Internet, the solid red column near the center of the visualization. What might be going on here? As we drill down we quickly notice something all too familiar to any regular at JFK: Wifi. Take it from one foreign traveler:
“How it is possible that one of The biggest airport [sic] of The world dont [sic] Provide free wifi???”
And, from a sardonic American:
“Get free wifi, this place is like a greyhound station ?”
There is, in fact, not a single neutral or positive mention of wifi in the JFK data set. It’s 100% negative.When planning future terminal renovation, AMPAP ought to consider network optimized architecture as well as sponsored, complimentary wifi.
Kindness is a universal language
A trend we’ve noticed across our airport experiment is the frequency of staff attitude. Staff attitude frequently plays first fiddle in the qualitative reviews.
This pie chart illustrates how much of the JFK data set is dedicated to customer-staff interactions.
All four variants — Attitude, Staff-General, Staff-General-Helpfulness, Staff-General-Attitude — constitute a volume nearly equal to the next 16 topics combined. Furthermore, we begin to notice a troubling situation when we compare this pie chart to the sentiment polarity columns from the first visualization. Frequently, customer-staff interactions result in negative feedback.
For each of these categories, sentiment skews neutral-negative. Frequently, visitors mention how JFK appears understaffed, like this American traveler:
“Not enough staff, the staff you do have are rude, shouting at the public like they’re animals. I will never fly through JFK again. End of story.”
JFK is close to two competing airports, including Newark Airport. This means JFK’s non-aeronautical facilities, such as restaurants and retail stores, are especially susceptible to churn. A disgruntled guest, like the one highlighted above, can have a sphere of social influence encompassing hundreds of potential customers.
Hearing and addressing these concerns are the only way to ensure JFK retains a dedicated user base.
The road ahead for JFK
The AMPAP renovation project will cover a broad scope. But central to the mission ought to be the loud and colorful social media manifesto issued by JFK’s many customers. Staff attitude and wifi aren’t the only discussion topics. Hundreds of JFK reviews point to broken elevators, jammed jetways, confusing signage, and more.
Keeping a finger on these real time data streams will define the projects of the future while maintaining the facilities of today.
What is natural language processing? And what does it mean for you, me and your drunk friend? Seth Redmore explains the fundamental concepts of NLP in 5 minutes or fewer.
What is natural language processing? In short, Natural Language Processing (NLP) is the study of making computers understand how humans naturally speak, write and communicate.
Now that’s out of the way, what does NLP mean for you, me and your drunk friend? Read on to find out.
As CMO at a text analytics company, I’m very interested in the latest data analytics tech. But I realize these topics can be hard to understand. All of the phrases involved can start to sound like plot devices in a Douglas Adams novel. Sentiment analysis, intention detection, machine learning, text analytics, natural language processing… the list goes on.
Still, you need to understand them if you want to bring your business apparatus into the 21st century. With that in mind, here’s a 5-minute primer for all you non-techie types out there.
So, what is natural language processing?
I’ll say it again: Natural Language Processing (NLP) is the study of making computers understand how humans naturally speak, write and communicate.
When it works, natural language processing enables us to interact with computers like we interact with other humans. Think of a customer service chat bot, or the way Google seems to know what you intended even when you type in “beset drkun food enar me”. Both cases are NLP in action (though we could spend hours talking about failing chat bots).
Think about it this way: traditionally, communicating with a computer would require giving it very precise, unambiguous, and highly structured instructions. Moreover, these had to be written in dedicated programing languages, like Java, C++ and Ruby. This meant that, realistically, only trained software engineers could have any hope of managing a computer.
In short: NLP gives you, me, and your drunk friend the ability to tell a computer what to do.
Java? Whatever, where’s that food at?
I know, right? Who wants to deal with complex computer programming first thing in the morning? (Well okay, I know some people who do that every day. But not you or me!)
Remember, we humans don’t speak to each other like we speak to computers. We don’t always follow the rules. Human communication conveys messages in ways that, while structured with grammar, can be imprecise and ambiguous. Often, like with slang or idioms, words and their meaning can vary region to region in the same country.
This can create big problems when you, me, or your drunk friend try to interact with our phones or laptops. Just think of the misspellings! Without NLP, our computers would be clueless.
With this new knowledge, go back to our original question: what is natural language processing? Here’s the answer:
In the messy landscape of human communication, NLP is the technology that bridges the vast gap between structured and natural (real) language.
NLP + ML = Natural Language Machine Learning
One last thing to touch on before we go. Modern natural language processing is based on machine learning. For example, software engineers use machine learning to to examine patterns within data, and then draw conclusions on how natural human languages work. By applying these conclusions, machines are able to perform complex text analytics tasks better and more efficiently than before.
Here, Cloud is a reference Cloud Computing. And of course, ASAP is the common acronym for As Soon As Possible. That’s pretty obvious to you and me – but how would a computer know that?
It used to be that a developer would’ve had to go in and manually tell the system to recognize those tangential references and acronyms. But with NLML, a modern text analytics system can figure that out on its own. Finally, the system can easily break the rest of the sentence down into its grammatical elements (“amazing” = adjective, “Cloud” = noun, “delivers” = verb, etc.).
Wrap it up and get some java
So, what is natural language processing? It’s the reason you can order a cup of coffee by telling a chatbot to get you exactly what you want. It’s the reason Google knows you’re looking for late-night drunk food near you, even when you butcher the spelling. And it’s the reason Lexalytics, an InMoment company, exists.
Got that all? Great! Now go order a cup of (brown liquid) java using Whatsapp or something.