In this article, we run through 10 real life examples of artificial intelligence. Whether you’re a developer, CTO or founder, we’re sure you’ll find out some interesting insights.
We’re covering the following points:
- Virtual Personal Assistants
- Video Games
- Smart Cars
- Purchase Prediction
- Fraud Detection
- Chat Bots
- Social Networking
- Real Estate
- Flying Drones
Virtual Personal Assistants
In recent years, there has been a surge in the development and implementation of virtual personal assistants. These take many forms and leverage various technologies such as voice recognition, text analysis and some even make decisions for you such as automatically scheduling meetings based on incoming emails! Let’s explore some of these personal assistants.
Developed by Amazon, Alexa is a VPA made popular by the Amazon Each and the Amazon Echo Dot. It was released in 2014 and allows you can interact with it simply by speaking it. Alexa is capable of the following:
- Playing music
- Creating to-do lists
- Setting alarms
- Streaming podcasts
- Playing audio books
- Providing weather updates and other real time information such as the news.
Most devices that have Alexa installed allow users to activate the functionality via a “wake word” such as “Echo” or “Alexa”. Some of the more recent developments as of May 2017, include home ordering! Using Alexa, one can now order takeout food from places such as Domino’s Pizza, Pizza Hut and Just Eat. Starbucks also announced a private beta for placing pick up orders.
Alex is built on Skills. Skills give the platform the ability to understand verbal commands and instructions. They run in the cloud meaning there is no installation required from an end user perspective.
In a drive to increase developer adoption of Alexa Skills and enrich the ecosystem, Amazon have been offering up free prizes to developers as part of the Alexa Skills Challenge. In this challenge, developers are tasked with writing an Alexa Skill and can win up to $5,000 is cash!
Alex is popular but it also has its share of sceptics. This technology could listen to private conversations in the home. Amazon has reassured customers than Alexa enabled devices only listen to conversations when the “wake word” has been uttered.
Despite this, the device must listen all the time to detect if the “awake word” has been uttered. Another point worth mentioning is that Amazon uses past recordings to help train Alexa and improve the user experience. These recordings can be deleted though.
At the end of the day, people who use products like Alexa are ultimately trading privacy for convenience. Internet users’ attitudes to privacy have relaxed in the past 5-10 years (increased adoption of social channels has driven that).
X.AI’s “Amy” whilst still falling under the category of VA is a completely different product to Amazon’s Alexa. If it could be summed up in one sentence is would be:
The personal assistant who schedules meetings for you
Amy was born out of the Founders personal paint point of scheduling 1019 meetings in one year alone. As often is the case, these meetings bounced between the respective parties until a suitable appointment date and time was found.
He figured this pain point must be affecting not just him but other information workers so set out to build a virtual agent that leverages AI to reduce the amount of email ping pong between work colleagues when trying to schedule meetings.
Amy’s artificial intelligence can interrogate communication and determine if humans are talking about arranging meetings, when it’s identified this, Amy will then examine each person’s diary and find non-conflicting times and present these to all parties in the email or group message.
Download Our Project Specification Template
Crude forms of AI in video games have been around for decades. Take the 80s-arcade game Pacman for example, each ghost featured unique forms of AI to try and catch the player as they made their way around the maze.
Technology has moved on from the 80s though and non-player-characters (or NPCs) have become so advanced that entire worlds can now be modelled, rendered and explored.
Games such as GTA, Call of Duty feature rich digital worlds with numerous NPCs that can, and often do, behave in a human like manner. All of this is possible due to advancement in artificial intelligence.
Consider for a minute, AI researcher at Princeton University Artur Filipowicz. Filipowicz has been trying to develop software for autonomous vehicles, part of the problem is that the software must be able to recognise a stop sign. These signs can vary in their appearance due to weather conditions or may just even may just be needing repair. When a car arrives at a stop sign, it must stop, failing to do so could result in a human fatality.
The image recognition algorithm therefore must be able to identify multiple forms of stop sign.
Filipowicz came up with a novel solution for this problem. GTA V.
In the game GTA V, the player is immersed in a fictional city Los Santos which is loosely based on the Los Angeles. During the games production, Los Angeles was extensively researched. The team organised field research trips with tour guides and architectural historians, and captured around 250,000 photographs and many hours of video footage. These photographs and footage naturally made their way into the level design.
Filipowicz was then able to alter the game in such a way that his autonomous vehical software could navigate through the graphically rendered roads and more importantly, identify and response to stop signs as if it were in real life.
Drivers cars and driverless lorries having been making headlines recently. Companies such as Google, Uber, Apple, Volkswagen and Mercedes are heavily investing in self driving automobiles powered by artificial intelligence.
In 2016, by leveraging AI, San Francisco start up Otto (owned by Uber) successfully delivered 50,000 cans of Budweiser. From a commercial perspective, integrating AI into long haul trucking routes will yield costs savings and has the potential to save lives – AI routines don’t suffer from fatigue.
Gartner predicts that by 2020, there will be approximately 250 million cars connected to each other via WiFi. This will be to allow them to communicate with each other on the roads. That’s not too far away at the time of writing this blog post (2017).
Consumer Analytics and Forecasting
Machine learning and artificial intelligence have been in use for years to help businesses forecast demand and set prices dynamically. Back in 2013, Amazon patented “predictive stocking”, the idea behind this shipping system is to reduce delivery times by predicting what consumers will want before they have even bought it
One example pre-shipping scenario:
“a method may include packaging one or more items as a package for eventual shipment to a delivery address, selecting a destination geographical area to which to ship the package, shipping the package to the destination geographical area without completely specifying the delivery address at the time of shipment, and while the package is in transit, completely specifying the delivery address for the package.”
Estimating the price to sales ratio (or price elasticity can be difficult for retailers, artificial intelligence makes price optimization easier however. It does this by correlating pricing trends with sales trends and can also align other variables such as stock levels. Data Scientist Mohammad Islam wrote an article on this subject which explains this concept in more detail here.
Business rules and reputation lists have existed for decades and many organizations today implement such things to identify fraudulent behaviour. A rule contains a statement that is both readable by a human and understandable by a computer. For example, a bank may create a rule that says something like:
“If the customer is purchasing a product that costs greater than $1,500, there location in Yemen, and signed up less than 24 hours ago, then block the transaction.”
Rules like this are static, over time they can be gamed by adopting a brute forcing approach, criminals can try different combinations of location, monetary value and so on.
Artificial intelligence is changing this though. By implementing supervised machine learning or SML, the machine can learn from historical datasets that contain fraudulent transactions. The machine can then identify specific patterns that represent a typical fraudulent transaction, whether it be the location, quantity or type of product.
When applying for credit, whether it be a loan or a credit card, banks must determine whether each customer is credit worthy. Other variables are calculated such as the credit limit, interest rate and maximum amount the customer can obtain. Today’s consumer expects near instant decisions and AI and machine learning is helping drive this.
To help banks make more informed credit decisions and determine the risk of lending to a customer, FICO is using machine learning. Researchers at MIT also found that by using machine learning, banks could reduce the number of delinquent customers by up to 25%.
Historically, chatbots offered rudimentary answers to simplistic questions, most of this was achieved by identifying specific keywords and returning a canned response. This was often frustrating for users but artificial intelligence is transforming this field.
Advancements in Natural Language Processing and machine learning allow chatbots understand the semantic orientation of each word in a sentence and derive true meaning. Doing this allows the chatbot to create some context of what a customer is talking about and ask relevant questions or provide solutions to customer queries.
Bank of America
one of the largest U.S banks using a voice and text enabled chatbot called Erica. Erica can send customers notifications or help customers make better financial decisions.
Launched a bot called COIN which allows the bank to analyse complex legal contracts faster and more efficiently that a human ever can.
COIN can also undertake the following tasks:
- parse emails for employees,
- grant access to software systems
- reset passwords.
With almost 2 billion users on the platform, Facebook own one of the largest datasets on the planet. Its users share vast quantities of content whether it be in text, image and video format.
Consider the uploading of a photograph, Facebook will automatically highlight faces and suggests friends to tag that exist within the user’s social graph.
But how can Facebook do this in near real-time? You’ve guessed it, AI.
By leveraging facial recognition software and neural networks, Facebook can identify with reasonable accuracy, who each person is. Facebook acquired an Israeli facial recognition tech firm Face.com in 2012 for $55-60 million which is has helped drive this. Facebook has also been investing in this technology internally.
Read How We Helped a Marketing Company to Build a Back-Office Custom Ads Dashboard
In 2015, Snapchat introduced “facial filters”. These track facial movements and allow users to add digital masks that overlay their faces when moved. It uses AI technology which was originally developed by a Ukrainian company called Looksery which has patents on using machine learning to track movements in video.
Elements of artificial intelligence have been used in real estate for quite some time now, property listing platforms can match buyers to new properties within minutes of being shared online. It goes further than simply just matching on keywords.
Roof.AI sets to change real estate by integrating artificial intelligence into the heart of all real estate activities. Some of the feature include but are not limited to:
- task automation
- lead generation
- Facebook messenger integration
“Roof Ai is an AI-powered messaging service that enables smart conversations between real estate businesses and their customers..
The messaging service is backed by a proprietary CRM. The CRM is used by the real estate teams to manage the requests and assist the chatbot in case a human intervention is needed. It’s also an analytics tools that helps them monitor everything that is happening on our messaging channels.
Most real estate websites struggle to achieve a 2% conversation rate. And the main reason for that is the lack of engagement with the visitors on these sites. Roof Ai helps increase conversion by a factor of 8.”
Broker vs Bot
Inman, a real estate publication launched a challenge affectionally titled “Broker vs Bot”. The challenge, which was conducted in Denver, asked a local real estate journalist to play the role of “buyer” and select three homes that he like from local real estate listings.
Then, on three separate dates, Inman asked three separate real estate brokers to compete against a bot “Find More Genius” to recommend homes like one of the homes selected by the buyer. The “buyer” was then asked which one of the recommended homes he preferred. On all three dates, Find More Genius’ choices were selected.
Does this mean that AI will replace agents?
You’re probably familiar with pilotless drones, they’ve been used by the military for years now. In recent years, drones have also made the switch from the defence military world to the civilian world.
Let’s explore some examples of how drones are using artificial intelligence.
An engineer at the Technical University of Delft, one of the world’s leading drone research hubs in the Netherlands, started to investigate if drones could reach a heart attack patient faster than an ambulance.
By working with ambulance services in Amsterdam, Alec Momont established that the typical response time for a cardiac arrest call is approximately 10 minutes
Momont when onto build a drone prototype that ships with a DIY defibrillator and is aiming to get there in six minutes. Momont’s vision is for drones to be part of a wider emergency services response team and that someone witnessing a heart attack could call 112 (the equivalent of 911 in the USor 999 in the UK) and the call handler would dispatch a drone. Using a two-way video connected to the drone, a medic could talk the witness through the necessary steps of using the defibrillator.
One can see the obvious advantage to having such technology in rural areas or difficult to reach locations
Zero Zero Robotics’ “Lily Camera” which is pitches itself as a “throw and shoot” camera hovers in the air and is powered by bespoke artificial intelligence which has been coined “Embedded AI”.
The firm developed proprietary technology that fuses a suite of state-of-the-art AI with a PCB the size of two US quarters!
Weighing only 238 grams, the self-flying camera can be carried around, it’s like having your own self-flying personal photographer. Once in the air, the done automatically finds and follows you (its owner), while recording your everyday life from a completely new angle. It’s all possible thanks to advanced artificial intelligence facial recognition algorithms.
In December, 2016, a British farmer Richard Barnes receive an order placed on Amazon for a bag of popcorn and an Amazon Fire TV Stick.
What was different about this delivery?
Text Analytics and NLP
Text Analytics and NLP are intertwined, without Natural Language Processing (NLP), the machine can’t determine the semantic orientation of the words (make sense of the order of the words and what they mean).
NLP allows humans to communicate with the machine in natural language, let’s explore some examples of text analytics and NLP.
Customer Reviews and Sentiment Analysis
Consumers often leave comments or reviews on specific products or services that they purchase. Quite often, the volume of user generated data being created is vast and simply cannot be reviewed but a human at scale. This sort of text is also unstructured which only adds to the problem. Text analytics, NLP and AI are ideal for these sorts of tasks however.
By applying techniques such as sentiment analysis and POS Tagging (Part of Speech) business can find out how consumers feel about their product, brand or service.
In recent years, we have saw the democratisation of sentiment analysis in that it’s now being offered as-a-service. Some of the companies that offer this sort of functionality include, but aren’t limited to:
- Monkey Learn
- Social Opinion
They offer REST APIs that integrate easily with your existing software applications. For example, using the following publicly available Sentiment Analysis REST API from UK start-up Social Opinion, we pass in the text, “this phone is awesome”:
The REST API then returns the following response:
In the response, we can see the text has been identified as expressing positive emotion with a 64% probability of that being true.
This is probably one of the more mature forms of artificial intelligence and machine learning in operation online. Have you ever looked at products in Amazon then moments later noticed similar products being displayed in your Facebook or Twitter feed?
By tracking what you “Like” and what you’ve viewed and the comments you post and share, machine learning can, with relatively accuracy place marketing creatives in your news feed on social channels thereby improving conversion rates for business.
AdTech is such a lucrative vertical that companies such as Twitter have launched developer initiatives like #Promote to encourage the development of AI based software to drive online sales.
In this post, we’ve covered 10 examples of artificial intelligence in the real world, we’ve covered everything from virtual personal assistants to smart cars to flying drones and much more.
Businesses will continue to innovate and build innovative solutions to complex problems and developments in artificial intelligence so no signs of slowing down.
Thanks for reading this blog and if you’ve enjoyed it, please feel free to share it with your colleagues, friends or anyone else that you think might be interested in. You can also subscribe to our blog to get the latest updates.
Download Our Project Specification Template
Latest posts by Jamie Maguire (see all)
- Microservices vs SOA and API Comparison - April 20, 2018
- Microservice Architecture (Examples and Diagram) - April 16, 2018
- How an Enterprise Can Use AI with Predictive Maintenance - April 10, 2018