inscription
Natural language processing is a subset techniqueArtificial intelligenceused to bridge the communication gap between computers and humans. Born from the idea oftranslation machine(MT) created during World War II. The main idea was to convert one human language into another human language, for example converting the Russian language into English using a computer brain, but after that the idea of ​​converting the human language into the computer language and vice versa, so as to communicate with the machine became easy. In simple terms, a language can be understood as a set of rules or symbols. These symbols are incorporated and used to convey and convey the information. The rules for suppressing symbols apply here. The Natural Language Processing area is divided into sub-areas, i.e.Natural Language Generationand Natural Language Understanding, which, as the name suggests, is related to the generation and understanding of texts. The graphic below outlines these points. Don't get confused by these new terms like phonology, pragmatics, morphology, syntax and semantics. Let's examine them in a very brief way:Was ist Natural Language Processing?
- Phonology -This science helps to deal with the patterns that exist in sound and language, relating to sound as a physical entity.
- pragmatist -This science examines the various uses of language.
- Morphology -This science deals with the structure of words and the systematic relationships between them.
- Syntax -This science deals with the structure of sentences.
- Semantics -This science deals with the literal meaning of words, phrases and sentences.
The ability of machines to understand and interpret written or spoken human language.Click here to exploreNLP techniques and applications
What is the history of natural language processing?
The history of natural language processing is described:
The beginning
As mentioned above, the idea arose from the need for machine translation in the 1940s, when the original languages ​​were English and Russian. But the use of other words like Chinese also emerged in the early 1960's. Then came a bad time for MT/NLP in 1966. This fact was backed up by an ALPAC report stating that it almost died out due to research in the field current at the time . This condition improved again in the 1980s when the related product started providing some results to clients. After his death in the 1960s, it was breathed new life as the idea and need for artificial intelligence emerged.
LUNAR was created in 1978 by W.A. Forest; he was able to analyze, compare and evaluate the chemical data of moon rock and soil composition obtained during the Apollo moon missions and answer the associated question. In the 1980s, the field of computer grammar became a very active area of ​​research associated with the science of reasoning for meaning and the consideration of user beliefs and intentions. In the 1990s, the growth rate increased fromIt isIncrease. Grammars, tools and practical resources related toIt iswas available with the analyzers.
The investigation of central and forward-looking topics, such as word meaning disambiguation and statistically colored NLP, the work on the lexicon has reached a research direction. In this search for its appearance, other essential topics were added, such as statistical language processing, information extraction and automatic summarization.
The first chatbot - ELIZA
The discussion of the story cannot be considered complete without mentioning ELIZA, a chatbot program that ran at the MIT Artificial Intelligence Laboratory from 1964-1966. It was created by Joseph Weizenbaum. It was a script-based program called DOCTOR that was sent to the Rogerian psychotherapist and used rules to answer users' psychometric questions. was one ofChatbotsYou could do the Turing test back then.
The government is gradually incorporating modern emerging technologies into its architecture.Click here to exploreRole and use of NLP in government
What are the Applications of Natural Language Processing?
Nowadays everyone wants the machine to talk and the only way a computer can talk is through it. Take the example ofAlexa, a conversational product from Amazon. A voice query will be forwarded to you and you can respond in the same way, ie by voice. It can be used to ask anything, search for something, play music or even book a taxi. It appears to be magic, but it's not magic, see the diagram below. This simple diagram is the demonstration of the natural language processing technique in Alexa.
Alexa is not a unique example, and these talking machines, popularly known as chatbots, can even manage complicated interactions and simplify business-related processes using Alexa alone. Historically, chatbots were only used for customer interaction with limited conversational skills as they were generally rule-based, but following the advent of natural language processing and its integration withmachine learningmideep learning, now the chatbot can cover different areas like human resources and health. This isn't the only use case where it comes across as a game changer; there are other examples too. Let's take a quick look at them. Below you will find the description of some use cases.
- medical assistance
- sentiment analysis
- Cognitive Analysis
- spam detection
- recruitment
- conversation structure
medical assistance
Amazon Comprehend Medical services used for disease extraction can conduct meditation sessions and monitor treatment outcomes through clinical study reports, electronic medical records, and patient notes. This is an example of NLP in health analysis, where through language processing it is possible to predict various diseases using pattern recognition methods and the speech of the patient and the electronic medical record.
sentiment analysis
Businesses and organizations are now focusing on the different ways of getting to know their customers to give them a personal touch. Mood analysis (which is only possible with NLP) can be used to determine the mood behind the words. EITHERsentiment analysisIt offers a wealth of insights into customer behavior and the choices that can be viewed as key decision-making factors.
Cognitive Analysis
This is the best example of different technologies working together, but both are under one artificial intelligence umbrella. Conversation frames that can receive voice or text commands are possible. wear and tearcognitive analysisit is now possible to automate various technical processes, e.g. B. the creation of a technical ticket for a technical problem as well as automated or semi-automated processing. These technologies working together may result in an automated process for addressing technical issues within an organization or may provide the customer with the resolution of some technical issues in an automated manner as well.
Provide more useful results from your text, enable smart search to get results, and facilitate sentiment analysis.Fuente:Amazon PNL
spam detection
Giants in the tech world like Google and Yahoo use NLP to classify and filter suspicious spam emails. This process is known as spam detection or spam filtering. This results in an automated process that can classify the email as spam and prevent it from reaching your inbox.
recruitment
It can also be used in the search and selection phases of the hiring process, the chatbot can even be used to process the entry-level job-related inquiry, which includes identifying the skills required for a specific job and managing them. Placement tests and exams.
conversation structure
This technology and the devices associated with it are very popular nowadays. The Alexa pictured above is one of them, but it existsSiri and appleand Google's Ok Google, which are examples of the same type of technology use cases.
What does the future of natural language processing look like?
With exponential growth, it should hit the $16 billion mark by 2021 at a compound growth rate of 16% per year. The reason behind this growth is the rise of chatbots, the desire to discover customer information, the shift in messaging technology from manual to automated, and many other tasks that need to be automated and will eventually involve speech/speech. However, as mentioned above, the functionality revolves around language/speech, which refers to words in their basic raw form. Regardless of the means of communication, whether oral or written, words are the fundamental basic unit of functionality. But there does seem to be a difference in NLP performance when it comes to text and voice. Undoubtedly, this challenge will be addressed in the near future. Let's look at the different scenarios.
A subset of artificial intelligence that increases its demand as its sub-technologies improve each day.Click here to exploreBusiness applications of NLP
Evolution from human-computer interaction to human-computer conversation
In the case of pure interaction, it is possible to use a single medium, which can be any verbal or non-verbal communication. But for communication it is necessary to use the medium, the verbal and the non-verbal together. Although it is believed that with the development of natural language processing and biometrics, machines such asHumanoid RoboterYou will acquire the ability to read facial expressions as well as body language and words. To achieve this, it is necessary to integrate several modern technologies, such as B. human user recognition, sentiment analysis, recommendation analysis and techniques with dynamic participation in conversations.
The first critical part of the NLP advances - biometrics
The area of ​​non-verbal communication includes body language, touch, gestures and facial expressions. So, in order to bring non-verbal communication into play, it is necessary to use biometrics asface recognition, fingerprint scanner and retina scanner. Nowadays the use of these biometrics is also becoming the main feature in the field of security on laptops, tablets or even smartphones, which also believe in the use of biometrics to find the patterns in human facial expressions to recognize their feelings and emotions. Just as different words are used to form a complete sentence, different micro-expressions are also used to show feelings in a conversation. These micro-expressions are key to identifying the difference between different feelings and emotions, and if it is possible to combine them, natural language processing units, then this integration could unlock a whole new layer of interactions that emerge from interaction arise between man and machine. use communication
The second critical part of the NLP advancements - Humanoid Robotics
Every soul needs a body to express itself. Likewise, there is a need for a physical entity to facilitate the advancement of NLP in a proprietary and commercial environment. Thinking devices such as iPads, interactive television, dedicated conversation devices (such asSiriand Google Home) have started to cover this area but are still scratching the surface as they have limitations such as the use of certain senses (hearing, speaking, some sight but not touch). This interaction must be bidirectional, and the fourth sense (touch) must also be included in it; For example, one person talks face to face with another person. Humanoid robots are the premise for this type of communication as they can be the body of a programmed artificial soul. As the growth of NLP and biometrics also gain pace and precision, these technologies could take research on humanoid robots to a new level, allowing them to express themselves through movements, postures, and expressions.
Leverage AI-powered NLP capabilities to offer businesses sentiment analysis, insight mining, intent recognition, and text categorization solutions.Discover XenonStack NLP services
a holistic strategy
To learn more about the use and application of natural language processing in different fields, we recommend you to follow the steps below:- Learn more aboutGoogle Cloud natural language solutions
- knowDifferences between NLP, NLU and NLG