Category: 07. Subsets of AI

https://cdn3d.iconscout.com/3d/premium/thumb/not-subset-3d-icon-png-download-9791919.png

  • NLP Tutorial

    NLP tutorial provides basic and advanced concepts of the NLP tutorial. Our NLP tutorial is designed for beginners and professionals.

    • What is NLP?
    • History of NLP
    • Advantages of NLP
    • Disadvantages of NLP
    • Components of NLP
    • Applications of NLP
    • How to build an NLP pipeline?
    • Phases of NLP
    • Why NLP is Difficult?
    • NLP APIs
    • NLP Libraries
    • Difference between Natural language and Computer language

    What is NLP?

    NLP stands for Natural Language Processing, which is a part of Computer Science, Human language, and Artificial Intelligence. It is the technology that is used by machines to understand, analyse, manipulate, and interpret human’s languages. It helps developers to organize knowledge for performing tasks such as translation, automatic summarization, Named Entity Recognition (NER), speech recognition, relationship extraction, and topic segmentation.

    What is NLP

    History of NLP

    (1940-1960) – Focused on Machine Translation (MT)

    The Natural Languages Processing started in the year 1940s.

    1948 – In the Year 1948, the first recognisable NLP application was introduced in Birkbeck College, London.

    1950s – In the Year 1950s, there was a conflicting view between linguistics and computer science. Now, Chomsky developed his first book syntactic structures and claimed that language is generative in nature.

    In 1957, Chomsky also introduced the idea of Generative Grammar, which is rule based descriptions of syntactic structures.

    (1960-1980) – Flavored with Artificial Intelligence (AI)

    In the year 1960 to 1980, the key developments were:

    Augmented Transition Networks (ATN)

    Augmented Transition Networks is a finite state machine that is capable of recognizing regular languages.

    Case Grammar

    Case Grammar was developed by Linguist Charles J. Fillmore in the year 1968. Case Grammar uses languages such as English to express the relationship between nouns and verbs by using the preposition.

    In Case Grammar, case roles can be defined to link certain kinds of verbs and objects.

    For example: “Neha broke the mirror with the hammer”. In this example case grammar identify Neha as an agent, mirror as a theme, and hammer as an instrument.

    In the year 1960 to 1980, key systems were:

    SHRDLU

    SHRDLU is a program written by Terry Winograd in 1968-70. It helps users to communicate with the computer and moving objects. It can handle instructions such as “pick up the green boll” and also answer the questions like “What is inside the black box.” The main importance of SHRDLU is that it shows those syntax, semantics, and reasoning about the world that can be combined to produce a system that understands a natural language.

    LUNAR

    LUNAR is the classic example of a Natural Language database interface system that is used ATNs and Woods’ Procedural Semantics. It was capable of translating elaborate natural language expressions into database queries and handle 78% of requests without errors.

    1980 – Current

    Till the year 1980, natural language processing systems were based on complex sets of hand-written rules. After 1980, NLP introduced machine learning algorithms for language processing.

    In the beginning of the year 1990s, NLP started growing faster and achieved good process accuracy, especially in English Grammar. In 1990 also, an electronic text introduced, which provided a good resource for training and examining natural language programs. Other factors may include the availability of computers with fast CPUs and more memory. The major factor behind the advancement of natural language processing was the Internet.

    Now, modern NLP consists of various applications, like speech recognition, machine translation, and machine text reading. When we combine all these applications then it allows the artificial intelligence to gain knowledge of the world. Let’s consider the example of AMAZON ALEXA, using this robot you can ask the question to Alexa, and it will reply to you.


    Advantages of NLP

    • NLP helps users to ask questions about any subject and get a direct response within seconds.
    • NLP offers exact answers to the question means it does not offer unnecessary and unwanted information.
    • NLP helps computers to communicate with humans in their languages.
    • It is very time efficient.
    • Most of the companies use NLP to improve the efficiency of documentation processes, accuracy of documentation, and identify the information from large databases.

    Disadvantages of NLP

    A list of disadvantages of NLP is given below:

    • NLP may not show context.
    • NLP is unpredictable
    • NLP may require more keystrokes.
    • NLP is unable to adapt to the new domain, and it has a limited function that’s why NLP is built for a single and specific task only.

    Components of NLP

    There are the following two components of NLP –

    1. Natural Language Understanding (NLU)

    Natural Language Understanding (NLU) helps the machine to understand and analyse human language by extracting the metadata from content such as concepts, entities, keywords, emotion, relations, and semantic roles.

    NLU mainly used in Business applications to understand the customer’s problem in both spoken and written language.

    NLU involves the following tasks –

    • It is used to map the given input into useful representation.
    • It is used to analyze different aspects of the language.

    2. Natural Language Generation (NLG)

    Natural Language Generation (NLG) acts as a translator that converts the computerized data into natural language representation. It mainly involves Text planning, Sentence planning, and Text Realization.

    Note: The NLU is difficult than NLG.

    Difference between NLU and NLG

    NLUNLG
    NLU is the process of reading and interpreting language.NLG is the process of writing or generating language.
    It produces non-linguistic outputs from natural language inputs.It produces constructing natural language outputs from non-linguistic inputs.

    Applications of NLP

    There are the following applications of NLP –

    1. Question Answering

    Question Answering focuses on building systems that automatically answer the questions asked by humans in a natural language.

    Applications of NLP

    2. Spam Detection

    Spam detection is used to detect unwanted e-mails getting to a user’s inbox.

    Applications of NLP

    3. Sentiment Analysis

    Sentiment Analysis is also known as opinion mining. It is used on the web to analyse the attitude, behaviour, and emotional state of the sender. This application is implemented through a combination of NLP (Natural Language Processing) and statistics by assigning the values to the text (positive, negative, or natural), identify the mood of the context (happy, sad, angry, etc.)

    Applications of NLP

    4. Machine Translation

    Machine translation is used to translate text or speech from one natural language to another natural language.

    Applications of NLP

    Example: Google Translator

    5. Spelling correction

    Microsoft Corporation provides word processor software like MS-word, PowerPoint for the spelling correction.

    Applications of NLP

    6. Speech Recognition

    Speech recognition is used for converting spoken words into text. It is used in applications, such as mobile, home automation, video recovery, dictating to Microsoft Word, voice biometrics, voice user interface, and so on.

    7. Chatbot

    Implementing the Chatbot is one of the important applications of NLP. It is used by many companies to provide the customer’s chat services.

    Applications of NLP

    8. Information extraction

    Information extraction is one of the most important applications of NLP. It is used for extracting structured information from unstructured or semi-structured machine-readable documents.

    9. Natural Language Understanding (NLU)

    It converts a large set of text into more formal representations such as first-order logic structures that are easier for the computer programs to manipulate notations of the natural language processing.


    How to build an NLP pipeline

    There are the following steps to build an NLP pipeline –

    Step1: Sentence Segmentation

    Sentence Segment is the first step for building the NLP pipeline. It breaks the paragraph into separate sentences.

    Example: Consider the following paragraph –

    Independence Day is one of the important festivals for every Indian citizen. It is celebrated on the 15th of August each year ever since India got independence from the British rule. The day celebrates independence in the true sense.

    Sentence Segment produces the following result:

    1. “Independence Day is one of the important festivals for every Indian citizen.”
    2. “It is celebrated on the 15th of August each year ever since India got independence from the British rule.”
    3. “This day celebrates independence in the true sense.”

    Step2: Word Tokenization

    Word Tokenizer is used to break the sentence into separate words or tokens.

    Example:

    JavaTpoint offers Corporate Training, Summer Training, Online Training, and Winter Training.

    Word Tokenizer generates the following result:

    “JavaTpoint”, “offers”, “Corporate”, “Training”, “Summer”, “Training”, “Online”, “Training”, “and”, “Winter”, “Training”, “.”

    Step3: Stemming

    Stemming is used to normalize words into its base form or root form. For example, celebrates, celebrated and celebrating, all these words are originated with a single root word “celebrate.” The big problem with stemming is that sometimes it produces the root word which may not have any meaning.

    For Example, intelligence, intelligent, and intelligently, all these words are originated with a single root word “intelligen.” In English, the word “intelligen” do not have any meaning.

    Step 4: Lemmatization

    Lemmatization is quite similar to the Stamming. It is used to group different inflected forms of the word, called Lemma. The main difference between Stemming and lemmatization is that it produces the root word, which has a meaning.

    For example: In lemmatization, the words intelligence, intelligent, and intelligently has a root word intelligent, which has a meaning.

    Step 5: Identifying Stop Words

    In English, there are a lot of words that appear very frequently like “is”, “and”, “the”, and “a”. NLP pipelines will flag these words as stop words. Stop words might be filtered out before doing any statistical analysis.

    Example: He is a good boy.

    Note: When you are building a rock band search engine, then you do not ignore the word “The.”

    Step 6: Dependency Parsing

    Dependency Parsing is used to find that how all the words in the sentence are related to each other.

    Step 7: POS tags

    POS stands for parts of speech, which includes Noun, verb, adverb, and Adjective. It indicates that how a word functions with its meaning as well as grammatically within the sentences. A word has one or more parts of speech based on the context in which it is used.

    Example: “Google” something on the Internet.

    In the above example, Google is used as a verb, although it is a proper noun.

    Step 8: Named Entity Recognition (NER)

    Named Entity Recognition (NER) is the process of detecting the named entity such as person name, movie name, organization name, or location.

    Example: Steve Jobs introduced iPhone at the Macworld Conference in San Francisco, California.

    Step 9: Chunking

    Chunking is used to collect the individual piece of information and grouping them into bigger pieces of sentences.


    Phases of NLP

    There are the following five phases of NLP:

    Phases of NLP

    1. Lexical Analysis and Morphological

    The first phase of NLP is the Lexical Analysis. This phase scans the source code as a stream of characters and converts it into meaningful lexemes. It divides the whole text into paragraphs, sentences, and words.

    2. Syntactic Analysis (Parsing)

    Syntactic Analysis is used to check grammar, word arrangements, and shows the relationship among the words.

    Example: Agra goes to the PoonamIn the real world, Agra goes to the Poonam, does not make any sense, so this sentence is rejected by the Syntactic analyzer.

    3. Semantic Analysis

    Semantic analysis is concerned with the meaning representation. It mainly focuses on the literal meaning of words, phrases, and sentences.

    4. Discourse Integration

    Discourse Integration depends upon the sentences that proceeds it and also invokes the meaning of the sentences that follow it.

    5. Pragmatic Analysis

    Pragmatic is the fifth and last phase of NLP. It helps you to discover the intended effect by applying a set of rules that characterize cooperative dialogues.

    For Example: “Open the door” is interpreted as a request instead of an order.


    Why NLP is difficult?

    NLP is difficult because Ambiguity and Uncertainty exist in the language.

    Ambiguity

    There are the following three ambiguity –

    • Lexical Ambiguity

    Lexical Ambiguity exists in the presence of two or more possible meanings of the sentence within a single word.

    Example:

    Manya is looking for a match.

    In the above example, the word match refers to that either Manya is looking for a partner or Manya is looking for a match. (Cricket or other match)

    • Syntactic Ambiguity

    Syntactic Ambiguity exists in the presence of two or more possible meanings within the sentence.

    Example:

    I saw the girl with the binocular.

    In the above example, did I have the binoculars? Or did the girl have the binoculars?

    • Referential Ambiguity

    Referential Ambiguity exists when you are referring to something using the pronoun.

    Example: Kiran went to Sunita. She said, “I am hungry.”

    In the above sentence, you do not know that who is hungry, either Kiran or Sunita.


    NLP APIs

    Natural Language Processing APIs allow developers to integrate human-to-machine communications and complete several useful tasks such as speech recognition, chatbots, spelling correction, sentiment analysis, etc.

    A list of NLP APIs is given below:

    • IBM Watson API
      IBM Watson API combines different sophisticated machine learning techniques to enable developers to classify text into various custom categories. It supports multiple languages, such as English, French, Spanish, German, Chinese, etc. With the help of IBM Watson API, you can extract insights from texts, add automation in workflows, enhance search, and understand the sentiment. The main advantage of this API is that it is very easy to use.
      Pricing: Firstly, it offers a free 30 days trial IBM cloud account. You can also opt for its paid plans.
    • Chatbot API
      Chatbot API allows you to create intelligent chatbots for any service. It supports Unicode characters, classifies text, multiple languages, etc. It is very easy to use. It helps you to create a chatbot for your web applications.
      Pricing: Chatbot API is free for 150 requests per month. You can also opt for its paid version, which starts from $100 to $5,000 per month.
    • Speech to text API
      Speech to text API is used to convert speech to text
      Pricing: Speech to text API is free for converting 60 minutes per month. Its paid version starts form $500 to $1,500 per month.
    • Sentiment Analysis API
      Sentiment Analysis API is also called as ‘opinion mining‘ which is used to identify the tone of a user (positive, negative, or neutral)
      Pricing: Sentiment Analysis API is free for less than 500 requests per month. Its paid version starts form $19 to $99 per month.
    • Translation API by SYSTRAN
      The Translation API by SYSTRAN is used to translate the text from the source language to the target language. You can use its NLP APIs for language detection, text segmentation, named entity recognition, tokenization, and many other tasks.
      Pricing: This API is available for free. But for commercial users, you need to use its paid version.
    • Text Analysis API by AYLIEN
      Text Analysis API by AYLIEN is used to derive meaning and insights from the textual content. It is available for both free as well as paid from$119 per month. It is easy to use.
      Pricing: This API is available free for 1,000 hits per day. You can also use its paid version, which starts from $199 to S1, 399 per month.
    • Cloud NLP API
      The Cloud NLP API is used to improve the capabilities of the application using natural language processing technology. It allows you to carry various natural language processing functions like sentiment analysis and language detection. It is easy to use.
      Pricing: Cloud NLP API is available for free.
    • Google Cloud Natural Language API
      Google Cloud Natural Language API allows you to extract beneficial insights from unstructured text. This API allows you to perform entity recognition, sentiment analysis, content classification, and syntax analysis in more the 700 predefined categories. It also allows you to perform text analysis in multiple languages such as English, French, Chinese, and German.
      Pricing: After performing entity analysis for 5,000 to 10,000,000 units, you need to pay $1.00 per 1000 units per month.

    NLP Libraries

    Scikit-learn: It provides a wide range of algorithms for building machine learning models in Python.

    Natural language Toolkit (NLTK): NLTK is a complete toolkit for all NLP techniques.

    Pattern: It is a web mining module for NLP and machine learning.

    TextBlob: It provides an easy interface to learn basic NLP tasks like sentiment analysis, noun phrase extraction, or pos-tagging.

    Quepy: Quepy is used to transform natural language questions into queries in a database query language.

    SpaCy: SpaCy is an open-source NLP library which is used for Data Extraction, Data Analysis, Sentiment Analysis, and Text Summarization.

    Gensim: Gensim works with large datasets and processes data streams.


    Difference between Natural language and Computer Language

    Natural LanguageComputer Language
    Natural language has a very large vocabulary.Computer language has a very limited vocabulary.
    Natural language is easily understood by humans.Computer language is easily understood by the machines.
    Natural language is ambiguous in nature.Computer language is unambiguous.

    Prerequisite

    Before learning NLP, you must have the basic knowledge of Python.

    Audience

    Our NLP tutorial is designed to help beginners.

    Problem

    We assure that you will not find any problem in this NLP tutorial. But if there is any mistake or error, please post the error in the contact form.

  • What is an Expert System?

    Another field of artificial intelligence (AI) is expert systems that are intended to imitate the capability of human experts to make decisions. Their analysis of information and complex problems depends on a knowledge base, which contains facts and rules that are domain-specific.

    An example of this can be found in healthcare, where a patient may have their symptoms analyzed by an expert system to propose possible diagnoses or treatments, or in finance, where an expert system may analyze market trends to give investment advice.

    Why are Expert Systems Important?

    Expert systems are transformative in the field of AI. Here are highlighting the importance of Expert Systems in AI:

    Expert Systems in AI

    Preserve Knowledge

    They capture the knowledge of human professionals in soft copy so that useful information is not lost when the experts retire or leave.

    Enhance Decision-Making

    They provide intuitive, rational, and objective advice using orderly information and regulations.

    Save Time and Costs

    They save on costs by automating what would otherwise require human skills, thus increasing efficiency.

    Increase Accessibility

    They bring the knowledge of the specialists to the level of non-specialists and expand the possibilities of accessing specialized knowledge.

    Elements of an Expert System

    There are a number of significant elements comprising an expert system, all of which enable the system to function. Here’s a breakdown:

    Expert Systems in AI

    Knowledge Base: The Core Repository

    The body of knowledge is the pillar of the system where facts, rules, and expertise of the domain are stored. It is viewed as an online encyclopedia of professional knowledge, studies, and best practices. The effectiveness of the system recommendations will be based on the quality and accuracy of this knowledge. The use of outdated information or incomplete information may result in poor outcomes.

    Inference Engine: The Decision-Maker

    The inference engine, also known as the brain of the system, is a system that uses reasoning procedures on the knowledge base to draw conclusions or to propose actions. It uses methods such as:

    • Forward Chaining
      • The process of drawing conclusions based on facts you know.
      • For example, when a patient coughs and has a fever, the system might come to the conclusion that he/she has a respiratory infection.
    • Backward Chaining
      • It begins with a goal and seeks evidence.
      • For example, to diagnose diabetes, it looks at such signs as frequent urination and high blood sugar.

    User Interface: Linking the Systems and the Users

    The user interface enables free interaction between the expert system and the user. It is crafted to be user-friendly enough to allow non-experts to ask questions or enter problems and get straightforward advice or solutions.

    Module of Explanation: Transparency

    This element generates trust by clarifying the rationality of the decisions made by the system. It also gives a stepwise breakdown of the process by which a conclusion was drawn, just as a teacher demonstrating the steps to a solution does.

    Sample: The patient has been diagnosed with pneumonia due to the presence of fever, cough, and a defective chest X-ray.

    Knowledge Acquisition Module: Constant Updating

    The knowledge base of the system needs to be updated on a regular basis to remain effective. The knowledge acquisition module gathers new facts, rules, and insights, and keeps the system up to date with the latest developments to ensure that the recommendations of the system are not obsolete.

    Reasoning Strategies Used by the Inference Engine

    In order to process the information and solve problems, the inference engine of an expert system depends on two main reasoning approaches: Forward Chaining and Backward Chaining.

    Forward Chaining

    It is an information-based logical analysis. The system will start with facts that are known and then use rules to come up with new information or conclusions. It is usually applied in prediction and outcome determination.

    Example: Stock market projections based on financial data.

    Backward Chaining

    It is a goal-oriented methodology. The system begins with an assumption or an objective and becomes inductive to verify the facts or circumstances that confirm it. It can be particularly helpful in diagnostics.

    Example: To verify a diagnosis such as dengue or blood cancer, the system searches for symptoms of stomach pain, fever, or abnormal test results.

    The Interplay of these Components

    As an example, take a medical expertise system which is used to diagnose diseases:

    • Input: A patient feeds in the user interface with symptoms like fever, cough, and fatigue.
    • Processing: The inference engine uses the rules of the knowledge base to analyze the symptoms.
    • Output: The system proposes a potential ailment, such as pneumonia.
    • Explanation: The justification of the decision is in the explanation module: “The diagnosis is performed with references to fever, cough, and abnormal X-ray changes in the chest.
    • Update: The learning module incorporates the latest developments in knowledge, including new pneumonia therapeutic methods, and keeps the system up to date.

    Types of Expert Systems in AI

    There are a few categories of expert systems, which differ in terms of their structure and usability:

    Expert Systems in AI

    Rule-Based Expert Systems

    These are the most ordinary ones, which depend on the rules of if-then decision-making. The domain experts have designed the rules, which serve as the reasoning engine of the system.

    For example, MYCIN, an early medical system that was created to diagnose bacterial infections.

    Frame-Based Expert Systems

    Such systems model knowledge in the form of frames, as objects in programming. Attributes and values of particular entities are saved in frames and are useful when knowledge has to be represented, or tasks like natural language processing are needed.

    Fuzzy Logic Systems

    Fuzzy logic systems are also meant to deal with uncertainty and vagueness; they do not need to know true/false, but they can know degrees of truth. They can be extensively implemented on household appliances such as washing machines and air conditioners to maximize the performance of the appliances at various settings.

    Expert Systems based on Neural Networks

    These systems are complexes of artificial neural networks that attempt to recognize patterns, learn information, and improve decisions. They have particular applications in image recognition, complex pattern recognition, speech interpretation, and complex pattern recognition.

    Neuro-Fuzzy Expert Systems

    Under this hybrid approach, the learning strength of neural networks is integrated with the adaptive reasoning ability of fuzzy logic. This is particularly true in areas of finance forecasting and automated controls, where flexibility and uncertainty control are necessary.

    Examples of AI Expert Systems

    There are some well-known expert systems that were created over the years in diverse domains. Here are some key examples:

    Expert Systems in AI

    MYCIN

    One of the earliest medical expert systems, which was based on backward chaining and could be used to diagnose bacterial infections such as meningitis and bacteremia. It posed a series of questions to doctors regarding patient symptoms and test results so as to determine the probable bacteria.

    Significance: MYCIN never actually found clinical use, but played a major role in shaping the design of subsequent medical expert systems.

    DENDRAL

    DENDRAL was one of the earliest AI systems in chemistry: it used spectrographic data to predict the structure of molecules. It was actually created to analyze the mass spectrometry data and name chemical substances.

    Significance: It revolutionized the field of chemical research as it automated a time-consuming and complicated process.

    R1/XCON

    Digital Equipment Corporation (DEC) created R1 in the late 1970s (also called XCON), a very successful commercial expert system. It designed new computer systems by choosing the appropriate hardware and software based on the needs of the customers.

    Significance: It made the system easier to configure, reduced errors, and saved DEC millions of dollars.

    PXDES

    PXDES was an expert system that was developed in the field of oncology to assist in diagnosing lung cancer. It has examined the history of patients, including imaging science, to identify the form and stage of the malignancy, on which a treatment decision has been made.

    Significance: It increased the quality of diagnosis and treatment planning in treating cancer.

    CaDet

    An early cancer detection clinical decision support system. It used patient symptoms and medical data to compare them with known cancer indicators, therefore identifying early warning signs of the disease.

    Significance: CaDet early detection was related to enhanced likelihood of survival since they were obtaining medical care in a time-sensitive way.

    Expert System Use Cases

    Medical Diagnosis

    Expert systems assist physicians in interpreting a symptom list and patient history to provide a possible diagnosis or treatment.

    Case Study: MYCIN diagnosed the bacteria and ordered antibiotics.

    Expert Systems in AI

    Financial Services

    They find application in credit scoring, fraud detection, and investment advice in the field of finance. They analyze financial data and trends in order to provide credible insights and suggestions.

    Technical Support

    Expert systems offer troubleshooting services that help users step through the problem-solving process, guided by expert rules and knowledge, and eliminate dependency on human resources.

    Manufacturing

    They automate business processes, perform quality control, and keep stock by analyzing data and making recommendations that improve efficiency and reduce spending.

    Benefits of Expert Systems

    Expert systems have some benefits that render them useful in industries, including the medical field, finance, and more.

    Expert Systems in AI

    Consistency

    These systems provide uniform and dependable recommendations throughout the board, compared to decision-making by people, which can change. This makes sure that the same problems will always use the same solutions, which increases reliability.

    Availability

    Expert systems do not need human experts, unlike human experts, who are not available 24 hours a day. They can work on multiple queries simultaneously, and answer them quickly and without interruptions and relaxation.

    Cost-Effectiveness

    Automating decisions at the expert level allows organizations to save a lot of money in terms of employment, training, and maintenance of experts. This renders them a viable option to manage large-scale complex tasks.

    Knowledge Preservation

    Expert systems serve as stores of useful experience. They store and transfer knowledge when human professionals withdraw, retire, or are not available, and also provide long-term continuity of valuable knowledge.

    Limitations of Expert Systems

    Although expert systems are useful in decisions making process, there are limitations which make expert systems have limitations and flexibility.

    Expert Systems in AI

    Knowledge Limitation

    An expert system depends on the quality and the thoroughness of its knowledge base to perform. When information is old, incomplete, or wrong, the system can provide bad or inaccurate results.

    Lack of Flexibility

    Because expert systems work within the confines of what is coded in the program and what the system knows, they tend to fail when new, unforeseen, or unclear situations occur that were not outlined in the development process.

    Maintenance Effort

    In order to be effective, expert systems need to have their knowledge base updated and revised on a regular basis. It can be a time-consuming and expensive process, costing a significant amount of resources and professional expertise.

  • Subsets of Artificial Intelligence

    Artificial intelligence is an area of computer science concerned with the development of machines or programs that perform tasks that would normally require human intelligence. Artificial intelligence (AI) is the development of algorithms and models that enable machines not just to understand and even criticize data but also to make decisions and learn from them.

    AI aims to develop machines able to imitate or simulate human intelligence and complete a job as accurately, efficiently and independently or even better than humans.

    AI has a number of subfields trying to do some aspects of AI research and use in the field in generally other places. These subgroups each have problems, techniques and applications and, together, form a rich and diverse area of work under AI.

    Subsets of AI

    The following are the most common subsets of AI:

    • Machine Learning
    • Deep Learning
    • Natural Language Processing
    • Expert System
    • Robotics
    • Neural Networks
    • Machine Vision
    • Speech Recognition
    Subsets of AI

    Machine Learning

    Machine learning is a part of AI that provides intelligence to machines with the ability to learn with experiences without being explicitly programmed automatically.

    • It is primarily concerned with the design and development of algorithms that allow the system to learn from historical data.
    • Machine Learning is based on the idea that machines can learn from past data, identify patterns, and make decisions using algorithms.
    • Machine learning algorithms are designed in such a way that they can learn and improve their performance automatically.
    • Machine learning helps in discovering patterns in data.

    Types of Machine Learning

    Subsets of AI

    Machine learning can be subdivided into three main types:

    Supervised Learning:

    Supervised learning is a type of machine learning in which machines learn from known datasets (set of training examples) and then predict the output. A supervised learning agent needs to find out the function that matches a given sample set.

    Supervised learning further can be classified into two categories of algorithms:

    • Classifications
    • Regression

    Reinforcement Learning:

    Reinforcement learning is a type of learning in which an AI agent is trained by giving some commands, and on each action, an agent gets a reward as feedback. Using these feedbacks, the agent improves its performance.

    Reward feedback can be positive or negative, which means that for each good action, the agent receives a positive reward, while for a wrong action, it gets a negative reward.

    Reinforcement learning is of two types:

    • Positive Reinforcement learning
    • Negative Reinforcement learning

    Unsupervised Learning:

    Unsupervised learning is associated with learning without supervision or training. In unsupervised learning, the algorithms are trained with data that is neither labelled nor classified. In unsupervised learning, the agent needs to learn from patterns without corresponding output values.

    Unsupervised learning can be classified into two categories of algorithms:

    • Clustering
    • Association

    Natural Language processing

    Natural language processing is a subfield of computer science and artificial intelligence. NLP enables a computer system to understand and process human language, such as English.

    NLP plays an important role in AI as without NLP, AI agents cannot work on human instructions, but with the help of NLP, we can instruct an AI system on our language. Today we are all around AI, and as well as NLP, we can easily ask Siri, Google or Cortana to help us in our language.

    Natural language processing application enables a user to communicate with the system in their own words directly.

    The Input and output of NLP applications can be in two forms:

    • Speech
    • Text

    Deep Learning

    Deep learning is a subset of machine learning that provides the ability for machines to perform human-like tasks without human involvement. It gives the ability for an AI agent to mimic the human brain. Deep learning can use both supervised and unsupervised learning to train an AI agent.

    • Deep learning is implemented through neural network architecture, hence also called a deep neural network.
    • Deep learning is the primary technology behind self-driving cars, speech recognition, image recognition, automatic machine translation, etc.
    • The main challenge for deep learning is that it requires lots of data with lots of computational power.

    How Deep Learning Works?

    • Deep Learning Algorithms work on deep neural networks, so it is called deep learning. These deep neural networks are made of multiple layers.
    • The first layer is called an Input layer, the last layer is called an output layer, and all layers between these two layers are called hidden layers.
    • In the deep neural network, there are multiple hidden layers, and each layer is composed of neurons. These neurons are connected in each layer.
    • The input layer receives input data, and the neurons propagate the input signal to its above layers.
    • The hidden layers perform mathematical operations on inputs, and the performed data is forwarded to the output layer.
    • The output layer returns the output to the user.
    Subsets of AI

    Expert Systems

    An expert system is an application of artificial intelligence. In artificial intelligence, expert systems are computer programs that rely on obtaining the knowledge of human experts and programming that knowledge into a system.

    Expert systems emulate the decision-making ability of human experts. These systems are designed to solve complex problems through bodies of knowledge rather than conventional procedural code.

    One of the examples of an expert system is a Suggestion for the spelling error while typing in the Google search box.

    The following are some characteristics of expert systems:

    • High performance
    • Reliable
    • Highly responsive
    • Understandable

    Robotics

    • Robotics is one of the fields of artificial intelligence and engineering to design and manufacture robots.
    • But robots are programmed pieces of machines that are able to perform a series of actions semi-automatically or automatically.
    • AI can be used in robots as intelligent robots operate and do the job, taking their intelligence. Robots need to be able to do more complex tasks, which necessitates AI algorithms.
    • Currently, there are applications of AI and machine learning applied on the robots for manufacturing intelligent robots that would also interact socially as humans do. The best example of AI in robotics is Sophia’s robot.

    Neural Networks

    Neural networks or artificial neural networks (ANNs) are a class of computational models inspired by the architecture and functions of the biological nervous system. ANNs are subfields of AI and are used very successfully in data processing and analysis, pattern detection and prediction through a number of applications.

    A neural network is made up of layers of nodes or neurons that are connected. Data is accepted by the nodes and processed on that data, and the resulting output signals are passed through weighted connections to other nodes. In the procedure called training, the neural network is trained by tagged data so that its performance is refined with time, adjusting the node interconnects.

    There are different types of neural networks depending on their structure, such as feedforward neural networks, recurrent neural networks (RNNs), convolutional neural networks (CNNs) and long short-term memory (LSTM) networks. There is a specific type of data for which each type is used and possesses certain characteristics.

    Neural networks have been used in a lot of applications, including financial forecasting, self-driving cars, natural language processing, drug finding, image and voice recognition, recommendation systems, etc., Yet still, relevance as a field of AI research and development is ever increasing, and they are successfully used in a broad variety of fields.

    Machine Vision

    • Machine vision is an application of computer vision that enables a machine to recognize an object.
    • Machine vision captures and analyses visual information using one or more video cameras, analog-to-digital conversations, and digital signal processing.
    • Machine vision systems are programmed to perform narrowly defined tasks such as counting objects, reading the serial number, etc.
    • Computer systems do not see in the same way as human eyes can see, but it is also not bounded by human limitations such as to see through the wall.
    • With the help of machine learning and machine vision, an AI agent can be able to see through walls.

    Speech Recognition

    Speech recognition is a technology that enables a machine to understand spoken language and translate it into a machine-readable format. It can also be said as automatic Speech recognition and computer speech recognition. It is a way to talk with a computer, and on the basis of that command, a computer can perform a specific task. It is a branch of AI called natural language processing (NLP), which centres on the relationship between computers and human language.

    We need to train our speech recognition system to understand our language. In previous days, these systems were only designed to convert speech to text, but now, various devices can directly convert speech into commands.

    Speech recognition systems take audio signals and process and analyze them through a host of algorithms and techniques before translating them into text. These systems can be used for various purposes, such as speech-to-text functionality, voice assistants like Siri and Alexa etc., call centre automation, transcription service, car systems and many more. Also, they can recognize various languages, accents and speaking manners.

    Speech recognition is a process that consists of a number of steps. First, the system starts to record the sound being entered by sound input, such as the microphone or any other audio device. The sound signal is then preprocessed in order to eliminate noise from it, normalize the volume level and make other improvements.

    Then, features are extracted from the sound signal, which removes the relevant features of the sound signal (for example, spectral and temporal features). Then, these characteristics are fed into the speech recognition algorithm, which converts the spoken language to text using statistical models, machine learning or deep learning methods.

    Speech recognition systems can be used in the following areas:

    • System control or navigation system
    • Industrial application
    • Voice dialing system

    There are two types of speech recognition

    • Speaker Dependent
    • Speaker Independent

    Conclusion

    Artificial Intelligence, with its subsets, is a specialized domain that enables the AI system to copy the traits of human intelligence and perform different complicated tasks. Machine learning, deep learning, natural language processing, robotics and neural networks are key subsets that, for instance, enable things by themselves, like learning from data, understanding language, recognizing patterns and interacting with the environment. All these components often work together in real-world applications in healthcare, finance and transportation.

    The growth of each subset is dependent on improvements in the algorithms it uses and in the computational power at its disposal and they enable their future innovations and thus grow the potential of AI. There are many challenges in developing efficient, intelligent systems that can be configured in order to solve different, diverse, domain-specific challenges, and these subsets are critical to the development of such systems.