wiki.ziemers.de

ziemer's informatik Wiki

Benutzer-Werkzeuge

Webseiten-Werkzeuge


wiki:software:beuthbot:rasa

Dies ist eine alte Version des Dokuments!


Rasa NLU

Rasa is an open source solution for developing „AI assistants“ or chatbots. Rasa provides a stack consisting of the modules „Rasa NLU“ and „Rasa Core“. With the help of „Rasa NLU“ the user intention is determined from the received text message (Intent Recognition) and afterwards the NLU returns all intentions of the message sorted according to the „Confidence Score“. Training data is required to record the user's intentions. Furthermore, Rasa NLU allows „entity recognition“ to extract relevant terms from the text. The Rasa Core is a dialog engine that uses machine-learning trained models to decide which response to send to the user, such as greet the user. Furthermore, the core allows „session management“ as well as „context-handling“. Within the project only the component „Rasa NLU“ will be used, because only the functionality is needed to capture entities from a text message and to determine the user intention.

Table Of Content

Understanding Rasa-NLU

Rasa NLU allows the processing of natural language to classify user intentions and extract entities from text.

e.g.

„Wie ist das Wetter morgen?“

The user intention is then determined from the text.

"intent": {
   "name": "Wetter",
   "confidence": 0.9989299178
}

Training data is needed so that Rasa can identify the intention of a text. Training data can be created in the form of Markdown or JSON. You can define this data in a single file or in multiple files in a directory.

To create a trained model for Rasa from the Markdown or JSON, Rasa offers a REST API. An alternative to creating trained models is to install Rasa on your local machine and then create the model using the command „rasa train nlu“. Rasa creates the training model (tar.gz) from the Markdown or JSON.

Furthermore Rasa NLU is configurable and is defined by pipelines. These pipelines define how the models are generated with the training data and which entities are extracted. For this, a preconfigured pipeline with „spaCy“ is used. spaCy works in this case with pre-trained language models.

Getting Started

The following instructions are intended to help the user run the Rasa-NLU-component on the local machine for development.

Prerequisites

You will need to install Docker in order to use the Docker-Compose-file for running the application.

Installation instructions for Docker

Installing

After the repository has been cloned and the prerequisites have been fulfilled, you can run the Docker-Compose-file.

:!: FIXME FIXME FIXME

# build and start Rasa-NLU-Container && serve at localhost:5005
docker-compose up

# stop and remove rasa-container, volumes, images and networks
docker-compose down

# do the same steps as "docker-compose down" 
# additionally remove declared volumes in Docker-Compose-file
docker-compose down -v

# lists running containers
docker ps

# connect to the container with a bash
docker exec -it <Container-ID> bash

Overview

As part of the chatbot-project, microservices are supposed to run in Docker-Containers. In order to start several different services in containers at the same time, a Docker-Compose-File should be created. A Docker-Image is used for the Rasa NLU.

version: '3.0'
services:
  rasa:
    image: rasa/rasa:1.5.2-spacy-de
    ports:
      - 5005:5005
    volumes:
      - ./rasa-app:/app
    command:
      - run
      - --enable-api
      - --cors
      - "*"

The most important file for Rasa is the machine learning trained model (.tar.gz), which is written in the volume of the docker container. When executing the Rasa container, the model is needed to recognize user intentions.

HTTP-API

Rasa offers several REST APIs to provide server information, training models, etc. The Rasa features used in the project are listed here:

- Serverinformation:

You can query the Rasa-server whether it is still running or which Rasa version is available. You can also check which model Rasa is currently using.

- Model:

You can send requests via the Rest API of the Rasa server to create a trained model or load the model into Rasa. You can also send text to the server and Rasa will then determine the user's intention and the confidence score.

Links:

- HTTP-API (Retrieved 12.12.2019)
- OpenAPI-specification (Retrieved 12.12.2019)

Further Development

For further development, it is important that the existing training data be expanded and improved.

Further Reading

- Rasa Documentation (Retrieved 12.12.2019)
- Running Rasa with Docker (Retrieved 12.12.2019)

Built With

- Docker-Compose (Retrieved 12.12.2019)
- Docker Hub Rasa (Retrieved 12.12.2019)

Author

- Abirathan Yogarajah

References

wiki/software/beuthbot/rasa.1579610844.txt.gz · Zuletzt geändert: 21.01.2020 13:47 von Thomas Ziemer