Rasa 3 NLU

The first piece of a Rasa assistant is an NLU model. NLU stands for Natural Language Understanding, which means turning user messages into structured data. To do this with Rasa, you provide training examples that show how Rasa should understand user messages, and then train a model by showing it those examples.

Run the code cell below to see the NLU training data created by the rasa init command:


cat data/nlu.md

lizhe@ubuntu:~/rasa_workspace$ cat data/nlu.md

## intent:greet

– hey

– hello

– hi

– good morning

– good evening

– hey there

## intent:goodbye

– bye

– goodbye

– see you around

– see you later

## intent:affirm

– yes

– indeed

– of course

– that sounds good

– correct

## intent:deny

– no

– never

– I don’t think so

– don’t like that

– no way

– not really

## intent:mood_great

– perfect

– very good

– great

– amazing

– wonderful

– I am feeling very good

– I am great

– I’m good

## intent:mood_unhappy

– sad

– very sad

– unhappy

– bad

– very bad

– awful

– terrible

– not very good

– extremely sad

– so sad

## intent:bot_challenge

– are you a bot?

– are you a human?

– am I talking to a bot?

– am I talking to a human?


Rasa NLU: Language Understanding for Chatbots and AI assistants

Rasa NLU is an open-source natural language processing tool for intent classification, response retrieval and entity extraction in chatbots. For example, taking a sentence like

“I am looking for a Mexican restaurant in the center of town”

and returning structured data like


  “intent”: “search_restaurant”,

  “entities”: {

    “cuisine” : “Mexican”,

    “location” : “center”



Using NLU Only

If you want to use Rasa only as an NLU component, you can!

Training NLU-only models

To train an NLU model only, run:

rasa train nlu

This will look for NLU training data files in the data/ directory and saves a trained model in the models/ directory. The name of the model will start with nlu-.

To try out your NLU model on the command line, use the rasa shell nlu command:

rasa shell nlu

This will start the rasa shell and ask you to type in a message to test. You can keep typing in as many messages as you like.

Alternatively, you can leave out the nlu argument and pass in an nlu-only model directly:

rasa shell -m models/nlu-20190515-144445.tar.gz

Running an NLU server

To start a server with your NLU model, pass in the model name at runtime:

rasa run –enable-api -m models/nlu-20190515-144445.tar.gz

You can then request predictions from your model using the /model/parse endpoint. To do this, run:

curl localhost:5005/model/parse -d ‘{“text”:”hello”}’