The first piece of a Rasa assistant is an NLU model. NLU stands for Natural Language Understanding, which means turning user messages into structured data. To do this with Rasa, you provide training examples that show how Rasa should understand user messages, and then train a model by showing it those examples.
Run the code cell below to see the NLU training data created by the rasa init command:
1
cat data/nlu.md
lizhe@ubuntu:~/rasa_workspace$ cat data/nlu.md
## intent:greet
– hey
– hello
– hi
– good morning
– good evening
– hey there
## intent:goodbye
– bye
– goodbye
– see you around
– see you later
## intent:affirm
– yes
– indeed
– of course
– that sounds good
– correct
## intent:deny
– no
– never
– I don’t think so
– don’t like that
– no way
– not really
## intent:mood_great
– perfect
– very good
– great
– amazing
– wonderful
– I am feeling very good
– I am great
– I’m good
## intent:mood_unhappy
– sad
– very sad
– unhappy
– bad
– very bad
– awful
– terrible
– not very good
– extremely sad
– so sad
## intent:bot_challenge
– are you a bot?
– are you a human?
– am I talking to a bot?
– am I talking to a human?
lizhe@ubuntu:~/rasa_workspace$
Rasa NLU: Language Understanding for Chatbots and AI assistants
Rasa NLU is an open-source natural language processing tool for intent classification, response retrieval and entity extraction in chatbots. For example, taking a sentence like
“I am looking for a Mexican restaurant in the center of town”
and returning structured data like
{
“intent”: “search_restaurant”,
“entities”: {
“cuisine” : “Mexican”,
“location” : “center”
}
}
Using NLU Only
If you want to use Rasa only as an NLU component, you can!
Training NLU-only models
To train an NLU model only, run:
rasa train nlu
This will look for NLU training data files in the data/ directory and saves a trained model in the models/ directory. The name of the model will start with nlu-.
To try out your NLU model on the command line, use the rasa shell nlu command:
rasa shell nlu
This will start the rasa shell and ask you to type in a message to test. You can keep typing in as many messages as you like.
Alternatively, you can leave out the nlu argument and pass in an nlu-only model directly:
rasa shell -m models/nlu-20190515-144445.tar.gz
Running an NLU server
To start a server with your NLU model, pass in the model name at runtime:
rasa run –enable-api -m models/nlu-20190515-144445.tar.gz
You can then request predictions from your model using the /model/parse endpoint. To do this, run:
The first step is to create a new Rasa project. To do this, run:
1 rasa init --no-prompt
The rasa init command creates all the files that a Rasa project needs and trains a simple bot on some sample data. If you leave out the –no-prompt flag you will be asked some questions about how you want your project to be set up.
This creates the following files:
__init__.py
an empty file that helps python find your actions
actions.py
code for your custom actions
config.yml ‘*’
configuration of your NLU and Core models
credentials.yml
details for connecting to other services
data/nlu.md ‘*’
your NLU training data
data/stories.md ‘*’
your stories
domain.yml ‘*’
your assistant’s domain
endpoints.yml
details for connecting to channels like fb messenger
models/<timestamp>.tar.gz
your initial model
The most important files are marked with a ‘*’. You will learn about all of these in this tutorial.
tf.pack(values, axis=0, name=”pack”)
Packs a list of rank-R tensors into one rank-(R+1) tensor
将一个R维张量列表沿着axis轴组合成一个R+1维的张量。
# 'x' is [1, 4]
# 'y' is [2, 5]
# 'z' is [3, 6]
pack([x, y, z]) => [[1, 4], [2, 5], [3, 6]] # Pack along first dim.
pack([x, y, z], axis=1) => [[1, 2, 3], [4, 5, 6]]
The inputs must be two-dimensional matrices, with matching inner dimensions, possibly after transposition.
Both matrices must be of the same type. The supported types are: float, double, int32, complex64.
Either matrix can be transposed on the fly by setting the corresponding flag to True. This is False by default.
If one or both of the matrices contain a lot of zeros, a more efficient multiplication algorithm can be used by setting the corresponding a_is_sparse or b_is_sparse flag to True. These are False by default.
# -*- coding: utf-8 -*-
import tensorflow as tf
a = [[1.,2.],[3.,4.]]
b = [[1.],[2.]]
c = tf.Variable(2.1, name="bias")
with tf.Session() as sess:
tf.initialize_all_variables().run()
print(sess.run(tf.to_float(a)))
print(sess.run(tf.to_float(b)))
print(sess.run(tf.matmul(a,b)))
print(sess.run(tf.matmul(a,b)+tf.to_float(c)))
print(sess.run(tf.reduce_sum(tf.squared_difference(tf.matmul(a,b),tf.matmul(a,b)+tf.to_float(c)))))