Have you ever been stuck at work while a pulsating cricket match was going on? You need to meet a deadline but you just can’t concentrate because your favorite team is locked in a fierce battle for a playoff spot. Sounds familiar?
I’ve been in this situation a lot in my professional career and checking my phone every 5 minutes was not really an option! Being a data scientist, I looked at this challenge from the lens of an NLP enthusiast. Building a chatbot that could fetch me the scores from the ongoing IPL (Indian Premier League) tournament would be a lifesaver.
So I did just that! Using the awesome Rasa stack for NLP, I built a chatbot that I could use on my computer anytime. No more looking down at the phone and getting distracted.
And the cherry on top? I deployed the chatbot to Slack, the popular platform for de facto team communications. That’s right – I could check the score anytime without having to visit any external site. Sounds too good an opportunity to pass up, right?
In this article, I will guide you on how to build your own Rasa chatbot in minutes and deploy it in Slack. With the ICC Cricket World Cup around the corner, this is a great time to get your chatbot game on and feed your passion for cricket without risking your job.
The Rasa Stack is a set of open-source NLP tools focused primarily on chatbots. In fact, it’s one of the most effective and time efficient tools to build complex chatbots in minutes. Below are three reasons why I love using the Rasa Stack:
These features differentiate Rasa from other chatbot building platforms, such as Google’s DialogFlow. Here’s a sneak peek into the chatbot we’ll soon be building:
Let’s understand how our Rasa powered IPL chatbot will work before we get into the coding part. Understanding the architecture of the chatbot will go a long way in helping us tweak the final model.
There are various approaches we can take to build this chatbot. How about simply using the quickest and most efficient method? Check out a high-level overview of our IPL chatbot below:
Let’s break down this architecture (keep referring to the image to understand this):
I have created two versions of the project on GitHub:
So, go ahead and clone the ‘Practice Version’ project from GitHub:
git clone https://github.com/mohdsanadzakirizvi/iplbot.git && cd iplbot
And cd into the practice_version:
cd practice_version
A quick note on a couple of things you should be aware of before proceeding further:
conda create -n rasa python=3.6 conda activate rasa
You can use the code below to install all the dependencies of the Rasa Stack:
pip install -r requirements.txt
This step might take a few minutes because there are quite a few files to install. You will also need to install a spaCy English language model:
python -m spacy download en
Let’s move on!
The first thing we want to do is figure out the intent of the user. What does he or she want to accomplish? Let’s utilize Rasa and build an NLU model to identify user intent and its related entities.
Look into the practice_version folder you downloaded earlier:
The two files we will be using are highlighted above.
As you can see, the format of training data for ‘intent’ is quite simple in Rasa. You just have to:
Let’s write some intent examples in Python for the scenario when the user wants to get IPL updates:
You can include as many examples as you want for each intent. In fact, make sure to include slangs and short forms that you use while texting. The idea is to make the chatbot understand the way we type text. Feel free to refer to the complete version where I have given plenty of examples for each intent type.
Let’s choose the former as it suits our example:
If you have made it this far, you have already done most of the work for the intent extraction model. Let’s train it and see it in action!
You can train the classifier by simply following the command below:
make train-nlu
Using Windows? You can run the following python code:
python -m rasa_nlu.train -c nlu_config.yml --data data/nlu_data.md -o models --fixed_model_name nlu --project current --verbose
Let’s test how good our model is performing by giving it a sample text that it hasn’t been trained on for extracting intent. You can open an iPython/Python shell and follow the following steps:
>>> from rasa_nlu.model import Interpreter >>> nlu_model = Interpreter.load('./models/current/nlu') >>> nlu_model.parse('what is happening in the cricket world these days?')
Here is what the output looks like:
Not only does our NLU model perform well on intent extraction, but it also ranks the other intents based on their confidence scores. This is a nifty little feature that can be really useful when the classifier is confused between multiple intents.
One of the most important aspects of a chatbot application is its ability to be interactive. Think back to a chatbot you’ve used before. Our interest naturally piques if the chatbot is able to hold a conversation, right?
The chatbot is expected to extract all the necessary information needed to perform a particular task using the back and forth conversation it has with the end user.
Take a moment to think of the simplest conversation our chatbot can have with a user. What would be the flow of such a conversation? Let’s write it in the form of a story!
Me: Hi Iplbot: Hey! How may I help you? Me: What was the result of the last match? Iplbot: Here are some IPL quick info: 1.The match between Rajasthan Royals and Delhi Capitals was recently held and Delhi Capitals won. 2.The next match is Warriors vs Titans on 22 April 2019 Iplbot: Did that help you? Me: yes, thank you! Iplbot: Glad that I could help! :)
Let’s see how we can teach a simple conversation like that to Rasa:
The general format is:
## news path 1 <--- story name for debugging purposes * greet <--- intent detected from the user - utter_greet <--- what action the bot should take * current_matches <--- the following intent in the conversation
This is called a user story path. I have provided a few stories in the data/stories.md file for your reference. This is the training data for Rasa Core.
The way it works is:
Check out the data/stories.md file in the complete_version of the project for more such examples. Meanwhile, here is a nice visualization of the basic story paths generated by Rasa for our IPL chatbot:
The above illustration might look complicated, but it’s simply listing out various possible user stories that I have taught Rasa. Here are a few things to note from the above graph:
Write the following in your stories.md file:
Now, generate a similar graph for your stories using the following command:
python -m rasa_core.visualize -d domain.yml -s data/stories.md -o graph.html
This is very helpful when debugging the conversational flow of the chatbot.
Now, open up the domain.yml file. You will be familiar with most of the features mentioned here:
The domain is the world of your chatbot. It contains everything the chatbot should know, including:
Rasa Core generates the training data for the conversational part using the stories we provide. It also lets you define a set of policies to use when deciding the next action of the chatbot. These policies are defined in the policies.yml file.
So, open that file and copy the following code:
Here are a few things to note about the above policies (taken from Rasa Core’s policies here):
You can train the model using the following command:
make train-core
Or if you are on Windows, you can use the full Python command:
python -m rasa_core.train -d domain.yml -s data/stories.md -o models/current/dialogue -c policies.yml
This will train the Rasa Core model and we can start chatting with the bot right away!
Before we proceed further, let’s try talking to our chatbot and see how it performs. Open a new terminal and type the following command:
make cmdline
Once it loads up, try having a conversation with your chatbot. You can start by saying “Hi”. The following video shows my interaction with the chatbot:
I got an error message when trying to get IPL updates:
Encountered an exception while running action 'action_match_news'. Bot will continue, but the actions events are lost. Make sure to fix the exception in your custom code.
The chatbot understood my intent to get news about the IPL. So what went wrong? It’s simple – we still haven’t written the backend code for that! So, let’s build up the backend next.
We will use the CricAPI for fetching IPL related news. It is free for 100 requests per day, which (I hope) is more than enough to satiate that cricket crazy passion you have.
You need to first signup on the website to get access to their API:
https://www.cricapi.com/
You should be able to see your API Key once you are logged in:
Save this key as it will be really important for our chatbot. Next, open your actions.py file and update it with the following code:
Python Code:
API_URL = "https://cricapi.com/api/"
API_KEY = ""
class ApiAction(Action): # Here action is not defined so running this code will return an error, but the script is correct, do no remove Action parameter
def name(self):
return "action_match_news"
def run(self, dispatcher, tracker, domain):
res = requests.get(API_URL + "matches" + "?apikey=" + API_KEY)
if res.status_code == 200:
data = res.json()["matches"]
recent_match = data[0]
upcoming_match = data[1]
upcoming_match["date"] = datetime.strptime(upcoming_match["date"], "%Y-%m-%dT%H:%M:%S.%fZ")
next_date = upcoming_match["date"].strftime("%d %B %Y")
out_message = "Here some IPL quick info:\n1.The match between {} and {} was recently held and {} won.".format(recent_match["team-1"], recent_match["team-2"], recent_match["winner_team"])
dispatcher.utter_message(out_message)
out_message = "2.The next match is {} vs {} on {}".format(upcoming_match["team-1"], upcoming_match["team-2"], next_date)
dispatcher.utter_message(out_message)
return []
Fill in the API_KEY with the one you got from CricAPI and you should be good to go. Now, you can again try talking to your chatbot. This time, be prepared to be amazed.
Open a new terminal and start your action server:
make action-server
This will activate the server that is running on the actions.py file and will be working in the background for us. Now, restart the chatbot in the command line:
make cmdline
And this time, it should give you some IPL news when asked. Isn’t that awesome? We have already built a complete chatbot without doing any complex steps!
So we have the chatbot ready. It’s time to deploy it and integrate it into Slack as I promised at the start of this article. Fortunately for us, Rasa handles 90% of the deployment part on its own.
Note: You need to have a workspace in Slack before proceeding further. If you do not have one, then you can refer to this.
Now that we have a workspace to experiment with, we need an application to attach your bot. Create the app on the below link:
https://api.slack.com/apps
1. Click on “Create App”, give a name to the app, and select your workspace:
This will redirect you to your app dashboard. From there, you can select the “Bots” option:
2. Click “Add a Bot User” –> Give a name to your bot. In my case, I have named it “iplbot”. Now, we need to add it to our workspace so we can chat with it! Go back to the above app dashboard and scroll down to find the “Install App to Workspace” option:
Once you do that, Slack will ask you to “authorize” the application. Go ahead and accept the authorization.
3. Before we are able to connect any external program to our Slack bot, we need to have an “auth token” that we need to provide when trying to connect with it. Go back to the “app dashboard” and select the “OAuth & Permissions” option:
4. This will open the permissions settings of the app. Select the “Bot User OAuth Access Token” and save it (I have hidden them for security reasons). This token is instrumental in connecting to our chatbot.
Our work isn’t over yet. We need another useful tool to deploy our chatbot to Slack. That’s ngrok and you can use the following link to download it:
https://ngrok.com/download
We are now one step away from deploying our own chatbot! Exciting times await us in the next section.
We need only five commands to get this done as Rasa takes care of everything else behind the scenes.
make action-server
ngrok http 5055
This will give an output like the below image:
The highlighted link is the link on the internet that is connected to your computer’s port 5055. This is what ngrok does – it lets your computer’s local programs be exposed on the internet. In a way, this is a shortcut for using a cloud service to deploy your app.
your_ngrok_url:5055/webhook
python -m rasa_core.run -d models/current/dialogue -u models/current/nlu --port 5002 --connector slack --credentials slack_credentials.yml --endpoints endpoints.yml
You will get a message like this:
Notice that the Rasa Core server is running at port 5002.
ngrok http 5002
In the above URL, replace the ngrok part with your ngrok URL:
your_ngrok_url/webhooks/slack/webhook
Once you’ve added the events, click the Save Changes button at the bottom of the screen.
Now you can just refresh your Slack page and start chatting right away with your bot! Here’s a conversation with my chatbot:
You’ll find the below links useful if you are looking for similar challenges. I have built a Zomato-like chatbot for Restaurant Search problem using both Rasa Core and Rasa NLU models. I will be teaching this in much more detail in our course on Natural Language Processing.
The links to the course are below for your reference:
I would love to see different approaches and techniques from our community. Try to use different pipelines in Rasa Core, explore more Policies, fine-tune those models, check out what other features CricAPI provides, etc. There are so many things you can try! Don’t stop yourself here – go on and experiment.
Feel free to discuss and provide your feedback in the comments section below. The full code for my project is available here.
You should also check out these two articles on building chatbots:
Thanks, Sanad ...!! That's really a Great Article Which actually includes all the information about the Package and the Application deployment as well
Hey Dheeraj, Glad that you liked it! My goal was to cover everything that involves building a chatbot end to end, including how would one deploy something like that.
Hi Rizvi, Nice Article!!! How about chat responding with the ongoing match score??
Thanks Rahul, You can actually do that, I encourage you to go ahead and explore CricAPI's other APIs. They even provide detailed summary of every match and player stats and what not!
Hi Sanad, I was going through codes residing in all the files and wondering the conversation flow between * current_matches - action_match_news Observed that action_match_news is missing in templates of domain.yml then how does the flow understand that for "action_match_news" the file named actions.py needs to be called?
Hey Rahul, 1. action_match_news is actually present in the domain.yml file. 2. It is read by Rasa Core and since it’s a custom action (one that is not present in Rasa Core by default), it checks it in the actions.py file. This is the behavior of Rasa Core itself. 3. You can define multiple custom actions like these in the actions.py file and add them in the domain.yml and Rasa would pick them up 🙂