In this blog, you will learn,

  • How to create a docker compose file
  • How to add multiple services to the docker-compose file
  • How to run multiple services together with docker-compose in just one command

Like in the previous blog you must have learned how you can link the action server to the rasa server and to run both the services together by creating a network with docker. Unlike the previous blog, you will learn how to run multiple services (can be termed as an independent Docker container) together with just a single command.

create a docker compose file

Like in the previous blog, I have shown how you can create a Docker file and update the commands with respect to the docker images. Now its time to use each independent docker container as a separate service that can be added to the docker-compose file. Now let’s create a docker-compose file and add the services to. For that go to the project directory and create a new file with the name docker-compose.yml

nano docker-compose.yml

Now add the below code to the file,

version: '3.4'
    image: rasa/rasa:1.10.8-full
      - 5005:5005
      - ./:/app
      - run
    image: rasa/rasa-sdk:1.10.2
      - ./actions:/app/actions
      - 5055:5055
docker compose file

In the above docker compose file, you can see that I have set two arguments one is version and the other one is the services. Under services, there are two services and each service has a name that is linked to an independent docker image. Like “rasa” is linked to “rasa/rasa:1.10.8-full” and “action-server-test” is linked to “rasa/rasa-sdk:1.10.2”. Also, these images have been exposed at unique ports 5005 and 5055 respectively.

Also, to link the one service to the other you can use the service name that you mentioned in the docker compose file. Like to link action server to the rasa server you need to update the endpoints for action_endpoint with the given service name like we did in the previous blog.

 url: "http://action-server-test:5055/webhook"

Once you are done with this now train your model if you have made any changes to the rasa chatbot with the given command,

For Ubuntu/Mac:
docker run --user 1000 -v $(pwd):/app rasa/rasa:1.10.8-full train --domain domain.yml --data data --out models

For Windows:
docker run --user 1000 -v $(cd):/app rasa/rasa:1.10.8-full train --domain domain.yml --data data --out models

Once you have the latest trained model with you, now you can run all the services that you added to the docker compose file by running the given single command in your project directory,

Also, if you want to run all theh services in background then use the given command

docker-compose up
docker-compose up -d

The above command will run all the services at onces which means you can access any docker container which you mentioned in the docker-compose file file. Now, that you have all the services running(rasa server and the action server) and also they are linked together, it’s time to verify the connection and working of the rasa chatbot. Run the given command to verify the working of the chatbot with docker-compose,

curl -XPOST http://localhost:5005/webhooks/rest/webhook \
-H "Content-type: application/json" \
-d '{"sender": "test", "message": "hello"}'

Once you will run the given command your bot will reply you with respect to the docker services that you connected.

Also check this video for more clarity,

Also, in future if you want to update your project and you want to verify the changes then you have to rerun all the service to take the effect. For that run the given commands to stop all the services and start the services.

docker-compose down
docker-compose up

This is all about HOW TO RUN MULTIPLE SERVICES WITH DOCKER COMPOSE. I hope you have got a crystal clear understanding of it. But still, if you are facing any difficulties in understanding and implementing it. Feel free to leave a comment below in the comment section.

For more understanding and clarification on rasa chatbot, you can check out the official website of rasa and docker hub. Also, you can check these video contents for the deployment of rasa Chabot on the live server with Google Cloud Platform and link it to the domain name.

Stay Tuned and Happy Learning. 🙂


  1. Hello Ashish,
    I have a big problem with this docker-compose file
    I am using rasa for final year project and I have created my own custom nlu component for rasa nlu pipeline.
    the chatbot works fine by running the rasa shell in terminal and running the action server separately.
    however now that I am trying to run the chatbot with docker-compose the rasa server image is not able to capture the dependencies I have used in my program in that custom nlu component.
    do you know how can I fix this issue?
    the documentation is not clear enough about this in the link below:,init%20%2C%20just%20init%20is%20enough.
    I hope you can guide me here how to fix this problem
    I have asked in rasa community forum however could find solution for it.

Leave a Reply