A simple chat architecture for your MVP

Two smartphones communicating through a cloud.

When we think about the process of developing an MVP, what we have in mind is to create great value in a short time, avoiding complexities and solving problems in the simplest way possible.

Then, on a beautiful sunny summer day, the Project Manager comes up and says:

“We need to create an MVP for a chat system in just one month!”

It’s chaos! It seems to be something very complex for only a month of work. Some questions begin to emerge:

  • What are the requirements?
  • How to estimate the effort?
  • How to build a simple chat app that aggregates value?

After brainstorming a bit, developers start to think about various open-source alternatives, like jsxc and pyxmpp, that could fit the business model and the project’s needs.

In this case, the alternatives found didn’t meet the application requirements perfectly, because the team would have to spend a lot of time learning a new library whose features would mostly not be needed for our MVP.

After extensive research and no solution, a voice from heaven says:

“Why not create your own architecture?”

It’s just an MVP! We need to test the project before we think about scalability, right? It was then that we decided to implement this architecture with WebSockets. Here’s the breakdown of what we came up with.

The Project

Our starting point was a basic architecture, composed by a Back-end with Python and Django, using PostgreSQL database and ReactJS and Redux on the Front-end. With baby steps in mind, we thought about creating a chat system without real-time operations, in other words, just a CRUD of messages. After the database modeling, we ended up with only two tables: Chat and Message. Simple!

Database Models for Chat and Messages
Database Models

With the database ready, we created all the endpoints needed to communicate with the Front-end. Our API became very simple, just two endpoints: /chats and /chats/:idChat/messages, having the following HTTP actions:

  • POST: /api/v1/chats/ — Creates a new chat between two users.
  • GET: /api/v1/chats/ — Lists all chats the logged-in user belongs to.
  • POST: /api/v1/chats/:idChat/messages/ — Adds a new message to a specific chat.
  • GET: /api/v1/chats/:idChat/messages/ — Lists all interactions between users on a specific chat.

In the Front-end we created, with the Redux, two stores: one for the chats and another for the messages, reflecting exactly what was being received from the API. For the communication between the Reducers, four actions were necessary:

  • CREATE_CHAT: sends a POST call to /chats/ and merges the new chat in the store.
  • GET_ALL_CHATS: sends a GET call to /chats/ and adds the returned chats in the store.
  • SEND_MESSAGE: sends a POST call to /chats/:idChat/messages/ and adds the new message to the store.
  • GET_ALL_MESSAGES: sends a GET call to /chats/:idChat/messages/ and adds the returned messages in the store.
Diagram showing the communication between database, webserver and the user.
Simple Chat Architecture

The image above represents the implementation after the first phase of the project. With this, we could already carry out conversations between two users. However, we needed to refresh the page to receive new messages.

Thinking about real-time

To evolve the application into a real-time architecture, we needed to add two key pieces: a data structure featuring publish/subscription and a WebSocket server.

For Pub/Sub, we chose Redis – an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker –  as our team was very familiar with it. This tool has the utmost importance for the operation of our new architecture, allowing messages to almost instantaneously published without harming our web service performance.

To listen to the Redis publications and create a WebSocket server, we decided to create a microservice using NodeJS. When initialized, it creates a WebSocket using the socket.io library, subscribing as a Redis listener.

The microservice works like a messaging router. After receiving messages from Redis, it checks if the recipient is connected via WebSockets and forwards the content. Messages sent to unconnected recipients are discarded, but, when users log in to chat, they will receive messages through HTTP requests.

After adding these two pieces, our architecture looks like this:

Diagram showing the communication between the database, Redis, WebSocket, webserver and the user.
Real-Time Chat Architecture

Now it’s time to fix the page refresh issue, making our conversation more fluid. For this, we needed a bidirectional communication between the server and the client. According to our architecture, now we just need to add the socket.io library on the Front-end side and connect it to microservice. With this, we were able to have a permanent connection between the client and the server, allowing them to receive and send messages at any moment.

In the Back-end side, we added two things: the connection with Redis and the publication of the message after it has been saved in the database. For the Front-end structure, we just created a new action called ADD_MESSAGE_FROM_SOCKET and the  <WebSocket /> component.

The action has the simple responsibility of adding the messages received in the correct store. The component took the role of creating/removing the connection to the WebSocket and listening to the port. When it receives a new message via the WebSocket, it just calls the action ADD_MESSAGE_FROM_SOCKET.

Some important decisions were made:

  • The connection from the WebSocket to the Client should not be responsibility of the web server.
  • The messages should be saved in PostgreSQL and published in Redis.
  • The client could only receive new messages via WebSockets, the insertion of new messages would still occur via HTTP.

Using this simple approach, we were able to transform our CRUD-only chat in real-time fluent conversations.

Wrapping up

Through this solution, we created a simple and maintainable chat application for our MVP. Also, we managed to prove that, for an MVP, this solution is extremely feasible. However, we have to keep in mind that as the project grows, some concerns should be raised, such as scalability and maintainability.

About the author.

Daniel Leite
Daniel Leite

A Front-End developer passionate about open source culture. Trying to help people by sharing code.