By Hugo Vaz and Anish Singh Walia

MQTT brokers are essential for modern IoT infrastructure and automation systems, where the need for a centralized, unified, and fast data hub is the key part for system interoperability and data exchange. Coreflux is a powerful, low code MQTT broker that expands the traditional MQTT broker to a system that provides advanced features for real-time data processing, transformation, and seamless integration with DigitalOcean managed databases including MongoDB, PostgreSQL, MySQL, and OpenSearch.
In this comprehensive DevOps tutorial, you will deploy a complete IoT automation pipeline using Coreflux MQTT broker integrated with MongoDB on DigitalOcean. This scalable storage and processing solution enables you to collect, transform, and store IoT data efficiently while maintaining enterprise-grade reliability and performance.
Before diving into the step-by-step deployment process, here are the key points you’ll learn:
This tutorial provides a production-ready foundation for IoT applications that need real-time messaging combined with persistent data storage.
By the end of this automation guide, you will have deployed:
Coreflux provides a Lightweight MQTT Broker and Data Pipeline tools through Language-of-Things programming language for Efficient IoT Communication on DigitalOcean.
MQTT (Message Queuing Telemetry Transport) is a lightweight, publish-subscribe network protocol widely adopted in IoT ecosystems. Designed for constrained devices and low-bandwidth, high-latency, or unreliable networks, MQTT enables efficient, real-time messaging in bandwidth-constrained environments.
Coreflux offers a lightweight MQTT broker to facilitate efficient, real-time communication between IoT devices and applications, including real-time data transformation capabilities necessary for each use-case. Built for scalability and reliability, Coreflux is tailored for environments where low latency and high throughput are critical.
Coreflux provides the robust messaging backbone to ensure smooth data flow between devices, whether developing a small-scale IoT project or deploying a large-scale industrial monitoring system.
With Coreflux on DigitalOcean, you get:
Data Processing: Centralization of your data processing needs where your data lives, ensuring real-time data processing.
Data Integration: Easily integrate with other DigitalOcean services like Managed Databases, ensuring a single and simple ecosystem for all your data needs.
Scalability: Easily handle growing amounts of data and devices without compromising performance.
Reliability: Ensure consistent and dependable messaging across all connected devices.

Before you begin this MQTT broker deployment tutorial, you’ll need:
First, you’ll create a Virtual Private Cloud (VPC) to ensure secure communication between your IoT services and MQTT broker, without the need for public access.

Configure your VPC for IoT automation:
Click Create VPC Network
The VPC will provide isolated networking for all your IoT resources, ensuring secure communication between the Coreflux MQTT broker and managed databases.
DigitalOcean managed databases provide automated backups, monitoring, and maintenance, making them ideal for production IoT workloads and scalable storage requirements.

From the DigitalOcean control panel, navigate to Databases
Click Create Database Cluster
Configure your MongoDB cluster for IoT automation:
Choose your plan based on your IoT requirements:
Click Create Database Cluster
The managed database creation process typically takes 1-5 minutes. Once complete, you’ll be redirected to the database overview page, where you can see the connection details and perform administrative actions.
You’ll be prompted with Getting Started steps, where your connection details are shown and you can configure the inbound access rules (recommended to limit to your IP and VPC-only).

For connection details, you’ll be able to see two options: Public Network and VPC Network. The first is for external access for tools like MongoDB Compass, while the second will be used by the Coreflux service to access the database.

You can test the MongoDB connection using MongoDB Compass or the provided connection string, using public access credentials:
mongodb://username:password@mongodb-host:27017/defaultauthdb?ssl=true

For better security and organization, create a dedicated user and database for your IoT automation application. This can also be done through MongoDB Compass or CLI, but DigitalOcean provides a user-friendly approach:

Configure your droplet for MQTT broker deployment:

Choose Size for your IoT workload:
Choose Authentication Method:
Finalize Details:
Click Create Droplet
See the Droplet Home Page and wait for it to finish deploying

Using the same approach as for Coreflux Droplet, select Docker as the Marketplace image.
Once your droplet is running, connect to it via SSH with the defined authentication method or the web console available in the Droplet home page:
ssh root@your-droplet-ip

Run the Coreflux MQTT broker using Docker:
docker run -d \
--name coreflux \
-p 1883:1883 \
-p 1884:1884 \
-p 5000:5000 \
-p 443:443 \
coreflux/coreflux-mqtt-broker-t:1.6.3
This Docker command:
Verify the MQTT broker is running:
docker ps
You should see a container running:

You can access the MQTT broker through a MQTT client like MQTT Explorer to validate the access to the broker, regardless of the approach taken to deploy it.

For production IoT automation deployments, configure firewall rules to restrict access:
Navigate to Networking → Firewalls
Click Create Firewall
Configure inbound rules for MQTT broker security:
Apply the firewall to your droplet
For detailed firewall configuration, refer to DigitalOcean’s firewall quickstart guide.
The LoT (Language of Things) Notebook extension for Visual Studio Code provides an integrated low code development environment for MQTT broker programming and IoT automation.

Configure the connection to your Coreflux MQTT broker, using default credentials, when prompted on the top bar or by clicking the MQTT button on the bottom bar on the left:
Assuming no errors, you’ll see the status of the MQTT connectivity to the broker in the bottom bar, on the left.

For this use-case, we will build an integration of raw-data, through a transformation pipeline, into a Database. However, as we are not connected to any MQTT devices in the demo, we will take advantage of IoT’s capabilities and use an Action to simulate device data.
In LoT, an Action is an executable logic that is triggered by specific events such as timed intervals, topic updates, or explicit calls from other actions or system components. Actions allow dynamic interaction with MQTT topics, internal variables, and payloads, facilitating complex IoT automation workflows.
As such, we can use an Action that generates data in certain topics in a defined time interval, that can then be used by the rest of the pipeline we will define below.
You can download the github repo with the sample project.
Create an Action to generate simulated sensor data using the low code LoT (Language of Things) interface:
DEFINE ACTION RANDOMIZEMachineData
ON EVERY 10 SECONDS DO
PUBLISH TOPIC "raw_data/machine1" WITH RANDOM BETWEEN 0 AND 10
PUBLISH TOPIC "raw_data/station2" WITH RANDOM BETWEEN 0 AND 60
In the Notebook provided you also have an Action that does an incremental counter to simulate data, as an alternative to the provided Action.

When you run this Action, it will:
Models in Coreflux are used to transform, aggregate, and compute values from input MQTT topics, publishing the results to new topics. They serve the foundation for the creation of the Unified Namespace, of your system, applicable to your several data sources.
This way, a Model allows you to define how raw Language of Things data should be structured and transformed, both for a single device or for multiple devices simultaneously (through the use of the wildcard +). A model also serves as the key data schema used for scalable storage to the managed database.
DEFINE MODEL MachineData WITH TOPIC "Simulator/Machine/+/Data"
ADD "energy" WITH TOPIC "raw_data/+" AS TRIGGER
ADD "energy_wh" WITH (energy * 1000)
ADD "production_status" WITH (IF energy > 5 THEN "active" ELSE "inactive")
ADD "production_count" WITH (IF production_status EQUALS "active" THEN (production_count + 1) ELSE 0)
ADD "stoppage" WITH (IF production_status EQUALS "inactive" THEN 1 ELSE 0)
ADD "maintenance_alert" WITH (IF energy > 50 THEN TRUE ELSE FALSE)
ADD "timestamp" WITH TIMESTAMP "UTC"
This low code model:
+ to apply to all machines automatically+ with each topic that matches the format for the trigger/source data)As we generated two simulated sensors/machines with the Action, we can see the Model structure being applied automatically to both, generating both a json object and the individual topics.

Routes define how processed real-time data flows to external systems like managed databases. They are defined with the following low code format:
DEFINE ROUTE mongo_route WITH TYPE MONGODB
ADD MONGODB_CONFIG
WITH CONNECTION_STRING "mongodb+srv://<username>:<password>@<cluster-uri>/<database>?tls=true&authSource=admin&replicaSet=<replica-set>"
WITH DATABASE "admin"
Replace with your MongoDB connection details from DigitalOcean and run the Route in your IoT Notebook.
Modify your LoT model to use the database route for scalable storage, by adding this to the end of the Model:
STORE IN "mongo_route"
WITH TABLE "MachineProductionData"
Additionally, add a parameter with the topic, to have a unique identifier for each entry in your managed database.
DEFINE MODEL MachineData WITH TOPIC "Simulator/Machine/+/Data"
ADD "energy" WITH TOPIC "raw_data/+" AS TRIGGER
ADD "device_name" WITH REPLACE "+" WITH TOPIC POSITION 2 IN "+"
ADD "energy_wh" WITH (energy * 1000)
ADD "production_status" WITH (IF energy > 5 THEN "active" ELSE "inactive")
ADD "production_count" WITH (IF production_status EQUALS "active" THEN (production_count + 1) ELSE 0)
ADD "stoppage" WITH (IF production_status EQUALS "inactive" THEN 1 ELSE 0)
ADD "maintenance_alert" WITH (IF energy > 50 THEN TRUE ELSE FALSE)
ADD "timestamp" WITH TIMESTAMP "UTC"
STORE IN "mongo_route"
WITH TABLE "MachineProductionData"
After you deploy this updated action, all data should be automatically stored in the database when updated.
Connect to your MongoDB managed database using MongoDB Compass to verify scalable storage:

You should see real-time data documents with structure similar to:
{
"_id": {
"$oid": "68626dc3e8385cbe9a1666c3"
},
"energy": 36,
"energy_wh": 36000,
"production_status": "active",
"production_count": 31,
"stoppage": 0,
"maintenance_alert": false,
"timestamp": "2025-06-30 10:58:11",
"device_name": "station2"
}
As we’ve seen before, all of the data is available in the MQTT Broker for other uses and integrations.

You integrate an MQTT broker with MongoDB by creating a database route in Coreflux that connects to your MongoDB instance. The route uses a connection string with authentication credentials and automatically stores MQTT message payloads into MongoDB collections. Coreflux handles the connection management, allowing you to focus on data transformation logic using the Language of Things (LoT) syntax.
Yes, Coreflux provides a low-code solution through its MongoDB route feature. You define a route with your MongoDB connection string, then add a STORE IN directive to your data models. This automatically persists all processed MQTT messages to MongoDB without writing custom database integration code. The route handles connection pooling, error handling, and retry logic automatically.
MongoDB’s flexible document structure makes it ideal for IoT data because sensor readings often have varying schemas. Unlike relational databases, MongoDB doesn’t require predefined tables, allowing you to store different device types and data formats in the same collection. Combined with MQTT’s lightweight messaging protocol, this creates an efficient pipeline for high-volume, real-time IoT data streams. DigitalOcean’s managed MongoDB adds automated backups, scaling, and monitoring to simplify operations.
When deploying on DigitalOcean, you can use VPC networking to keep all communication between your Coreflux MQTT broker and MongoDB database private. The VPC isolates your resources from public internet access, and DigitalOcean managed databases support TLS encryption for connections. Additionally, you can create dedicated database users with limited permissions for your Coreflux application, following the principle of least privilege.
Yes, this architecture is production-ready. DigitalOcean managed databases provide automated backups, high availability, and monitoring out of the box. Coreflux MQTT broker scales horizontally and handles high message throughput. For production deployments, ensure you configure firewall rules, use strong authentication credentials, enable TLS for MQTT connections, and set up monitoring alerts. Start with appropriate resource sizing based on your expected message volume and scale up as needed.
Integrating Coreflux MQTT broker with DigitalOcean’s managed MongoDB service provides a powerful solution for real-time IoT data processing and scalable storage. Following this tutorial, you have set up a seamless automation pipeline that allows you to collect, process, and store IoT data efficiently using low code development practices.
With Coreflux’s scalable architecture and MongoDB’s robust document storage capabilities, you can handle large volumes of real-time data and gain valuable insights instantly. Whether you are monitoring industrial systems, tracking environmental sensors, or managing smart city infrastructure, this IoT automation integration empowers you to make data-driven decisions quickly and effectively.
The Language of Things (LoT) notebook approach combined with DigitalOcean’s managed services creates an IoT & DevOps-ready foundation that scales with your business needs. Your MQTT broker deployment is now ready for production workloads and can be extended to support PostgreSQL, MySQL, OpenSearch, and other database technologies as your requirements evolve.
Now that you’ve deployed your Coreflux MQTT broker with MongoDB, explore these resources to extend your IoT automation capabilities:
Explore Related Tutorials: Learn more about deploying OpenSearch with Coreflux in our deploying OpenSearch with Coreflux MQTT broker tutorial, or dive deeper into MongoDB with our MongoDB tutorial collection.
Try DigitalOcean Managed Databases: Simplify your database operations with DigitalOcean Managed Databases, which provide automated backups, scaling, and high availability for MongoDB, PostgreSQL, MySQL, and Redis.
Deploy Coreflux from Marketplace: Get started quickly with the Coreflux MQTT Broker available in the DigitalOcean Marketplace, or explore other Marketplace applications for additional IoT and automation tools.
Review Coreflux Documentation: Deepen your understanding of Language of Things and advanced Coreflux features in the Coreflux documentation.
Try the Sample Project: Experiment with the sample IoT automation project to see additional use cases and integration patterns.
Note: While this tutorial uses the Coreflux or Docker Marketplace image for simplified deployment, you can also install the Coreflux MQTT broker directly on Ubuntu. For manual installation instructions, visit the Coreflux Installation Guide.
Thanks for learning with the DigitalOcean Community. Check out our offerings for compute, storage, networking, and managed databases.
I help Businesses scale with AI x SEO x (authentic) Content that revives traffic and keeps leads flowing | 3,000,000+ Average monthly readers on Medium | Sr Technical Writer @ DigitalOcean | Ex-Cloud Consultant @ AMEX | Ex-Site Reliability Engineer(DevOps)@Nutanix
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Full documentation for every DigitalOcean product.
The Wave has everything you need to know about building a business, from raising funding to marketing your product.
Stay up to date by signing up for DigitalOcean’s Infrastructure as a Newsletter.
New accounts only. By submitting your email you agree to our Privacy Policy
Scale up as you grow — whether you're running one virtual machine or ten thousand.
Sign up and get $200 in credit for your first 60 days with DigitalOcean.*
*This promotional offer applies to new accounts only.