From Custom Code to Clicks: Simplifying Data Handling with A No Code Tool

Data Mining Extraction Oil
Data Mining Extraction Oil
Data Mining Extraction Oil
Data Mining Extraction Oil

Apr 1, 2025

Apr 1, 2025

From Custom Code to Clicks: Simplifying Data Handling with A No Code Tool.

When working with IoT devices—especially in LPWAN setups like LoRaWAN—data often comes in bursts, wrapped in nested payloads, and must be parsed, processed, stored, and potentially republished for downstream services. The traditional approach involves writing custom code to connect to an MQTT broker, extract data, transform it, store it in a database, and publish it elsewhere.

But what if there was a simpler way?

In this post, we’ll walk through the custom code approach to handle this pipeline. Then, we'll show how the same process can be done in a few clicks using ThingDash, a data extraction platform that can handle any JSON payload, in this case, from MQTT.

🔧 The Custom Code Approach

Let’s assume we have a LoRaWAN payload like this coming in on the topic lorawan/devices/device123/uplink:

jsonCopyEdit{
  "end_device_ids": {
    "device_id": "device123"
  },
  "uplink_message": {
    "decoded_payload": {
      "temperature": 21.5,
      "humidity": 58,
      "battery": 3.7
    },
    "rx_metadata": [
    {
        "gateway_ids": {
          "gateway_id": "gateway001"
        },
        "rssi": -78
      }
    ]

We want to:

  • Connect to the broker

  • Subscribe to this topic

  • Extract temperature, humidity, and gateway_id

  • Merge into a new payload

  • Save to a database

  • Re-publish the new payload to another broker

Here’s how we’d do it in Python with paho-mqtt, jsonpath-ng, and sqlite3:


🧩 Using ThingDash – Build Data Extraction Flows Visually

With ThingDash, you can replace all of the above with:

  1. Connect to your MQTT broker via Broker Connections on the platform.

  2. Subscribe to the topic lorawan/devices/+/uplink using Subscription Node.

  3. Use Payload Extractor Node to pull out temperature, humidity, and gateway_id.

  4. Merge extracted keys together using Merge Node.

  5. Check the "store payload" box to save data.

  6. Export in csv with keys as Headers.

  7. Publish to another topic with a Publisher Node.

Here's an example of what a Data Extraction flow could look like.

No Python, no dependencies, no boilerplate error handling, no code, completed in 15 mins. Plus, the MQTT Broker is hosted for you. Here's how we compare with custom code.

⚖️ Custom Code vs ThingDash: A Quick Comparison

Feature

Custom Code

ThingDash

MQTT Broker Connectivity

Manual setup with Paho

One-click connection

Topic Subscription

Wildcard matching via code

Enter topic in input field

JSON Extraction

Requires jsonpath-ng

Clicks with built-in JSONPath helper

Payload Construction

Hard coded merge

Select extracted payloads to merge on UI

Database Integration

Write queries manually

Built-in DB or connect to preferred external DB visually

MQTT Republish

Manual broker/client handling

Configured via UI

Dev Time

Hours

Minutes

🚀 Conclusion

Custom code offers flexibility and control—but at the cost of development time, maintenance, and debugging. For teams that want to move fast, ThingDash abstracts the MQTT boilerplate and lets you focus on what you want to do with your data, not how you have to code it.

Whether you're deploying to the edge, monitoring LoRa devices, or managing smart sensors at scale, ThingDash simplifies your data workflows without sacrificing quality. Reach out to us and we'll help you get started!

Get Started with ThingDash Today.

Transform, filter and save your MQTT payloads easily.