Protocol Buffers, also known as Protobuf, is a tool developed by Google that offers a platform-neutral approach for serialising structured data. It is similar to JSON but smaller, faster, and can automatically generate bindings in your preferred programming language. AWS IoT Core is a managed service that enables you to connect billions of IoT devices and route trillions of messages to AWS services, allowing for seamless scaling of your application to millions of devices. By integrating AWS IoT Core with Protobuf, you can take advantage of Protobuf’s efficient data serialisation protocol and automated code binding generation. This week’s article covers using Protobuf with AWS IoT Core and explores the benefits of doing so, such as smaller message size and better performance.
Protobuf’s code generator provides a significant benefit in the form of simplified and more secure program creation. The communication between the various parts of your application can be described in a schema. The schema is read by a code generator (such as protoc) that then creates the necessary code for you to use to encode and decode data. The code generated by Protobuf’s code generators is reliable since it has been used extensively and is well-maintained. By generating code automatically, programmers are spared the tedious task of creating encoding and decoding functions, and cross-language support is ensured.
With the recent launch of AWS IoT Core’s Rule Engine support for the Protocol Buffer messaging format, you can have a producer application written in C running on your device and an AWS Lambda function consumer written in Python, all using generated bindings. Other advantages of using Protocol Buffers instead of JSON with AWS IoT Core include:
- Schema and validation: Protobuf enforces the schema for both the sender and receiver, ensuring proper integration and eliminating bugs. Since the auto-generated code encodes and decodes messages, errors are reduced.
- Adaptability: The schema is mutable, allowing for changes in message content while maintaining backward and forward compatibility.
- Bandwidth optimisation: Protobuf messages are smaller than JSON messages, as they do not include headers, resulting in better device autonomy and less bandwidth usage. Recent research on messaging protocols and serialisation formats has shown that Protobuf-formatted messages can be up to 10 times smaller than their equivalent JSON-formatted messages, meaning fewer bytes are transmitted to transmit the same content.
- Efficient decoding: Decoding Protobuf messages is more efficient than decoding JSON, allowing recipient functions to run more quickly. A benchmark conducted by Auth0 showed that Protobuf can be up to 6 times more performant than JSON for equivalent message payloads.
How to Use Protocol Buffers with AWS IoT Core
Protobuf can be used in various ways with AWS IoT Core. The easiest method is to publish the message as a binary payload and let the recipient applications decipher it. AWS IoT Core Rules Engine already supports this and works for any binary payload, including Protobuf.
To utilise a more advanced method, you can ingest Protobuf-encoded data from IoT devices, then decode the data into JSON format using AWS IoT Core’s Rules Engine’s decode function. After doing so, you can route the data to a variety of AWS and third-party services. Here are the steps to follow:
-
Download the sample code and install Python requirements.
- https://github.com/aws-samples/aws-iotcore-protobuf-sample
- pip install -r requirements.txt
-
Configure your IOT_ENDPOINT and AWS_REGION environment variables.
- export AWS_REGION=<AWS_REGION>
- export IOT_ENDPOINT=$(aws iot describe-endpoint –endpoint-type iot:Data-ATS –query endpointAddress –region $AWS_REGION –output text)
-
Generate Python bindings and message descriptors using protoc.
- protoc –python_out=. msg.proto
- protoc -o filedescriptor.desc msg.proto
- filedescriptor.desc msg_pb2.py
-
Run the simulated device using Python and the Protobuf generated code bindings.
- python3 simulate_device.py
- Create AWS Resources with AWS CloudFormation and then upload the Protobuf file descriptor.
- Take a look at the AWS IoT Rule that matches, filters, and republishes Protobuf messages as JSON
- Confirm that the transformed messages are being republished.
Once you complete these steps, you’ll be able to convert Protobuf messages from your device to JSON using AWS IoT Core Rules Engine and trigger a Lambda function. The Lambda function will perform an intelligent action based on the message content. For instance, if your device’s temperature exceeds a specific threshold, the Lambda function can send an SNS notification or write data to a DynamoDB table.
How Does Protobuf Improve Scalability in Large IoT Deployments?
Scalability is a key consideration when expanding IoT solutions across thousands or millions of devices. Protocol Buffers play a significant role in achieving this by optimising message efficiency and reducing operational overhead. Here’s how Protobuf enhances scalability:
- Compact Message Format: Protobuf encodes data in a highly compact binary format, minimising the size of each transmitted message and reducing overall network traffic.
- Resource Efficiency: Devices require less processing power to encode and decode Protobuf messages, making it possible for even low-powered sensors to participate in large-scale networks.
- Lower Bandwidth Consumption: Smaller messages mean less data sent over the network, which is particularly valuable for remote sites or areas with limited connectivity.
- High Throughput: AWS IoT Core, combined with Protobuf, can handle a significantly higher volume of device messages, supporting rapid expansion.
- Cost Savings: Efficient message transmission leads to lower cloud infrastructure costs, as less data is stored and processed.
With these benefits, UK organisations can confidently scale their IoT deployments without compromising on speed, reliability, or cost-effectiveness.
Are There Any Industry Use Cases Where Protobuf with AWS IoT Core Excels?
Protocol Buffers, used with AWS IoT Core, are proving invaluable across several key industries in the UK:
- Smart Manufacturing: Factories deploy hundreds of sensors for real-time monitoring. Protobuf’s efficiency ensures fast, accurate transmission of temperature, humidity, and equipment status data, facilitating predictive maintenance and minimising downtime.
- Energy Sector: Smart meters installed in homes and businesses send frequent usage reports. With Protobuf, energy providers can process this data swiftly, supporting dynamic pricing models and optimised energy distribution.
- Logistics and Supply Chain: Tracking devices on shipments and vehicles utilise Protobuf for compact, reliable status updates, enabling real-time route adjustments and improved delivery accuracy.
- Healthcare: Medical devices can securely transmit sensitive patient data using Protobuf, helping maintain regulatory compliance while supporting telehealth and remote patient monitoring solutions.
These examples highlight how the integration drives efficiency and reliability in sectors that demand robust, scalable IoT infrastructure.
What Are the Best Practices for Managing Protobuf Schemas in an IoT Environment?
Managing Protobuf schemas effectively is vital for maintaining reliability in evolving IoT environments. Consider the following best practices:
- Version Control: Use systems like Git to track every change to your .proto files, ensuring all stakeholders can see what has changed and why.
- Backward Compatibility: Whenever possible, update schemas in ways that support both old and new devices, avoiding disruption in communication.
- Testing in Staging: Always test new schema versions in a controlled environment before rolling out changes to production.
- Central Schema Registry: Maintain a dedicated repository or cloud service to store and document all schema versions, preventing fragmentation across teams or projects.
- Comprehensive Documentation: Clearly document every schema change, including intended use cases and compatibility notes.
- Automated Testing: Implement continuous integration workflows to catch issues early, ensuring schemas work as expected across all devices.
Following these practices helps UK organisations maintain a resilient, scalable, and future-proof IoT infrastructure.
Can Protobuf Be Used With Other AWS Services Beyond IoT Core?
Certainly, the flexibility of Protocol Buffers allows UK businesses to benefit across the AWS ecosystem—not just with IoT Core. Here’s how Protobuf fits in with other AWS services:
- AWS Lambda: After decoding IoT messages, Protobuf data can be processed in Lambda for real-time analytics, automation, or event-driven workflows.
- Kinesis Data Streams: Use Protobuf for ingesting high-volume data streams, enabling efficient processing and real-time insights.
- Amazon S3: Store historical Protobuf-encoded data in S3, balancing efficiency with the ability to reprocess or audit messages as needed.
- AWS Glue and Analytics: Protobuf-formatted data can be integrated into data lakes, making it accessible for big data analytics, machine learning, and reporting tools.
- Event-Driven Architectures: Protobuf’s language-neutral format supports seamless integration with microservices or event-driven pipelines across AWS.
By extending Protobuf usage beyond IoT Core, UK organisations can create unified, high-performance data architectures that remain efficient at every stage.
Final Words
This article explores the advantages of utilising Protocol Buffers in conjunction with AWS IoT Core. These benefits include reduced message sizes, improved performance, schema and validation, adaptability, and optimised bandwidth. Additionally, we’ve covered the process of using Protocol Buffers with AWS IoT Core, including the steps required to convert Protobuf messages from your device to JSON with the AWS IoT Core Rules Engine and how to trigger a Lambda function.












