DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
Evaluating Your Event Streaming Needs the Software Architect Way
Watch Now

IoT

IoT, or the Internet of Things, is a technological field that makes it possible for users to connect devices and systems and exchange data over the internet. Through DZone's IoT resources, you'll learn about smart devices, sensors, networks, edge computing, and many other technologies — including those that are now part of the average person's daily life.

icon
Latest Refcards and Trend Reports
Trend Report
Edge Computing and IoT
Edge Computing and IoT
Refcard #214
MQTT Essentials
MQTT Essentials
Refcard #263
Messaging and Data Infrastructure for IoT
Messaging and Data Infrastructure for IoT

DZone's Featured IoT Resources

Unlocking the Potential of IoT Applications With Real-Time Alerting Using Apache Kafka Data Streams and KSQL

Unlocking the Potential of IoT Applications With Real-Time Alerting Using Apache Kafka Data Streams and KSQL

By Kiran Peddireddy
IoT devices have revolutionized the way businesses collect and utilize data. IoT devices generate an enormous amount of data that can provide valuable insights for informed decision-making. However, processing this data in real time can be a significant challenge, particularly when managing large data volumes from numerous sources. This is where Apache Kafka and Kafka data streams come into play. Apache Kafka is a distributed streaming platform that can handle large amounts of data in real time. It is a messaging system commonly used for sending and receiving data between systems and applications. It can also be used as a data store for real-time processing. Kafka data streams provide a powerful tool for processing and analyzing data in real time, enabling real-time analytics and decision-making. One of the most important applications of Kafka data streams is real-time monitoring. IoT devices can be used to monitor various parameters, such as temperature, humidity, and pressure. By using Kafka data streams, this data can be processed and analyzed in real time, allowing for early detection of issues and immediate response. This can be particularly beneficial in manufacturing, where IoT devices can monitor machine performance and alert maintenance personnel to potential problems. Another application of Kafka data streams is predictive maintenance. By analyzing IoT data in real-time using Kafka data streams, it is possible to predict when maintenance will be required on devices. This can help to prevent downtime and reduce maintenance costs. For instance, sensors in vehicles can monitor engine performance and alert the driver to potential problems before they cause a breakdown. Energy management is another area where IoT devices can be leveraged using Kafka data streams. IoT devices can be used to monitor energy consumption in real time. By using Kafka data streams, this data can be analyzed to identify energy-saving opportunities and optimize energy usage. For example, smart buildings can use sensors to monitor occupancy and adjust heating and cooling systems accordingly. Smart cities are another application of Kafka data streams for IoT devices. IoT devices can be used to monitor and control various aspects of city life, such as traffic flow, air quality, and waste management. By using Kafka data streams, this data can be processed and analyzed in real-time, allowing for quick response to changing conditions and improved quality of life for residents. For example, sensors in smart traffic lights can adjust the timing of the lights to reduce congestion and improve traffic flow. One of the advantages of using Kafka data streams for IoT devices is that it enables real-time analytics and decision-making. This is important because it allows businesses to respond quickly to changing conditions and make informed decisions based on current data. The real-time nature of Kafka data streams means that businesses can monitor and analyze data as it is generated rather than waiting for batch processing to occur. This enables businesses to be more agile and responsive. We are using Apache Camel to consume IoT data and write it to a Kafka topic: Java import org.apache.camel.builder.RouteBuilder; import org.apache.camel.component.kafka.KafkaConstants; import org.apache.camel.model.dataformat.JsonLibrary; public class RestApiToKafkaRoute extends RouteBuilder { @Override public void configure() throws Exception { // Set up Kafka component from("kafka:{{kafka.bootstrap.servers}") .routeId("kafka") .to("log:received-message"); // Set up REST API component from("timer://rest-api-timer?period={{rest.api.timer.period}") .routeId("rest-api") .to("rest:get:{{rest.api.url}") .unmarshal().json(JsonLibrary.Jackson, DeviceData.class) .split(body()) .process(exchange -> { // Extract device ID from data and set Kafka topic header DeviceData deviceData = exchange.getIn().getBody(DeviceData.class); String deviceId = deviceData.getDeviceId(); exchange.getMessage().setHeader(KafkaConstants.TOPIC, deviceId); }) .marshal().json(JsonLibrary.Jackson) .to("kafka:{{kafka.topic}"); } } KSQL is a streaming SQL engine for Apache Kafka. It enables real-time data processing and analysis by providing a simple SQL-like language for working with Kafka data streams. KSQL makes it easy to create real-time dashboards and alerts that can be used for monitoring and decision-making. Real-time dashboards are an important tool for monitoring IoT devices using Kafka data streams. Dashboards can be used to display key performance indicators (KPIs) in real time, allowing businesses to monitor the health and performance of their IoT devices. Dashboards can also be used to visualize data trends and patterns, making it easier to identify opportunities for optimization and improvement. Alerts are another important tool for monitoring IoT devices using Kafka data streams. Alerts can be used to notify businesses when certain conditions are met, such as when a device exceeds a certain threshold or when a potential issue is detected. Alerts can be sent via email, SMS, or other means, allowing businesses to respond quickly to potential issues. SQL sample Kql query for dashboard for IOT data alerts: CREATE TABLE pressure_alerts AS SELECT device_id, pressure FROM iot_data_stream WHERE pressure > 100; CREATE STREAM pressure_alerts_stream (device_id VARCHAR, pressure INT, alert_type VARCHAR) WITH (kafka_topic='pressure_alerts', value_format='JSON'); CREATE TABLE pressure_alert_count AS SELECT alert_type, COUNT(*) FROM pressure_alerts_stream WINDOW TUMBLING (SIZE 1 MINUTE) GROUP BY alert_type; SELECT * FROM pressure_alert_count; KSQL also provides a real-time dashboard for monitoring and visualizing data in Kafka data streams. The dashboard can display real-time data streams and visualizations and can be used to track performance metrics and detect anomalies in real time. This enables users to gain real-time insights and make informed decisions based on the data. Below sample program that enables the consumption of data from a Kafka topic and issues an alert based on a predetermined threshold limit, as shown in the example where the pressure level exceeded 100. Java import com.twilio.Twilio; import com.twilio.rest.api.v2010.account.Message; import com.twilio.type.PhoneNumber; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.common.serialization.Serdes; import org.apache.kafka.streams.KafkaStreams; import org.apache.kafka.streams.StreamsBuilder; import org.apache.kafka.streams.StreamsConfig; import org.apache.kafka.streams.kstream.KStream; import org.apache.kafka.streams.kstream.Produced; import java.util.Properties; public class AlertTrigger { // Set Twilio Account SID and Auth Token public static final String ACCOUNT_SID = "your_account_sid_here"; public static final String AUTH_TOKEN = "your_auth_token_here"; // Set Twilio phone number and mobile app endpoint public static final String TWILIO_PHONE_NUMBER = "+1234567890"; public static final String MOBILE_APP_ENDPOINT = "https://your.mobile.app/endpoint"; public static void main(String[] args) { // Set up properties for Kafka Streams Properties props = new Properties(); props.put(StreamsConfig.APPLICATION_ID_CONFIG, "alert-trigger"); props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass()); props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass()); props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest"); // Build Kafka Streams topology StreamsBuilder builder = new StreamsBuilder(); // Read data from Kafka topic KStream<String, String> input = builder.stream("iot-data"); // Define KSQL query for alert trigger String ksql = "SELECT device_id, pressure FROM iot-data WHERE pressure > 100"; // Create Kafka Streams application and start processing KafkaStreams streams = new KafkaStreams(builder.build(), props); streams.start(); // Set up Twilio client Twilio.init(ACCOUNT_SID, AUTH_TOKEN); // Process alerts input.filter((key, value) -> { // Execute KSQL query to check for alert condition // If pressure is greater than 100, trigger alert // Replace this with your actual KSQL query return true; }) .mapValues(value -> { // Create alert message String message = "Pressure has exceeded threshold value of 100!"; return message; }) .peek((key, value) -> { // Send notification to mobile app endpoint Message message = Message.creator(new PhoneNumber(MOBILE_APP_ENDPOINT), new PhoneNumber(TWILIO_PHONE_NUMBER), value).create(); }) .to("alert-topic", Produced.with(Serdes.String(), Serdes.String())); // Gracefully shut down Kafka Streams application Runtime.getRuntime().addShutdownHook(new Thread(streams::close)); } } Overall, Apache Kafka and Kafka data streams, combined with Kafka Connect and KSQL, offer a powerful toolset for processing and analyzing real-time data from IoT devices. By integrating IoT devices with Kafka data streams, organizations can gain real-time insights and improve decision-making, leading to significant improvements in efficiency, cost savings, and quality of life. The KSQL dashboard provides a powerful way to visualize and monitor the data in real time, allowing users to quickly identify trends, anomalies, and potential issues. With the continued growth of IoT devices and the increasing demand for real-time analytics, Kafka data streams and KSQL are likely to become even more important in the years to come. More
Shaping the Future of IoT: 7 MQTT Technology Trends in 2023

Shaping the Future of IoT: 7 MQTT Technology Trends in 2023

By Zaiming (stone) Shi
Message Queuing Telemetry Transport (MQTT) is the standard messaging protocol for the Internet of Things (IoT). MQTT follows an extremely lightweight publish-subscribe messaging model, connecting IoT devices in a scalable, reliable, and efficient manner. It's been over 20 years since MQTT was invented in 1999 by IBM and 10 years since the popular open-source MQTT broker, EMQX, launched on GitHub in 2012. As we move into 2023 and look forward to the years ahead, we can anticipate 7 developing trends in MQTT technology, as the use of MQTT in IoT is growing tremendously and diversely, driven by the progress of emerging technologies. MQTT Over QUIC Quick UDP Internet Connections (QUIC) is a new transport protocol developed by Google that runs over UDP and is designed to reduce the latency associated with establishing new connections, increase data transfer rates, and address the limitations of TCP. HTTP/3, the latest HTTP protocol version, uses QUIC as its transport layer. HTTP/3 has lower latency and a better loading experience on web applications than HTTP/2 due to the adoption of QUIC. MQTT over QUIC is the most innovative advancement in the MQTT protocol since the first release of the MQTT 5.0 specification in 2017. With multiplexing and faster connection establishment and migration, it has the potential to become the next generation of the MQTT standard. The MQTT 5.0 protocol specification defines three types of transport: TCP, TLS, and WebSocket. MQTT over TLS/SSL is widely used in production to secure communications between MQTT clients and brokers, as security is a top priority for IoT applications. However, it is slow and has high latency, requiring 7 RTT handshakes, 3 TCP, and 4 TLS to establish a new MQTT connection. MQTT over QUIC, with 1 RTT connection establishment and 0 RTT reconnection latency, is indeed faster and has lower latency compared to MQTT over TLS. The QUIC stack can be customized for various use cases, such as keeping connections alive in poor networking conditions and for scenarios where there is a need for low client-to-server latency. It will benefit connected cars with unreliable cellular networks and low-latency industrial IoT applications. The adoption of MQTT over QUIC is expected to play a vital role in the future of IoT, Industrial IoT (IIoT), and the Internet of Vehicles (IoV). EMQX has introduced MQTT over QUIC support in its latest version, 5.0. And like HTTP/3, the next version of the MQTT protocol, MQTT 5.1 or 6.0, will use QUIC as its primary transport layer in the near future. MQTT Serverless The serverless trend in cloud computing marks a groundbreaking paradigm shift in how applications are designed, developed, deployed, and run. This paradigm enables developers to focus on their application's business logic instead of managing infrastructure, resulting in enhanced agility, scalability, and cost-effectiveness. Serverless MQTT broker emerges as a cutting-edge architectural innovation for 2023. In contrast to traditional IoT architectures, which require minutes to hours for creating MQTT-hosted services on the cloud or deploying them on-premises, serverless MQTT enables rapid deployment of MQTT services with just a few clicks. Moreover, the true value proposition of serverless MQTT lies not in its deployment speed, but in its unparalleled flexibility. This flexibility manifests in two key aspects: the seamless scaling of resources in response to user demands and the pay-as-you-go pricing model that aligns with this elastic architecture. As a result, serverless MQTT is poised to drive broader adoption of MQTT, reducing operational costs and spurring innovation and collaboration across diverse industries. We might even see a free serverless MQTT broker for every IoT and Industrial IoT developer. In March 2023, EMQX Cloud launched the world's first serverless MQTT service, offering users not only an incredibly fast deployment time of just 5 seconds but also the exceptional flexibility that truly sets serverless MQTT apart. MQTT Multi-Tenancy Multi-tenancy architecture is a vital aspect of a serverless MQTT broker. IoT devices from different users or tenants can connect to the same large-scale MQTT cluster while keeping their data and business logic isolated from other tenants. SaaS applications commonly use multi-tenancy architecture, where a single application serves multiple customers or tenants. There are usually two different ways to implement multi-tenancy in SaaS, such as: Tenant Isolation: A separate application instance is provided to each tenant, running on a server or virtual machine. Database Isolation: Multiple tenants can share a single application instance, but each tenant has their database schema to ensure data isolation. In the multi-tenancy architecture of the MQTT broker, each device and tenant is given a separate and isolated namespace. This namespace includes a unique topic prefix and access control lists (ACLs) that define which topics each user can access, publish to, or subscribe to. MQTT broker with multi-tenancy support will reduce management overhead and allow greater flexibility for complex scenarios or large-scale IoT applications. For example, departments and applications in a large organization could use the same MQTT cluster as different tenants. MQTT Sparkplug 3.0 MQTT Sparkplug 3.0 is the latest version of the MQTT Sparkplug, the open standard specification designed by Eclipse Foundation. It defines how to connect industrial devices, including sensors, actuators, Programmable Logic Controllers (PLCs), and gateways using the MQTT messaging protocol. MQTT Sparkplug 3.0 was released in November 2022 with some key new features and improvements: MQTT 5 Support: MQTT Sparkplug 3.0 supports the MQTT 5 protocol, which includes several new features such as shared subscriptions, message expiry, and flow control. Optimized Data Transmission: MQTT Sparkplug 3.0 includes several optimizations for data transmission, including the use of more compact data encoding and compression algorithms. Expanded Data Model: MQTT Sparkplug 3.0 introduces an expanded data model, which allows for more detailed device information to be communicated, as well as additional information such as configuration data and device metadata. Improved Security: MQTT Sparkplug 3.0 includes several improvements to security, including support for mutual TLS authentication and improved access control mechanisms. Simplified Device Management: MQTT Sparkplug 3.0 includes several improvements to device management, including automatic device registration and discovery, simplified device configuration, and improved diagnostics. MQTT Sparkplug aimed to simplify connecting and communicating with disparate industrial devices and achieve efficient industrial data acquisition, processing, and analysis. As the new version is released, MQTT Sparkplug 3.0 has the potential to be more widely adopted in the Industrial IoT. MQTT Unified Namespace Unified Namespace is a solution architecture built on the MQTT broker for Industrial IoT and Industry 4.0. It provides a unified namespace for MQTT topics and a centralized repository for messages and structured data. Unified Namespace connects industrial devices, sensors, and applications, such as SCADA, MES, and ERP, with star topology using a central MQTT broker. Unified Namespace dramatically simplifies the development of industrial IoT applications with an event-driven architecture. In traditional IIoT systems, OT and IT systems have generally been separate and operated independently with their data, protocols, and tools. By adopting Unified Namespace, it is possible to allow OT and IT systems to exchange data more efficiently and finally unify the OT and IT in the IoT era. In 2023, with EMQX or NanoMQ MQTT broker empowered by Neuron Gateway, the latest open source IIoT connectivity server, building a UNS architecture empowered by the most advanced technology from the IT world is just within grasp. MQTT Geo-Distribution MQTT Geo-Distribution is an innovative architecture that allows MQTT brokers deployed in different regions or clouds to work together as a single cluster. Using Geo-Distribution, MQTT messages can be automatically synchronized and delivered across MQTT brokers in different regions. In 2023, we can expect two approaches to implementing MQTT Geo-Distribution: Single Cluster, Multi-Region: A single MQTT cluster with brokers running in different regions. Multi-Cluster, Multi-Cloud: Multiple MQTT clusters connected with Cluster Linking in different clouds. We can combine the two approaches to create a reliable IoT messaging infrastructure across geographically distributed MQTT brokers. By adopting the MQTT Geo-Distribution, organizations can build a Global MQTT Access Network across multi-cloud, where devices and applications connected locally from the closest network endpoint can communicate with each other regardless of their physical location. MQTT Streams MQTT Streams is an expected extension of the MQTT protocol that enables the handling of high-volume, high-frequency data streams in real time within an MQTT broker. This feature enhances traditional MQTT brokers' capabilities, which were originally designed for lightweight publish/subscribe messaging. With MQTT Streams, clients can produce and consume MQTT messages as streams, similar to how Apache Kafka works. This allows for historical message replay, which is essential for event-driven processing, ensuring ultimate data consistency, auditing, and compliance. Stream processing is crucial for extracting real-time business value from the massive amounts of data generated by IoT device sensors. Previously, this required an outdated, complex big data stack involving the integration of an MQTT broker with Kafka, Hadoop, Flink, or Spark for IoT data stream processing. However, with built-in stream processing, MQTT Streams streamlines the IoT data processing stack, improves data processing efficiency and response time, and provides a unified messaging and streaming platform for IoT. By supporting features such as message deduplication, message replay, and message expiration, MQTT Streams enables high throughput, low latency, and fault tolerance, making it a powerful tool for handling real-time data streams in MQTT-based IoT applications. Conclusion Overall, these 7 trends in MQTT technology reflect the progress of emerging technologies and their role in advancing the IoT. As a standard messaging protocol evolved for over two decades, MQTT’s importance continues to grow. With the increasing use of IoT in various industries, the MQTT protocol is evolving to meet new challenges and demands, such as faster and lower-latency connections, more rapid deployment of MQTT services, greater flexibility for complex scenarios or large-scale IoT applications, and more support on connecting various industrial devices. With these developments, MQTT will become the nerve system of IoT and an even more crucial player in IIoT and IoV (Internet of Vehicles) in 2023 and beyond. More
Handling Bad Messages via DLQ by Configuring JDBC Kafka Sink Connector
Handling Bad Messages via DLQ by Configuring JDBC Kafka Sink Connector
By Gautam Goswami CORE
Securing MQTT With Username and Password Authentication
Securing MQTT With Username and Password Authentication
By Kary Ware
Real-Time Analytics for IoT
Real-Time Analytics for IoT
By David G. Simmons CORE
Get Up to Speed With the Latest Cybersecurity Standard for Consumer IoT
Get Up to Speed With the Latest Cybersecurity Standard for Consumer IoT

With growing concern regarding data privacy and data safety today, Internet of Things (IoT) manufacturers have to up their game if they want to maintain consumer trust. This is the shared goal of the latest cybersecurity standard from the European Telecommunications Standards Institute (ETSI). Known as ETSI EN 303 645, the standard for consumer devices seeks to ensure data safety and achieve widespread manufacturer compliance. So, let’s dive deeper into this standard as more devices enter the home and workplace. The ETSI Standard and Its Protections It counts a long name but heralds an important era of device protection. ETSI EN 303 645 is a standard and method by which a certifying authority can evaluate IoT device security. Developed as an internationally applicable standard, ETSI offers manufacturers a baseline for security rather than a comprehensive set of precise guidelines. The standard may also lay the groundwork for various future IoT cybersecurity certifications in different regions around the world. For example, look at what’s happening in the European Union. Last September, the European Commission introduced a proposed Cyber Resilience Act, intended to protect consumers and businesses from products with inadequate security features. If passed, the legislation — a world-first on connected devices — will bring mandatory cybersecurity requirements for products with digital elements throughout their whole lifecycle. The prohibition of default and weak passwords, guaranteed support of software updates and mandatory testing for security vulnerabilities are just some of the proposals. Interestingly, these same rules are included in the ETSI standard. IoT Needs a Cybersecurity Standard Shockingly, a single home filled with smart devices could experience as many as 12,000 cyber attacks in a single week. While most of those cyber attacks will fail, the sheer number means some inevitably get through. The ETSI standard strives to keep those attacks out with basic security measures, many of which should already be common sense, but unfortunately aren’t always in place today. For example, one of the basic requirements of the ETSI standard is no universal default passwords. In other words, your fitness tracker shouldn’t have the same default password as every other fitness tracker of that brand on the market. Your smart security camera shouldn’t have a default password that anyone who owns a similar camera could exploit. It seems like that would be common sense for IoT manufacturers, but there have been plenty of breaches that occurred simply because individuals didn’t know to change the default passwords on their devices. Another basic requirement of ETSI is allowing individuals to delete their own data. In other words, the user has control over the data a company stores about them. Again, this is pretty standard stuff in the privacy world, particularly in light of regulations like Europe’s General Data Protection Regulation (GDPR) and California’s Consumer Privacy Act (CCPA). However, this is not yet a universal requirement for IoT devices. Considering how much health- and fitness-related data many of these devices collect, consumer data privacy needs to be more of a priority. Several more rules in ETSI have to do with the software installed on such devices and how the provider manages security for the software. For example, there needs to be a system for reporting vulnerabilities. The provider needs to keep the software up to date and ensure software integrity. We would naturally expect these kinds of security measures for nearly any software we use, so the standard is basically just a minimum for data protection in IoT. Importantly, the ETSI standard covers pretty much everything that could be considered a smart device, including wearables, smart TVs and cameras, smart home assistants, smart appliances, and more. The standard also applies to connected gateways, hubs, and base stations. In other words, it covers the centralized access point for all of the various devices. Why Device Creators Should Implement the Standard Today Just how important is the security standard? Many companies are losing customers today due to a lack of consumer trust. There are so many stories of big companies like Google and Amazon failing to adequately protect user data, and IoT in particular has been in the crosshairs multiple times due to privacy concerns. An IoT manufacturer that doesn’t want to lose business, face fines and lawsuits, and damage the company's reputation should consider implementing the ETSI standard as a matter of course. After all, these days a given home might have as many as 16 connected devices, each an entry point into the home network. A company might have one laptop per employee but two, three, or more other smart devices per employee. And again, each smart device is a point of entry for malicious hackers. Without a comprehensive cybersecurity standard like ETSI EN 303 645, people who own unprotected IoT devices need to worry about identity theft, ransomware attacks, data loss and much more. How to Test and Certify Based on ETSI Certification is fairly basic and occurs in five steps: Manufacturers have to understand the 33 requirements and 35 recommendations of the ETSI standard and design devices accordingly. Manufacturers also have to buy an IoT platform that has been built with the ETSI standard in mind, since the standard will fundamentally influence the way the devices are produced and how they operate within the platform. Next, any IoT manufacturer trying to meet the ETSI standard has to fill out documents that provide information for device evaluation. The first document is the Implementation Conformance Statement, which shows which requirements and recommendations the IoT device does or doesn’t meet. The second is the Implementation eXtra Information for Testing, which provides design details for testing. A testing provider will next evaluate and test the product based on the two documents and give a report. The testing provider will provide a seal or other indication that the product is ETSI EN 303 645-compliant. With new regulations on the horizon, device manufacturers and developers should see it as best practice to get up to speed with this standard. Better cybersecurity is not only important for consumer protection but brand reputation. Moreover, this standard can provide a basis for stricter device security certifications and measures in the future. Prepare today for tomorrow.

By Carsten Rhod Gregersen
Your Pi-Hole Is a Rich Source of Data
Your Pi-Hole Is a Rich Source of Data

While a lot of my inspiration for blog posts come from talking with New Relic users, it's hard to share them as examples because their so specific and often confidential. So I find myself struggling more to find a generic "for instance" that's easy to understand and accessible to all everyone. Which should explain why I use my home environment as the sample use case so often. Even if you don't have exactly the same gear or setup I do, it's likely you have something analogous. On top of that, if you don't have the specific element I'm discussing, many times I believe it's something you ought to consider. That brings us to my example today: Pi-Hole. Pi-Hole acts as a first-level DNS server for your network. But what it REALLY does is make your network faster and safer by blocking requests to malicious, unsavory, or just plain obnoxious sites. If you’re using Pi-Hole, it’ll be most noticeable in the ways advertisements on a webpage load. BEFORE: pop-overs and hyperbolic ads AFTER: No pop-overs, spam ads blocked But under the hood, it’s even more significant. BEFORE: 45 seconds to load AFTER: 6 seconds to load Look in the lower-right corner of each of those images. Load time without Pi-Hole was over 45 seconds. With it, the load time was 6 seconds. You may there are many pages like this, but the truth is web pages link to these sites all the time. Here's the statistics from my house on a typical day. How Does the Pi-Hole API Work? If you have Pi-Hole running, you get to the API by going to http://<your pi-hole url>/admin/api.php?summaryRaw. The result will look something like this: {”domains_being_blocked”:115897,”dns_queries_today”:284514,”ads_blocked_today”:17865,”ads_percentage_today”:6.279129,”unique_domains”:14761,”queries_forwarded”:216109,”queries_cached”:50540,”clients_ever_seen”:38,”unique_clients”:22,”dns_queries_all_types”:284514,”reply_NODATA”:20262,”reply_NXDOMAIN”:19114,”reply_CNAME”:16364,”reply_IP”:87029,”privacy_level”:0,”status”:”enabled,””gravity_last_updated”:{”file_exists”:true,”absolute”:1567323672,”relative”:{”days”:”3,””hours”:”09,””minutes”:”53”}} Let's format the JSON data so it looks a little prettier: The point is, once we have access to all that JSON-y goodness, it's almost trivial (using the Flex integration, which I discussed in this series) to collect and send into New Relic, to provide further insight into how your network is performing. At that point, you can start to include the information in graphs like this: Assuming you have the New Relic infrastructure agent installed on on any system on the network that can access your pihole (and once again, if you need help getting that set up, check out my earlier blog post here) you have relatively few steps to get up and running. First, the YAML file would look like this (you can also find it on the New Relic Flex GitHub repo in the examples folder). integrations: - name: nri-flex config: name: pihole_simple apis: - name: pihole_simple url: http://pi.hole/admin/api.php?summaryRaw&auth= #<your API Key Here> headers: accept: application/json remove_keys: - timestamp Next, the NRQL you'd need to set up two different charts are as follows: For the "Query Volume" chart: From pihole_simpleSample SELECT average(dns_queries_all_replies), average(dns_queries_today), average(queries_forwarded), average(queries_cached), average(dns_queries_all_types) TIMESERIES For the "Blocking Activity" chart: From pihole_simpleSample SELECT average(ads_blocked_today), average(domains_being_blocked) TIMESERIES This is, of course, only the start of the insights you can gain from your Pi-Hole server (and by extension, ANY device or service that has an API with endpoints that provide data). If you find additional use cases, feel free to reach out to me in the comments below, on social media, or when you see me at a conference or meet-up.

By Leon Adato
High Throughput vs. Low Latency in Data Writing: A Way to Have Both
High Throughput vs. Low Latency in Data Writing: A Way to Have Both

This article is about how Apache Doris helps you import data and conduct Change Data Capture (CDC) from upstream databases like MySQL to Doris based on Flink streaming. But first of all, you might ask: What is Apache Doris and why would I bother to do so? Well, Apache Doris is an open-source real-time analytical data warehouse that supports both high-concurrency point queries and high-throughput complex analysis. It provides sub-second analytic query capabilities and comes in handy in multi-dimensional analysis, dashboarding, and other real-time data services. Overview How to perform end-to-end data synchronization within seconds How to ensure real-time data visibility How to smoothen the writing of massive small files How to ensure end-to-end Exactly-Once processing Real-Timeliness Stream Write The Flink-Doris Connector in Doris used to follow a "Cache and Batch Write" method for data ingestion. However, that requires a wise choice of batch size and batch write interval; otherwise things could go wrong. For example, if the batch size is too large, OOM errors could occur. On the other hand, frequent writes could lead to too many data versions generated. To avoid such troubles, Doris implements a Stream Write method, which works as follows: A Flink task, once started, asynchronously initiates a Stream Load HTTP request. The data is transmitted to Doris via the chunked transfer encoding mechanism of HTTP. The HTTP request ends at Checkpoint, which means the Stream Load task is completed. Meanwhile, the next Stream Load request will be asynchronously initiated. Repeat the above steps. Transaction Processing Quick Aggregation of Data Versions Highly concurrent writing of small files can generate too many data versions in Doris and slow down data queries. Thus, Doris has enhanced its data compaction capability in order to quickly aggregate data. Firstly, Doris introduced Quick Compaction. Specifically speaking, data compaction will be triggered once data versions increase. Meanwhile, by scanning the metadata of tablets, Doris can identify those tablets with too many data versions and conduct compaction correspondingly. Secondly, for the writing of small files, which happens in high concurrency and frequency, Doris implements Cumulative Compaction. It isolates these compaction tasks from the heavyweight Base Compaction from a scheduling perspective to avoid mutual influence between them. Last but not least, Doris adopts a tiered data aggregation method, which ensures that each aggregation only involves files of similar sizes. This greatly reduces the total number of aggregation tasks and the CPU usage of the system. Exactly-Once The Exactly-Once semantics means that the data will be processed once and only once. It prevents the data from getting reprocessed or lost even if the machine or application fails. Flink implements a 2PC protocol to realize the Exactly-Once semantics of Sink operators. Based on this, the Flink-Doris Connector in Doris implements Stream Load 2PC to deliver Exactly-Once processing. The details are as follows: A Flink task will initiate a Stream Load PreCommit request once it is started. Then, a transaction will be opened, and data will be continuously sent to Doris via the chunked mechanism of HTTP. The HTTP request ends at Checkpoint and the Stream Load is completed. The transaction status will be set to Pre-Committed. At this time, the data has been written to BE and become invisible to users. The Checkpoint initiates a request and changes the transaction status to Committed. After this, the data will become visible to users. In the case of Flink application failures, if the previous transaction is in Pre-Committed status, the Checkpoint will initiate a rollback request and change the transaction status to Aborted. Performance of Doris in High-Concurrency Scenarios Scenario Description Import data from Kafka using Flink. After ETL, use the Flink-Doris Connector for real-time data ingestion into Doris. Requirements The upstream data is written into Doris at a high frequency of 100,000 per second. To achieve real-time data visibility, the upstream and downstream data needs to be synchronized within around 5s. Flink Configurations Concurrency: 20 Checkpoint Interval: 5s Here's how Doris does it: Compaction Real-TimelinessAs the result shows, Doris manages to aggregate data quickly and keep the number of data versions in tablets below 50. Meanwhile, the Compaction Score remains stable. CPU UsageAfter optimizing the compaction strategy of small files, Doris reduces CPU usage by 25%. Query LatencyBy reducing the CPU usage and the number of data versions, Doris arranges the data more orderly and thus enables much lower query latency. Performance of Doris in Low-Latency Scenarios (High-Level Stress Test) Description Single-BE, single-tablet Stream Load stress test on the client side Data real-timeliness <1s Here are the Compaction Scores before and after optimization: Suggestions for Using Doris Low-Latency ScenariosAs for scenarios requiring real-time data visibility (such as data synchronization within seconds), the files in each ingestion are usually small in size. Thus, it is recommended to reduce cumulative_size_based_promotion_min_size_mbytefrom the default value of 64 to 8 (measured in MB). This can greatly improve the compaction performance. High-Concurrency ScenariosFor highly concurrent writing scenarios, it is recommended to reduce the frequency of Stream Load by increasing the Checkpoint interval to 5–10s. This not only increases the throughput of Flink tasks, but also reduces the generation of small files and thus avoids extra pressure on compaction. In addition, for scenarios with less strict requirements for real-timeliness (such as data synchronization within minutes), it is recommended to increase the Checkpoint interval to 5–10 minutes. In this way, the Flink-Doris Connector can still ensure data integrity via the 2PC+Checkpoint mechanism. ConclusionApache Doris realizes data real-timeliness by its Stream Write method, transaction processing capability, and aggregation of data versions. These techniques help it reduce memory and CPU usage, which enables lower latency. In addition, for data integrity and consistency, Doris implements Stream Load 2PC to guarantee that all data is processed exactly once. This is how Doris facilitates quick and safe data ingestion.

By Frank Z
Using AI To Optimize IoT at the Edge
Using AI To Optimize IoT at the Edge

As more companies combine Internet of Things (IoT) devices and edge computing capabilities, people are becoming increasingly curious about how they could use artificial intelligence (AI) to optimize those applications. Here are some thought-provoking possibilities. Improving IoT Sensor Inference Accuracy With Machine Learning Technology researchers are still in the early stages of investigating how to improve the performance of edge-deployed IoT sensors with machine learning. Some early applications include using sensors for image-classification tasks or those involving natural language processing. However, one example shows how people are making progress. Researchers at IMDEA Networks recognized that using IoT sensors for specific deep-learning tasks may mean the sensors cannot guarantee specific quality-of-service requirements, such as latency and inference accuracy. However, the people working on this project developed a machine learning algorithm called AMR² to help with this challenge. AMR² utilizes an edge computing infrastructure to make IoT sensor inferences more accurate while enabling faster responses and real-time analyses. Experiments suggested the algorithm improved inference accuracy by up to 40% compared to the results of basic scheduling tasks that did not use the algorithm. They found an efficient scheduling algorithm such as this one is essential for helping IoT sensors work properly when deployed at the edge. A project researcher pointed out that the AMR² algorithm could impact an execution delay if a developer used it for a service similar to Google Photos, which classifies images by the elements they include. A developer could deploy the algorithm to ensure the user does not notice such delays when using the app. Reducing Energy Usage of Connected Devices With AI at the Edge A 2023 study of chief financial officers at tech companies determined 80% expect revenue increases in the coming year. However, that’s arguably most likely to happen if employees understand customers’ needs and provide products or services accordingly. The manufacturers of many IoT devices intend for people to wear those products almost constantly. Some wearables detect if lone workers fall or become distressed or if people in physically demanding roles are becoming too tired and need to rest. In such cases, users must feel confident that their IoT devices will work reliably through their workdays and beyond. That’s one of the reasons why researchers explored how using AI at the edge could improve the energy efficiency of IoT devices deployed to study the effects of a sedentary lifestyle on health and how correct posture could improve outcomes. Any IoT device that captures data about how people live must collect data continuously, requiring few or no instances where information gathering stops because the device runs out of battery. In this case, subjects wore wireless devices powered by coin-cell batteries. Each of these gadgets had inertia sensors to collect accurate data about how much people moved throughout the day. However, the main problem was the batteries only lasted a few hours due to the large volume of data transmitted. For example, research showed a nine-channel motion sensor that reads 50 samples every second produces more than 100 MB of data daily. However, researchers recognized machine learning could enable the algorithms only to transfer critical data from edge-deployed IoT devices to smartphones or other devices that assist people in analyzing the information. They proceeded to use a pre-trained recurrent neural network and found the algorithm achieved real-time performance, improving the IoT devices’ functionality. Creating Opportunities for On-Device AI Training Edge computing advancements have opened opportunities to use smart devices in more places. For example, people have suggested deploying smart street lights that turn on and off in response to real-time traffic levels. Tech researchers and enthusiasts are also interested in the increased opportunities associated with AI training that happens directly on edge-deployed IoT devices. This approach could increase those products’ capabilities while reducing energy consumption and improving privacy. An MIT team studied the feasibility of training AI algorithms on intelligent edge devices. They tried several optimization techniques and came up with one that only required 157 KB of memory to train a machine-learning algorithm on a microcontroller. Other lightweight training methods typically require between 300-600 MB of memory, making this innovation a significant improvement. The researchers explained that any data generated for training stays on the device, reducing privacy concerns. They also suggested use cases where the training happens throughout normal use, such as if algorithms learn by what a person types on a smart keyboard. This approach had some undoubtedly impressive results. In one case, the team trained the algorithm for only 10 minutes, which was enough to allow it to detect people in images. This example shows optimization can go in both directions. Although the first two examples here focused on improving how IoT devices worked, this approach enhanced the AI training process. However, suppose developers train algorithms on IoT devices that will eventually use them to perform better. That’s a case where the approach mutually benefits AI algorithms and IoT-edge devices. How Will You Use AI to Improve How IoT-Edge Devices Work? These examples show some of the things researchers focused on when exploring how artificial intelligence could improve the functionality of IoT devices deployed at the edge. Let them provide valuable insights and inspiration about how you might get similar results. It’s almost always best to start with a clearly defined problem you want to solve. Then, start exploring how technology and innovative approaches could help meet that goal.

By Devin Partida
UUID: Coordination-Free Unique Keys
UUID: Coordination-Free Unique Keys

Let’s build an IoT application with weather sensors deployed around the globe. The sensors will collect data and we store the data along with the IDs of the sensors. We’ll run multiple database instances, and the sensors will write to the geographically closest database. All databases will regularly exchange data, so all the databases will eventually have data from all the sensors. We need each sensor to have a globally unique ID. How can we achieve it? For example, we could run a service assigning sensor IDs as a part of the sensor installation procedure. It would mean additional architectural complexity, but it's doable. Sensor IDs are immutable, so each sensor needs to talk to the ID service only once - right after the installation. That’s not too bad. What if we need to store a unique ID for each data reading? Hitting the centralized ID service whenever we need to store data is not an option. That would stress the ID service too much, and when the ID service is unavailable, no sensor could write any data. What are the possible solutions? In the simplest case, each sensor could talk to the remote ID service and reserve a block of IDs it could then assign locally without further coordination. When it exhausts the block, it asks the ID service for a new one. This strategy would reduce the load on the ID service, and sensors could function even when the ID service is temporarily unavailable. We could also generate local reading IDs and prefix them with our unique immutable sensor ID. We could also be smart and use fancy ID algorithms like FlakeIDs. The strategies mentioned aim to minimize the need for coordination while ensuring that the IDs are unique globally. The goal is to generate unique IDs without any coordination at all. This is what we call coordination-free unique IDs. UUID Enters the Scene Flip a coin 128 times and write down 1 for each head and 0 for each tail. This gives you a sequence of 128 1s and 0s, or 128 bits of randomness. That’s a space large enough that the probability of generating the same sequence twice is so extremely low that you can rule it out for practical purposes. How is that related to UUIDs? If you have ever seen a UUID then you know they look similar to this: 420cd09a-4d56-4749-acc2-40b2e8aa8c42. This format is just a textual representation of 128 bits. How does it work? The UUID string has 36 characters in total. If we remove the 4 dashes, which are there just to make it a bit more human-readable, we are left with 32 hexadecimal digits: 0-F. Each digit represents 4 bits and 32 * 4 bits = 128 bits. So UUIDs are 128-bit values. We often represent them as strings, but that's just a convenience. UUID has been explicitly designed to be unique and generated without coordination. When you have a good random generator, 128 random bits are enough to practically guarantee uniqueness. At the same time, 128 bits are not too much, so UUIDs do not occupy too much space when stored. UUID Versions There are multiple versions of UUIDs. Versions 1-5 are defined in RFC 4122 and they are the most widely used. Versions 6 - 8 are currently in draft status and might be approved in the future. Let's take a brief look at the different versions. Version 1 Version 1 is generated by using a MAC address and time as inputs. The MAC address is used to ensure uniqueness across multiple machines. The time ensures uniqueness across multiple processes on the same machine. Using the MAC means that generated UUIDs can be tracked to a specific machine. This can be useful occasionally, but it might not be desirable in other cases, as a MAC address can be considered private information. Interestingly enough, the time portion is not based on the usual Unix epoch, but it uses a count of 100-nanosecond intervals since 00:000:00.00 on the 15th of October 1582. What is special about October 1582? It's the Gregorian calendar reform. See version 7 for a UUID with a standard Unix epoch. Version 2 Version 2 is similar to version 1 but adds a local domain ID to the UUID. It's not widely used. Versions 3 and 5 These versions use a hash function to generate the UUID. The hash function is seeded with a namespace UUID and a name. The namespace UUID is used to ensure uniqueness across multiple namespaces. The name is used to ensure uniqueness within a namespace. Version 3 uses MD5 as a hash function, while version 5 uses SHA-1. SHA-1 generates 160 bits, so the digest is truncated to 128 bits. Version 4 Version 4 UUID is probably the most popular one. It relies solely on a random generator to generate UUIDs, similar to the coin flip example above. This means that the quality of the random generator is critical. Version 6 Version 6 is similar to Version 1 but has a different ordering of bytes. It encodes the time from the most significant to the least significant. This allows sorting UUIDs correctly by time when you sort just bytes representing the UUIDs. Version 7 Version 7 uses a 48-bit timestamp and random data. Unlike versions 1, 2, or 6, it uses a standard Unix epoch in milliseconds. It also uses a random generator instead of a MAC address. Version 8 Version 8 is meant to be used for experimental and private use. Security Considerations UUIDs are designed to be unique, but they are not designed to be secret. What's the difference? If you generate a UUID then you can assume it's different from any other UUID generated before or after, but you should not treat them as a password or a secret session identifier. This is what RFC 4122 says about this: Do not assume that UUIDs are hard to guess; they should not be used as security capabilities (identifiers whose mere possession grants access), for example. A predictable random number source will exacerbate the situation. UUID in QuestDB UUIDs are popular synthetic IDs because they can be generated without any coordination and do not use too much space. QuestDB users often store UUIDs, but until recently, QuestDB did not have first-class support. Most users stored UUIDs in a string column. It makes sense because as we have seen above UUIDs have a canonical textual representation. Storing UUIDs in a string column is possible, but it's inefficient. Let's do some math: We already know each UUID has 128 bits, that's 16 bytes. The canonical textual representation of UUID has 36 characters. QuestDB uses UTF-16 encoding for strings, so each ASCII character uses 2 bytes. There is also a fixed cost of 4 bytes per string stored. So it takes 36 * 2 + 4 = 76 bytes to store a single UUID which contains just 16 bytes of information! It's not just wasting disk space. QuestDB must read these bytes when evaluating a SQL predicate, joining tables, or calculating an aggregation. Thus storing UUIDs as strings also makes your queries slower! That's why QuestDB 6.7 implemented UUID as a first-class data type. This allows user applications to declare a column as UUID and then each UUID stored will use only 16 bytes. Thanks to this, SQL queries will be faster. Demo time The demo creates tables occupying just under 100 GB of disk space. Make sure you have enough disk space available. You might also need to increase the query timeout via the query.timeout.sec property. See Configuration for more details. Alternatively, you can change the long_sequence() function to create a smaller number of rows. Let’s create a table with a single string column and populate it with 1 billion random UUIDs. The column is defined as the string type, so the UUIDs will be stored as strings. SQL CREATE TABLE tab_s (s string); INSERT INTO tab_s SELECT rnd_uuid4() FROM long_sequence(1000000000); Let’s try to query it: SQL SELECT * FROM tab_s WHERE s = 'ab632aba-be36-43e5-a4a0-4895e9cd3f0d'; It’s taking around 2.2s. It’s not terrible given it’s a full-table scan over one billion strings, but we can do better! How much better? Let’s see. Create a new table with a UUID column: SQL CREATE TABLE tab_u (u uuid); Populate it with UUID values from the first table: SQL INSERT INTO tab_u SELECT * FROM tab_s; The newly created table has the same values as the first table, but the column is defined as UUID instead of string, so it eliminates the waste we discussed above. Let’s see how the predicate performs now: SQL SELECT * FROM tab_u WHERE u = 'ab632aba-be36-43e5-a4a0-4895e9cd3f0d'; This query takes around 380ms on my test box. That’s almost 6x better than the original 2.2 seconds! Speed is the key to any real-time analysis so this is certainly important. Let’s check disk space. The du command shows the space used by each table. First, the table with strings: Shell $ du -h 79G ./default 79G . The table with UUID: Shell $ du -h 15G ./default 15G . Declaring the column as UUID saved 64GB of disk space! Using UUID optimizes query performance and is cost-effective. Last but not least, predicates on UUID values will become even faster in future QuestDB versions as we are looking at how to vectorize them by using the SIMD instructions! Conclusion We use UUIDs to generate globally unique IDs without any coordination. They are 128 bits long so they do not use too much space. This makes them suitable for distributed applications, IoT, cryptocurrencies, or decentralized finance. When your application stores UUIDs, tell your database it’s a UUID, do not store them in a string column. You will save disk space and CPU cycles.

By Jaromir Hamala
How To Test IoT Security
How To Test IoT Security

Though the Internet of Things (IoT) has redefined our lives and brought a lot of benefits, it has a large attack surface area and is not safe until it is secure. IoT devices are an easy target for cybercriminals and hackers if not properly secured. You may have serious problems with financial and confidential data being invaded, stolen, or encrypted. It is difficult to spot and discuss risks for organizations, let alone build a comprehensive methodology for dealing with them, without practical knowledge of what IoT security is and testing it. Realizing the security threats and how to avoid them is the first step, as Internet of Things solutions require significantly more testing than before. Integrated security is frequently lacking when it comes to introducing new features and products to the market. What Is IoT Security Testing? IoT security testing is the practice of evaluating cloud-connected devices and networks to reveal security flaws and prevent devices from being hacked and compromised by a third party. The biggest IoT security risks and challenges can be addressed through a focused approach with the most critical IoT vulnerabilities. Most Critical IoT Security Vulnerabilities There are typical issues in security analysis faced by organizations that are missed even by experienced companies. Adequate testing Internet of Things (IoT) security in networks and devices is required, as any hack into the system can bring a business to a standstill, leading to a loss in revenue and customer loyalty. The top ten common vulnerabilities are as follows: 1. Weak Easy-to-Guess Passwords Absurdly simple and short passwords that put personal data at risk are among the primary IoT security risks and vulnerabilities for most cloud-connected devices and their owners. Hackers can co-opt multiple devices with a single guessable password, jeopardizing the entire network. 2. Insecure Ecosystem Interfaces Insufficient encryption and verification of the user’s identity or access rights in the ecosystem architecture, which is software, hardware, network, and interfaces outside of the device, enable the devices and associated components to get infected by malware. Any element in the broad network of connected technologies is a potential source of risk. 3. Insecure Network Services The services running on the device should be given special attention, particularly those that are open to the Internet and have a high risk of illegal remote control. Do not keep ports open, update protocols, and ban any unusual traffic. 4. Outdated Components Outdated software elements or frameworks make a device unprotected from cyberattacks. They enable third parties to interfere with the performance of the gadgets, operating them remotely or expanding the attack surface for the organization. 5. Insecure Data Transfer/Storage The more devices are connected to the network, the higher the level of data storage/exchange should be. A lack of secure encoding in sensitive data, whether it is at rest or transferred, can be a failure for the whole system. 6. Bad Device Management Bad device management happens because of a poor perception of and visibility into the network. Organizations have a bunch of different devices that they do not even know about, which are easy entry points for attackers. IoT developers are simply unprepared in terms of proper planning, implementation, and management tools. 7. Poor Secure Update Mechanism The ability to securely update the software, which is the core of any IoT device, reduces the chances of it being compromised. The gadget becomes vulnerable every time cybercriminals discover a weak point in security. Similarly, if it is not fixed with regular updates, or if there are no regular notifications of security-related changes, it can become compromised over time. 8. Inadequate Privacy Protection Personal information is gathered and stored in larger amounts on IoT devices than on smartphones. In case of improper access, there is always a threat of your information being exposed. It is a major privacy concern because most Internet of Things technologies are somehow related to monitoring and controlling gadgets at home, which can have serious consequences later. 9. Poor Physical Hardening Physical hardening is one of the major aspects of high security IoT devices since they are a cloud computing technology that operates without human intervention. Many of them are intended to be installed in public spaces (instead of private homes). As a result, they are created in a basic manner, with no additional level of physical security. 10. Insecure Default Settings Some IoT devices come with default settings that cannot be modified, or there is a lack of alternatives for operators when it comes to security adjustments. The initial configuration should be modifiable. Default settings that are invariant across multiple devices are insecure. Once guessed, they are used to hack into other devices. How To Protect IoT Systems and Devices Easy-to-use gadgets with little regard for data privacy make IoT security on smart devices tricky. The software interfaces are unsafe, and data storage/transfer is not sufficiently encrypted. Here are the steps to keep networks and systems safe and secure: Introduce IoT security during the design phase: IoT security strategy has the greatest value if it is introduced from the very beginning, the design stage. Most concerns and threats that have risks to an Internet of Things solution may be avoided by identifying them during preparation and planning. Network security: Since networks pose the risk of any IoT device being remotely controlled, they play a critical role in cyber protection strategy. The network stability is ensured by port security, animal ware, firewall, and banned IP addresses that are not usually used by a user. API security: Sophisticated businesses and websites use APIs to connect services, transfer data, and integrate various types of information in one place, making them a target for hackers. A hacked API can result in the disclosure of confidential information. That is why only approved apps and devices should be permitted to send requests and responses with APIs. Segmentation: It is important to follow segmentation for a corporate network if multiple IoT devices are connecting directly to the web. Each of the devices should use its small local network (segment) with limited access to the main network. Security gateways: Serve as an additional level in security IoT infrastructure before sending data produced by a device out to the Internet. They help track and analyze incoming and outgoing traffic, ensuring someone else cannot directly reach the gadget. Software updates: Users should be able to set changes to software and devices by updating them over a network connection or through automation. Improved software means incorporating new features as well as assisting in identifying and eliminating security defects in the early stages. Integrating teams: Many people are involved in the IoT development process. They are equally responsible for ensuring the product’s security throughout the full lifecycle. It is preferable to have IoT developers get together with security experts to share guidance and necessary security controls right from the design stage. Our team consists of cross-functional experts who are involved from the beginning to the end of the project. We support clients with developing digital strategies based on the requirements analysis, planning an IoT solution, and performing IoT security testing services so they can launch a glitch-free Internet of Things product. Conclusion To create trustworthy devices and protect them from cyber threats, you have to maintain a defensive and proactive security strategy throughout the entire development cycle. I hope you take away some helpful tips and tricks that will help you test your IoT security. If you have any questions, feel free to comment below.

By Anna Smith
What Is IoT Gateway? Is It Important
What Is IoT Gateway? Is It Important

An IoT (Internet of Things) gateway is a device that acts as a bridge between connected IoT devices and other networks, such as the Internet. It provides a centralized platform for managing and processing data from multiple IoT devices and securely transmitting that data to the cloud or other systems for analysis, storage, and further processing. The IoT gateway can perform various functions, such as data aggregation, protocol translation, security management, and device management. An IoT gateway builds connections to connected IoT devices through various communication protocols, such as Wi-Fi, Ethernet, Zigbee, Z-Wave, or others. The gateway uses these protocols to communicate with the IoT devices and receive data from them. The gateway can also establish connections to other networks, such as the Internet, through Wi-Fi or Ethernet, to transmit the data it collects from IoT devices to the cloud or other systems for further processing. To ensure the secure transmission of data, the IoT gateway typically employs encryption and authentication methods. Additionally, the gateway can be configured to perform data processing and storage locally to reduce the amount of data transmitted to the cloud or other systems. Why IoT Gateways Are Important IoT gateways are important for several reasons: Connectivity: IoT gateways provide a central platform for connecting and communicating with multiple IoT devices, which may use different communication protocols. The gateway acts as a bridge, allowing these devices to communicate with each other and with other systems, such as the cloud or a local network. Data processing: IoT gateways can perform data processing tasks such as data aggregation, protocol translation, data filtering, and data compression, reducing the amount of data transmitted to the cloud and improving the efficiency of the IoT network. Security: IoT gateways provide a secure connection between IoT devices and other systems, using encryption and authentication methods to protect transmitted data. This ensures the privacy and security of the IoT network and the connected devices. Device management: IoT gateways can manage and control connected IoT devices, updating their firmware, configuring their settings, and monitoring their performance. This simplifies the management of a large number of connected devices and reduces the maintenance overhead. Cost savings: By performing data processing and storage locally, IoT gateways can reduce the amount of data transmitted to the cloud, reducing the cost of data storage and transmission. Overall, the IoT gateway is an essential component of an IoT network, providing a centralized platform for connecting, managing, and processing data from connected devices. How Does an IoT Gateway Work? An IoT gateway works by serving as a communication hub between IoT devices and other systems, such as the cloud or a local network. It acts as a bridge, connecting devices that use different communication protocols and enabling them to communicate with each other. The following are the key steps involved in the working of an IoT gateway: Data collection: The IoT gateway collects data from the connected IoT devices using communication protocols such as Wi-Fi, Ethernet, Zigbee, Z-Wave, or others. Data processing: The gateway can perform data processing tasks such as data aggregation, protocol translation, data filtering, and data compression, among others. Data transmission: The processed data is transmitted to the cloud or other systems for further analysis and storage. Security: The IoT gateway employs security measures, such as encryption and authentication, to protect the transmitted data and ensure secure communication between the devices and the cloud or other systems. Device management: The IoT gateway can manage and control connected IoT devices, updating their firmware, configuring their settings, and monitoring their performance. Overall, the IoT gateway plays a crucial role in the functioning of an IoT network, enabling connected devices to communicate with each other and with other systems and providing a platform for data processing and management. How Many Types of IoT Are There? IoT gateways come in different types based on their form factor, connectivity options, processing capabilities, and other factors. Some of the common types of IoT gateways are: Industrial IoT gateways: These gateways are designed for industrial and commercial applications, such as factory automation and building management systems. They are rugged, have multiple connectivity options, and can operate in harsh environments. Home automation gateways: These gateways are designed for use in residential environments to control and manage connected home devices, such as smart locks, lighting systems, and thermostats. Wireless IoT gateways: These gateways are designed for wireless communication with connected devices, using protocols such as Wi-Fi, Zigbee, Z-Wave, or others. They provide a low-power, low-cost solution for connecting devices in a small area. Embedded IoT gateways: These gateways are integrated into the connected devices themselves, providing a compact and integrated solution for small IoT networks. Multi-protocol IoT gateways: These gateways can communicate with devices using multiple communication protocols, such as Wi-Fi, Ethernet, Zigbee, Z-Wave, and others. They provide a flexible solution for connecting a variety of devices to a network. Cloud-based IoT gateways: These gateways are hosted in the cloud, providing a remote access solution for managing and processing data from connected devices. Each type of IoT gateway has its own advantages and disadvantages, and the choice of the right gateway depends on the specific requirements of the IoT network and the connected devices.

By Paridhi Dhamani
Are Industrial IoT Attacks Posing a Severe Threat to Businesses?
Are Industrial IoT Attacks Posing a Severe Threat to Businesses?

What is the Industrial Internet of Things (IIoT)? IIoT refers to using interconnected devices, sensors, and machines in industrial settings. These devices can monitor and analyze data from various systems, giving businesses real-time insights into their operations. For example, a factory might have IIoT sensors installed throughout its assembly lines. Each sensor will collect information about what's happening in that factory area, such as temperature levels or product quality. This information is then collected by a server (or “hub”) that aggregates the data from each sensor and displays it on an interactive map for easy viewing. This allows factory managers to understand better what's happening at each stage of production — and when something goes wrong — so they can respond quickly and effectively. IIoT has the potential to revolutionize various industries, including manufacturing, transportation, and energy, by making operations more efficient, reducing downtime, and improving product quality. What Are IIoT Attacks? IIoT attacks are malicious activities aimed at disrupting, damaging, or taking control of IIoT systems. These attacks can be carried out by hackers, cybercriminals, or even disgruntled employees. The main goal of these attacks is to cause damage to the systems, steal sensitive data, or compromise the business's operations. Some common types of IIoT attacks include: Ransomware: This type of attack involves using malware to encrypt the data on the IIoT devices, making it inaccessible to the business until a ransom is paid. Distributed Denial of Service (DDoS): DDoS attacks overwhelm the IIoT systems with a flood of traffic, rendering them unusable. This attack makes an online service, network resource, or machine unavailable to its intended users. Man-in-the-Middle (MITM) Attack: This type of attack involves intercepting the communication between IIoT devices and altering it to gain access to sensitive data or take control of the systems. Malware: Malware can infect IIoT devices, enabling attackers to steal data, take control of the systems, or cause damage. Physical Attacks: Attackers can physically access IIoT devices and systems to steal, modify, or destroy them. Why Are IIoT Attacks a Severe Threat to Businesses? IIoT attacks pose a severe threat to businesses that rely on these systems. The consequences of an IIoT attack can be severe and long-lasting. IIoT attacks can impact enterprises in several ways, including: Financial Loss: An IIoT attack can lead to significant financial losses for businesses, including lost revenue, damage to equipment, and the cost of remediation. Reputation Damage: If a business suffers an IIoT attack, its reputation may be severely damaged, losing customers and trust. Regulatory Compliance: Many industries have regulatory compliance requirements that businesses must meet. An IIoT attack can result in a breach of these regulations, leading to penalties and fines. Safety Concerns: In some cases, IIoT attacks can have severe safety implications, such as disrupting critical infrastructure or systems essential for public safety. Intellectual Property Theft: Businesses that rely on IIoT systems may have valuable intellectual property stored on those systems. An IIoT attack can result in the theft of this intellectual property, compromising the competitiveness of the business. How Can Businesses Protect Themselves from IIoT Attacks? Businesses can take several steps to protect themselves from IIoT attacks. Some best practices include: Develop a Cybersecurity Plan: A cybersecurity plan should be developed that takes into account the unique risks associated with IIoT. This plan should identify potential threats and risks, assess vulnerabilities, and outline appropriate responses. Conduct Regular Risk Assessments: Regular risk assessments are necessary to identify vulnerabilities in the IIoT environment. The assessments should include identifying weaknesses in hardware and software, identifying potential attack vectors, and evaluating the effectiveness of existing security measures. Implement Appropriate Access Controls: Access to IIoT systems should be limited to authorized personnel. This can be achieved through robust authentication mechanisms, such as multi-factor authentication, and by restricting access to sensitive data and systems on a need-to-know basis. Use Secure Communication Protocols: IIoT devices should use secure communication protocols, such as SSL/TLS, to ensure that data is transmitted securely. Devices should also be configured only to accept communications from authorized sources. Implement Security Measures at the Edge: Edge computing can help secure IIoT systems by allowing security measures to be implemented closer to the data source. This can include using firewalls, intrusion detection systems, and antivirus software. Ensure Software and Firmware is Up-to-Date: Keeping software and firmware up-to-date is essential to ensure that known vulnerabilities are addressed. This includes not just IIoT devices themselves but also any supporting software and infrastructure. Implement Appropriate Physical Security Measures: Physical security measures, such as access control and monitoring, should be implemented to protect IIoT devices from physical tampering. Develop an Incident Response Plan: An incident response plan should be developed to ensure appropriate action is taken during an IIoT attack. This plan should outline steps to be taken to minimize damage, contain the attack, and restore normal operations. Provide Employee Training: Employees should be trained on the risks associated with IIoT and how to recognize and respond to potential threats. This includes educating employees on best practices for secure passwords, safe browsing habits, and identifying suspicious activity. To Conclude The rapid adoption of industrial IoT has increased efficiency but has eventually created a broadened threat vector in the IoT landscape. Protecting against IIoT attacks requires a multi-faceted approach that includes strong access controls, secure communication protocols, regular risk assessments, and a comprehensive incident response plan. By taking these steps, businesses can minimize the risks associated with IIoT and protect themselves from potentially devastating consequences.

By Deepak Gupta
Five Arguments for Why Microsoft Azure Is the Best Option for Running Industrial IoT Solutions
Five Arguments for Why Microsoft Azure Is the Best Option for Running Industrial IoT Solutions

The current technological landscape demands digital transformation, and the industrial internet of things (IoT) is undoubtedly one of the best alternatives for that. Industrial IoT solutions leverage connected sensors, actuators, and other smart devices to monitor, track, and analyze available data and make the best use of it for enhanced efficiency and minimized costs. However, it leaves us with an important question — Which cloud computing platform is the best option for running industrial IoT solutions? Well, according to research by Statista, over 70% of organizations are using Microsoft Azure for their cloud services. Moreover, Gartner has recently reported that Microsoft Azure is one of the key leading players out of the 16 top global companies considered for industrial IoT platforms. With these stats and facts, you probably got the answer. However, you might be wondering what makes Microsoft Azure a popular choice over others. This blog will outline the five significant reasons why Azure is the preferred cloud IoT option in the industrial IoT infrastructure. Moreover, you will also learn how it can benefit development teams to optimize their operational efficiencies. Microsoft Azure as a Cloud IoT Platform: An Overview Launched in 2010, Microsoft Azure is one of the three leading private and public cloud computing platforms worldwide. Although Azure was founded comparatively later, its intriguing features have made it a strong contending faction in the AWS vs. Azure vs. Google Cloud debate. Moreover, the global revenue growth of Microsoft Azure stood at 40% in the last quarter of 2022. Also, its total revenue in terms of public cloud platform as a service (PaaS) was $111 billion till last year. It’s the versatility of this cloud IoT platform that attracts the eye of software developers, engineers, and designers. IoT solutions by Azure cover almost every aspect of industrial IoT development, from linking devices and systems to providing decision-makers with valuable insights. The following section highlights some of the benefits of Microsoft Azure cloud IoT solutions. Benefits of Azure Cloud IoT 1. Simplicity and Convenience One of the best things about the products from Microsoft is that they are convenient for all types of users, irrespective of their skills. From integrating app templates to leveraging SDKs, everything requires minimal coding. In addition, the platform provides users with several shortcuts for easy wireframing, prototyping, and deployment. 2. Robust Network of Partners Just like Amazon Web Services, Microsoft Azure has an ever-growing list of globally acclaimed IoT partners. These include a vast community of software developers and IoT hardware manufacturers. 3. Interactive Service Integrations In the Azure IoT Central, one of the core IoT solutions of Microsoft Azure, you will find a plethora of fascinating tools and services. For instance, with the help of Accuweather, you can get insights in the form of weather intelligence reports. Similarly, developers can build a virtual representation of their physical IoT environment with Azure Digital Twins. This feature can also help identify the dependencies and correlations between different parts of the environment. 4. Top-Notch Security Keeping cybersecurity threats in mind, Microsoft has focused specifically on the security aspects of all its products and services. Each Azure cloud IoT service is equipped with its own security features to help protect the data and prevent the code files from getting infected with viruses. Reasons Why Microsoft Azure Is the Best Option for Running Industrial IoT Solutions Azure IoT Central — A Robust SaaS Platform Although developers can easily develop end-to-end IoT products using the basic Microsoft Azure services, the process of plumbing can feel complex anyway. In such cases, Azure IoT Central can be an ideal solution to link your existing devices and manage them using the cloud. With this, you wouldn’t need to build a custom solution. Azure IoT Central is a SaaS product that extracts Azure’s fundamental IoT PaaS capabilities, making it convenient for you to procure value from the linked devices. Besides this, the public-facing APIs help provide a seamless user experience throughout the development process; be it while creating dashboards or connecting IoT devices. A Rich and Vibrant Partner Ecosystem One crucial thing that needs to be understood to create a successful IoT solution is that it’s not just about writing code and deploying software. Instead, it’s more about how efficiently it could manage the devices and analyze data. For this, you need a professional system integration team that can pick the right hardware and incorporate it with legacy OT technologies. Microsoft Azure cloud IoT solutions provide users with a massive range of software and hardware offerings. For instance, consider one of its ranges of products, say, Azure Stack Edge. The developers can choose from its robust, battery-powered device or standard server-grade alternative fueled by 32vCPUs, 204 gigabytes RAM, 2.5 terabytes local storage, and 2 NVIDIA A2 GPUs. This is one of the reasons why several popular industrial IoT players, like Schneider, ABB, PTC, Siemens, etc., have developed their platforms over the Microsoft cloud. All these examples show that Microsoft Azure has a rich and vibrant partner ecosystem, delivering intuitive industrial IoT solutions. A DevOps-Friendly Platform The role of edge computing is quite significant in developing industrial IoT solutions. Azure IoT Edge, a robust edge computing platform, performs that function well, making developers and system operators more efficient. Azure IoT Edge can run on both AMD64 and ARM64 platforms, and one can use it to form a channel between the public cloud and local devices. Moreover, business logic can be written by developers as standard Docker containers, which can then be installed as modules in Azure IoT Edge. Also, operators can incorporate Kubernetes, an open-source tool for automating management and installation with Azure IoT Edge to constantly monitor the deployments. An Ultimate Level of Security As discussed earlier, Microsoft has invested both time and money in enhancing the security aspects of all its products and services. Here are some of the security services you can avail of with Microsoft Azure: Azure Sphere It is the one-stop solution for users seeking protection for cloud-to-edge and edge-to-cloud integration. Since the device is securely integrated with Azure IoT Central and Azure IoT Hub, users can easily and quickly build secure connected solutions. Azure Defender for IoT Users can use this solution to get end-to-end security for IoT devices. Azure Defender for IoT leverages features such as behavioral analytics and threat intelligence to constantly monitor IoT devices for unauthorized and unwanted activities. Easy Integration with AI and Data Analytics To make an IoT solution more functional and efficient, it is crucial to integrate it with advanced technologies, like artificial intelligence, machine learning, or big data analytics. Incorporating all these technologies makes the processes much simpler and saves both effort and time for developers. With Azure Stream Analytics, developers can quickly store and process the local telemetry data. Moreover, Azure Data Lake or Azure Cosmos Database can also be used to store data ingested from sensors. It can then be passed through Azure ML and Power BI in order to perform predictive analytics and form predictive models derived from this data. Wrapping Up Several experts describe Microsoft Azure as an IoT cloud platform with ‘limitless potential’ and ‘unlimited possibilities,’ and now you probably know the reason why. In fact, it is reported that observing the rapid growth of Azure, the day is not far when it will surpass the dominance of Amazon Web Services (AWS). Microsoft Azure has got everything in the package, and that includes services for data management and analysis, too. Its intriguing features and cloud computing capabilities can help developers construct an industrial IoT environment in the best way possible.

By Shikhar Deopa
Key Characteristics of Web 3.0 Every User Must Know
Key Characteristics of Web 3.0 Every User Must Know

Web 3.0 is indeed the future version of the present-day Internet which will be purely based on public blockchains. Public blockchains refer to a record-keeping system known for carrying out crypto transactions. Unlike its predecessors, the key feature of Web 3.0 is its decentralized mechanism, translating to users using the Internet via services governed by major tech players, individuals, and users. The users will also get the privilege of controlling various parts of the Internet. Web 3.0 doesn't necessarily demand any form of "permissions," meaning that the governing bodies have no role to play in deciding the Internet service accessibility, nor is any "trust" required. Hence no intermediatory body is not necessary to carry out virtual transactions amongst different involved parties. Since these online agencies are involved in most of the data collection part, Web 3.0 will protect user privacy in a better manner. Decentralized Finance, or DeFi, is an integral component of Web 3.0 and has gained significant traction recently. It involves executing real-world financial transactions over blockchain technology without any assistance from banks or the government. Also, larger enterprises across different industries are now investing in Web 3.0, and this hasn't been easy to consider that their engagement won't be driving results in some centralized authority form. What Is Web 3.0? Web 3.0, also called the Semantic Web or read-write-execute, is the web era starting from 2010 that mentions the future of the web. Technologies like Artificial Intelligence and Machine Learning allow user systems to analyze data the same way as humans, which assists in the smart generation and distribution of important content per the user's needs. There are a lot of differences between Web 2.0 and Web 3.0. with decentralization present at the core of both web versions. Web 3.0 developers do not always create and deploy applications running over a single server, or the data remains stored in just one database (hosted on and managed by a cloud service provider). Rather, applications based on Web 3.0 are developed on blockchains, decentralized networks of multiple servers, or a hybrid of these two (blockchain and servers). These programs are also called Decentralized Apps or DApps. In the Web 3.0 ecosystem, network participants or developers are recognized and awarded for delivering the best services toward creating a stable and secure decentralized network. Benefits of Web 3.0 Over Predecessors Since in Web 3.0, there are no intermediaries involved, there is no longer control over the user data. This also eliminates the possibilities of government/corporate restrictions and damages from denial-of-service or DoS attacks. Compared to the previous web versions, searching for accurately-refined results over search engines has proved challenging. However, search engines have significantly transformed their strengths to discover semantically relevant results based on users' search intent and information. This has made web browsing a more convenient option than before, allowing users to get the specific piece of information they need easily. Customer service has also been important for driving positive user experience on websites and web applications. Leading successful web-driven organizations to find it difficult to upscale their customer operations due to high expenditures. Users can get a better experience while engaging with support teams using AI-driven chatbots that can 'talk' to multiple customers simultaneously, backed by the emergence of Web 3.0 with the use of artificial intelligence and machine learning technologies. Significant Characteristics of Web 3.0 The transition to Web 3.0 is taking place at a very slow pace and might get unnoticed by the general web audience. Web 3.0 applications have a strong resemblance in terms of look and feel with Web 2.0 applications; however, their back end differs fundamentally. The future of Web 3.0 is headed towards universal applications that can be easily read and used by multiple devices and software types, making the end user's commercial activities better with seamless experiences. Decentralization of data and establishment of transparent and secure environments are going to emerge with the advent of next-gen technologies like distributed ledgers and blockchain, which will dissolve Web 2.0's centralized surveillance and bombarded advertisements. In a decentralized web-like Web 3.0, individual users get complete control of their data, where a decentralized infrastructure and application platforms are displacing the centralized tech-based organizations. The following are some major properties of Web 3.0 in order to determine the associated complexities and intricacies linked with this emerging web version. Semantic Web The concept of the Semantic Web is a very critical element of Web 3.0, which was coined by the legend Tim Berners-Lee for describing a web of data that machines can analyze. In layman's language, the syntax of two phrases can differ, by their semantics remain similar, and semantics is more centered around the emotion depicted through facts. A couple of cornerstones are linked with Web 3.0: semantic web and artificial intelligence. The semantic web will help computer systems understand what data means, while AI will assist in creating real-world use cases that can enhance data use. The primary concept is to create a knowledge loop across the Internet that will support understanding the words and then generating, sharing, and connecting content via search and analytics tools. Web 3.0 will boost data communications owing to the semantic metadata. As a result, the user experience gets elevated to higher levels of connectivity, benefiting from the real data that can be easily accessed. Artificial Intelligence Owing to artificial intelligence technology, websites can now filter out and provide the best facts. In the present web era of Web 2.0, enterprises have started soliciting customer feedback for an enhanced understanding of the product or service quality. One of the major contributors to this present-day web is peer reviews. However, these human recommendations and thoughts can get opinionated or biased towards a particular service. Various AI models are now being trained to differentiate between good and bad data and offer suggestions backed with relevant and accurate information. Ubiquitous The ubiquitous characteristic of Web 3.0 is seen as the concept of existing or being present simultaneously; however, this feature is already present in Web 2.0 as well. For instance, on social media platforms where users share their photos and online with everyone online. This makes the sharer the intellectual property owner of the media he has shared. Once it is shared online, the photo becomes available everywhere, making it ubiquitous. With the increased number of mobile devices and Internet penetration across these devices, Web 3.0 remains accessible from anywhere, anytime. Unlike the previous web versions, the Internet won't be restricted to desktops or smartphones. With everything around us getting interconnected in a digital ecosystem called the Internet of Things, Web 3.0 is seen as the web of everything and everywhere. 3D Graphics Web 3.0 will impact the future of the Internet, as there'll be a transition from a two-dimensional web to a more realistic three-dimensional digital world. Services and websites of Web 3.0, like eCommerce, online games, and the real estate markets, are some sectors that will be extensively using three-dimensional designing.

By Rishabh Sinha

Top IoT Experts

expert thumbnail

Frank Delporte

Java Developer - Technical Writer,
CodeWriter.be

Frank Delporte is a technical writer at Azul, blogger on webtechie.be and foojay.io, author of "Getting started with Java on Raspberry Pi" (https://webtechie.be/books/), and contributor to Pi4J. Frank blogs about his experiments with Java, sometimes combined with electronic components, on the Raspberry Pi.
expert thumbnail

Tim Spann

Principal Developer Advocate,
Cloudera

https://github.com/tspannhw/SpeakerProfile/blob/main/README.md Tim Spann is a Principal Developer Advocate in Data In Motion for Cloudera. He works with Apache NiFi, Apache Pulsar, Apache Kafka, Apache Flink, Flink SQL, Apache Pinot, Trino, Apache Iceberg, DeltaLake, Apache Spark, Big Data, IoT, Cloud, AI/DL, machine learning, and deep learning. Tim has over a ten years of experience with the IoT, big data, distributed computing, messaging, streaming technologies, and Java programming. Previously, he was a Developer Advocate at StreamNative, Principal DataFlow Field Engineer at Cloudera, a Senior Solutions Engineer at Hortonworks, a Senior Solutions Architect at AirisData, a Senior Field Engineer at Pivotal and a Team Leader at HPE. He blogs for DZone, where he is the Big Data Zone leader, and runs a popular meetup in Princeton & NYC on Big Data, Cloud, IoT, deep learning, streaming, NiFi, the blockchain, and Spark. Tim is a frequent speaker at conferences such as ApacheCon, DeveloperWeek, Pulsar Summit and many more. He holds a BS and MS in computer science
expert thumbnail

Carsten Rhod Gregersen

Founder, CEO,
Nabto

Carsten Rhod Gregersen is the CEO and Founder of Nabto, a P2P IoT connectivity provider that enables remote control of devices with secure end-to-end encryption.
expert thumbnail

Emily Newton

Editor-in-Chief,
Revolutionized

Emily Newton is a journalist who regularly covers stories for the tech and industrial sectors. She loves seeing the impact technology can have on every industry.

The Latest IoT Topics

article thumbnail
Data Encryption: Benefits, Types, and Methods
This post explains data encryption and lists its benefits, types and the common encryption methods found in different tools.
April 20, 2023
by Alex Tray
· 1,515 Views · 1 Like
article thumbnail
Understanding Angular Route Resolvers
Better manage your components in a few, simple steps.
April 20, 2023
by Anastasios Theodosiou
· 112,967 Views · 7 Likes
article thumbnail
Benefits of React V18: A Comprehensive Guide
This article will cover three key features of React v18: automatic batching, transition, and suspense on the server.
April 20, 2023
by Beste Bayhan
· 2,095 Views · 1 Like
article thumbnail
External Data Services Plugin Design in IBM Content Navigator and a Sample Scenario of Implementation
In this article, the reader will learn and understand the basic concepts of External Data Services of IBM Content Navigator and a sample scenario of implementation.
April 19, 2023
by Ravikiran Kandepu
· 1,384 Views · 3 Likes
article thumbnail
Step-By-Step Guide to Building a High-Performing Risk Data Mart
We aim to serve four needs in our data platform development: monitoring and alerting, query and analysis, dashboarding, and data modeling.
April 19, 2023
by Jacob Chow
· 1,498 Views · 1 Like
article thumbnail
Diving Into Cloud Infrastructure: An Exploration of Its Different Components
In this blog, we’ll explore the building blocks of cloud infrastructure, including virtualization, containers, microservices, and serverless computing.
April 19, 2023
by Ruchita Varma
· 1,747 Views · 1 Like
article thumbnail
Apache Kafka for Data Consistency
Apache Kafka ensures data consistency across legacy batch, request-response mobile apps, and real-time streaming in a data mesh architecture.
April 18, 2023
by Kai Wähner CORE
· 1,769 Views · 1 Like
article thumbnail
Demystifying Data Fabric Architecture: A Comprehensive Overview
This article will provide a comprehensive overview of the data fabric architecture, its key components, and how it works.
April 18, 2023
by Amlan Patnaik
· 2,698 Views · 1 Like
article thumbnail
Usability Testing: A Comprehensive Guide With Examples And Best Practices
Learn how usability testing is a great way to discover unexpected bugs, find what is unnecessary, and have unbiased opinions from an outsider.
April 18, 2023
by Kavita Joshi
· 1,539 Views · 1 Like
article thumbnail
Prompt Engineering: Unlocking the Power of Generative AI Models
Prompt engineering is the art of crafting effective input prompts for AI models like GPT-4 or Google AI Bard, enabling accurate and context-aware results.
April 18, 2023
by Navveen Balani
· 2,172 Views · 3 Likes
article thumbnail
Embedded Systems Security Vulnerabilities and Protection Measures
Some tips and important consideration for embedded devices, which are often connected to the internet and can be vulnerable to various types of cyberattacks.
April 17, 2023
by Sreekanth Yalavarthi
· 2,548 Views · 3 Likes
article thumbnail
Python Stack Data Structure: A Versatile Tool for Real-time Applications
In this article, we will explore the Python stack data structure, its implementation, and real-time use cases.
April 17, 2023
by Amlan Patnaik
· 2,099 Views · 1 Like
article thumbnail
Securing MQTT With Username and Password Authentication
This article will explain how authentication works in MQTT, what security risks it solves, and introduce password-based authentication.
April 17, 2023
by Kary Ware
· 3,329 Views · 1 Like
article thumbnail
5 DNS Troubleshooting Tips for Network Teams
DNS is a critical but often ignored component of the networking stack. Monitoring DNS query anomalies can help you detect and correct underlying issues.
April 17, 2023
by Terry Bernstein
· 3,668 Views · 2 Likes
article thumbnail
Getting Started With Prometheus Workshop: Exploring Basic Queries
In this tutorial, you'll continue your open-source observability journey exploring basic Prometheus queries using PromQL.
April 15, 2023
by Eric D. Schabell
· 5,573 Views · 3 Likes
article thumbnail
How Incorporating NLP Capabilities Into an Existing Application Stack Is Easier Than Ever
With the use of pre-trained models, developers can code and query using everyday language. Read on to learn how.
April 14, 2023
by Jorge Torres
· 4,281 Views · 1 Like
article thumbnail
Optimizing Cloud Performance: An In-Depth Guide to Cloud Performance Testing and its Benefits
This article explains cloud performance testing and the types, different forms, benefits, benefits, and commonly used tools.
April 13, 2023
by Waris Husain
· 5,804 Views · 2 Likes
article thumbnail
Data Encryption Is the First Line of Defense Against Identity Theft and Cybercrime
Protect your data and identity from cyber criminals with data encryption, the first line of defense against identity theft and cybercrime. Stay secure online!
April 13, 2023
by Ryan Kh
· 4,349 Views · 1 Like
article thumbnail
5 NAS Backup Strategies: Pros and Cons Explained
In this article, we explain the NAS backup definition, reasons to have a data protection strategy, and five main strategies to back up NAS devices.
April 12, 2023
by Alex Tray
· 2,541 Views · 1 Like
article thumbnail
Building an Optimized Data Pipeline on Azure Using Spark, Data Factory, Databricks, and Synapse Analytics
This article will explore how Apache Spark, Azure Data Factory, Databricks, and Synapse Analytics can be used together to create an optimized data pipeline in the cloud.
April 11, 2023
by Amlan Patnaik
· 3,884 Views · 2 Likes
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • ...
  • Next

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com

Let's be friends: