The Connection Conundrum - Things (Devices, Sensors, Protocols)

No alt text provided for this image

The world around us is connected in more ways than one. The number of devices, sensors, and their applications is continuously growing. One result of all this activity is huge quantities of data is generated. To make sense of all this data, derive insights and value, data connections, and data ingestion are pivotal activities.

Accessing data in different locations, different settings, and different connectivity situations becomes a complex process. In this article, I will outline the options available and how this problem can be approached.

It is practical to approach the problem based on protocols.

There are two ways in which a thing (device/sensor) will communicate.

  • Manufacturer-specific (custom): This is on the decline and will soon be obsolete
  • Some industry-accepted protocol: This is the way forward considering the push on collaboration and standardization.

In the latter case, there can again be two different approaches.

  • Build your custom libraries for handling all the different protocols
  • Use already available packages / tools in the industry like Node-RED to handle the connectivity part

Lets look at each of these approaches in a little detail.

Building your custom libraries

Adopting this approach for handling different protocols may seem like reinventing the wheel at first, but it makes sense in the following scenarios:

The existing available solutions are not reliable or lack in some essential features

  1. Existing libraries provided may have some

  • identified anomalies
  • performance-related issues
  • environmental issues / restrictions

Building your custom libraries will entail effort and cost. Also, these will need to be rigorously tested and qualified. Once these libraries have been developed, maintaining them, and keeping them up to date with respect to any changes should be considered.

Some of the apparent advantages are:

  • Complete control over the library and its functionality up-gradation and future path
  • Build as you want and when you want
  • Build only for the needed functionality

In contradiction this approach does require

  • Time, effort, and budget
  • Specific governance for maintenance, upgrades
  • Continuous validation with testing and qualification

Going with an existing solution

Adopting this approach involves mainly evaluation, qualification, and selection. There are multiple solutions available, and newer ones do keep entering the market. Once a particular solution is selected, then the key focus area that remains is its actual integration and implementation.

Some of the apparent advantages are:

  • The solution is ready to go
  • It has been tested in the industry
  • Upgrade, and the path ahead is clearly defined

In contradiction

  • No control over the releases and forward path
  • The solution may not have the best possible solution for all scenarios
  • May involve architecture changes

It is apparent that a single approach may not work in all possible scenarios.

To conclude, each situation, will demand a hybrid approach with varying degrees of implementation of both approaches mentioned above.

Also, it should be remembered, resolving connections is just the first step of the process. We then must think of handling, processing, and storage models for all this data being acquired. 

I shall touch upon these topics in my subsequent posts.


To view or add a comment, sign in

More articles by Niranjan Dikshit

  • Data lake dichotomy – objective or end result ?

    A journey of a thousand miles starts with one small step Do we start with a specific end destination in mind, or do we…

  • Reporting Vs Analytics

    "Elementary my dear Watson" As I was watching “Sherlock” the other day, it suddenly struck me on how striking the…

  • AI & IOT

    Artificial Intelligence (AI) and Data Science are instrumental in realizing the full potential of the Internet of…

  • Data Steward – The Bridge

    As the concept of “Data as a Service” gains traction in the industry, there are various roles which assume key…

  • Visual Storyboarding Basics

    Visual analytics, or representation of data in a visual format is a necessary requirement for reporting in today’s age.…

  • Be Data Smart!!

    Context Data driven decision making is on the rise in enterprises across the spectrum. It is natural as everyone has…

  • Data Governance Basics

    Context The word governance has many interpretations in our minds. Typically, to summarize governance represents…

    1 Comment
  • Data quality, completeness, and its impact on analytical insights.

    There is no doubt that we are in the age of digitization. Data is the most important commodity and wherever you look…

  • Are you writing good code?

    Quality of code plays a major role in the success of any software. It has a direct impact on the total amount of bugs/…

    2 Comments
  • IOT/ IIOT solution adoption challenges

    IOT, IIOT, Industry 4.0 are terms which are much used, sometimes abused terms in the market today.

Insights from the community

Others also viewed

Explore topics