If Data is the Hero , is Device the Super Hero?


To monetize any engagement or a project, data is a critical element. Data needs to be cleaned, aggregated, ingested, modeled and analyzed to provide business insights which are actionable. A project which is addressing defined business use case would be meaningful if it provides us insights. IoT projects are multi-dimensional, they cover layers like Device – Device, Device-Server, and Server-Server when it comes to data and its flow .

A typical project in  IoT gets represented as a complex system of networks, platforms, interfaces, protocols, devices, and data. IoT devices range from sensors, actuators, gateways, and embedded hardware/software within products and assets.

The number and type of IoT devices, as well as the associated use cases for apps and services, will grow exponentially within leading industry verticals. One of the critical success factors for IoT operation which could well be termed as Operational Support Systems (OSS) for IoT is  IoT Device Management.

If devices are not managed well , the output data would be a suspect and the project could well be a failed one. The need for IoT device management is paramount and it could be a matter of concern or an opportunity to differentiate as well if addressed with care

Fundamentally IoT Device Management would cover areas like

  1. Provisioning and Authentication
  2. Configuration and Control
  3. Software Updation and Maintenance
  4. Monitoring and Diagnostics

I would not slot Security here as it is a separate subject in its own way.

Of the 4 areas listed 1, 2 and 3 can be done with a degree of control as we initiate the project. The last point of Monitoring and Diagnostics is the most critical as the level of control which needs to be brought in, is the most complex. This is given the diversity and the scale in terms of a number of devices, the corresponding protocols, the associated challenges of interoperability. Then there is the need to replicate the problems and take corrective or predictive actions.

Data may have the flamboyance of a hero in the IoT story, but the real workhorses are the devices which work at the edge of the IoT system—the “Things” in the Internet of Things” . The devices spew out the data and hence a healthy device would provide honest data.

Devices out in the field are either generating and transmitting data to a centralized platform ( one-way movement) or performing automated tasks that generate data. A mundane job, perhaps, yet the overall performance of a system often hinges on the health of field devices.

Imagine running a critical operation of managing a fleet of trucks managing a cold chain shipping perishable goods. If a device, sensor, embedded agent, or gateway begins to falter, and more importantly has not been monitored well enough and corrective actions not taken,  the consequences could be  dire  and contractual impact could be disastrous .

The challenge of maintaining devices may sound basic compared with aggregating and analysing data, but it’s essential to a successful IoT strategy.

  • So what is the factor which make the devices vulnerable and hence the output data could be unreliable?

The key factor here is RF Technology .

Most IoT devices rely on radio frequency (RF) technology such as Bluetooth, ZigBee and Wi-Fi for communications. Otherwise known as far-field transmission, RF is great when communicating over long distances, but becomes problematic when applied to short-range, isolated IoT ecosystems, like the wireless personal area network.

Link and network security become increasingly difficult as the number of any RF devices increases as per research reports. The relentless requirement for decreasing power consumption in devices translates to less room for handshake and encryption protocols. These issues are clearly reflected by Bluetooth’s increasingly poor reliability and security record.

Then it is seen that RF-based devices are shutting each other down due to interference, a situation that will grow worse when the IoT industry grows by the billions.

  • So how would these vulnerabilities get addressed?

An alternative which will come mainstream is  Near Field Magnetic Induction (NFMI) for RF. NFMI uses the modulations of magnetic fields to transfer data wirelessly between two points. Its main strength is its attenuation. It decays a thousand times faster than RF signals, which eliminates much of the interference and security issues that are attributed to technologies such as Bluetooth.

NFMI will prove its worth in a new way in the age of IoT as it marches to mainstream adoption.

  • How will Operational Support System metamorphose?

As devices explode in numbers human cannot control a billion nodes connected in a wide-area however centralized the remote based management it may be .

Here the deployment of machine learning for the development of a dynamic, automated network management framework would be key. Industry is coming up with proprietary algorithms to provides real-time distributed system control and self-management and self-healing capabilities for huge long-range IoT networks consisting of billions of smart devices and sprawling across millions of square miles. The system uses trained neural networks and Bayesian methods to optimize the interaction of nodes and IoT gateways on the network.

Hence in conclusion the implications of scale of the increasingly connected world could be scary, whoever would master the management of the explosion of devices would be the winner and the ones who cannot could well be buried under the weight of the devices. A service line , AI infused Operational Support System will develop .


Phones are a wonder of Inter-operability, can “Things” emulate ?

Posted on June 22, 2017 by somjitamrit

Ever wondered that be it a wired (landline  ) phone or a wireless  ( cellular ) phone, once a call is made from it, it works, effortlessly? Then why are we making a big deal of the seamless connectivity of the “Things” in IoT when it comes to Interoperability.

Before we go to Interoperability, let us deliberate for a moment of Standards. Many a time, these two words are lazily referred to interchangeably. But there is a difference and it is a massive one.

Let us deliberate why equating a standard with interoperability is a fallacy of sorts.  Reaching to analogy, a standard is like a language.  So we could define English, or German or Mandarin as standards.  The standards bodies would then claim that everyone who speaks the same language is interoperable.  The language defines the grammar and the vocabulary but does nothing to promote interoperability.

Interoperability is about working together seamlessly.  To achieve that requires more than just a standard.  It needs a set of interoperability tests and the testing tools to confirm compliance with those tests.  These don’t generally come with a standard – they need to be put in place to support it.  That entails time and money, which means most standards can’t support them until they’re already fairly well established.

Now let us come back to our example of the phone. This is a simpler interoperability problem that it might first appear, because phones do not connect directly to each other.  Instead each connects with the network infrastructure, which transfers the data between the two (or more) appropriate handsets.  Handsets and base stations have to conform to industry standards.  For most phones those are defined by ETSI – the European Telecoms Standards Institute, which is responsible for the GSM and 3G/4G standards which are widely used.  It’s not the only standard, but it accounts for over 8 billion connections throughout the world.

Despite those numbers, there are not that many different mobile phones, and even fewer different base stations that they connect to because the industry is controlled by a relatively small number of companies thanks to the consolidation.  One of the reasons for the small number of companies is the cost of implementing the standards, plus the cost of testing them.  Before a mobile phone is brought to market it needs to pass a stringent set of qualification tests, which can cost anything up to a $ 1 million.  At that point, it becomes legal to sell it.  However, before a network operator will sell it to you, they insist on it is passing a further set of interoperability tests.

Now let us bring the focus to the “Things” and why Interoperability is a must for IoT to go mainstream.

As the number of things getting connected from the physical world to the digital world on the Internet, starts to grow, the task of testing everything against everything else becomes impossible, as it grows factorially.  Major manufacturers will still perform extensive testing of their flagship products, but in general, interoperability starts to take a nosedive.  The other thing that happens is that as more and more manufacturers start to write protocol stacks and profiles, each tends to deviate slightly from the standard, because of minor differences in interpretation and implementation.  Rather than testing these rigorously against the specification, effort tends to be put into ensuring interoperability with what each manufacturer sees as the market leading product.  That results in more patches to make their stack work.  If in turn, they become successful, other manufacturers will do the same thing against that product, running the risk that de facto implementations start to diverge from the specification.

This has resulted in a half a dozen standards bodies or consortium which would like to set the standards and their success would be in roping in more manufacturers of physical devices which would get connected. The more they have they can make it a leading standard to which others have to “interoperate”.

Therefore we find; Industrial Internet Consortium (IIC) led by GE ( a manufacturing behemoth ) but not its competitor Honeywell in the consortium.

Internet of Things Consortium (IOTC) led by Verizon ( a Global Telecom Giant), but its peer AT&T, is absent.Open Internet Consortium (OIC) led by CISCO andIntel ( Leading Network and Chip manufacturer) but not Juniper or AMD

Open Internet Consortium (OIC) led by CISCO and Intel ( Leading Network and Chip manufacturers ) but not Juniper or AMD  do not make it to the list.

Allseen which started with Qualcomm and Microsoft as principals,  has a healthy partnership of appliances / white goods companies like LG, Sharp, but not Samsung and Electrolux).

Interestingly we will see consolidation here sooner than later and the good sign is the merger of sorts of OIC and IIC.

Will this see the blurring of the definition of Interoperability and Standards!! Let us see, but it will not be far into the future !!



Why Smart Homes are not as wildly Popular as Smart Phones ?


Talk to anyone on IoT and the discussions veer around Smart Homes and one can turn a staid home into a smart home. But having gone deeper into this is the last 6 months, I see that many hackneyed examples which are being talked about as use cases have not moved beyond an enthusiast’s realm. This has hardly gone mainstream. To go mainstream 4 areas have to be mastered, as we look at deploying smart devices to make smart homes. These are easy to mention but hard to master.

The areas to be mastered are:

  1. Should be a child’s play – Users do not want to be burdened to more than a click of a button. Calling someone like the cable guys to install (costs time , money and effort), installing apps ( becomes one too many and bothersome as it takes up memory ), creating an account ( takes time and effort), pairing and syncing devices and joining a home Wi-Fi network ( this could be a challenge given the networks we have which are sometimes congested ). Add to this interoperability ( can the fridge work with the voltage stabilizer? ).
  2. Should fit into the income and NOT disposable income – If sensors and actuators have to be embedded in the home appliances of today, costs could go up. While IPv6 protocol would take care of IP addresses, still the deployment of IPv6 for IP addresses per device is a couple of years down the line.Today smart refrigerators in countries like US and UK can add to a significant cost to the tune of $200 or equivalent. The value derived for the cost could be a suspect.
  3. Should not invite security breaches – Security and its cousin, privacy, are something which have not been perfected. Reports suggest 75 % of the Bluetooth smart locks can be hacked.With the regular reports of IoT devices leak data, the industry is not instilling a lot of confidence in the consumer.As companies like Amazon and Google are trying to take control of the home, to be the home-hub of sorts , no one tells us how transparent she would be in handling private data. No one likes to be spied in . Least of all at homes!!
  4. Is it a “Want” or a “Need”? – Value needs to be created for a price to be paid. The Price – Value equation would be important. Some of us must have watched an awkward interview with Mark Zuckerberg and Jerry Seinfeld


The pair were talking about home automation and Seinfeld said: “Isn’t it funny how we work so hard to just eliminate a little bit of effort?”

The question is are we solving something which needs to be solved or we are putting technology first and pushing the business use case to the background?

I am not concluding that IoT is not ripe enough to be put to use, but the business use case needs to justify that. “Is smart home the place where you place your bets?”, is the billion dollar question.