Should the Telecom companies be “shovel providers “or be “gold miners” in the gold rush of IoT?

Let us do a Google search on the  “Top 10 IoT platforms”

http://internetofthingswiki.com/top-10-iot-platforms/634/.

The Top 10 are:

Amazon Web Services  Microsoft Azure IoT, Thing Worx from PTC , Watson IoT from IBM, Jasper from Cisco, Salesforce.com, Carriots , Oracle , Predix from GE and KAA.

In my last blog “If Data is Hero then is Device Super Hero ?” I had elucidated the importance of the poor cousin Device when it comes effectiveness of an IoT program sometimes more than Data.

As we talk about device and data, the device “creates” the data and the data has to be “communicated “. The carrier of data is through a Communication/ Telecom Service Provider  (TSP) .

If creation and communication are the two key things in a successful implementation and deployment of an IoT project why is that none of the Telecom companies are in the top bracket when it comes to building and promoting an IoT Platform.

This is something which is intriguing, to say the least, all would agree.

In this list, we have start-ups, equipment manufacturers, software companies, storage companies, business process management companies, and large manufacturing cos. Telecom companies are sorely missed in this august list!!

This is not to say that Verizon with its “Thingspace” and AT&T with M2X are not making headway but more needs to be done to occupy a center stage.

My premise here is –

“Are not Telecom Companies with their inherent capabilities as service providers over decades  natively best suited to lead this rather than be bit players ?”

Let us do a quick fact finding of why this is so. There are primarily two reasons for this –

A ) Telecom companies can provide the following areas of services namely:

  1. As a connectivity provider by providing communication as a service
  2. Help manage devices and their integration
  3. Distribute Content as a service
  4. Provide Maintenance and support through monitoring and control
  5. Help deploy development environment

 

The following diagram is courtesy https://www.ericsson.com/en/internet-of-things/telecom-service-provider

Drawing from the analogy of Gold Rush and if the scale and speed of growth in IoT can be compared to a Gold Rush, while the gold miners are trying to find gold, someone is trying to provide shovels. Today the telecom companies seem to be doing the latter.

They have to shake up their approach from being a connectivity or infrastructure provider to someone providing information value through service enablement and service creation.

B) IoT infrastructure demands are different – the demand for services is from machines/ devices and not humans.

While we could use the term connectivity and communication with a broad brush. We need to get an understanding of the way communication is dealt with IT by Telecom companies and how different the communication needs of IoT ecosystem is.

While IoT ecosystem requires low bandwidth, over a widely dispersed area for a massive number of devices in the field which could be multiple factors of current capacity, the current service needs which Telecom companies address are for high bandwidth in high-density regions with the need for higher power supply.

So, with an increase in the volume of connected devices service providers need to factor in that IoT devices may communicate very differently compared to smartphones and computers mainly manned by humans.

Some IoT devices tend to exchange relatively small amounts of data and connect and disconnect to the network very infrequently. Examples of this are smart meters (e.g. gas or electricity) providing their latest values to a centralized repository. In contrast, a connected car may exchange diagnostics information to this central hub while also offering mobile broadband services for in-car entertainment, thereby exchanging a lot of data over the mobile connection for a longer period of time.

This difference in ‘IoT endpoint’ behavior places very different demands on both the network as well as the data center responsible for processing and hosting this information. For example, a 4G network is very suitable for the connected car use case, but may not be the best choice for the smart metering scenario.

Smart metering only requires a low bandwidth channel that can be accessed with minimal power consumption.

C) Need for Short Range and Long Range Communication in IoT is best addressed by Telecom Service Providers.

Telecom service providers ( TSP) are currently rolling out low-power WAN networks (LP-WAN) such as LoRa or Sigfox which will work alongside traditional 3G/4G networks and which cater to those IoT applications that require very low bandwidth and low power consumption so the battery lifespan of the IoT device can last several years.

On the data center side, adopting cloud technologies is critical. The ability to quickly spin up a virtual environment delivering both the network functionalities as well as the IoT platform functionalities addressing the specifics to each IoT use case is crucial. Indeed, due to the wide variety of IoT use cases, there is no one-size-fits-all approach.

As the promised-land of the Internet of Things approaches, TSP is best positioned to become the facilitators and engine-room of this super-connected world.

Connecting IoT devices is one thing, securing them and securing the applications they connect to is another. TSPs have become much more security-aware in recent years as cyber and DDOS attacks have impacted other areas of their business.

Given that TSPs,

  1. Can handle Short-Range Communication and Long-Range communication through the cloud-based infrastructure and
  2. Have  a better knowledge of handling security and privacy given the years of experience behind them

It would not be a surprise of TSPs take the lead here in building comprehensive IoT platforms and aspire to be market leaders rather than being fast followers and bit players.

 

 

 

 

If Data is the Hero , is Device the Super Hero?

 

To monetize any engagement or a project, data is a critical element. Data needs to be cleaned, aggregated, ingested, modeled and analyzed to provide business insights which are actionable. A project which is addressing defined business use case would be meaningful if it provides us insights. IoT projects are multi-dimensional, they cover layers like Device – Device, Device-Server, and Server-Server when it comes to data and its flow .

A typical project in  IoT gets represented as a complex system of networks, platforms, interfaces, protocols, devices, and data. IoT devices range from sensors, actuators, gateways, and embedded hardware/software within products and assets.

The number and type of IoT devices, as well as the associated use cases for apps and services, will grow exponentially within leading industry verticals. One of the critical success factors for IoT operation which could well be termed as Operational Support Systems (OSS) for IoT is  IoT Device Management.

If devices are not managed well , the output data would be a suspect and the project could well be a failed one. The need for IoT device management is paramount and it could be a matter of concern or an opportunity to differentiate as well if addressed with care

Fundamentally IoT Device Management would cover areas like

  1. Provisioning and Authentication
  2. Configuration and Control
  3. Software Updation and Maintenance
  4. Monitoring and Diagnostics

I would not slot Security here as it is a separate subject in its own way.

Of the 4 areas listed 1, 2 and 3 can be done with a degree of control as we initiate the project. The last point of Monitoring and Diagnostics is the most critical as the level of control which needs to be brought in, is the most complex. This is given the diversity and the scale in terms of a number of devices, the corresponding protocols, the associated challenges of interoperability. Then there is the need to replicate the problems and take corrective or predictive actions.

Data may have the flamboyance of a hero in the IoT story, but the real workhorses are the devices which work at the edge of the IoT system—the “Things” in the Internet of Things” . The devices spew out the data and hence a healthy device would provide honest data.

Devices out in the field are either generating and transmitting data to a centralized platform ( one-way movement) or performing automated tasks that generate data. A mundane job, perhaps, yet the overall performance of a system often hinges on the health of field devices.

Imagine running a critical operation of managing a fleet of trucks managing a cold chain shipping perishable goods. If a device, sensor, embedded agent, or gateway begins to falter, and more importantly has not been monitored well enough and corrective actions not taken,  the consequences could be  dire  and contractual impact could be disastrous .

The challenge of maintaining devices may sound basic compared with aggregating and analysing data, but it’s essential to a successful IoT strategy.

  • So what is the factor which make the devices vulnerable and hence the output data could be unreliable?

The key factor here is RF Technology .

Most IoT devices rely on radio frequency (RF) technology such as Bluetooth, ZigBee and Wi-Fi for communications. Otherwise known as far-field transmission, RF is great when communicating over long distances, but becomes problematic when applied to short-range, isolated IoT ecosystems, like the wireless personal area network.

Link and network security become increasingly difficult as the number of any RF devices increases as per research reports. The relentless requirement for decreasing power consumption in devices translates to less room for handshake and encryption protocols. These issues are clearly reflected by Bluetooth’s increasingly poor reliability and security record.

Then it is seen that RF-based devices are shutting each other down due to interference, a situation that will grow worse when the IoT industry grows by the billions.

  • So how would these vulnerabilities get addressed?

An alternative which will come mainstream is  Near Field Magnetic Induction (NFMI) for RF. NFMI uses the modulations of magnetic fields to transfer data wirelessly between two points. Its main strength is its attenuation. It decays a thousand times faster than RF signals, which eliminates much of the interference and security issues that are attributed to technologies such as Bluetooth.

NFMI will prove its worth in a new way in the age of IoT as it marches to mainstream adoption.

  • How will Operational Support System metamorphose?

As devices explode in numbers human cannot control a billion nodes connected in a wide-area however centralized the remote based management it may be .

Here the deployment of machine learning for the development of a dynamic, automated network management framework would be key. Industry is coming up with proprietary algorithms to provides real-time distributed system control and self-management and self-healing capabilities for huge long-range IoT networks consisting of billions of smart devices and sprawling across millions of square miles. The system uses trained neural networks and Bayesian methods to optimize the interaction of nodes and IoT gateways on the network.

Hence in conclusion the implications of scale of the increasingly connected world could be scary, whoever would master the management of the explosion of devices would be the winner and the ones who cannot could well be buried under the weight of the devices. A service line , AI infused Operational Support System will develop .

 

Phones are a wonder of Inter-operability, can “Things” emulate ?

Posted on June 22, 2017 by somjitamrit

Ever wondered that be it a wired (landline  ) phone or a wireless  ( cellular ) phone, once a call is made from it, it works, effortlessly? Then why are we making a big deal of the seamless connectivity of the “Things” in IoT when it comes to Interoperability.

Before we go to Interoperability, let us deliberate for a moment of Standards. Many a time, these two words are lazily referred to interchangeably. But there is a difference and it is a massive one.

Let us deliberate why equating a standard with interoperability is a fallacy of sorts.  Reaching to analogy, a standard is like a language.  So we could define English, or German or Mandarin as standards.  The standards bodies would then claim that everyone who speaks the same language is interoperable.  The language defines the grammar and the vocabulary but does nothing to promote interoperability.

Interoperability is about working together seamlessly.  To achieve that requires more than just a standard.  It needs a set of interoperability tests and the testing tools to confirm compliance with those tests.  These don’t generally come with a standard – they need to be put in place to support it.  That entails time and money, which means most standards can’t support them until they’re already fairly well established.

Now let us come back to our example of the phone. This is a simpler interoperability problem that it might first appear, because phones do not connect directly to each other.  Instead each connects with the network infrastructure, which transfers the data between the two (or more) appropriate handsets.  Handsets and base stations have to conform to industry standards.  For most phones those are defined by ETSI – the European Telecoms Standards Institute, which is responsible for the GSM and 3G/4G standards which are widely used.  It’s not the only standard, but it accounts for over 8 billion connections throughout the world.

Despite those numbers, there are not that many different mobile phones, and even fewer different base stations that they connect to because the industry is controlled by a relatively small number of companies thanks to the consolidation.  One of the reasons for the small number of companies is the cost of implementing the standards, plus the cost of testing them.  Before a mobile phone is brought to market it needs to pass a stringent set of qualification tests, which can cost anything up to a $ 1 million.  At that point, it becomes legal to sell it.  However, before a network operator will sell it to you, they insist on it is passing a further set of interoperability tests.

Now let us bring the focus to the “Things” and why Interoperability is a must for IoT to go mainstream.

As the number of things getting connected from the physical world to the digital world on the Internet, starts to grow, the task of testing everything against everything else becomes impossible, as it grows factorially.  Major manufacturers will still perform extensive testing of their flagship products, but in general, interoperability starts to take a nosedive.  The other thing that happens is that as more and more manufacturers start to write protocol stacks and profiles, each tends to deviate slightly from the standard, because of minor differences in interpretation and implementation.  Rather than testing these rigorously against the specification, effort tends to be put into ensuring interoperability with what each manufacturer sees as the market leading product.  That results in more patches to make their stack work.  If in turn, they become successful, other manufacturers will do the same thing against that product, running the risk that de facto implementations start to diverge from the specification.

This has resulted in a half a dozen standards bodies or consortium which would like to set the standards and their success would be in roping in more manufacturers of physical devices which would get connected. The more they have they can make it a leading standard to which others have to “interoperate”.

Therefore we find; Industrial Internet Consortium (IIC) led by GE ( a manufacturing behemoth ) but not its competitor Honeywell in the consortium.

Internet of Things Consortium (IOTC) led by Verizon ( a Global Telecom Giant), but its peer AT&T, is absent.Open Internet Consortium (OIC) led by CISCO andIntel ( Leading Network and Chip manufacturer) but not Juniper or AMD

Open Internet Consortium (OIC) led by CISCO and Intel ( Leading Network and Chip manufacturers ) but not Juniper or AMD  do not make it to the list.

Allseen which started with Qualcomm and Microsoft as principals,  has a healthy partnership of appliances / white goods companies like LG, Sharp, but not Samsung and Electrolux).

Interestingly we will see consolidation here sooner than later and the good sign is the merger of sorts of OIC and IIC.

Will this see the blurring of the definition of Interoperability and Standards!! Let us see, but it will not be far into the future !!

 

 

Why Smart Homes are not as wildly Popular as Smart Phones ?

https://www.linkedin.com/pulse/smart-homes-wildly-popular-phones-somjit-amrit

Talk to anyone on IoT and the discussions veer around Smart Homes and one can turn a staid home into a smart home. But having gone deeper into this is the last 6 months, I see that many hackneyed examples which are being talked about as use cases have not moved beyond an enthusiast’s realm. This has hardly gone mainstream. To go mainstream 4 areas have to be mastered, as we look at deploying smart devices to make smart homes. These are easy to mention but hard to master.

The areas to be mastered are:

  1. Should be a child’s play – Users do not want to be burdened to more than a click of a button. Calling someone like the cable guys to install (costs time , money and effort), installing apps ( becomes one too many and bothersome as it takes up memory ), creating an account ( takes time and effort), pairing and syncing devices and joining a home Wi-Fi network ( this could be a challenge given the networks we have which are sometimes congested ). Add to this interoperability ( can the fridge work with the voltage stabilizer? ).
  2. Should fit into the income and NOT disposable income – If sensors and actuators have to be embedded in the home appliances of today, costs could go up. While IPv6 protocol would take care of IP addresses, still the deployment of IPv6 for IP addresses per device is a couple of years down the line.Today smart refrigerators in countries like US and UK can add to a significant cost to the tune of $200 or equivalent. The value derived for the cost could be a suspect.
  3. Should not invite security breaches – Security and its cousin, privacy, are something which have not been perfected. Reports suggest 75 % of the Bluetooth smart locks can be hacked.With the regular reports of IoT devices leak data, the industry is not instilling a lot of confidence in the consumer.As companies like Amazon and Google are trying to take control of the home, to be the home-hub of sorts , no one tells us how transparent she would be in handling private data. No one likes to be spied in . Least of all at homes!!
  4. Is it a “Want” or a “Need”? – Value needs to be created for a price to be paid. The Price – Value equation would be important. Some of us must have watched an awkward interview with Mark Zuckerberg and Jerry Seinfeld

(http://www.popsci.com/joins-mark-zuckerbergs-facebook-live-stream.)

The pair were talking about home automation and Seinfeld said: “Isn’t it funny how we work so hard to just eliminate a little bit of effort?”

The question is are we solving something which needs to be solved or we are putting technology first and pushing the business use case to the background?

I am not concluding that IoT is not ripe enough to be put to use, but the business use case needs to justify that. “Is smart home the place where you place your bets?”, is the billion dollar question.