Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

by Alok Batra


 

Last week, after I quoted Jane, my Co-founder at Atomiton saying, “head in the cloud, feet on the ground” about IoT (see my previous post “IoT is missing the ‘O’”), she told another story.

Being Chinese, she said in her culture there was originally no concept of the lobster, but only the concept of shrimp. So when the Chinese first saw lobster, they considered it just another type of shrimp. Hence, she explained, in the Chinese language, a “Lobster” is called a “Dragon Shrimp” (龙虾), meaning, a Very Big Shrimp.

Yes… I started to wonder out loud – aren’t the “Things” in IoT a Very Big Source of Data? As a software engineer my applications always consist of three layers: data, logic, and presentation. When IoT happened and I saw physical “Things” coming at my applications, they looked to me just like another source of data. Is that why the popular concept of Big Data (versus the original definition) has spread so quickly along with IoT?

When paradigm shifts and new things emerge, we often grasp them through the lens of our old framework. This is why we as humans learn fast. But such short-cut learning can become the impediment to greater realizations, because we only see what we recognize, and leave out what does not fit into the current framework. The Internet of Things will fundamentally change the architecture of our software applications. But if we stay with the existing framework, we will only see quantitative changes, not qualitative shifts at a tectonic scale.

I call the new type of IoT era applications “Thing-based applications”, in contrast to the traditional “Data-based applications”.

Data-based applications do one or both of two things:

  1. Help people organize or process information (examples are ERP solutions from Oracle, and dashboard applications from Pentaho.)
  2. Help people organize themselves using information (examples are CRM applications such as Salesforce.com and collaboration software such as Webex). In this category, people convert their activities into data by updating the software (CRM), or interact with information in the virtual world (Webex).

Thing-based applications do one or both of two things:

  1. Help people organize and interact with Things (for instance, an application that lets city operator manage street light intensity).
  2. Help Things organize themselves (for example, an application that lets wind turbines adjust themselves to optimize the wind farm).

Many data-based applications contain data about things or data from things, but they are not thing-based applications. An ERP system is a data-based application containing data about things. A health monitoring application connected to personal health devices is an application containing data from things (devices).

Thing-based applications are drastically different in three aspects:

1. Physical Impact: do software “actions” create physical world impact by itself? In data-based applications, a command creates new information in the logical world without changing any physics outside the computing machine. Humans bridge the gap by taking actions according to the information. In thing-based applications, some commands change the state or behavior of physical things. Such impact could be mechanical, chemical, or biological in nature. In fact, most applications imbedded in functional machines are thing-based applications, albeit limited in their scope. Think about the software that runs your printer.

2. Physical Stickiness (feet on the ground): what is the significance and the means of staying in sync with the states of physical reality? A CRM application could have an out-of-date customer address. An electronic medical record may have a wrong patient diagnosis. In data-based applications, when data is inconsistent with reality, it causes anywhere between inconvenience and serious error in people’s work, but the application is unaware.

In thing-based applications, out-of-sync with reality means the software itself would fail to function. If your visiting cousin manually locked your garage door and you try to open it from your garage door app on iPhone, your app may mistakenly think the garage door is now open while in fact it is in a closed state. This will render the app dysfunctional unless it has a parallel mechanism to verify physical state, such as via sensors. In contrast, as a Java developer for data-based applications, I assume the state of an object changes accordingly when an “operation” is carried out on it. I don’t worry about “double-checking” because everything I deal with is in the logical world. In thing-based applications states cannot be taken as granted.

Many data-based applications rely on humans to acquire and input up-to-date information. Thing-based applications cannot afford this. In Atomiton’s work in smart cities, we run into parking facility operators who maintain occupancy rate using cameras, by the cumulative counts of cars entering and exiting. But once the count is out of sync due to occasional errors, there is no way to recover the truth. Short of implementing parallel sensors, the only way seems to be sending a person to “recalibrate” the lot occupancy every morning – physically.

3. Physical Abstraction: how are Things abstracted? The history of computer science can be seen as the perfection of the art of abstraction. We all do abstraction in Java programming. To abstract is to hide something – the complexity that we do not want everyone to deal with every time. But if you write applications in Java most likely you are occupied with logical abstraction, not physical abstraction. The latter is the hiding of complex lower level functions dealing with the hardware computer or the operating system. Hardware primitives such as handling states or I/O are not what you worry about when you create or store a model. This is because you are standing on the laurel of decades of achievements in computer science:

Figure-1. A hierarchy of abstract machines

Ironically, throughout the history of computing, we have only thoroughly abstracted one device: the computing machine. Now with IoT we are setting out to abstract the whole world – with billions of devices and sensors. The concerns in the previous two sections – action and states, should exactly be the subjects of IoT abstraction.

In a properly abstracted IoT application world, most developers only need to worry about opening or closing the garage door, not maintaining state consistency with the physical world. That task is taken care of by a few developers who abstracted the physical world interactions into logical services. In a non-abstracted IoT world, what goes from bottom (things) to top (applications) is only data, not interactions. Instead of abstraction, it becomes “omission”.

So far, the abstraction efforts for the world of Things have been limited to individual devices (e.g. Phillips Hue), not systems. While some may prefer to look at our conventional computing machine as the future logical home of all physical Things, I look at the world of Things as the physical home of all future computing we do. In this sense, we still have a lot of physical abstraction to do. One day, when you can write a software command to ask a wind farm to distribute its workload by local wind conditions, without having to know the mechanics of the wind turbines, proper physical abstraction would have been done for the world of Things.

We are in the Lobster moment of software applications. We can be content with a Very Big Shrimp, or call it out that it is a Lobster, and do something different about Thing-bases applications.

 


 

 

  • No labels