Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Column
width700px

Profile Picture
User557058:338f35e5-7dcd-4154-8414-5f8af406de38
by Alok Batra


 

There are three types of things we do in IoT. They sound similar, but are actually quite different from each other: making smart thingsmaking things smart, and being smart about things. If we run a fleet of trucks and want more intelligence and efficiency, which direction should we choose?

  1. Make smart things. Creating from scratch new products that are smart. For example, the Nest thermostat, the family robot Jibo. Here we recognize that smartness comes from the combination of hardware and software, not just the latter.
  2. Make things smart. Taking some existing things – machines, devices, or even non-electronic stuff, and add sensors, software or sometimes controllers to make them smart or smarter. For example smart buildings that can regulate climate controls based on people’s activities. Here we know not all smartness is pre-planned and pre-loaded, nor can we replace all our assets overnight with new products.
  3. Be smart about things. Gather data about things, as much as possible. Study and understand their conditions and behavior patterns, even anticipate what’s going to happen in the future. For example, we can collect data and become smart about a jet engine, and predict when it is going to have faults. 

Demos have never gotten so real. Demos have never gotten so intense.  

Some people are always against giving live demos. In IoT, there is no way around it. No real things, no real IoT demo. So how to do it better?

 

 

From the perspective of enabling technology, the first category is smart products, the second is smart systems, and the third is data and analytics.

You probably already have the answer now for a fleet of trucks: probably not #1 – we will leave that to Tesla or Ford (to come in 2020). Between #2 and #3, it is important to make a distinction. A lot of efforts have been put in making us smart about the truck, not enough in making the trucks smart.

To make the difference clearer, replace truck with “person”. We have a set of criteria to judge if someone is smart, which is very different from us being smart about that person (e.g. know what he is doing at any moment). Let me borrow three common sense “intelligence criteria” for people and explain three things we should do differently about trucks.

Be alert and oriented.

Means a person is aware and responsive to the present environment. On the contrary, if someone knows everything going on inside his body, but has no clue of his current context – he may not have the five senses and is cut out from the external world – he cannot be smart. When we monitor trucks with only Onboard Diagnostics (OBD) data, we are missing the truck’s contextual information. Is the driver with the truck? Is the container open? What is the current weather condition? Has a crash just happened? To make the truck smart, we want it to be aware and well-oriented to such context. Only so are there means to respond quickly to situations such as crash, theft, or driver issues. 

OBD may be sufficient for us to be smart about trucks. Many applications collect and store OBD data. Analyzing the data could generate good intelligence, such as when the next engine problem might happen. But this is “statistical intelligence” owned by the data scientist and based on the past, not contextual intelligence with the truck and based on the present.

What to do differently? Take advantage of the falling cost of sensors. Make the truck aware of its context by adding additional environmental sensors, such as cameras, weather sensors, beacons, light sensors.

Think independently.

There is clearly a difference between taking the data away and extending the intelligence out. If this was our kids, the former is to (remotely) monitor their every move and watch for each single aberrance – be smart about them; the latter is to teach them the right judgment and let them think independently – make them smart. 

All fleet management applications today push data from the truck to the cloud, rather than push the intelligence to the truck. Data processed next to the truck versus data processed in the cloud, is there a difference? The answer is yes if you want to make the truck smart in “real time”. Can it detect possible theft and raise an alert? Can it know it is being delayed and adjust the container temperature? To achieve such, pushing all data to the cloud becomes infeasible: 

  1. It will be too expensive to rely on every single bit of data being transmitted to the cloud in real time.
  2. The network will have disruptions, dead zones and delays.
  3. Loss of context. The sequence of events and the identity of the truck are lost when data is sent over a network and aggregated in the cloud. They would have to be carefully tracked and reconstructed to make sense of the context.

What to do differently? In addition to OBD, we need OBC – Onboard Computing – the edge processing units that push intelligence to each truck. With IoT, distributed computing has become cheaper in cost, size, weight as well as power. Atomiton for example has a Neuron computer for each truck. 

Communicate intelligently.

We humans understand the world in three sequential activities: 1. sense; 2. perceive/interpret; and 3. think. Sensing is the sensory capture of information, perceiving is the immediate and intuitive interpretation of what is happening, and thinking is the cognitive analysis involving conscious efforts. There is a “short-circuit” of going directly from sensing to communicating, before perceiving or thinking occurs. When that happens we don’t consider that person to be very smart (at least not for the moment). Consider the following communication from someone (pretend you are receiving text messages from him): 

  • It’s 10:05:10 AM, itchiness on left forearm, severity level 5; seeing bushes around
  • It’s 10:05:20 AM, itchiness on left forearm, severity level 6; seeing bushes around
  • It’s 10:05:30 AM, itchiness on left forearm, severity level 6; seeing red spot on left forearm, 1cm in diameter
  • It’s 10:05:40 AM, itchiness on left forearm, severity level 4; scratch action happens;
  • It’s 10:09:30 AM, itchiness on left forearm, severity level 8; seeing red spot on left forearm, 1.5 cm in diameter

Don’t be surprised if you receive similar kinds of communication from a monitored truck, or other monitored assets today. We are attempting to be smart about these assets by gathering as much data as possible. We have not made the assets themselves smart.

If this poor guy has taken it to the step of perceiving, he would have communicated more intelligently: “I’m walking in bushes and might have been bitten by a bug”. 

A truck made smart would not communicate in raw data to the rest of the world – the operators, the service managers, the fleet management system. We have seen enough truck telematics data being accumulated in databases without meaningful actions, because it is very difficult to take actions on discrete parameters. Instead, the truck’s Onboard Computing unit should take the first level of interpretation on what is happening, and communicate intelligently: “possible crash just happened”, or “my driver has left me for 2 hours”. Only after perceiving happens, can thinking and decisions take place.

What to do differently? Create software meta models to correlate and interpret events and make sense of what’s happening. Don’t just report discrete parameters. Humans get smarter in the same way. We build mental models of bug bites, which integrate the various sensory inputs into one coherent situation.

In sum, a few things to make trucks smart: use sensors to gather contextual information; feed them into models to recognize the situations; and maintain such distributed intelligence with each truck. 

Image Added

 


 

Excerpt
hiddentrue

Image Modified    

this is the exerpt here... two sentencesThere are three types of things we do in IoT: making smart thingsmaking things smart, and being smart about things.