>> In the last few years, actually,
I've been slowly getting obsessed with
this particular topic of Smart Cities.
It was just accidental that I started into this,
but I think it's turning out to be a very interesting,
I think rich space for
a lot of interesting research problems,
which is what we are interested in, but also,
an opportunity to really shape
how these things will get implemented in India right now.
So, I'll just try to present
coherent story on what we could do as
Technologists for enabling
really futuristic Smart Cities solutions in India.
So, really one of the main things which we
have been trying to make a case
for is like an open stack,
which will really allow for a rich set
of applications to develop and emerge in the future.
So, to motivate this,
what I'll do is I'll go a little
bit in detail into this recent RFP,
which has been put out by the City of Agra,
which I'm sure Betel Nolkia would be able to tell you
in much greater detail of
many of the other aspects of this RFP.
We'll just focus on
the technology related aspects of this particular RFP.
I presume that this is
a prototypical RFP for a Smart City,
which is similar to a lot of what other cities
are doing or want to be done.
If you see out here, one of the key elements
out here is this integrated control and command center.
So, clearly they want a visibility as to
the overall operations of the city.
There are a number of components which can plug
into this command and Control Center.
Transportation,
intelligent traffic management system,
solid waste management,
surveillance, interactions with the citizens of the city,
being able to communicate through
digital signages, and so on.
So, these are all the various components,
which they would like to have
executed in the RFP, which they have given up.
So, what we thought was let us go into this RFP
a little bit more detail to understand
from a technology perspective.
What are the key elements
which will need to be supported?
So, the RFP also says that you need to have
a layered approach and this is common sense.
In some sense, especially with
all the sensors being deployed like cameras etc.,
they advocate having these multiple layers
out here, sensor layer,
communication layer, data layer and application layer.
What I would like to focus on,
as we go forward in this talk
is essentially this layer, the data layer,
which I believe is like
the core fundamental layer which
we really need to make sure
it's designed and constructed well
because it will really enable
all kinds of interesting things to happen in the top,
once if that is done well.
So, if you look at
this adaptive traffic management system,
what they would really like to do is to be able
to manage the traffic in a reactive, in a smart way.
For example, they want to be able to
control the traffic signals optimally,
so that it can change as required based on the duration,
on the time of the day, and the traffic conditions.
I guess this is
a big deal in India, we are happy the way it goes.
So, they want to be able to
automatically create these corridors for VIP vehicles,
as well as emergency, and police vehicles.
They would like to actually have
incident messaging being done in real time.
Also, there are little
forward-looking in the sense that they're
saying let us collect this data
of all the traffic conditions,
make it available, so that
it can be used for further analysis,
and improvement of the road networks.
So, if we look at the implication on the data layer,
this real timeness is
important which emerges when you want to
quickly be able to connect to various things.
Also, you want to be able to, of course,
it's not immediately apparent from here,
but if you look at the RFP detail,
it's also apparent that you want to be able to
ingest data from a diversity of sources,
cameras, crowdsource, police database systems, etc.
Similarly, you want to make the data available to
a diversity of applications,
and it's not just this adaptive
traffic management system application,
but maybe other applications.
They have this,
yeah?
>> [inaudible].
>> That is a very good point
and I don't have an answer to that.
I mean, if you don't have enough capacity of the road,
I mean there's not much you can do in terms of that.
But, hopefully, you can at
least do some demand response type thing.
You can stagger how
people- titles cannot even know the capacity by,
it's creating incentive schemes, and so on.
>> You don't study [inaudible].
Because you are not. You are already spending
money on something that is zero result.
>> So, that is one view is that why do all
these when we can't even make a proper footpath.
Were not even able to walk properly and we want to fly.
But, maybe we cannot
bypass this like what we did with wireless,
it directly gone to smartphone by
passing the intermediate layers.
So, the other thing is that,
I think one of
the things also which is embedded in the RFP turns out,
is that just making the information available
hopefully will allow citizens
to say look what the heck are you doing.
Are you doing your job?
Why aren't you fix this and so on.
So, that also hopefully there'll be some positive
comes sort of just making the data available.
>> [inaudible]
>> Right. Yes. Agra, I
don't think- as it's said everything-
>> Bike.
>> Bicycles, okay. So, you
have to also get the appropriating challenge.
>> [inaudible]
>> There are a lot of interesting, okay.
So, I'll point out something funny out here.
So, intelligent traffic NCC turbulent.
So, traffic comes again in two different places,
adapter traffic as well as as part of this,
but I'm just reporting what they set out there.
Here, they really want to
have the system to support the police.
So, this from the point of view
of the police's perspective,
they want to be able to traffic police,
be able to give finds,
do traffic management, monitor suspicious people,
vital places, video analytics.
They want to be able to detect
no helmet whether someone smoking in the car,
this is funny wrongly driving detection.
So, they want that. I don't know why.
They would like to actually also
detective people have long hair or short hair.
It's there, this RFP.
So, this is interesting.
They would like to have a video coming from ad-hoc video,
so similar you just set up a video network,
and then data should come from there,
or variable like police is wearing a video camera,
data should be coming from there, and so on.
Then, of course, they want this to
empower secure communications.
So, what this implies is for the data layer,
you need to support end-to-end security,
and also diversity of radio sources,
mobile, wearable, ad-hoc type sources.
You need to really support the diversity of analytics,
which might not be doable right now,
and see their point is some of these longer, shorter,
it might take time, they want that,
but in the future,
you'll have more requirements for analytics,
and then of course
strong authorization and authentication.
Solid waste management, they
have this probably for most of the cities.
They don't know what is going on,
where people actually collecting the waste.
It's supposed to either disposing of properly and so on.
So, you just need more visibility by putting
sensors to track vehicles,
monitor waste collection sites,
manage the roots of this trucks, etc.,
look at the handover from
one small truck to the bigger truck,
what is going on, things like that,
and really increase automation.
Again, out here diversity data
ingestion from a diversity of sources
followed by video analytics.
They also want video analytics.
They want to see what is happening in
the monitoring waste disposal sites,
are things being done properly or not.
They would like all of this to be reported,
summary to be reported to
the integrated command and Control Center.
GIS is a big deal for this,
I guess most of the smart city RFPs,
where they would like to use this
for better revenue management, and planning,
and urban development, and
maintenance of public places, and so on.
They would like this to get integrated with
other smart solutions like in surveillance.
So, GIS is a important requirement for the data layer,
being able to provide GIS data,
and space data, I'll come to this what I
mean by this little later in the talk.
So, that you can do a correlation analysis of data from
all these different sensors in the context of GIS data.
Air quality, noise, light,
and being able to report that
through messaging displays is something they would like,
and they would also want these to be tamper-proof.
So, this is where it's
interesting some of the requirements are
coming up in the specific components,
but they are genderly or they cannot cut
across the entire implementation of smart cities.
So, secure onboarding of devices is important,
so that you can trust
the device which is sending the data,
and detect if the trust is broken.
You would like to know how
healthy these devices are and then really,
again, coming back to this exchanging data
between sensors and other applications.
>> [inaudible]?
>> I guess, they want to measure the amount
of noise decibel levels in these places,
just monitor that, I guess.
I guess, if you had these microphones you could also do
some analytics with that in conjunction with the video,
though they're not asking for it,
but you could do
some interesting things with that I guess.
>> [inaudible]?
>> Yeah. I don't know. I'm sure
there are certain positive laws, I don't know.
I don't know actually. See, it's
like the air quality, right?
I mean, there are certain safe air
quality levels to be maintained,
but we're kind of not really maintaining that,
so probably, the same happens with these lines.
>> [inaudible].
>> Taj? Okay. I see.
Yeah. So, Taj Mahal,
so that is driving some of these requirements.
I guess, once you start having
data, see that is the thing,
is that maybe you can kind of use
that to drive some actions,
policy actions after that, around that.
So, for example, this kind
of data would also be useful to correlate
with the health implications
of the people living in those areas.
So, we don't have data but
once we start having this kind of data,
those kind of studies can also be done and then we
could inform the city,
could do the appropriate things in terms of, yes.
>> [inaudible].
>> For noise?
>> Yes.
>> Noise, I don't know, but I do know
that maybe the same people,
because air quality related studies,
some of the doctors in some of
these hospitals are interested
and they're kind of doing that kind of
study along with some scientists in ISC.
So, I guess, noise is kind of related,
so probably, they would also be interested in that.
>> [inaudible].
>> So, Saint John's is something which is-.
>> [inaudible] in New York City [inaudible] there
is city code about what is allowed for noise,
and anything that is above 10dB above [inaudible]
which was informally stated as 55dB is prosecutable.
Then they have very specific prosecutions
for different types of the offense,
and it is in fact treated like a health hazard.
>> Health hazard.
>> It is known to also affect educational outcome.
So, kids who are studying near say railway lines [inaudible] ,
so, many agencies which are interested in managing noise.
>> The air quality, for example,
the study which the ISC scientist, Professor Satish,
did along with some of his colleagues from
hospital was to look at the asthma levels of
children in a school before and after
some change in the traffic routing.
There was a noticeable change in
the asthma levels when of course they changed some
traffic- the paths for some of these heavy vehicles.
Over a six-month period,
the asthma level actually went up
and then it came down
when the old routing was reestablished.
So, clearly, there is a major connection note
that the health and even educational outcomes
too much noise meant also
prevent children from focusing and so on.
So, they want this latency of 15 milliseconds round trip,
which I think it might be kind of
aggressive with sudden jitter specs and loss specs and
so on for the communication network and authentication
of devices through the certificates
and being able to look at the health of all these things.
So, I think the message here is that real time is
important where anyone can meet
these specs or not is a different issue.
But I think that makes sense if
there is a ambulance coming
in you want to notify
it and change the signals and so on.
>> [inaudible].
>> I think it's Pricewater House an issue [inaudible].
Yeah, exactly.
So, it could also be that
certain big corporates also help come with these specs.
So that their equipments could meet these standards.
I don't know how this works. But-.
>> [inaudible] I think it just goes one panic part of
[inaudible] there's no quality of
service with no spec
on how good your expenses have to be or how accurate-.
>> Absolutely. In fact, that is the point I kind of
want to make is that if you look at these RFPs,
in some sense, this is at the level
at which specs are written.
It's already like 300-page document.
You know, this technical stuff.
Then from there, to derive even more detailed specs,
that's where what I want to make
a cases is that we really need to get down
to those details of being able to properly
get to the next level of technical
specifications of each of these-.
>> Tasks specification, I mean at the end,
this is not the specific [inaudible] what you get out of it. [inaudible].
>> Yeah.
Yeah. That's why I cannot put this up so that
it comes out this is the kind of
spec which is there right now in the RFPs,
is what they're asking for that.
>> [inaudible].
>> Sorry, yes.
>> What is the [inaudible].
>> See, they just spec that it
needs to be end-to-end latency,
round trip latency is 15-millisecond.
Now, I presume it
is probably they had this applications like
the signaling traffic light control
creating a green corridor
when a VIP vehicle comes or emergency vehicle comes.
So, maybe that is driving this particular spec.
Yeah.
>> [inaudible] 15 milliseconds to exactly what?
>> Fifteen milliseconds from the time the sensor says,
"Look, I'm emergency vehicle. I am here."
to going and changing probably the traffic light
to set the current green corridor up in that local space.
I'm kind of speculating that is what is driving,
one possible example use case for this kind of a spec.
So, from the sensor,
it goes to the communication network.
It has to go through the app.
So, here they are not even
mentioning anything with the app,
because there is a complete application stack
through which it goes and comes back.
Everything has to be within 15-millisecond,
and the jitter of two percent of 7.5-millisecond,
that is jitter two percent of
the one-way latency, probably,
for video or something, I don't know where that is
coming from and what is the motivation for that.
>> [inaudible].
>> Yes. So this is the- yeah.
The integrated command and
control is like the key piece of the smart cities,
and really they have aspiration
to break the silos between departments.
They want a single window visibility
of all that is going on in
the city and they want to have
this integrated operations platform
which is like the core of that,
which will expose data through API,
support distributed deployment of workflow,
Stream Analytics,
correlate handle multiple data stream, developer support,
vendor agnostic, Software Development Kit CPIs,
security APIs, and they want to have Analytics
for lightning detection abundant objects, etc.
I'm kind of bringing these things up because this kind of
how the RFPs are also written in some sense.
These are the consultants who
come in and write these RFPs.
There are some elements which are
captured which are what is the cities want,
plus there are certain other additional things
which I put it here.
>> [inaudible].
>> See, this is what they have
written in one of those places.
So, what we want to actually- we
seized on this and we're saying, "Let's make this.
Let's try to get this thing,
achieve this as OpenStack, open API framework."
Now, this RFP which has been
put and in the normal course of operations,
what will happen is that there will be
one master system integrator like Star Lite or
LNT or someone will pull
together all the different vendors for surveillance,
waste management, etc., etc., and probably,
each of the vendor will have
their own complete stack from
sensors all the way to the dashboarding.
Then some other will integrate it into
the- there'll be a single command and control
center where you have all these little snippets
of videos running from these different applications.
So, it really siloed in some sense.
So, that's probably what will happen.
There is really nothing more
specified in terms of further levels of
specification in terms of standardizing
the framework for exchanging
data so that you
can really exchange data across different applications,
devices, and even cities in some sense.
You want to go to that level.
So, also really, there is no support for an economy
based on data that is kind of
the recurrent thing in any of these.
Datas are recurrent things.
They have to figure out how to make money off data.
So, there is really no support for that.
So that is a little bit future thinking we need to do.
There is no mention of privacy.
So, you now know that is one of
the most important things
right now but there is really
no mention of privacy in these.
So, what we really need is kind of a platform
which standardizes APIs and
the data models so that it helps application developers.
It should be targeted at
application developers instead of
just having end complete applications.
Then, it should really support
AI type solutions which are emerging and
really unleash the innovation and
entrepreneurship for new applications to develop.
>> [inaudible].
>> These applications?
>> Yes.
>> Yeah. There are a lot of I think
already Pune has implemented something
and some of the cities
which are a little bit early have implemented that.
So you can have vendors,
for example, which will provide
you a solutions for surveillance.
So, there's someone called Videonetics will provide
you that complete with the cameras and VMS and all that.
Parking, I guess a lot of players startups and so on,
and so lighting management-.
>> [inaudible].
>> Pardon?
>> [inaudible]
>> Right.
>> [inaudible].
>> Radiolytics, yes.
>> Is there any law in India that protects privacy?
>> I think there is something which is there in
this the Data Empowerment and Privacy- DEPA.
I think that is something which has been- Yes.
I think MAYT has adopted it.
So, I think those guidelines are supposed to be.
That's what someone who was involved with that told me
that that will be legally binding.
>> [inaudible].
>> No. Yes. Because this privacy thing is like
a very recent thing which is kind
of brought to our attention in some sense.
>> [inaudible] ?
>> I'll just come to that next year.
So, now what we are saying is that,
let us focus on the data layers.
We really need a well defined data exchange layer.
It needs to be an open platform,
with open APIs and data models.
So, that instead of the siloed it gets replaced by this,
what I'm showing is a hourglass model.
At the neck of the hourglass
is the key, that is the city,
it's a little verbose city data exchange
and Edge Analytic stack.
I'll tell you why Edge Analytics is important shortly.
So, this data exchange stack,
what it does is, it's an intermediary
between the producers of data,
which is all the sensors,
and the consumers of data, all the applications.
The sensors could be hardware sensors,
but it could also be software sensors.
Other applications could do
further refinement of data
and produce alerts and messages and so on.
Really this kind of an open stack will
enable very complex applications in the future.
Emergency response, we splice together information from
multiple diverse sources and connect with
multiple different ways of connecting with citizens,
crowd monitoring, accident detection
as well as prediction, and preventing accidents.
So, that is what we really need going forward.
To do a lot of these things you need to integrate
analytics across video and other sources.
If you see currently, video is
a complete separate beast by itself.
Video is separate, everything else is separate.
So, we really need to integrate,
as application developer you need to be able to consume
a video stream as a less stream
coming from other sensors nearby.
Really this architecture is something inspired by other,
this is what they do for other.
This is a course kernel you identify
which enables a whole set
of ecosystems to rebuilt around it.
If we do it right,
we can have portability across vendors for devices,
as well as portability across cities.
Application development Bangalore should
work in Sura, et cetera.
So, I think that is what we should
be aiming for as a country,
not have all these cities independently just
doing their own little thing and then,
at least just let's see if we can do it correctly;
the smart city solutions.
That is our hope. Yeah.
>> Is there [inaudible]
Or will it just be lost after whatever happens?
>> That is a good question. So, nothing
explicit has been mentioned in the RFP.
At least I could notice, but
certain numbers are mentioned.
For example, for surveillance related application,
we know that people want to store it
for at least 30 days or something I believe,
and the video data.
The other sensor data is really
not much I guess compared to video.
Video is really the beast in some sense.
They are saying that you have the archive data to
do further analysis and help and so on.
But it's not really been quantified.
I think that that is the next level
where when I say be able to say,
okay, you need this much amount of
storage to store these kinds of data.
So, for video, it is there,
because certain requirements require you to
store so many days worth of data but.
>> [inaudible]
>> See, I think for video from
a forensic perspective they definitely want the raw data.
In fact, you should not be touching the raw data.
It should be as it is.
But there could be lot of other cameras which are not
really part of the surveillance
system you're just put them.
For those, I guess
it's up to the how you implement those.
I mean, do you really care for this raw data coming
in where nothing is happening like
a waste monitoring site.
You have a camera, you
just want to know if someone is actually,
if people have showed up to work or not maybe.
Maybe that's the alert you want,
so the raw data may not be that useful there.
So, I think that is also open in some sense.
>> Actually users forget to store the use [inaudible]
>> So, that is really why the Edge Analytics comes in.
In fact, from maybe,
I'll just say why we need
Edge Analytics is essentially for what you pointed out.
The amount of video data is
just too much to stream to the cloud.
Data has been locally stored locally consumed.
>> Localized or local data?
>> Local is what? So, local
could be a word, maybe locality,
again it depends on the kind of
investment a city it is able to put in.
I feel that it should be local at
a level of a ward of a locality
at a campus or some such thing.
Then that's what I'll kind of
discuss in the next few slides.
So that you don't
really ship this huge amounts of data all over the place,
it just consume there,
and then that's why you need the
Edge Analytics also running there.
It's only the alert that are transient.
So, this data exchange layer needs to have,
what are the attributes for this.
We need to have very strong security,
handle heterogeneous data across video,
and other IoT data.
These are the key functionality of supporting exchange
of data between various applications and devices.
Be a repository of data models,
because just raw data by itself is not useful.
What is the metadata about the data,
and that is something
else we insisted on that every source of
data comes with metadata information so
that you know what it means,
support low-latency, have APIs, and SDKs.
Because of the need for a local just for video,
if it is not video, probably Edge Analytics might not be
that useful but maybe other than for latency reasons.
But definitely, video requires
I think Fog/Edge Analytics.
Now, what was missing here,
we would like to also add these three requirements that,
they should be strong protections for privacy.
Say if someone is,
you should know as a citizen with
all cameras or whatever video feeds you appear in,
I mean that would be nice to know.
Then you need to have
a marketplace to be able to buy and sell data,
and of course support the AI type applications.
I'll kind of briefly mention what I mean by that.
>> [inaudible] There is no privacy anymore.
You are telling me I have the right information.
>> Correct. Yeah. At least be able to know.
You see because if we take that forward.
Suppose cameras support but,
maybe you lack the data
in which you appear or what you own.
That is unlocked only based on, yeah,
>> Sorry. [inaudible] . Is data privacy.
Supposed to deliver it in index form,
which is what's taking into account
the fact that GDPR in one end
is extremely strict and in the US we have control.
Are they supposed to be in [inaudible]
and that protection on top of that,
there is going to be a law.
That's the end of my question.
>> Just additional information.
These cases where in Seattle
a system was abused by asking for
so much information by
the city authority to satisfy those requests.
Then there's the issue I think money products
where if you have access to excessive data,
you are building a cost on someone to deliver it to you.
So, is there any thinking of how we will manage
excess and sustainability of cost for?
>> I think those are all opportunities
for startups to offer
services along those lengths. That's what I think.
So for example,
Analytics-as-a-Service, Annotation-as-a-Service,
all these are interesting opportunities
to think of coming up with solutions for those.
That's where these kinds of
data stacks which are very open will
support those kinds of new kinds of services to come up.
So, this is the stack,
basically what we're saying,
and it's not very
different from what any of
these IoT type stacks look like,
except there are a few things out here.
Of course, you have the brokers for
streaming messages and alerts and so on,
which is an archive,
being able to access these archived versions
of these messages.
Also, support for real-time messaging,
but the video and audio,
the media brokers and databases are
also part of the stack so that you access this.
Basically, a directory of
data resources out here which is
a catalog which has all the meta information.
Can I give you an example of how that might look like,
support for certain contexts because if you
want to really interpret the data coming in,
you need to know what is the context,
so that you can do your inferences and so on.
So, that is what I meant by AI support.
A specific example, I'll just talk about the spaces.
Then, which is also very
important in the context of video and Edge
Analytics is that you need to be able to
support running of analytics and
not this support to run
analytics at the Fog and the Edge layer.
Then as a framework for authentication,
authorization, provisioning et cetera, as part of this.
So this is basically the stack
and we want to- and all this is exposed as APIs,
and each of them runs as services or
microservices in some sense
from an implementation perspective.
So if you look at from a deployment option
you have the stack at the center,
and currently what will
probably happen is that you already
might have applications running.
Waste Management, this is an application
someone has developed et cetera.
So, with Adapters you put data into the stack and
the command and Control Center will pull data off
the stack in some sense
and also communicate through this.
So, this is how probably the immediate future
with legacy or ongoing applications things will work out.
But maybe going forward you cannot decouple
the data producing portion
from the data consumption portion.
Even the data producing person is just
done by a bunch of companies,
consuming is done by some other kind bunch of companies,
and all of them work through
the Data Exchange Stack in some sense.
So, what you can see from
the Command and Control Center is
not just the data but also
the health of the devices and
health of all the things producing the data.
That is essentially the management systems VMS,
Element Management System, Network Management System.
They also report the health of
all these different things
to the Command and Control Center.
Okay. When we think
in terms of that is kind of a software perspective,
when you look at the hardware perspective
what you will have is
a collection of gateways or
IoT gateways which are out there publicly deployed,
but possibly on street poles,
because those are something which are
all over the city they have access to power.
So you have these street poles where
each street pole has this IoT gateway with a bunch of
locally wired connected sensors
and actuators and perhaps also ingesting
data from nearby sensors wirelessly.
All these street poles are connected through
a WAN to the local Micro Data Center.
So this could be at the level of a campus
or a ward or locality.
This Micro Data Center actually has the WAN terminator,
network switches and then compute
and storage services for it.
So this becomes the what is called the Fog load,
some people call it the fog,
they have different terminologies.
But this is the Micro Data Center sitting
there serving a locality out here.
So we expect this
to be the hardware realization
of the smart city solutions.
Anyway if you do it that way you
can do it in a modular fashion,
each locality gets lit up in some sense or IoTified.
Then you have all these Micro Data Centers
which are then managed
or visible as in a unified way
through Integrated Command and Control Centers.
So this way you slowly lighter different portions
of the city on a step-by-step basis.
The software itself you
would want to of course
make it run in a distributed fashion.
So, what here I'm showing is that the core,
the API gateways and
the authentication et cetra is in
some sense unified and centralized.
But the stack itself the brokers
et cetera are run in a distributed fashion
each in the local Micro Data Center.
So that way the whole thing
is actually distributed, okay.
So, there are a few concepts which
we want to bring out here which
are I think interesting
and maybe some of them are new in some sense.
So one is of course a Registry of Data Sources.
Now this is something which OCF is also there
multiple approaches to our efforts to
try to come up with a schema,
a standard framework for capturing the metadata.
So we will look at what OCF did,
and I were proposing something along those lines.
Where any IoT source or
any data source has a schema
which has four pieces of information.
There's some static information,
there is some schema
about the observation data which comes in schema with
the control data which goes to the device and
then schema for the configuration data.
As an example out here one of the elements
is the output of lux sensor
which is of type number which is
measuring the LED output intensity, units of lux.
It has what we're also seeing is that there is
an access modifier to
control the privacy setting out here.
So you have private data,
you have protected data, you have public data.
Private data is only accessible to the owner of the data,
whereas protected is data
which is available to someone who is authorized,
and public is open to available to everyone.
Now, this kind of schema that the idea is that
any data source when it gets registered
in the system also has to put in a schema
which tells you about the format of the data coming in.
Then the system can validate the pieces
of actual data which come in against the schema
that is what is shown out here on the right side.
So, as an example if you take a street light
which through of schema
which is a proposal schema which we
have experimenting with in our reference implementation,
which gives you some meta information,
this is a static portion.
Where is the street light located? Who owns it?
How do you access the data from the street light which
is what does the subscription endpoint?
What is the API endpoint for getting archival data?
Then the data from the street light
itself what are the different pieces of data
which come in essentially when is
like this a sampling instead.
Which is the temperature of
the casing, the power consumption,
the ambient lux et cetera,
and then some meta information describing each of those.
So, this is the control inflammation
that is how do you go and change
the brightness level for example of the street light.
So, the actual data from the street light will of course
be decent type documents,
keyword values which are adhering to
that particular schema out there.
So, that is the first thing is that
every piece of data has to have
this data model which is uploaded.
Second is video itself, right?
So now with video,
we would like to also treat video as an IoT element.
What does it really mean for video to be an IoT element?
Well, we need to have some metadata about the video,
and metadata on the semantic content of the video
not just the frame rate and resolution and so on.
But what is inside these?
Some description that we had
this lux sensor and the units are lux.
Is there some other similar information we can put
maybe over the semantic content
or some associated inflammation.
I'll give you a simple example.
So, for example if I look out here someone go do
an analytics on this video and give out
this meta data stream
which is associated with the video stream.
Which tells me are there
any Autos in this video sequence?
How many diesel vehicles are there? Et cetra.
The thing is these kinds of analytics then
help the consumer to
work off the analytics data and not worry
about the raw data anymore or the encoded video stream.
Now you're working on a derived
information as you go forward.
This will really enable complex applications to be
built where you build on this kind of analytics.
It'll also allow analytics service providers who develop
very specific analytics and offer
them as analytic stream.
So, the formatting is a little messed up here.
So for example, a simple use cases,
let us say you have a mobile video source like a drone,
and the drone is giving out
the encoded video stream but you'd
like to know what are the Geo-location,
as a time series,
so that that is extra meta information because you
had that maybe Geo-location maybe the posts of the drone,
you can do better much more
interesting inferences with the video stream.
So, it is just a simple schema to describe that
very similar to the schema of the street light.
Here I'm just looking at the data portion
of the schema which is telling me
that this is the timestamp of the metadata.
You need to be able to refer into the video stream,
what portion of the video stream is this piece of
Meta Data actually matching to?
Start an end pointer of the video?
Then what is actual data coming?
Is just the latitude longitude
in this particular example saying
that this portion of
the video frame this is the Geo-location.
So you have now a stream of,
let's have this picture again it tells you you have
the video stream which is going flowing out here.
You have this Meta Data corresponding to
the video stream which is flowing on the side of tier.
So for example this piece of
this Meta-Data Stream each metadata packet
actually is referring to some portion of the video and
then contain some information about that video stream.
So, this is one thing. So, have
this Meta-Data stream which is also in the catalog.
So, any consumer will know okay what is
this derived information available
so you can build some new applications on top of it.
Secondly if you want to subscribe to
the video stream itself you need to have a way
of just like you would subscribe to
IoT Data Stream you
need to be able to subscribe to Video Streams.
So you need to have steaming servers,
be able to access archive streams et cetera,
and the corresponding metadata which comes in out here.
So, one of the things
is a lot of the video especially for
the surveillance camera is awfully
locals push spatio-temporal relevance.
No one else outset really cares about it.
So, I'm just going to highlight here problem,
we are thinking about is how do you create
this localized efficient video distribution system
for the local region.
So, this I think is where you have load balancing,
you have caching of these brokers etc done.
For example, you're stuck in a traffic junction,
or you're stuck in the road and you want to know
what is happening ahead. You're just curious.
With your city app, you're able to subscribe to
a video camera which is just ahead and you'll see,
okay this is why there is a problem there.
You're just curious to know.
This kind of a local consumption
of video from a local source
to be able to support these things.
Also, be able to kind of adapt
the QOS of the local network.
For example, if I have
some incidents happening in a local region,
you probably won't allocate
more bandwidth to the cameras around that region.
Maybe you want to track a crowd as
it's going past, right?
So, adaptive QOS for those things.
So these are some of the,
I would say interesting RND problems.
I'm not saying this will be part of
the stack which we're doing but these are
some interesting problems which
are worth looking at as we go forward.
The thing with the video is that you want
to get and consume the video,
and analyze the video locally.
Right now, of course, you do it at
the Fog layer because those a Standard Compute Servers,
but I guess with technologies which you guys
are developing MSR you can also
push that to the Edges out here on the IoT gateways.
Now the thing is that this collective
of the Micro Data Center,
and this IoT gateway in some sense forms cloud.
I call it the street cloud.
So, you have to come- and street cloud
has this heterogeneous compute two sources.
Little computes in the IoT gateways and
more compute in the compute servers.
So, basically you want
to locally terminate the video
both for privacy and bandwidth,
and you want to run this analytics
out here in the street cloud.
So, how do you do that?
What is the programming model
to allow you to develop applications like this,
in a general way and what kind
of runtime is required to support that?
So, this is something again I think is
an interesting systems problem
to look at as you go forward.
As an example, we have been looking at how you
could take advantage and do
this Distributed Video Analytics.
So, where you have
a camera and you have
some compute to the poll right next to the camera,
and then you have more heavier duty
compute out in the Micro Data Center.
So here what we have explored is how you can do
this tracking locally and
the object detection out in the Micro Data Center.
So, this is some work we did last year,
where you have multiple cameras
and then you have
an object detector running the Micro Data Center,
and the D-frames are the frames on which you
do the object detection in the data center,
whereas the D-frames are done
locally next to the camera to do the tracking.
So, the detector of course runs is
a much heavier duty detector running on one of
these neural nets and
locally you do the tracking et cetera.
So, there is some benefit to doing this,
we reported last year.
But, I think more interesting question is how
do you when you write this because
it's always a moving target,
the amount of compute you have every year is
improving even in the Edge Node
you can do more and more compute.
So, you can't hard partition your thing saying that,
"What you couldn't do today maybe
two years from now you can do
easily in the Edge compute."
So, how do we write
these applications and break it so that these pieces are
able to move around easily
and collectively they do the job you want to?
So that I think is also very interesting problem.
So, there is this,
the third concept which we kind of want to
articulate out here is concept of space as an asset.
Space regions, and this is something which actually
Intel has brought to our attention.
OTR who are also part of our collaborators.
So, this particular conference room
is the space region which will be there in the database.
Because if you have sensors out
there for example the smoke detectors,
you should be able to pin it to
this particular space region
which means this space region needs to have a entity,
a presence in your database
with a specific ID so that you can tie to that.
It can't just be the Geolocation,
because the Geolocation really and not work well inside.
So that is a concept where
you have spaces articulated as assets.
The thing is that for example,
this space if you take,
this is the map just outside around
this vicinity where you have your Microsoft building,
you have ITC gardenia, et cetera.
So I've just marked out three space region.
So there is this SR1 which is this piece of road,
SR2 is your Microsoft building,
SR3 is this another building out here.
These regions have to be captured somehow.
Of course, in the database you will have
SR1 as a region which has a unique ID,
it's described by the street segment
owned by the city of Bangalore,
administered by BBMP et cetera.
Similarly, you have associated
information for every regions,
but more importantly you have
a structure which connects these space regions together.
Saying that, SR1 is adjacent to SR2, SR3, et cetera,
and SR1 is part of this bigger Space Region called road,
RajaRamMohanRoad and so on.
Now, why is this interesting?
Because you can use
this structure to build your application.
For example, suppose there is an incident
which happens in this SR1
which is the road segment just in front of you.
Who are the people who need to be informed about that?
So, what I want to just build a general application
without customizingly doing it for this particular.
So, where then what you would do
is you take that you look at the network
in this graph structure
and traverse that and figure out what
all the affected people who
get notified about that accident.
So, this is the context
what I meant by saying supporting AI services,
is that you capture the context.
This context is most likely in the form of a graph.
It's not just be contexts of
the space region it could also be
context which is related
to this organizational structures.
So the're many things out here which are useful,
it's probably should be captured.
So, I think that the challenge here is to populate
the database with such kind of information in
a maybe automated way perhaps a semi-automated way.
So that you have a rich database which you can now
start using to do more and more
interesting automated DI type applications.
So, but at a minimum at
least what we are saying is let us
start with the space regions and make
that available as a database.
This link should be able
to automatically create analyzing
the shape files and so on.
Finally, the security, privacy, and manageability.
Regarding privacy, I will just briefly mention that,
there is work which is going on in India Stack
as part of privacy, supporting privacy.
They have something called a Consent
layer which they're proposing.
So, one can look at that and see how we can adopt it.
But the idea is that,
there is a clear ownership of data,
who owns data, and the data can only
be used with the permission of the owner of the data.
So that is basically what we want
to be able to support out there.
Also, it will be good to be able to track.
If I share data with you,
you don't share it with someone
else without my knowledge.
So you need to also have a mechanism for doing that.
So I think those all can be done now with
this modern encryption and Public key, PK technologies.
So, what we have done is
these are all concepts at various levels of maturity.
Some of it is actually work in progress,
and we have actually been working on
implementing some of these. Yes, go ahead.
>> [inaudible]
>> Yeah. So, basically every device
in the system needs to,
"in some sense," have similar
to our biomedical identification.
Every device needs to have an identification which
is somehow tied to some root of trust of some kind.
I think some of these things
the enterprise level people
have already solutions for that.
The question is are they- like you have these TPMs,
you have this Trust Zones et cetera.
>> [inaudible] from the device. The mechanism [inaudible] ?
>> The access to the data is,
you have to get access to the data.
So, there's the owner of the data.
In the case, it could be the city.
So, if you want to access those streams,
you need to have a relationship
with the city to get access to the data.
Now, if you do some bad things in
the data, that could happen,
but there is some very clear,
shall we say, audit trail.
As you have taken the data, it's recorded somewhere.
>> I provide some [inaudible]
>> Right.
Yeah. I guess, yeah, it's possible.
I don't know. I mean, security is a big topic.
There could be ways of hacking around the system.
But the thing is that at least
the point is there is an end-to-end encryption
with also strong identification
of devices and humans, right?
So, there's identity management system in place,
and the data is only
given to authorized people essentially.
The owner of the data authorizes the use of the data.
So, that is the kind of framework we can put in.
>> Sometimes [inaudible] security and privacy.
Do you see that those [inaudible].
>> See, right now,
it is at a very high level if you remember,
they will say using slot files and online certificates,
that is the level.
So, that is where the next level of
detail as to what it should be in more precise terms.
>> This is the precise, I mean,
this is an implementation detail, right?
What do I care about whether it's
using one certificate, another certificate.
I want to know what my end-to-end guarantees,
and then work backwards and
figure out what cryptographic protocols are needed.
>> Right.
>> [inaudible] has started themes.
>> You're right. The certificates
are one way of implementing security. I agree with you.
So, what we are actually doing is that,
at a high level, you want encrypted communication.
That is a high level requirement.
But how do you achieve
that encrypted communication, right?
So, what we want to actually propose now, essentially,
what we are doing is, I'll just come to that,
but there the consortium we are
making where we're coming up
with these more detailed specification.
Some of it on the security side is on the implementation,
probably, it is at the flavor of implementation.
Use this, yeah, so that's what it is, yes.
So, there's a lot of POC work,
which we are doing with some
of these devices you're making.
This a concept smart pole, which we are made,
which can have these smart street light,
which is on this pole,
and there is a gateway out here,
which has, of course, a Raspberry Pi on that.
Then, we have a network of these poles,
and this is just an example application
where very toy application,
where you could use video data,
have an antiques runoff the video data,
and then do something to manage your street lights.
So, where this heterogeneous
things through a same platform
allows you to do that stuff.
So, what we have here is a pole with a camera pointing.
>> How much power will be consumed if you had
a Raspberry PI [inaudible] every device?
>> I don't think the Raspberry Pi is
consumes that much power because
the street lights themselves are
about 30 watts to 90 watts.
Raspberry Pi will be a few watts.
So, it's not going to be much.
The cameras are also quite,
they can consume quite a bit of power
compared to that because cameras are
pretty much in a nice box with the optics, and so on.
>> So, it's a very [inaudible] yeah.
It's a Chicago array of things.
She was trying to deploy a significant set of sensors,
very strategically packaged around the city.
The concept was to put thousands
of these, 10 thousands of these.
They were funded for that.
Five years later, it turns out,
there are about a 150 nodes,
and the deployment cost in Chicago,
which is a different situation is
about three to $5,000 per pole.
So, what's been the experience doing
all base power source [inaudible].
>> I think there's not much experience frankly speaking.
So, in fact, this would probably related cannot go ahead
and see is it even
worth putting these Computes every pole everywhere.
I think some of that we have to learn as you go forward.
So, we are not there.
Yeah, I agree with you, yeah.
In fact, we have had some discussion
with the idea of things people,
and so on, and that was, we wanted to use that,
but then we realized it's too expensive for
us what they have right now.
So, yeah. So anyway,
this particular you've probably seen this video,
so I'll just move on.
All this is saying, it's just so people detector,
which turns on the street lights,
so it's only simple thing.
We are actually doing this,
setting this test bed in Electronic City.
They're being very willing to work with us,
so there is a fiber loop out here.
This Infosys Avenue, Wipro Avenue,
and this other road,
and the stars are
the junctions where there's some cameras,
which had been set up, and all this data
is going into a Micro Data Center,
which is set up in their office.
So, like that, there are two other rings.
So, there're these three rings,
which had been set up now,
and all the data is coming in,
and this reference stack,
which we are developing can be put out there,
and we can develop
the stack as we go along with
actual data coming in from these sources.
Of course, they are interested in traffic management,
so we wanted to see if we can help them with that.
Hopefully, with this kind of activity,
we can have a solid stack,
which is tested out and made available to people.
>> So, portal data is collected here?
>> Basically, right now,
you see in Electronic City,
they have these cameras out here,
then they have a solid waste management system,
and they're trying hard to set up
a smart street life management system,
and parking is something which they're concerned with.
Parking management is something they want to do.
>> Does that mean infrared sensors out there?
>> Right now, they're doing infrared sensors,
these are just standard cameras,
but we'll put one or two thermal images also
as an experiment in
this test bed to see what we can do with this, yeah.
This is the network,
there are optical fiber loop,
which is based on GIPON,
Gigabit Passive Optical Network.
Each of the poles in
Electronic City has this connectivity to the spawn.
There is a LoRa Network also to pick up data
from some of the sensors and LoRa based sensors.
So, they are also very interested in water management.
So, smart water management is another use case for them.
So, we are also looking at how to create API interface to
the network itself so that some of
the things I mentioned about adaptive QS,
and so on, can be experimented as part of this.
So, the application can go and change the network city.
Of course, going forward,
these kinds of interfaces in the street
become interesting to support autonomous vehicles,
so connected autonomous vehicles.
So, those where you can have these poles
have beacons and embedded sensors, radars, et cetera.
This again, where Edge Compute becomes very interesting
for low latency controlled everything.
This is a different applications,
it's not video analytics from a privacy perspective.
Bandwidth reduction is more from a latency perspective.
Again, this is I think the open problem challenge of
how do you do this reliable real-time control,
especially for mobile things as things are moving around.
This is also in sync with the 5G activity.
So, this kind of a setup is also
planning to use to create a 5G test bed for that.
So, I'm going to talk about different things.
Some of it to more research,
and some of it more mundane,
but the thing is that for the city stack, the data stack,
the way we are going about it now is we are forming
a consortium of academia industry consortium
with some companies like Intel,
Dell, VMware, Tejas, Bosch, Cisco.
These are all interested to be part of this consortium.
So, now we're working together to develop
the more detailed specification
of this particular data layer.
I've also left others out here.
So hopefully, they'll interested,
so Microsoft to also participate in this,
and other companies as we go forward.
The idea is that stack,
which we are seeing the data layer stack
is articulated out,
a reference implementation is created and made available.
It should be open reference implementation,
completely open-source technology.
Hopefully, that gets tested out in Electronic City.
So, it's a solid stack,
and hopefully, that can become
the kernel for all smart city implementations
as we go forward in India.
So, that is the aspiration.
So, just to summarize,
that data layer is what we are
focusing on as part of the smart city solutions,
because that's key foundational layer,
and it needs to be open with SDKs,
APIs, and so on,
and the reference implementation
will be open-source and made available,
and it should be looking at the future,
enabling future applications in AI,
and distributed computing, and things like that.
It can't be done by one entity,
and really needs a consortium.
A lot of these problems, probably,
US have also figured out,
but I think with companies,
this is where the academia comes in it.
It can be a neutral entity which can help get
this kind of activity going.
This would probably be something unique in India,
and this happens routinely in the West,
but this will be something interesting
if you can do it in India.
So, with that, I would like to
thank really good team
we have in the Robert Bosch Center at IISC,
and also these members of the India Stack Consortium,
which is evolving, and I hope
that some of you can also participate in that.
So, thank you for the attitude.
For more infomation >> Daewoo Nubira 1.6-16V SX - Duration: 1:10.
For more infomation >> All eyes on Elwell at CrossFit Games - Duration: 4:06.
For more infomation >> 2018 WGC Bridgestone Invitational leaderboard: Live coverage, golf scores... - Duration: 1:36.
For more infomation >> Mazda 2 1.5 Skyactiv-G TS AUTOMAAT 12dkm!! - Duration: 1:07.
For more infomation >> NFL preseason: Injuries are already piling up - Duration: 3:35.
For more infomation >> Everything You Need to Know - Duration: 3:28. 
For more infomation >> Medicare's most despicable, indefensible fraud hotspot: Hospice care - Duration: 2:02.
For more infomation >> Audi A4 1.8 TFSI 120pk Pro Line Business Multitronic - Duration: 1:13.
For more infomation >> 'Marvel's Spider-Man' Deftly Balances Mundanity With Acrobatic Combat, Swings - Duration: 9:48.
For more infomation >> Gov. Brown blasts Trump, EPA on plans to roll back vehicle emission laws - Duration: 1:46. 
For more infomation >> Seat Ibiza ST 1.2 TSI 85pk Chill Out + - Duration: 1:12.
For more infomation >> Audi A6 50 TDI quattro 286pk Tiptronic Sport - Duration: 1:11. 
For more infomation >> Volkswagen Caddy Maxi 1.6 TDI DSG | Navi | Trekhaak | PDC | Dealeronderhouden | 17'' Velgen | - Duration: 0:53.
For more infomation >> Volkswagen Polo 1.2 TDI 75PK 5drs BlueMotion Comfortline Executive Plus | Navi | - Duration: 1:14.
For more infomation >> Audi A4 Allroad 3.0 TDI 245pk quattro Pro Line Interieurvoorv| Panodak - Duration: 1:13.
For more infomation >> 사토미나미 (Sato Minami) 센터가 된 미나미 [한글자막] 180406 키미다레 #173 1 - Duration: 0:26. 
For more infomation >> :0224: When Grafts Have Grown.2./ پیوند اُگنے کے بعد - Duration: 9:34. 


For more infomation >> J.Lo Drops By To Support Co-Host A-Rod & Shares Details About Their Relationship | TODAY - Duration: 2:35.
For more infomation >> Kathie Lee Gifford And Alex Rodriguez Describe Their 1st Dates | TODAY - Duration: 5:23. 

For more infomation >> Plane Evacuated After Cell Phone Bursts Into Flames - Duration: 0:22.
For more infomation >> How to Know the News that Moves Stocks - Duration: 5:29.
For more infomation >> 美聯曝公牛當年Jordan5交易!1交易險毀公牛王朝 J博士單換被拒 - Duration: 13:12.
For more infomation >> Ứng dụng gọi xe Go-Viet có gì Hot -- Lost Data - Duration: 7:22.
For more infomation >> Dragon Ball Abridged : Episode 32 VOSTFR - Duration: 9:20. 
No comments:
Post a Comment