Wednesday, August 1, 2018

Youtube daily report Aug 1 2018

JUST IN Trump Rocks DC WITH MASSIVE ANNOUNCEMENT

A 94-year-old WWII vet asks President Trump to grant his greatest wish.

President Trump will do anything to ensure our heroes are respected.

Former President Obama, on the other hand, couldn't be bothered to address the staggering

abuse occurring in the Department of Veterans Affairs.

The Obama administration's negligence and incompetence caused the VA to became a cesspool

of corruption, and many of our veterans died while waiting for the healthcare they rightfully

earned.

But Trump promised things would be different when he was running for President.

Trump's campaign focused on making sure this poor treatment of our veterans never

happened again.

"Our debt to you is eternal," Trump said.

"Yet our politicians have totally failed you."

Veterans ultimately voted 2-1 for Trump over Hillary, and he has rewarded their trust in

him by enacting sweeping reforms to improve and modernize veterans care.

The president recently put his unwavering commitment to our service members on full

display; by honoring one of our nation's heroes in person.

From Western Journal:

"A 94-year-old World War II veteran got the thrill of a lifetime when he was invited

on stage by President Donald Trump at the Veterans of Foreign Wars national convention

in Kansas City, Missouri, on Tuesday.

After giving special recognition to Sergeant Alan Q. Jones of Pennsylvania from the stage,

Trump asked, 'Should we bring him up?'

The crowd responded enthusiastically.

'This is one of the highlights of this 94-year-old man,' Jones said, after the president turned

over the microphone to the long-time VFW member."

Jones asked if he could say a few things while he had the microphone, and the president replied,

"I've got time."

Jones then spoke about his four brothers who served alongside him in Word War II, and shared

how his eldest brother died in combat in Italy.

In a tearjerking moment, Jones described how he continues to grieve for his brother, telling

the crowd that:

"Many times I just wished he could have come back to the land of the free and the

home of the brave again."

This statement clearly moved both the president and the crowd, but Jones quickly made the

exchange more lighthearted by asking for a favor.

"I have been told that I could never enter the Oval Office in Washington, D.C.

I am going to be 95 years of age April 11 of next year.

Hopefully, you will allow me to bring my family into the Oval Office to meet you."

The words were barely out of Jones' mouth before the president answered:

"Yes.

Any time you want!"

Jones then asked Trump to sign a photo taken of the two of them taken during the 2016 campaign,

and the President happily and immediately complied.

This heartwarming moment shows that President Trump understands that the White House is

the People's House, and that he has the highest regard for our veterans.

America is truly great again, thanks

to one man!

Wouldn't you agree?

For more infomation >> JUST IN Trump Rocks DC WITH MASSIVE ANNOUNCEMENT - Duration: 13:38.

-------------------------------------------

Γιώργος Αγγελόπουλος: Αυτή είναι η μία και μοναδική φωτογραφία που δημοσίευσε από τη βάφτιση του ανι - Duration: 1:57.

 Ιδιαίτερο Σαββατοκύριακο για την οικογένεια Αγγελόπουλου, αφού το απόγευμα του Σαββάτου ο αδερφός του Γιώργου Αγγελόπουλου, Χάρης βάφτισε τον γιο του

   Δείτε επίσης: Γιώργος Αγγελόπουλος: Στη Σκιάθο για τη βάφτιση του ανιψιού του  Φυσικά, ο Γιώργος Αγγελόπουλος δε θα μπορούσε να απουσιάζει από το θρησκευτικό μυστήριο στη Σκιάθο

Ο πρωταγωνιστής της σειράς "Το Τατουάζ", σεβόμενος τις τραγικές ώρες για τη χώρα μας, επέλεξε να κρατήσει την είδηση της βάφτισης μακριά από τη δημοσιότητα

 Για τον λόγο αυτό, ο ίδιος επέλεξε να κάνει τη σχετική ανάρτηση, μερικά εικοσιτετράωρα αργότερα

Δημοσιεύοντας μια φωτογραφία με τον ανιψιό του, ο Ντάνος έγραψε: "Η ζωή μας δίνει πάντα λόγους να συνεχίζουμε

Τα λόγια δεν χωρούν την αγάπη που έχω γι' αυτό το μικρό, τον ανηψιο μου! Έυχομαι ο Παναγιώτης μας ,η Ελένη μας και όλα τα παιδιά του κόσμου να είναι πάντα ευλογημένα ❤️ 🙏🏻Σας ευχαριστώ πολύ όλους για τις ευχές σας"

For more infomation >> Γιώργος Αγγελόπουλος: Αυτή είναι η μία και μοναδική φωτογραφία που δημοσίευσε από τη βάφτιση του ανι - Duration: 1:57.

-------------------------------------------

Here's What I Think About the Toyota Land Cruiser and More - Duration: 3:21.

Rev up your engines!

Dao says, what do you think about Land Cruisers, ok Land Cruisers are well made, Toyota makes

them, now the Land Rovers are rolling piles of junk, the English stuff, but the Land Cruisers

are really excellent made vehicles and can last a long time,

but realize, they are gas hogs, those things get terrible gas mileage because

their huge and they weigh a lot, I mean the newer ones are better than the

older ones, I had customers with the old ones when they were big straight 6 cylinders with

giant pistons and standard transmissions and they got like 7 miles a gallon in town with

the AC on, but they can last a really long time,

Atom bomb says, Scotty my 1999 Buick Century misfires only between 35 and 45 miles an hour,

I checked plugs, cylinder 3 the plug was worn badly, replaced that and it still happens,

what now, ok if you find that it's only misfiring at

that speed, what you want to do if have a mechanic do what's called a wet and dry compression

test of your engine, because it could be that the number 3 cylinder

is just flat wearing out, if that plug is worn badly and the other aren't,

it generally means you have a problem in the number 3 cylinder, and if a mechanic says

yes you know you're wet and dry compression ratio shows you got a serious worn part inside

your number 3 cylinder and that's a 1999 Buick, you might either live with it or get another

car, because to rebuild the engine is going to

cost you thousands of dollars to do it right so you know,

Hanson says, big fan Scotty, you recommended an automatic transmission over CVT but the

Corolla uses CVT thanks, well for a while you could have a choice,

you could get a regular or CVT and I'd say get the regular, but my son bought one with

a CVT that was a year old and he loves it, he says it gets better gas mileage than it

was even rated for so, you know I'm not a big fan of how CVT's drive and I personally

like a standard anyways, but Toyota makes pretty good stuff, that said,

do not buy a Subaru CVT, do not buy a Nissan CVT because you're going to have problems,

they have lots of problems with those, Dexbolt says, Scotty do you think it's a bad

idea to put a turbo into a 2000 Honda Civic with more than 220,000 miles, well it is if

you don't want to rebuild the engine, realize Honda makes very good engines, they

were originally a motorcycle company so their engines are top notch, their automatic transmissions

no, and their CVT transmission no, but their engines are really strong engines,

but that said, their made for what size they have and what power,

so you take a small Civic engine that's got 220,000 miles on it and you put a turbo on

it, you'll probably end up blowing the engine, that said if you want to rebuild the engine

or put a later model engine in it and turbo go right ahead, their strong engines,

but with that kind of mileage your kind of pushing your luck adding a turbocharger now,

so if you never want to miss another one of my new car repair videos, remember to ring

that bell!

For more infomation >> Here's What I Think About the Toyota Land Cruiser and More - Duration: 3:21.

-------------------------------------------

Bits and Joules: Empowering energy consumers through IoT & AI - Duration: 55:27.

>> Tanuja is a co-founder of DataGlen,

which is a early,

around startup, in this space of solar energy.

Tanuja was

earlier a technical staff member at IBM Research,

She finished a masters from MIT.

It's a pleasure to have you, and she's also part of

the summer workshop that we have here,

so she'll be around for next couple of weeks.

>> Thanks Patrick. Good morning, everyone.

It's great to present some of the work that we're doing,

and I think it's very relevant to

Azure Analytics part that

we are talking as part of the workshop.

So, as the name suggests,

I'm going to talk about Bits and Joules.

So, some of the use cases in energy

vertical and what are the challenges?

How the paradigm is shifting?

How the modern day's technologies

or digital technologies of IoT and

AI are relevant for some of the problems

that you're trying to solve in energy particle?

So, as all of us would agree that energy is

a fundamental necessity of modern society,

and as we have the access to clean and cheap energy,

we can do a lot in terms of the other sectors,

including the healthcare and

education and various other factors.

So, it's important to

basically look at the problem of energy,

which is the driving factor for

various other industries and technologies,

in terms of the innovations.

Energy vertical right now is

going through a paradigm shift,

and we call that paradigm shift as Democratization.

So, on the left-hand side,

you would see the democratization of media.

So, few years back or let say 20,

30 years back, the media was pretty centralized,

so there was content which was

created centralized way and

the content was basically

disseminated in a distributed manner.

So, it's like through national BBC

who's creating the content,

and the mass population

is basically just absorbing the content.

But now in last 10 years,

there is with the social media like Facebook and YouTube,

everyone can create the content

and everyone can consume that content as well,

so it's completely changing from

the centralized to decentralized manner,

very similar thing which is happening

in the energy vertical right now,

which is democratization of energy.

So, previously, there was

a central generation which was

happening at the form of the hydro or thermal energy.

It was, the distribution and transmission was

happening through central entities.

Again, in India, most of the utilities are doing

the transmission and distribution right now,

and the consumers who are

basically the residential or industrial consumers,

are just consuming the energy without

participating in the entire model,

but now we are shifting,

very similar to social media,

the change is happening in the energy vertical as well.

So, there is distributed

solar and batteries which is coming into picture,

there are electric vehicles which

are basically getting commercialized right now.

So, it's completely changing

the way the energy was generated and consumed.

So now, the commercial or residential consumers

are able to generate their own energy,

part of that they might be consuming,

part of that they can basically

inject back into the grid as well.

So, it's not a one way flow but now

there is a two-way flow of energy which is happening.

Obviously, the traditional power plants

and traditional transmission distribution network

is not designed to handle

this two-way type of communication

or two-way type of energy flow,

and hence there are a lot of

interesting socioeconomical and technical challenges

which are getting created because of this.

So, we will go through some of

the issues in energy vertical and

how the computer systems like IoT and AI,

can help solve some of those issues.

So, the first issue is of power shortage.

So, on the left-hand side,

this is the power consumption,

basically the power demand from New York over the year.

So, basically, this axis shows,

over the year, how there are

variations in the power demand.

This particular axis shows during a particular day,

how the demand is varying.

So, during a day, again,

there are one or two peaks depending upon

the power consumption needs

and depending upon the season,

whether it's the summer versus winter,

there's a variability in the demand.

So now, if we look at these plots,

these are again from the New York City.

So, there was a highest peak load demand here,

which was 2006 versus the 1980's.

If you see that, there is no long tails

in this kind of distribution,

so these are basically

hourly loads in terms of megawatt hours,

and as we are moving to 2006 and again to 2018,

we see very long tails of demand which are there.

So, the demand is increasing,

there are short periods of high demand,

and definitely to cater to these kind of demands,

there is no point in adding

the infrastructure just for

catering to that peak demand load,

so definitely there is some other methods or

alternative methods are required to curtail

toward address to these kinds of demands,

which are coming on all the networks.

The second problem is of energy scarcity.

So, in general, there is a problem

of access to clean and cheap energy,

so this is something which we

have very much experienced in India

where there are regular power cuts or there are places

where there is no access to energy as of now.

So, this figure is from 2018,

which shows that across the world,

how much population is without electricity right now.

As we speak right now there is still

approximately 300 million people in

India who do not have access to energy,

and that's because of various reasons

of the cost which is required for

the transmission and distribution of

the energy which is generated at the central level,

as well as the other problem

of the availability of energy itself.

So current energy generation

which is happening through thermal

and coal power plants in India,

which is not sufficient to

increasing demand of energy at various places.

The third issue that we'll talk about is,

mainly the good part which is

happening is there is a lot of

renewable and alternate energy sources

which are coming in picture right now,

in the form of solar,

wind, as well as battery storage et cetera.

Good parts out there are quite in local,

so there are no transmission and

distribution losses involved here.

They're are sustainable and scalable.

So you can scale these plans as required.

Most of these solar and,

solar especially, is reaching grid parity,

which means that the cost of generation

of energy from solar which is

becoming almost same as cost of accessing

the energy from the thermal

or the existing conventional plants.

But there are some disadvantages

in using these renewable energy resources.

The first problem is generation variability.

So, this plot shows the generation

across the year from one, single wind turbine.

So, as we see that there is a huge variability

across the year as well as during one given day,

there is a huge variability in

the generation which is

happening from single wind turbine.

So, when we are taking into account

the large adoption of renewable energy resources,

if we are not able to predict how much amount of

energy that will be generated in next few hours,

then we cannot do the unit commitment and

we cannot rely especially on these kinds of sources.

So, it's very important that how reliably we can

predict how much amount of energy that would be

generated from these renewable energy resources,

and it includes the

accurate weather prediction and on top of that,

accurate prediction of the energy from these sources.

The second problem is that

if you take into account, let's say,

solar, then the solar

is available only during few hours of day.

So, it's available only from, let's say,

6.00 AM to 6.00 PM of the day,

and again there is a variability in amount

of generation which is happening from a solar plant.

So, we cannot rely on

this renewable energy resources completely,

we need to have additional systems in place to

backup whenever these resources are not available,

and that's where basically

we are looking into, storage technologies,

and how storage and other

technologies can be integrated with renewables,

so that we can reliably provide power supply

through these renewable energy resources.

So, that's where now we're

going to talk about the Participatory Grid.

So now, the consumer is not just the endpoint,

consumer is not the dumb or static consumer,

but consumer is participating in the entire process

of energy generation and consumption in various manners.

So, the four main points that I'm going to talk about is

first is how we can address

the power shortage problem by reducing the peak demand?

The second is how we can address the energy scarcity?

On one side there is energy scarcity,

which means that amount of energy which is available,

that is not sufficient to cater to most

of the energy resources,

but on the other hand we see lot of

energy wastage which happens in the residential,

commercial and industrial places.

So, can we identify

the energy wastage and

curtail those energy wastage points?

The third one is how we can

basically maximize the distributed generation,

or basically adoption of

distributed energy resources with

the additional resources like

solar and other technologies,

and the final one is looking at

optimal utilization of resources,

which means that basically including

all the points of preventive maintenance

and predictive analytics.

So, just going back very quickly to

IoT technologies and why

IoT analytics is very suitable

to solve some of these problems.

So, we are looking at basically the things which are

equipment setup at most of

these renewable or the conventional energy resources.

Most of these are capable of

communicating data with standard protocols.

So, we do not have to, most of the cases,

we do not have to add any new sensors as such,

the existing batteries inverters,

wind turbines or even

the smart meters which

are installed at the home premises,

they are capable of sending

the data and the context information as well.

There are some existing processes

in place which need to be

adopted to basically look into the new technologies.

So, I like to basically look at

the IoT stack similar to brain architecture.

So, as we look into the brain,

there are different systems which are

responsible for different types

of actions which are performed in our body.

So, if we look at the Reptilian Brain,

it's basically governing

all the important or survival instincts

of human body including breathing, heart rate,

and balance, which are like

low-latency response events which

are required by the body.

So, those events are basically being

handled very close to the sensor systems.

The second type of brain is Limbic System which

is looking at the feeling and memory formation.

So, looking at more data and generally forming

the patterns or memories through

that and the last one is basically,

Neocortex which is doing

the reasoning forward planning language and other things.

So, this is basically looking at more holistically,

the data from multiple sensors

and making their decisions on top of that.

So, can we look at

the IoT hierarchy of systems in similar view.

So, we have sensors and actuators at the level which are,

many times provided by the equipment.

Then, they have the data loggers or

edge devices which are sensing

and collecting that data

and capable of taking certain actions.

So, the actions which are required as

a low-latency response actions or decisions which

are crucial to be taken in the very near real time,

it's good to have those decisions in

the system which are as close to edge as possible.

Then, there are certain decisions

which require information from multiple resources.

So, let's say, if we are looking at the load across

the community and some

decisions that need to be taken on top of that.

Obviously, we would not be able to take

those decisions at edge devices but it

might be in the Fog or the Local Data

Center or the small computing which is at,

let say, community level where we

can access that and obviously, in the Cloud.

So, there was some initial work we did while I was at

IBM on looking at hierarchical decision system.

So, can we do the analytics and the hierarchical manner,

where one node is doing

the computation which is possible and

essential by that particular node and

passing on information to the next level of nodes?

So, depending upon the problem,

can we decide where the analytics should reside for

that particular problem and can

there be a connection between those nodes?

So, we have not completed this work

after I left IBM research but

we are interested in taking it up.

So, if there are any interest

in the similar kind of systems,

we'll be happy to collaborate on this.

So obviously, the Neocortex is doing

various types of analysis on imaging time-series data.

So, getting back to the Participatory Grid,

I'll start with the load shifting problem which is

basically looking at Peak Power Demand.

So, why reducing the peak is important.

So as we see,

the X-axis is the time of day and

Y-axis is the power production or their demand.

So, there are multiple peaks during the day and

the power is submitted as in the power is

generated by various different types of load systems.

One is basically Baseload which could be a hydrothermal.

So, which is a constant load

which we are getting through out the day.

On top of that there is an intermediate load,

which is again provided by

some conventional energy resources

and then, there are Peak Loads,

which are typically provided by

diesel generators which have low response times,

quickly can be started and run through.

But if we look on the right side plot,

it shows the amount of basically,

the price versus the power generation

and the price does not increase linearly.

As we go to the peak demand,

the price increases exponentially.

So, if we can reduce this particular demand

from a very small tiny amount at least,

it significantly helps in reducing the price in

this particular band when it

goes exponentially increasing.

So, there are historical approaches which have been

taken into account previously to shift these loads.

Some of these approaches have been

Daylight Saving Time or

the Tokyo Brownouts and CA Rolling Blackouts.

Also in India, we see that there

are these kind of load shedding which

happens like a few hours a day

at different localities to cater to the demand.

So, some part of the community would be completely

shut down during a particular point of time.

So, this is more like a top-down approach where

the utility is deciding that I'm

not able to get it to the peak demand.

So, I would shut down part of

the community and that would be

basically a ruling load shaving that we

see often during the summer times.

There are lot of other approaches

which are being tried out in

countries including US and Europe

with variable degree of success,

which are called Demand Response.

So basically, if you look at the left-hand side,

it shows the typical residential demand and what

part of the residential demand

is the deferrable or time shiftable loads.

So, if we're looking at electrical vehicle charging,

that could possibly be

a time shiftable load or if we're looking

at water heater or washing machines and dryers.

Those are time shiftable loads because

it's not like you need it immediately,

but it can be done over a period of three hours of time.

So, there is a large proportion

of loads in the residential,

commercial and industrial loads which

are kind of time shiftable and can be taken

advantage of those loads to

shift from pick-time to off pick-time so that

we can reduce the demand

and basically do the peak shaving during those times.

So, when we started looking into basically,

the problem of peak demand and demand response for

India or the developing countries.

The first challenges in the way it's being tried

out in US and Europe is,

utility collects the data

from the smart meters or

individual appliances, let's say,

washing machine is in-demand response,

then there is IoT device which is

sitting on the washing machine.

It's sending the data to the Cloud and utility

sends the demand response signal either for

the manual intervention or

automatic dispatching the load to the some other times.

So, it requires basically

complete infrastructure for communication,

centralize optimization techniques where

you can send the signals to

the consumers and the consumers

would participate in demand response.

There are few success stories in certain utilities in

US who have successfully done

the demand response process for shifting the load,

but we started looking at the same problem for let's say,

India and Africa, we do not even have

the sufficient infrastructure

for the network communication.

So, we cannot rely on the network communications to

send the signal from the utility to individual equipment.

Also, there are lot of equipments

which are already existing.

So, we cannot change those equipments,

which means that there has to be something which is

sitting outside of the appliance,

but still taking into account the appliance preferences.

The third thing that we wanted to look into is,

as to be as cheap as possible.

So, it has to reside on

a small device and the third thing is,

the same appliance should be

applicable from a range of appliances.

So, it should not be that

one type of appliance is only running

for water heaters versus washing machines.

So, it should be able to handle

multiple different types of time shiftable appliances.

So, taking these multiple things into account,

we built something nPlug.

Again, this is work which

I did while I was at IBM Research.

So, what this nPlug does is,

it's like a smart socket or it's like

over the power strip that we generally use.

So, it sits between the wall socket and their appliance.

The appliance goes into the nPlug and

this nPlug is built using a 16-bit micro-controller,

with very small,

it could be of RAM and four MB of flash memory.

So, what it does is,

it basically does local sensing and decision-making.

So, it senses the grid conditions

especially of voltage and frequency,

to identify whether there

is a high load on the grid or not.

It takes the preferences from the consumer.

So, let me take an example of water heater.

So, let's say most of their time solar water heaters,

the water can be kept heated because there is

insulation available for three to

four hours in the morning.

So, most of the times what happens is we all get

up at six o'clock and start the water heaters.

Water heater takes from

two kilowatt to four kilowatt of energy.

Multiple such appliances come in

the activation and we see the morning peak.

So, can we instead of starting,

everyone starting and manually

switching the appliance at 6:00 AM,

can we switch the appliance

anytime between 3:00 to 6:00 AM,

because water can be any way kept

hot and ready for three hours.

So, the user can set the preference that,

"I want the water to be ready at 6:00 AM",

and it needs to be run

for 30 minutes let's say for water to get heated.

So, then we can actually shift

this demand from 6:00 to 6:30 or

whatever time the user gets up and six to

7:00 AM let's say it's the peak demand time,

we can actually do it anytime from 3:00 to 6:00 AM,

because that's the off-peak time,

there is no much high load on the grid.

The user can set the preferences

that 30 minutes is what I want to run the appliance

and 6:00 AM is

the time by which I want the water to be ready.

Obviously, there are "Override" buttons,

so some days you don't want to set that schedule,

so you can always override and say that I

want to switch it on at whatever time I want.

So, without having any external infrastructure

and just having the sensing

of voltage and frequency inside the appliance itself,

we wanted to see how we can

identify the peak and off-peak demand times.

So, the first part is an autonomous operation or

decision making of identifying

when there is high demand on the grid.

So, this plot shows the time of day versus the voltage,

and each color is a different day of the week.

So, this is the data

which was collected from my house in [inaudible].

So, in the morning you would see that

the when night the voltage

is high because these off-peak times,

the voltage is about 230 isn't

nominal voltage we are expected to have like 230 volts.

But typically, at night we will see

it around 245 or 250 volts.

Whereas in the evening peak times around six to nine,

you would see significant drop which is like 220 volts.

This was in Bangalore where

this local transformer was very close to my house.

We did similar experiments at

various different parts in India,

and we see huge variability.

So, if you go at some of the rural places in [inaudible] ,

you have seen the voltage dropping up to

180 volts or 175 volts.

So, we cannot just fix a particular range because a range

varies across as you go

from one location to other location.

So, we cannot just say that anything above

235 is the off-peak time and anything

below to 220 let say is

the peak time because depending upon the location,

you have to adaptively identify that what

is the peak and off-peak time range.

So, we basically collect the data at

every 30 seconds or one minute time interval and put

the compression on top of it like

this is aggregate approximation,

for aggregating the signals

at every five minutes time interval.

Then we look at basically doing the clustering

of these particular time series

into three clusters of peak,

off-peak, and the median time.

Then we look at the patterns,

so let's say the algorithm looks at

the last 10 days pattern

to identify what are the peak and off-peak times.

So, based on that, it decides that okay,

it identifies first of all the thresholds,

which is the low threshold and

upper threshold for voltage.

Then identifies that based on

last 10 days pattern that this is

a time which is a peak time and this is a time which

off-peak when they need to take the control.

This wasn't previously there were a lot of

techniques which were looking into

the frequency data to identify

these gaps between the supply and demand,

there was lot of work on this,

but no one has looked into

the voltage signals and buttons,

so that was one of the contributions.

Doing these on these small microcontrollers

in streaming manner, is what we did.

The second thing is looking at frequency.

So, frequency would be ideally it

should be 50 hertz and close to,

if we look at the data from yours,

we would not see this much of variability,

it's pretty close to let say 60 Hertz over there,

but here we see

a huge variability from let's say 49.2-50.2.

But still there are events when

it goes beyond the normal threshold.

So, we identify what is the normal behavior for

frequency and which other times where there is imbalance.

So, if the demand is high then generation,

then the frequency will drop.

So, we identify those events and

again capture for shifting the load.

So, now we're able to identify

the off-peak and peak times

and we know that when we

should not be running the appliance.

But let's say, now there is no central controllers,

as such every device is going

to make the decision independently.

So, we should not create

a problem of just shifting the load

from one time to another time

by all of them starting and shifting.

So, let's say all of them sense,

there are thousand influx and all of them sense that it's

a off-peak time and all of them shift to

another time and create another peak demand.

>> So, what's the disadvantage of what the [inaudible].

>> Basically, you are ready to the peak demand.

So, if you are running the water heater

and since utility is not able to supply during

that time because there is a cap on

how much energy utility can

supply at any given point of time.

So, you would have experience let's say

morning and evening power cuts, right?

So, they typically do the power cuts during peak time,

because they are not able to cater

to the demand at that time.

So, that's like a top-down approach,

they force everyone to shut it off.

But what this one is trying

to do is like a bottom-up approach

that the device itself says that I need

not run only during the peak time,

so I can shift my demand.

There are other loads rate,

let say there are lights and

air-conditioner and other things

which are not time shiftable.

So, we have to run those appliances during peak time,

so giving chance to

the appliances which definitely need to run during

that time and shifting the loads

which are kind of voluntarily,

rather than utility forcing

saying that entire community would

go down from 6:00 through 7:00 AM.

>> [inaudible] ?

>> So basically, we then discussed with the utilities

on how they can include as part of the business model.

So, for industrial consumers,

there is time of use pricing, right?

So, even in Karnataka,

there is a time of use pricing where industries are

charged higher rate if they are running it on pick-time.

So, we'll discuss various business models if they

can give a rebate if someone is running the load.

So, let say in this particular devices

and able to send the audit log at the end of the month,

which shows that it was not running during the peak time.

Then, there could be discounts and rebates,

as well as if there is a time used pricing,

there is a straight use case for

the consumer to ship the load during follow-up.

>> [inaudible].

>> So, that's part of the user preferences.

So, there are certain loads which

are like a long-running loads.

So, if you take an example of EV charging,

that would be running over four hours

and you can basically

chop that four hours into let's say six hours with-.

You can say that my window size need to be 30 minutes,

so you should not be switching too often.

So, if I give the chance

for a plants were

uninsured and for at least minimum of 30 minutes,

so that's part of the user preferences.

User can say, "Do I want to run it at one go,

which is a case for let say,

washing machine that I want to run it at one go?"

versus if it's the inverters that we charge

at home or the

electrical vehicles or even the water heaters,

where it can be chopped that

the entire usage can be chopped at multiple time windows.

>> [inaudible].

>> No. So, the way they had design it,

user has to provide the preferences.

We wanted to keep minimal inclusions in the device,

taking into account the small size of that.

But that was a requirement

from a few discussions that we had that if

the device is capable

of providing the usage pattern let's say.

Let say, we can just collect

the data of all consumption of an appliance,

can be categorize what type of device it is?

So, can we categorize,

classify basically the water heaters

from washing machines and

then automatically put the preferences

of what category of device it is?

>> Is it different in plant for each device?

>> It's the same model,

but if you are running it

for multiple different appliances,

you would be putting one for each.

>> [inaudible].

>> In this model, it was not.

But then there was some conversation on,

let's say there are 10 templates within one industry.

Can they talk to each other and do

some even better optimization in terms of distribution?

>> [inaudible]

>> Yeah. So, they did

these experiments on the data collected from,

let's say, US houses as well.

As I said, the range was small.

So, even in case of frequency and voltage,

the range is very tight,

but still, you would see the pattern.

So, the pattern is available over there tight range.

>> [inaudible] ?

>> Yes. So, you would see a small variability still.

So, the variability might be in, let's say,

then this two minus two but still,

the variability is available over there.

Basically, to control this lot of utilities,

put additional devices for doing

multi-stablization and frequency stabilization.

So, a lot of rather in India,

while the utilities, or

I think in few countries in Europe as well,

the pricing in the real-time energy market,

the amount of price that you get

paid for providing energy

at a particular point of time is called ABT,

availability-based tariff,

that's dependent on the frequency at that time.

So, if the frequency is low and

if you're supplying energy at that time,

you get paid more versus

when if you are supplying in this particular region.

>> [inaudible].

>> Yes. So, again,

it's not widely available as of now.

It was initially done as part of the pilots.

Now, as they're putting smart meters,

the demand response is generally done if

there is smart meters available.

But a smart meters giving you timestamp data

already and being pushed to

the Cloud almost in real-time,

so that's where the analysis is happening in the Cloud.

>> [inaudible] ?

>> So, I'll go to the other problem where

we do look at the power consumption of the plants.

But in this case, it was mainly

for identifying what is the condition of the grid,

whether the grid is overloaded or not.

So, if you look at the power consumption part,

you will just get to know that how

your appliances working but you would not

get to know that your transformer

is overloaded right now,

and you need to basically

curtail your demand at this particular point of time.

So, basically, once the problem of

identifying the good condition was solved,

then we wanted to look at how we can basically

distribute spread that demand over time.

So, we looked at

various decentralized scheduling techniques.

We basically looked at computer networks,

CSMA kind of protocols,

which tried to back off

when there is a high demand on the network.

So, we call this algorithm as

a Grid Sense Multiple Access similar to

Carrier Sense Multiple Access.

There are few differences in this one.

So, if we look at this

similar computer network algorithms,

there is no deadline for completing the packet transport.

So, you can basically attempt and send the packets,

whereas in this problem, there is a deadline.

So, if I'm saying that I want

the water to be ready at 6:00 AM,

then 6:00 AM is my deadline.

I need to, no matter what,

we want to give preference to

the user's deadlines which means that the water

has to be ready by the deadline which is given.

So, we had to basically do some modifications for

CSMA algorithms to suit to the deadline-based algorithms.

So, the first thing that we did was contention window,

depending upon the available time slots.

So, let's say if I am at 3:00 AM

and I have a 30-minute slot.

So, I have three hours which means

six slots which are available or five minutes slots,

whatever number of slots which are available,

the current time slot during that

particular 3:00-6:00 AM window

and how much time I need to run that.

So, depending upon that,

every device basically tries

to identify the contention window.

As the time elapses,

as you start going towards their deadline,

your contention window size

starts becoming smaller and smaller.

So, what the device does is,

let's say if I have seven as the contention window size,

device chooses randomly to sense at any

of these initial seven-minute slots,

and identify sense the grid and

identifies whether it's the loaded or not.

So, by doing this, we are

first of all distributing the amount

of endplates with just sensing and

making their decision at the same time.

The second thing is the probabilistic connection.

So, depending upon the voltage levels,

so let's say the voltage is below peak load,

then it says that you

should not be joining because it's a peak time.

If it's above the upper threshold,

which means that it's a peak time and

device can definitely join at any point of time.

If it's in-between, then basically,

it computes probabilistically that with

how much probability that you should be joining the grid,

whether you should be starting.

So, by doing these probabilistic arrangements,

not all endplates would again join

at the same time but they distribute

probabilistically and would not

increase the load beyond a critical point.

>> [inaudible] ?

>> So, we did some experiments on let's say if

there are two controllers,

two endplate controllers in one house.

Obviously, if they can communicate between each other,

there would be better chance

of optimization than this one.

If you are able to do it centrally,

that would be a complete optimal solution.

So, as we go up in the hierarchy or from the H242 Cloud,

it might be giving you a bit.

Because you have higher information available with you,

you would be able to do better optimization.

By taking into account the constraints of,

we didn't want to write any communication infrastructure,

not even the connection with Wi-Fi or the other points.

So, because of that, we decided that we would make

their decisions locally and we would

see how much the difference it comes to.

If there are not many sensor,

if there are not hundreds or thousands of

sensors being controlled at one location,

this one still gives the same performance as if it's two,

10, or 100 still gives the same performance.

>> For households, this is still okay to that?

>> Well, even households

and industries which have less than 100 or 200 devices,

this still works equally as if

you had taken the centralized decision.

>> [inaudible] ?

>> So, that's another as in I can point you to the paper.

Where looking at users load.

We tried to identify what are the preferences

of running their devices and

based on the context of how they

would like to run their devices over weekends

versus let's say weekdays and other things.

So, these are the results of,

we tried it in simulation and

some controlled experiments in real world.

So, basically here it's

creating two peaks during the day.

There are different types of inputs

like water pumps and washing machines,

water heaters and inverters,

which have variable preferences

in terms of their time window,

how much they want to run?

The starting times and durations and other things.

So, we are considering let say close to I think

400 or 4,000 of such appliances.

Even when they are taking

these independent decision in a decentralized manner,

because now they can shift this particular peak

spent three hours or

four hours before the actual time duration.

It's able to actually

significantly spread the demand over time.

>> is this simulation or is it actual measurement?

>> This was in simulation.

So, we did various simulation and

then we tried it in

the control experiments of 100 devices.

So, 100 devices we ran on

multiple different appliances and

users said there are preferences as their need.

So, we didn't have control over

how they said the preferences.

So, now we're moving from the load

shifting to energy efficiency problem.

So, there is one thing is

there is high demand which utility

is not able to get into,

but on the other side, we see that there is

lot of wastage of electricity which happens.

The wastage could be because of the aging appliances like

Aging refrigerators or the cooling systems.

In US itself 30 to 45 percentage of

each HVAC system had to

operate at below efficiency levels.

So, this could be again because of two reasons;

one is that the malfunctioning or aging of their

appliance or inefficient use of appliance.

So, it could be that

the air conditioner is running

by individuals and doors are open.

Which means that it's not running at the optimum levels.

To address this problem,

what we did was something similar where

it's called as a SocketWatch.

Which again sits between

the appliance and the wall socket,

and now it looks at instead of looking at the grid,

it looks at the power consumption

of that particular appliance.

So, we collect data of active or reactive power and

various other parameters which can be sensed

from outside of their plants.

Then we try to model

the benchmark model for that particular plant.

So, here is an example

of washing machine which goes through different modes

depending upon the settings that you

choose on the the washing machine.

It has different signatures of active or reactive power.

So, we identified these states of their plants,

whatever appliances running in during the benchmark time.

So, this is a learning phase when we collect

the data and build a benchmark model.

So, here it's basically

showing that there are different states of

power consumption of their plants and how

the states are transitioned from

one state to another as a Macro model.

So, with the probabilities of going from one state to

another state depending upon various factors.

So, there are two phases,

one is the learning phase when we identify the modes,

depending upon the active or reactive power consumption.

The states depending upon

the time spent in each of the mode.

So, sometimes the same

power consumption but if it's running

for different time intervals means

differently or it's a different state.

Then we look at the periodicity between those modes.

So, for example if we are taking example of refrigerator,

the defrost heating might be running once every day.

So, that's the periodicity of that.

Or, if we see the cycling of compressor,

it might be running at specific time intervals,

which is a periodicity and then the state transition.

So, how the device

transitions from one state to another state.

In the model, once you have the benchmark model,

in the monitoring phase we do two things;

one is the standby mode correction.

So, if we identify that

the appliances is running on standby,

then it's switched off and if

we identify that there is

a change from the benchmark states,

then we basically alert as

a malfunction or inefficient use of the appliance.

So, every appliance while it consumes the power,

it consumes the power in two forms.

So, one is the,

so basically the breakage of your power consumption.

Which is termed as it has

a phase angle and it basically gives you two parameters.

That how much is active or reactive power,

depending upon if your load is

inductive or was capacitive.

The amount of active or reactive power would change.

So, the combination of active or reactive power would

define in some sense what kind of

operation is happening in your plants.

>> [inaudible]

>> So, basically if you

measure any appliance's power consumption,

you get these two factors of active or reactive power.

So, it's just sitting outside

and measuring power consumption, that's all it's doing.

So, there is a lag between,

as in it's power factor between

the active or reactive power

which is a phase angle of those.

So, here are some of

the examples of identifying the issues.

So, this particular one of the refrigerators,

as in this is the benchmark model.

When you see that there is typically,

there are states like compressor cycling,

compressor going on and off.

Which if you look into the clusters,

it basically identifies different clusters

which are states in terms of power consumption.

There is a periodicity,

so how much amount it stays in

the compressor on mode versus compressor off mode.

This refrigerator after building

benchmark model, we identified it.

We basically changing the states and

then they look back and found

out that there was a gasket leakage.

So, what's happening is,

since there is a gasket leakage,

it's not cooling sufficiently.

So, it's not able to take its cooling,

so it was not identified by

the user because it seems to be

working fine but it

wasn't ever going into the compressor off mode,

because thermostat is not able

to activate the compressor off mode.

So, if you see here, it's always on and

there are no cycles of on and off kind of a thing.

If we look into the clustering or the states,

here was the compressor off state,

which was coming periodically,

which was completely missing

from the actual operational model.

So, just moving next

to the distributed generation part.

So, basically, these are called Microgrids,

so the examples which I talked about earlier.

Because they are doing the generation

and consumption at the same location,

so they are called Microgrids.

There are various forms of Microgrids like Offline,

which are completely working independently,

or there are Backup or Online,

which are typically the solar storage

in the house where you are

getting grid power as well as you have

your own system of the grid.

It has been historically, we can say,

the consumption and generation

happening at the same place,

so they're historically different Microgrids,

but obviously, those are not efficient.

So, the problem that we discussed earlier,

previously, the user were just doing consumption.

So, it was simple on and off of a switch.

Whereas now, when the consumer is called the Prosumer,

where there is a local generation and consumption,

the complexity of the problem is

increasing on the consumer side.

So, this is like a time of

use pricing which is provided by the utility.

So, depending upon whether it's a summer

or whether it's a weekday versus weekend,

there are different prices for per unit prices.

You have the solar generation,

which is like we don't have much control over it.

This generation is happening depending upon your weather,

you can just predict

degeneration how much it can be there.

You have a battery. So the control that you have

is how much battery that you can charge and discharge.

Again, the problem at

the small scale is

being handled at multiple different stages as well.

So, there are something which is

called as Virtual power plants,

where you have multiple different types

of power generation equipment.

It could be wind, solar, diesel

generators, solar batteries.

So, how can you activate?

So, basically, virtually, they create a power plant,

and let's say if you have the requirement that

this Virtual power plant need to be

supplying a particular amount of demand during time,

how can you choose the right set of

generation at right time so that

the overall cost of generation is minimized?

So, starting from the individual household level

to the large-scale generation,

the same problem is being

tackled at multiple different levels.

>> [inaudible].

>> So, in earlier two problems,

we're still running with the conventional sources

but tackling to the power shortage

and electricity wastage.

Here, it's mainly we are taking

into account other non-conventional integration,

and that's happening at multiple different levels.

So, you can choose any subside or

of power generators which are available.

Each of them would have

different pricing for per unit generation,

and they have different availability times

and different constraints.

So, taking all these constraints into account,

how can you fulfill your demand in

real time while minimizing the total cost of generation?

So, that's happening at individual

household industry level or the utility level.

The interesting part here is you

need to make the decisions of

which generations to switch on or off in real time.

So, this requires very much computations

which are as close to the generators as possible.

So, if you are let's say collecting

data from wind turbines and solar together,

obviously, you need to collect both data and make

the decisions on the cloud.

But let's say if you're doing

these decisions for the house,

then it's good to make the decision at

individual house level rather than

collecting data at the cloud and making the decision

because of the latency and network issues.

>> [inaudible] what's the cost of each liters

that you need documents of it?

>> So, basically, it's called levelized cost of energy,

which takes into account additional costs of

maintenance and all other things

over the time of whatever period.

So, now, if let say we are taking into

account batteries levelized cost of energy,

you need to take into account how much battery life is.

So, whatever garbages that you are putting in,

you would be expensing it over four year's time.

Depending upon how many

charge and discharge cycles you make,

the battery life is going to go down.

So, that is another constraint on

when you decide how much to

charge and discharge the batteries,

that how many charging-discharging cycles

that you are taking into account.

Each of these resources

have similar kinds of constraints.

So, there are some times

when wind turbine if the wind is too low,

it would still generate energy but it

would run very sub-optimally during those times.

So, your levelized cost of

energy during those times would be

higher so it might be better of to run

some other generator rather than running it on low.

So, here, basically the problem is

our forecasting generation from each of these assets.

So, generally, now even utilities in India have started

posing every plant which

is more than five megawatt capacity.

They need to provide one day in

advance that how much generation

would be happening from the plant

at every 15 minutes time slot.

Then there would be revisions during their day.

Similarly for small houses,

if you are running it for

the individual small house or a commercial entity,

we need to know how much generation is

expected from the sources in real time,

a day in advance so that you can do

the optimization and if there are changes,

you need to be able to adjust to those changes.

The second is similar to the generation forecasting,

you need to do the load forecasting.

So, how much demand is coming from

your residential or commercial place

and how would you be able to cater to that demand.

Next is optimization problem to

comply with the policy. So, yeah.

>> [inaudible] ?

>> So, it depends upon how dividing is done inside house,

but that's something which is possible.

So, because of cost constraints on something,

generally, some people do that,

let's say solar power is supplied to

only few appliances and

grid is supplied to all the houses kind of a thing,

but the combination and mix is possible depending

upon the wiring that you do of your house.

Again, depending upon three-phase

versus one-phase loads and

other things there are few constraints.

>> [inaudible]

depend on the time of device?

>> Yes. So, there are two types of that.

So, there is one is

the consumption tariff where

the consumer is charged depending

upon how much amount of

energy user is consuming at specific time of day.

Then there is other type of

tariff which is called feed-in tariff,

so depending upon how much amount of

energy you are injecting into the grid.

Sometimes it's constant throughout the day,

sometimes there is a time of use as well.

So, this is the other problem

where depending upon the pricing

and the policies which take into account

the additional constraints of

let say levelized cost of charging,

discharging batteries, and battery life,

the optimization needs to be done on the grid.

So, this is a problem which we are working on right now.

So, right now, we do the head forecasting

and push the results to the box or Edge.

Then they do small optimizational,

simple optimization if there are any changes.

So, we are basically working on

how we can do the revisions on the box itself,

because if there is network disconnectivity

or other issues and how we can

basically provide the optimization

and mode of this matter on the Edge.

So, there are penalties associated

if your generation is not correct

beyond 15 percent for each of the 15 minutes point.

So, these are the policies in India right now,

and we need to be able to cater to those.

So, another problem which is not on the Edge but which we

are trying right now is this is

the Microsoft Research Building.

So, if we want to have

larger adoption of renewable energy resources,

we need to be able to identify the user,

need to be able to identify how

much is the potential of solar on

my particular roof and

how much amount of energy that can be generated.

So, basically, this identifies

the area of the roof and classification of the roof.

So, depending upon the dates,

the south-north roof or it's

the east-west roof or it's a flat roof,

the capacity of solar that you can put

on in then and its all our potential word body.

So, we are doing some work on

the image optimization on there,

and there is already some open source work on

that core loop and solar map if somebody is interested.

That was mainly for the French datasets.

We are looking at Indian datasets for this.

Finally, the efficient operations.

So, this is mainly about

the predictive maintenance and root-cause.

So, since we collect data from

various appliances in solar,

there are different types of

failures which are possible across various sensors,

including weather sensors, inverters,

MPPTs, and other things.

So, we tried to identify proactively and

reactively those faults and alert the users.

This is the cycle for data

collected from the IoT platform right now,

and this is how the Real-Time Portfolio View looks like.

So, there is something that we are doing

like online offline module,

where the central user wants to look at the entire data.

So, if I'm the independent power producer

who has 25 plants across India,

I would want to see in real time how

it's performing over my entire set of plants.

But many are times this large plants

are at remote locations.

So, there are always

problems of network connectivity and other things.

So, if the entire model is sitting only on the cloud,

the site engineers who are siting at the ground,

they are not able to access this system as well.

So, we have something like offline sync model,

where it's a subset of functionalities

available on the fog at the plant itself,

and opportunistically, it swings

with the cloud datacenters.

Then identifying various under-performing assets.

So, we have shared some data with [inaudible] team as well,

so we would like to explore

some of the techniques that they have used here.

So, this is the cleaning schedules.

So, cleaning is one of the important issues in solar,

which drives the inefficiencies.

So, without adding any additional sensors,

just like looking at the power consumption traits,

we are able to identify when is

the cleaning required and how much is assaulting losses.

Again, this one we were doing on

the cloud right now and we're trying to push

it in the Edge as part of

the workshop experiment that we're doing.

The last one is combining time-series and image data.

So, one is we are getting this time-series data of

power consumption and other things from

individual equipments.

But we're also trying to get some data from

either tomographic imaging or the normal cameras,

so how we can combine these two to

identify the right fault and decide what type

of dust it is or what type of soiling it is and

activate the auto cleaning mechanism appropriately.

So, there are a lot of interesting, other startups,

and other corporates which are working on

energy problems using

AI and IoT technologies across the world.

We'll be happy if there are any problems which are

of your interest and we'll be

happy to collaborate with you. Thank you.

No comments:

Post a Comment