Hi my name is J.Weathers AWS Solutions Architect Professional and MCSA cloud platforms today
we're doing and getting started with AWS SageMaker. We will deploy AWS Sagemaker
instance define IAM roles to work with s3 review the Juniper notebook and
algorithms that are actually deployed with the AWS SageMaker instance at the
end we'll overview what we learn and talk about next steps so first things
first let's login to our AWS account
re:Invent was this year and was awesome Andy Jazzy looking great guys he did an
awesome job of re:invent ok so we're going to sign into our console it's a
couple ways you can get to AWS Sage maker you can scroll down under machine
learning clicks aja maker it's under your recently visited items so we'll be
here or you can just type in the sage maker first thing you want to do is
create a notebook instance so we're gonna click create notebook instance
we're gonna name it deep learning demo for this demo we're
going to use a ml.t2.medium instance you can select
ml.m4.xlarge or ml.p2.xlarge but for demo purposes the t2.medium will be just fine.
For the execution roles you can actually create a new role or enter a Amazon Resource Name (ARN)
from a existing role or using the existing IAM role, but right now we're going to create a new
role so for a specific s3 bucket what you need to do is actually get the name
of a bucket that already exists. So let me go here well I'm gonna login into AWS again
all right we're going to go to s3. Okay we want to go to s3 that's our storage
where our data our csv files will be kept. The bucket I'm going to choose is Deepapp1
so you see this bucket right here DF one I want to type that here
it's called deepapp1 right okay so this bucket s3 policy will
be created for this bucket specifically and we're going to go through that so Deepapp1
is a bucket that I already have I'm going to click create role so now this
is the role that is actually with my policy when I click on this it takes me
to identity access management (IAM) here's my role and here's the policy that was
actually created for this so you can see this is the resource and s3 deepapp1
and I'm allowed to get objects which means pull objects from s3 put
objects which means I can put objects into the bucket or delete objects for
the bucket for deepapp1 so that looks great now we're going to talk
about the Virtual Private Cloud (VPC) so you have your VPC we're selecting the default VPC for this
particular demonstration Amazon recommends that if you're doing a
production based app that you can you create your own VPC and you can do that
by utilizing cloud formation and cloud formation is a tool that you can use to
automate all your infrastructure. So now we're going to select the subnets in
this availability zone in this region us the East one and the security group I'm
going to use a default security group for encryption keys that's basically you
have to protect your data with an encryption key you can do this you can
go to encryption keys you can use a key that you already have like I have this
key right or you can create a brand new key so you can create a key you can just
have an Alias for the key deep learning demo. demo key,
advanced options
kms only next step you can do it for lets say an app env you can
do dev that's the Tag next step then you can choose other things that can access
it we're going to go to the next steps well actually we're going to go to our
stage maker execution role both of those
click next steps and click finish so here is everything that can actually
access with this roll the stage make for execution rolls both that we just did
and you can see all create, describe, enable, list, put, delete click finish we
have our app deep learning demo key we can copy this on go back to sage maker
paste the arm here and there we have our encryption key create a notebook instance.
as you can see deep learning demo is pending that means it's deploying.
I have a instance in service and I have an instance stopped. The great thing is if
you're not working on your sage maker instance or in your Juniper notebook you
can stop your instance start it and all your data still be there. So while this
one was pending we're going to go ahead and open a instance that I've already
created. I'm gonna open that and here we see sample notebooks. so when we actually
take a look at the sample notebooks we're going to... you can actually see the
models that are deployed with AWS sage maker. You have advanced functionality
and introduction to Amazon algorithms, introduction to applying machine learning algorithms.
these are pre-installed and are samples with data and algorithms are
already there and you can actually start to train and predict breast cancer using
this linear learner model with features derived from images of breast mass.
So you can actually walk through this right now
and start learning how to do modeling for breast cancer which is amazing. Now
if we go back here we're going to take a look at a few more so let's go back that
was actually introduction to apply machine learning let's look at
introduction to Amazon algorithms so if we take a look at these
we have factorization, image classification model, IDA topic modeling
linear learner modeling with MS NIST that's the data set for image
recognition and then you have sequence of sequence and x boost with mnist
let's go back a little bit more go back to where we were and now we have the
sage maker Python SDK here you have your Gloun on for and mnist.
sentiment analysis and then you have your Tensor flow models for distributed
mnist you can actually start to learn tensor flow and work on tensor
flow if that is your framework of choice okay we're going to go ahead and wrap up
this particular session for the Amazon algorithms in part two we'll actually
start to look into introduction to Amazon machine learning and the first
model of will run is the breast cancer prediction. As promised for the overview
we deployed the sage make for instance we defined the I am role to work with s3
will review the for notebook and algorithms for our next
steps we encourage you to join our slack channel and you can go to our website
here click on our slack channel button we want you to be a part of our
community we want to help you learn deep learning and answer any questions that you
might need you might want to join our Kaggle team and help us solve and win
Kaggle competitions. You might form your own kaggle team from the deep learning
teams resources and talent or you might just want to become a part of the deep
learning team and start teaching how you build algorithms. Regardless of what you
want to do, we want to hear from you. We want to get to know you and we want to
create great change with this new technology. Ok see you in the next video
No comments:
Post a Comment