AWS Services Security Deep Dive

Sundog Education by Frank Kane
A free video tutorial from Sundog Education by Frank Kane
Founder, Sundog Education. Machine Learning Pro
4.5 instructor rating • 22 courses • 455,472 students

Learn more from the full course

AWS Certified Data Analytics Specialty 2021 - Hands On!

Practice exam included! AWS DAS-C01 certification prep course with exercises. Kinesis, EMR, DynamoDB, Redshift and more!

13:06:11 of on-demand video • Updated April 2021

  • Maximize your odds of passing the AWS Certified Data Analytics Specialty exam
  • Move and transform massive data streams with Kinesis
  • Store big data with S3 and DynamoDB in a scalable, secure manner
  • Process big data with AWS Lambda and Glue ETL
  • Use the Hadoop ecosystem with AWS using Elastic MapReduce
  • Apply machine learning to massive data sets with Amazon ML, SageMaker, and deep learning
  • Analyze big data with Kinesis Analytics, Amazon Elasticsearch Service, Redshift, RDS, and Aurora
  • Visualize big data in the cloud using AWS QuickSight
English So let's do a review of security in all the AWS services that we've seen so far. There's going to be long and boring but trust me pretty much needed for the exam. So let's get started. First Kinesis is comprised of Kinesis data streams. And Kinesis data streams has SSL endpoints so we can use the HTTPS protocol and do encryption in flight. That means sending the data to Kinesis securely. There is also KMS integration to provide server side encryption. And that gives us encryption at rest and on top of it we can do client side encryption but we must use our own encryption libraries. There is no support for it in the Kinesis service and we will need to use the KPL the producer library to do so but we need to also provide the encryption our own on our own. There is a supported interface for VPC end points permanently. That means that we can access the Kinesis service within our private EC2 instances privately we can use KCL to read from Kinesis but remember if you do use KCL to read from Kinesis streams then you must also grant read and write access to a DynamoDB table. Why. Because that KCL will use that DynamoDB table to do check pointing and sharing the work between different KCL instances. So remember that now for Kinesis data firehose well we attach IAM roles so we can deliver data to S3, Elasticsearch, Redshift, and Splunk hopefully remember these four destinations. And on top of it the entire delivery stream can be encrypted using KMS that is server side encryption. There is also support for VPC endpoints and private links to access privately. Finally for Kinesis data analytics we can attach an IAM role to it so you can read from the Kinesis data streams we need and references sources may be for reference data and write on outputs to an output destination for example that app the destination maybe either Kinesis streams or a Kinesis firehose. So that is for all the security on Kinesis. Let's get to the next technology. The next technology is SQS. SQS will give us encryption in flight using the HTTPS endpoints so we can transfer data to SQS securely. We will also have server side encryption using KMS and you must set an IAM policy to allow the usage of SQS. There is also a second level of security we can set using SQS queue access policies and ensuring that our users do have access to the SQS service. This is very similar to say an S3 bucket policy. If you want to do client side encryption as usual you must do it manually and you must do it yourself. This is not something that's supported directly by the service and you get a VPC endpoint that is provided for an interface. If you wanted to access the SQS service securely. All right. Next. Next is AWS IoT, IoT has many different securities I don't know if you remember but the first one is policies. Basically we are going to create X.509 certificates or Cognito identities into our devices and using the IoT policies we can control everything we're able to revoke any device at any time. And the IoT policies are JSON documents just like IAM policies they can be attached to groups instead of individual things so we can basically group the things together and just manage groups policies instead of things policies. So that was for things security but then we can get IAM policies to look at the access of the users groups or roles within the IoT service so it can be IAM policies attached to users, groups, or roles, as I said and then you can basically use that to control access to the IoT APIs at a high level for example creating a rule or creating a role action that kind of stuff and then you attach roles to the rules engine so that they can perform their actions. So if your rules engine action is sending data to Kinesis you need to attach an IAM role to that rule basically so we can perform its action and send data actually to Kinesis. OK now S3 we've seen this so many times the security of S3 but one more time we get IAM policies, S3 bucket policies, access control list, encryption in flight using HTTPS, encryption at rest of many different kinds, we get server side encryption, SSE-S3, SSE-KMS, SSE-C, we get client side encryption such as Amazon S3 encryption clients and then we get versioning and MFA delete basically to make sure the data doesn't get deleted by mistake we can get CORS for protecting Web sites and making sure only a few websites get access to our S3 Buckets we get VPC end point that is provided for Gateway endpoint basically to access S3 securely from my private subnet and we have glacier which is included glacier in there there's a bunch of stuff but including these things called lock policies that are very helpful if you want to prevent deletes for example for regulatory reasons and that's called also worm policy, write once read many, then we have DynamoDB and DynamoDB data will be encrypted in transit using TLS. So HTTPS and DynamoDB can be encrypted at rest. Using KMS basically for base tables and secondary indexes but that's only for new tables. Okay. If you have an unencrypted tables you need to create a new table and then you copy the data to it. Okay so this is not something you can enable in place. You have to basically migrate the table from unencrypted to encrypted to make sure it works. Encryption cannot be disabled once you enable it. So this is a setting that you just have when you create a table basically then you can control the access to the tables the API or DynamoDB DAX cluster using IAM policies and DynamoDB streams currently they do not support encryption but I'm sure they will in the near future. Finally there is a VPC end point provided to DynamoDB using a gateway to allow your EC2 instances or whatever in your private subnets to access DynamoDB directly. All right see you in the next part. For more technologies overview on security.