Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Data Engineering with AWS Cookbook

You're reading from   Data Engineering with AWS Cookbook A recipe-based approach to help you tackle data engineering problems with AWS services

Arrow left icon
Product type Paperback
Published in Nov 2024
Publisher Packt
ISBN-13 9781805127284
Length 528 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (4):
Arrow left icon
Viquar Khan Viquar Khan
Author Profile Icon Viquar Khan
Viquar Khan
Gonzalo Herreros González Gonzalo Herreros González
Author Profile Icon Gonzalo Herreros González
Gonzalo Herreros González
Huda Nofal Huda Nofal
Author Profile Icon Huda Nofal
Huda Nofal
Trâm Ngọc Phạm Trâm Ngọc Phạm
Author Profile Icon Trâm Ngọc Phạm
Trâm Ngọc Phạm
Arrow right icon
View More author details
Toc

Table of Contents (16) Chapters Close

Preface 1. Chapter 1: Managing Data Lake Storage 2. Chapter 2: Sharing Your Data Across Environments and Accounts FREE CHAPTER 3. Chapter 3: Ingesting and Transforming Your Data with AWS Glue 4. Chapter 4: A Deep Dive into AWS Orchestration Frameworks 5. Chapter 5: Running Big Data Workloads with Amazon EMR 6. Chapter 6: Governing Your Platform 7. Chapter 7: Data Quality Management 8. Chapter 8: DevOps – Defining IaC and Building CI/CD Pipelines 9. Chapter 9: Monitoring Data Lake Cloud Infrastructure 10. Chapter 10: Building a Serving Layer with AWS Analytics Services 11. Chapter 11: Migrating to AWS – Steps, Strategies, and Best Practices for Modernizing Your Analytics and Big Data Workloads 12. Chapter 12: Harnessing the Power of AWS for Seamless Data Warehouse Migration 13. Chapter 13: Strategizing Hadoop Migrations – Cost, Data, and Workflow Modernization with AWS 14. Index 15. Other Books You May Enjoy

Controlling access to S3 buckets

Controlling access to S3 buckets through policies and IAM roles is crucial for maintaining the security and integrity of your objects and data stored in Amazon S3. By defining granular permissions and access controls, you can ensure that only authorized users or services have the necessary privileges to interact with your S3 resources. You can restrict permissions according to your requirements by precisely defining who can access your data, what actions they can take, and under what conditions. This fine-grained access control helps protect sensitive data, prevent unauthorized modifications, and mitigate the risk of accidental or malicious actions.

AWS Identity and Access Management (IAM) allows you to create an entity referred to as an IAM identity, which is granted specific actions on your AWS account. This entity can be a person or an application. You can create this identity as an IAM role, which is designed to be attached to any entity that needs it. Alternatively, you can create IAM users, which represent individual people and are usually used for granting long-term access to specific users. IAM users can be grouped into an IAM group, allowing permissions to be assigned at the group level and inherited by all member users. IAM policies are sets of permissions that can be attached to the IAM identity to grant specific access rights.

In this recipe, we will learn how to create a policy so that we can view all the buckets in the account, give read access to one specific bucket content, and then give write access to one of its folders.

Getting ready

For this recipe, you need to have an IAM user, role, or group to which you want to grant access. You also need to have an S3 bucket with a folder to grant access to.

To learn how to create IAM identities, go to https://docs.aws.amazon.com/IAM/latest/UserGuide/id.html.

How to do it…

  1. Sign in to the AWS Management Console (https://console.aws.amazon.com/console/home?nc2=h_ct&src=header-signin) and navigate to the IAM console.
  2. Choose Policies from the navigation pane on the left and choose Create policy.
  3. Choose the JSON tab to provide the policy in JSON format and replace the existing JSON with this policy:
    {
      "Version": "2012-10-17",
      "Statement": [
          {
              "Sid": "AllowListBuckets",
              "Effect": "Allow",
              "Action": [
                  "s3:ListAllMyBuckets"
              ],
              "Resource": "*"
          },
          {
              "Sid": "AllowBucketListing",
              "Effect": "Allow",
              "Action": [
                  "s3:ListBucket",
                  "s3:GetBucketLocation"
              ],
              "Resource": [
                  "arn:aws:s3:::<bucket-name>"
              ]
          },
          {
              "Sid": "AllowFolderAccess",
              "Effect": "Allow",
              "Action": [
                  "s3:GetObject",
                  "s3:PutObject",
                  "s3:DeleteObject"
              ],
              "Resource": [
                  "arn:aws:s3:::<bucket-name>/<folder-name>/*"
              ]
          }
      ]
    }
  4. Provide a policy name and, optionally, a description of the policy in the respective fields.
  5. Click on Create Policy.

Now, you can attach this policy to an IAM role, user, or group. However, exercise caution and ensure access is granted only as necessary; avoid providing admin access policies to regular users.

How it works…

An IAM policy comprises three key elements:

  • Effect: This specifies whether the policy allows or denies access
  • Action: This details the specific actions being allowed or denied
  • Resource: This identifies the resources to which the actions apply

A single statement can apply multiple actions to multiple resources. In this recipe, we’ve defined three statements:

  • The AllowListBuckets statement gives access to list all buckets in the AWS account
  • The AllowBucketListing statement gives access to list the content of a specific S3 bucket
  • The AllowFolderAccess gives access to upload, download, and delete objects from a specific folder

There’s more…

If you want to make sure that no access is given to a specific bucket or object in your bucket, you can use a deny statement, as shown here:

{
    "Sid":"DenyListBucketFolder",
         "Action":[
            "s3:*"
         ],
         "Effect":"Deny",
         "Resource":[
            "arn:aws:s3:::<bucket-name>/<folder-name>/*"
}

Instead of using an IAM policy to set up permissions to your bucket, you can use S3 bucket policies. These can be located in the Permission tab of the bucket. Bucket policies can be used when you’re trying to set up access at the bucket level, regardless of the IAM role or user.

See also

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image