Cloud Walktrough Part 1

Alparslan Akyıldız academy
4 min readMay 2, 2023

Nowadays I have started learning and practicing the AWS pentesting issues. So I have already solved a challenge is called flaws.cloud and I am sharing the first part solution of the challenge.

What is the S3 buckets?

An Amazon S3 bucket is a public cloud storage resource available in Amazon Web Services (AWS) Simple Storage Service (S3) platform. It provides object-based storage, where data is stored inside S3 buckets in distinct units called objects instead of files.

On the first screen we can see the starting point of the challenge. So we need to enumerate the Domain, IP address for understanding what is running on the cloud at this domain. So first I used host and dig commands.

As you see s3 buckets are served on the cloud side. So when we add the bucket link to the domain, we could successfully send HTTP request via curl. After that I tried to listed buckets with;

aws s3 ls s3://flaws.cloud — no-sign-request

I saw the secret-dd02c7c.html file which can be interesting. It is worth to copy for retriewing it.

aws s3 cp s3://domain/file-path — no-sign-request

When we opened the secret file we had a flag like that;

second level flag

We I jump to level2 I see a message like a warning and the author of the ctf gives an example incident about that. He says by default s3 buckets comes as private but if you turn on static website hosting it will be public and you allow everyone “s3:GetObject” priviliges unless you don’t change the default permissions.

So I tried to get the content of the S3 buckets.

aws s3 ls s3://domain/file-path — profile profile-name
aws s3 sync command: Syncs directories and S3 prefixes. Recursively copies new and updated files from the source directory to the destination. Only creates folders in the destination if they contain one or more files.

I check the robots.txt and the secret file for finding next flag. I followed the new link which located in new secret file. I tried to list the new S3 buckets content with this command;

aws s3 ls s3://level3-bucket-link

It seems .git directory exists the under bucket. It’s cool because If we can download .git/ directory we can reach the source code of the some applications, even we can find access key IDs or secret access keys or lambda functions source codes and more… I realized that Similar to permissions to “Everyone” permissions can be set to “Any Authenticaed AWS User” which leaves the S3 bucket exposed as well. So I have created a free tier on Amazon and created a new user and configure my AWS keys with aws configure command. Now I know that all users who are authenticated any amazon account can be listed or downloaded s3 buckets content because of misconfiguration. Lets go;

After downloading github source code of the some applications which are located under the s3 bucket I examined the git logs and changes beetween releases;

I installed tig tool for facilitating the github release and source code examination;

Tig detected some differences beetween hint1.html file.

It’s cool we have access key id and secret access key. Lets use it;

We can now listed buckets.

When we use the new flag we level up 4. stage of the challenge;

Thank you for reading.

--

--