Public cloudstorage has a lot of crazy stuff in it. You can find a lot of different stuff, from privacy relevant stuff up to complete backups and keys for all services, including keys for AWS, Google Cloud and Azure.

Some of keys or tokens would have allowed a complete organization takeover! As still a lot companies do not have a cost limit in cloud providers an attacker can cause a lot of damage.

To identify most of the stuff, we are almost exclusevly using

Exposed sensitive data everywhere


This is not a new issue and around a long time. I am providing most of the search links and also only redact the most critical things, as the typical reader here should be able to restore this within seconds and as I respect our time the links are included.

It is very difficult to report this stuff as the owner is in most cases not clear. And breaching an AWS org, to get the billing address might be a little bit out of scope.

I came up with the idea to abuse the github secret scanner to at least invalidate leaked api keys by simply uploading it to github. Regarding this blogpost: AWS will add a policy to the leaked user. However this still does not block access to all ressources immediatly and therefore it is not sufficant to report leaked keys, as I would just collect them for others…
If anybody knows a way to securely report leaked API Keys, please let me know

The following chapters will be a sample gallery of what stuff can be found a biref explaination why it shouldn’t be there. Keep in mind that this is always only a small amount of the stuff which can be found.

It is important to mention, that not all things which can be found are still valid or also might also be honeycreds, or otherwise heavily monitored

Private Stuff

Let’s start with the privacy relevant stuff.

responsible handling of data

Outlook accounts and emails (pst, eml, msg)

PST files can be imported in Outlook and can be complete Email accounts.

Outlook Accounts

Outlook Accounts imported

The same goes for msg and eml files, which are a single email. Why do you backup a single email? Because it is important …


Personalausweise (German / Austrian ID Cards),png,jpg,jpeg&order=last_modified&direction=desc&page=1

Saving Passports and ID documents is illegal in several countries.

German ID Page front and back

Austrian ID


Passport #1

Passport #2

Certificate of Birth

Indian birth certificate

Puerto Rico birth certificate


Some US Social Security Numbers…

SSN #1

SSN2 #2


Medical data and prescriptions Prescriptions

It seems that there is quite a big bunch of company data available, including video call recordings, prescriptions, medical records, …

Medical data

Bills and customer data

Of course there are also bills and customer data directly availible. This is ofc bad, but it gets worse when it comes to database dumps.

Customer data

Polices,png,jpg,jpeg&order=last_modified&direction=desc&page=1 insurance policies

Licenses and stuff

There are also a lot of license files and keys available. For example, a Windows Server 2016 from an unattended.xml.

unattended.xml with some passwords and a Windows Server key

Key is valid

Another one, with additional domain join credentials


When it comes to credentials it is just a matter of finding juicy files. It is possible to get some inspiration from tools like Snaffler, which does an excellent job in finding secrets OnPrem.

Default file names with credentials

AWS - credential.csv

If a new IAM user is added in AWS, a credentials.csv file will be generated with the credentials.

CSV files as build when adding an IAM user in AWS


AWS credentials

A quick look by my friendo personal AWS Magician @rootcathacking, as i have no glue about AWS, revealed that the IAM roles are wrongly configured and the complete organization could get compromised.
If you want some deeper random AWS knowledge, visit his blog:

Personal AWS Magician

Of course, they are working and a complete Organization could get compromized

AWS Keys

AWS credentials

But those credentials are for sure invalid or? Sure…

Bucket listing

Another set
Just another set

AWS SMTP Creentials
AWS SMTP Credentials


Extension for mounting a S3 Bucket

Azure Token

I also found a bunch of Azure service accounts, e.g. for the blob service. To verify those credentials we can use Azure CLI and query it like this.

az storage share list --account-name "frowt######" --account-key "YfCL+0LjmmtdJ92BMZJHpTbeO+BB0n0tIdy+bvOE5xiJuMUG2ItHFyNou1ehl75u4p######################"

az storage file list --share-name "assets-dev" --account-name "frowt######" --account-key "YfCL+0LjmmtdJ92BMZJHpTbeO+BB0n0tIdy+bvOE5xiJuMUG2ItHFyNou1ehl75u4p######################""

Valid Blob credentials for Azure

Google Cloud

We can for example search for gha-creds.json files. Those files hold Google Cloud Service Account token.

Google cloud credentials

The gcp_scanner is really handy, if you want to enumerate a token and it’s access.
And ofc we again get some valid tokens…

Valid credentials and a report from the gcp scanner

CSCFG - Azure Cloud Services (classic) Config Schema,tfvars&order=last_modified&direction=desc&page=1

Those cscfg files hold a crazy amount of different credentials. Most but not all credentials are related to Azure ressources.

cscfg file sample #1

cscfg file sample #2


Over 30k hits for .pfx and .p12 certificates. Why those two types? In contriary to .der, .pem, .key files we know the purpose and mostly an url for those certificates. If we get a private certificate or key, it is nice, but quite useless if we do not know where to use …

30k certificates

Generally, it would be possible to try some passwords for the certificates and there might be some interesting ones.

Most of the certificates with a weak password are for Client Authentication and Secure Email, e.g. for SMIME.

We can easily get over 200 valid! certificates, like this one.

Typical securemail and identity certificate

CA chain

There are some other quite interesting certificates, e.g. Code Signing for Apple Developers.

kligo-cert.p12         PW: #####
Code Signing (
C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Developer ID Certification Authority
PS > $cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
PS > $cert.Import("$pwd\kligo-cert.p12",'#####','DefaultKeySet') 
PS > $cert.ToString()
  C=US, O=Medeo, OU=5TJZ8J82S5, CN=Developer ID Application: Medeo (5TJZ8J82S5), OID.0.9.2342.19200300.100.1.1=5TJZ8J82S5
  C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Developer ID Certification Authority
[Serial Number]
[Not Before]
  04/07/2023 14:36:04
[Not After]
  01/02/2027 23:12:15

Code Signing certificate

Remote Tools


WinSCP search

WinSCP config files are well known to be insecure encrypted, meaning it is possible to simply restore the passwords

WinSCP sample

Another WinSCP sample


<Pass encoding="base64"> I guess you can see where this is going

FileZilla sample

FileZilla sample #2


The config for sublime does sometimes hold sftp credentials.

Sublime sample #1

Sublime sample #2

Password Manager,psafe3,kdbx,kwallet,agilekeychain,cred&order=last_modified&direction=desc&page=1

Some people make their own extensions for files, like .pass or .creds and then store their credentials in cleartext inside. Additional there are some keepass databases which can be found, however this would make it necessary to crack the master password, which is no fun as it is really slow.

Sample for a selfmade password “safe”


OpenVPN config files are interesting, as some of them do not require a user-pass-authentication. This would allow it to connect to the VPN…

Sampe ovpn config without user-pass-authentication


Some companies also store their ticket data in the buckets and there might be juicy information inside.

Credentials for a ISP Portal #1

Credentials for a ISP Portal #2

Software configurations


Gitlab config files might also be juicy, as there is typically an SMTP configured.

Gitlab config


Not really a surprise, PHP config files are crazy and a lot of credentials can be found.

PHP Config #1

PHP Config #2


There are also complete Backups public availible, sometimes with a size of several Terrabyte. Of course they still might be encrypted, but yeah, let’s keep the hope.

Can’t loose the backup if it’s public available

Virtual disks

Virtual disks found

Virtual disks can easily be openend with 7zip.
Openend virtual disk

Browsing the backup

Mailstore is a solution for email archives, so this might not be good.

IMG files

The same goes for IMG files, they often also serve as backup. IMG files

Veeam & Acronis Backups,tib&order=last_modified&direction=desc

Veeam & Acronis Backups are normally password protected. But if the password can be cracked, this will mostly be juicy.

Forensic Images

Forensic Images are full disc images, used during incident response. With the correct tools, they can be openend. There are for example the forensic tools for 7zip.

Archive files,7z,tar,gz2,rar&order=last_modified&direction=desc&page=1

A lot of data available


SQL backups are crazy, as they might leak complete application data including usercredentials and also payment data. And there are a lot of files…

BAK files server typically as MSSQL backups

MySQL Dumps

MySQL dumps also contain the complete data.

MySQL dumps as backup

BSON files

Binary JSON files can also be used as backup and there some with quite interesting data.

BSON file with tokens for Firebase


Also no surprise, we can find a lot of the typical credentials in script files like vbs, cmd, bat , ps1, sh. The difficult thing here is to find a name which might contain credentials, as we can not grep through the files without downloading.,sh,vbs,cmd&order=last_modified&direction=desc&page=1

Bash script with credentials

Bash script with credentials

Bash script with AWS credentials

Powershell script with credentials

Excourse 1: Scraping data easily with Excel

Excel nowadays offers a nice feature for quickly gathering data from a webservice. Let’s say, we want to perform some checks on ourself for S3 buckets and therefore want a list.

List of buckets

We can now easily import this data in excel with a query.

Import Data from web

Select the data you want


But wait, now we only have the first page of data …

Adjust the Query

    // Define a function to get data for a specific page
    GetDataForPage = (page) =>
            Source = Web.Page(Web.Contents("" & Text.From(page))),
            Data0 = Source{0}[Data],
            #"Changed Type" = Table.TransformColumnTypes(Data0,{{"#", Int64.Type}, {"Bucket", type text}, {"Files", Int64.Type}, {"Container", type text}})
            #"Changed Type",

    // Use List.Generate to create a list of data for pages 1 to 50
    Pages = List.Generate(() => 1, each _ <= 50, each _ + 1),
    // Use Table.Combine to combine the data for all pages
    CombinedData = Table.Combine(List.Transform(Pages, GetDataForPage))

Now we crawl 50 pages or 1000 rows.

1000 rows within 50 requests crawled

This is not a rate limit bypass, meaning if you run into this issue, you need to tamper this a little more, maybe catspin or fireprox!

Excourse 2: AWS key validation

Check if the credentials are still valid.

root@ ~ [1]# aws configure --profile tmp
AWS Access Key ID [None]: AKIAIUH2AI65LLS3Y###
AWS Secret Access Key [None]: rINcfgU63NBUFaFJFP3kO9cR4sVX############
Default region name [None]: ap-south-1
Default output format [None]:
root@ ~# aws sts get-caller-identity --profile tmp
    "Account": "140444460056",
    "Arn": "arn:aws:iam::140444460056:user/chetan.awate"
root@ ~# aws s3 ls --profile tmp
2022-12-14 12:54:34 63moons-map-migrated
2022-12-14 12:55:14 a-presto-test-bucket-1
2022-12-14 12:55:46 amplify-myamplifytest-dev-163751-deployment
2022-12-14 12:56:15 astha-wfh-ec2
2022-12-14 12:56:45 astha-wfh-ec2-prod-serverlessdeploymentbucket-crtvtwylvb7o
2022-12-14 15:52:07 wave2-binaries
2022-12-14 15:52:10 zohobucket1
root@ ~# aws ec2 describe-instances --profile tmp
    "Reservations": [
            "Groups": [],
            "Instances": [
                    "AmiLaunchIndex": 0,
                    "ImageId": "ami-7d5f2412",
                    "InstanceId": "i-0280648b8540ccbc0",
                    "InstanceType": "t2.medium",
                    "KeyName": "asterisk_server",
                    "LaunchTime": "2022-02-03T05:53:46.000Z",
                    "Monitoring": {
                        "State": "disabled"
                    "Placement": {
                        "AvailabilityZone": "ap-south-1a",
                        "GroupName": "",
                        "Tenancy": "default"
                    "PrivateDnsName": "ip-172-31-23-174.ap-south-1.compute.internal",
                    "PrivateIpAddress": "",

Excourse 3: Download snippet

During my search it was quite nice to simply download a batch of files. A command for this might look like this. And yeah, there is quite a lot of space for improvements, but hey it should be working!

curl --request GET \
 --url ''  \ 
 --header 'Authorization: Bearer ###########' \ 
 | jq | grep "url" | cut -d ":" -f 2- \ 
 | sed 's/,//g' | xargs -i wget --no-check-certificate {}