Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add option for own wordlist and Dockerfile #2

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

janmasarik
Copy link

Here are some small changes required for me to work with your tool. :-)

I'll also use this MR to ask 2 questions:

  1. Why complicating it with subprocess instead of threads? You aren't doing any heavy CPU-lifting, just waiting for IO, which is a perfect use-case for threads (e.g. ThreadPoolExecutor)
  2. Why not extend some existing S3 bucket tool? There are too many similarities between digitalocean / S3 / GCP buckets to create a separate tool for all of them.

Otherwise, thanks again and have a great day!

@SpenGietz
Copy link
Contributor

Thanks for your PR! Will get it merged shortly.

  1. I was using subprocesses because with how I had it originally with threads, you couldn't Ctrl+C out of a scan to cancel it and I couldn't find a good solution for it. With subprocesses it is simple to exit earlier, but it does introduce other complications.

  2. I chose to do a separate script for GCP because there are some minor differences between the two and I wanted this to work exactly as I needed for my use-case. Not a great reason I guess haha, but it didn't build well into the S3 enumeration tool that Rhino has released in the past.

@SpenGietz
Copy link
Contributor

I'm having trouble getting the Docker setup working. Does the link in the first command need to be https://github.com/RhinoSecurityLabs/GCPBucketBrute/blob/master/Dockerfile instead of just the regular repository link?

@janmasarik
Copy link
Author

Apologies, I had a typo there as I'm using my pre-built image s14ve/gcpbucketbrute just with docker run s14ve/gcpbucketbrute .... The current problem is also that the Dockerfile isn't in master, so the docker cannot find and fails.

Ideal case would be to skip the boring build part and provide a pre-built image. As it's your repo, I didn't want to give mine image in README, but feel free to do so if you think it's ok :-)

Another option is to setup automated builds; it's quite simple and described here: https://docs.docker.com/docker-hub/builds/. In this case, you would skip the manual build part on updates completely as it would be handled automatically by docker hub.

@janmasarik
Copy link
Author

Also, thanks for the answers!

I believe that huge % of the code could be shared among S3, Azure, DO Spaces and GCP bucket enumeration tool. Aren't you aware of any tool to rule them all? (at least attempts to)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants