LevelUp 0x02 - JHaddix, Bug Bounty Hunter Methodology v3

#1

Have a question for JHaddix about his talk at LevelUp 0x02? Post it here!

Bug Bounty Hunter Methodology v3

2 Likes

#2

Hello all! Just started watching the videos and looking through the the wealth of resources available here and I am hooked! I am curious if there are any updates regarding the release of the Bugcrowd University as discussed in the video?

0 Likes

#3

Hi @DeLorean! We dont have any news right now…but we will very, very soon :slight_smile: :wink:

0 Likes

#4

Thanks for the response Sam. I am very much looking forward to it!

0 Likes

#5

Hi ! Actually i tried to access reverse.report website but it was unreachable , is there a specific way to access that site.
Location : India

0 Likes

#6

Hello,

Is there a list somewhere with all the tools you mentioned in this presentation? Thanks,

D

0 Likes

#7
0 Likes

#8

Hi ,

I’ve been doing research about bug bounty hunting, reading the advised books (the hackers handbook etc), reading some write-up’s, watching and analysing the bug bounty methodology video’s (Thanks to everyone for the help and great info you guys shared).
After this I wanted to go hunting myself. As adviced for starters, I’ve looked for a kudo’s bounty program with a large scope. Searched on bugcrowd and picked Netgear.

Started with identifying IP’s, subdomain scraping, sub bruting and link discovery.
But when using the burp spider for link discovery, I’ve noticed that this really takes a long time and the burp file gets pretty huge because the scope is so large.

I’ve let the spider run overnight and it currently made 235.611 requests, there are 681.512 requests queued and my burp file is already 62.3GB.
I’ve got the feeling I’m doing something wrong. The size of the file is not really an issue as long as it doesn’t get too big but I got the feeling this really takes too long.

Does spidering always take so lang? Are there some prefered settings I’ve missed? (I’ve increased the number of threads, set my maximum link depth to 4, I’ve also disabled the passive scanner)

Should I maybe first use eyewitness on the subdomains found scraping/bruting and only spider the hosts who look interesting to me?

Could someone maybe advice me on this?

Thanks alot.

Yves

0 Likes