How do you avoid duplicates in a Bug Bounty program?

How do you avoid duplicates?

Over the past year I’ve spoken with researchers and heard many different strategies and approaches to avoiding dupes. This topic can often piggyback off of the “How do you approach a target” discussion, but I’m interested in hearing about anything specific or special that you do to avoid the dreaded duplicate.

Some quick things I’ve heard from folks:

  • Avoid the “low-hanging” fruit, stuff people are most likely to find and go after
  • If you have a feel for where other researchers have looked, look elsewhere. Or try to build off of something someone else has found in the past.
  • Go after Bounties that may be overlooked by other researchers lately
  • Hit a target that was recently updated (new software updates, features, etc)

What else would you add?

Here are some additional tactics that we shared back in 2013 in @caseyjohnellis’s blog post “Three ways to avoid duplicates in bug bounty programs

Duplicates are a necessary aspect of bug bounty programs, but they can be a bit of a bummer… Here are some tips on how to avoid dupes which apply to all bounty programs, both Bugcrowd and those out in the wild:

  • Be the first to find the “easy to test for and easy to find” issues (e.g. XSS in an input on the Contact Us page of a web app). This requires planning to be around for the kickoff. Bugcrowd will usually send notifications ahead of time so that you can prepare if you want to.
  • Go after the “easy to test but difficult to find” issues (e.g. XSS in a really obscure part of the applciation, or on another in-scope system which is hard to find). The more obscure the location the better. The goal here is to look in places the other ninjas probably haven’t looked yet. Bugs like these often come as result of HTML source review for clues which point to unlinked pages.
  • Think like a bad guy (e.g. finding an unlinked page that renders tweets for a logged in user, use Twitter to pipe exploit code to POC an XSS, use the XSS to steal the user’s session cookie). This is a different approach to testing… Instead of asking “is this field vulnerable to XYZ bug” for all applicable inputs, think about where you are, what the best objective is (e.g. super-admin with admin portal access), what the controls are between you and that objective, and how best to bypass them.

2 Likes

The longer a bounty has been around, the more likely you’re going to hit duplicates. Everyone has a different description of low-hanging fruit. Some may see XSS as a low hanging fruit. Of course, the fact that low-hanging fruit is not looked at, then you miss chained exploits created with “low hanging fruit” that few look for to create something that should have a higher priority. I haven’t ran into any dupes with that, but now that I’ve posted it, I most likely will.

I’ve learned somethings I want to present at a talk. People say SQL injections are less likely to get you a duplicate. My research shows that what most people focus on are SQL injections to report, so you run into duplicates when you focus on that very important area.

There are so many important areas to focus on, that there is more then enough room to go around, we just need to talk to one another and figure out what others aren’t focusing on.

1 Like

I don’t try avoiding duplicates. I do my test independently and report everything I find. It is very likely that the xss in the search box on the main page of the only site in scope has already been reported but how can I be sure? After 10 minutes, 2 hours, 2 days the chances that the vulnerability is a duplicate are higher of course.

Low-hanging fruits gave me the top3 in May. Get all the low-hanging fruits while looking for high impact vulnerabilities is a good strategy for me. E.g. I see a parameter which is getting a file from the server:
www.site.com/edit/?file=image.jpg the first thing I would check for is XSS but at the same time I would open 2 new tabs for SQLi and path traversal (and others).

I can have the feeling that other researchers have looked at this or that part of the target but it is just a feeling and they maybe didn’t find what I will find or didn’t report it yet.

Yes, this is valid for me.

I think Bugcrowd could implement a feature which inform the hunter at the moment of submitting a new vulnerability report:
“A vulnerability [vulnerability type e.g. SQLi] for [component where the vulnerability is reflected e.g. www.example.com/category/?id=1] has already been submitted on [date/s] by [hunters nickname] [X number of times]. Are you sure you want to report this vulnerability?”

If the hunter decides to report the vulnerability anyway he should receive 2 points but the vulnerability should be marked as duplicated automatically or should be displayed as potential duplicate to Bugcrowd staff so that it is easier to get rid of some of the duplicates.
I would introduce the impact factor to help Bugcrowd deciding what to do with potential duplicates.

  • Low impact vulnerabilities should be marked as duplicates automatically.
  • Medium Impact should consider similarity with other reports (vulnerability type, where the vulnerability is reflected, parameters involved) and the number of times very similar reports have been submitted. If the vulnerability has been submitted already e.g. 3-4 times for the same component, I would consider it as duplicated automatically.
  • High impact vulnerabilities should be marked as potential duplicate and checked by Bugcrowd staff manually.

I think this could avoid a great number of duplicates being reported or analysed by the staff.

2 Likes

I’ve updated the original post to include three tips that we published on the Bugcrowd blog back in 2013. Are there any other tactics or strategies that you use to avoid dupes?

The easiest way to try and mitigate this problem is on the vendor side. Reward a bug and push a code update immediately so other researchers won’t find this bug because it’s patched.

Researchers can’t really do much on duplicates, apart from the heuristic approach in the first post.

1 Like

for best result “duplicatefilesdeleter”

I thought i’d chip in (having seen my umpteenth dupe :slight_smile:

The reality is the real advice for avoiding duplicates comes with bugcrowd understanding that they are provided with the url, the parameter, and possibly other info that might be useful to create Dupe indicators

the last thing i want to see having received the bad news of a dupe is tips on how not to find dupes that are as quite generic (no offence casey! ) all bugs are bugs easy and hard ones.

if bugcrowd used the information it has and doesn’t has, it could create small queries on submission that looked for the parameter , the url or both and give a confidence score regarding dupe likelyhood

that would be 10000% more useful than being told dont look for easy to find issues or what not :slight_smile:

less duplicates is more a bugcrowd problem, if you look at every webapp methodology it will tell you coverage, not skip the easy places

especially if you’ve been invited to private gigs that have been active for 200 days … where to start ? whats the point ? the advice is bad advice … i’m only writing this because I dont like being told be the first when the program is in it’s Nth week/month, who does this advice apply too

Dupes problem should be revisited by Bugcrowd Engineers as a opportunity to minimise time for all parties

no ? :slight_smile:

2 Likes

I am learning more and more each day and have seen reports closed as informative and also wont fix. I think all reports closed this way should be publicly disclosed. So as to not waste researchers and vendors time ? I mean if there is no risk (vulnerability) why not ? Will that not help prevent duplicates too?

1 Like