BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

When It's A Good Idea To Invite An Army Of Hackers To Attack You

This article is more than 9 years old.

Last month, Wired had a disturbing scoop for anyone who has posted an embarrassing revelation on the app Secret: a hacker named Benjamin Caudill had come up with a way to identify Secret's anonymous users. The fear and thrill of learning about the hack was short-lived though. Readers couldn’t rush to their smartphones and start pulling the digital masks off those whose lips had been loosened by the promise of anonymity. The hole had already been patched. Before Rhino Security Lab’s Caudill went to the press, he had disclosed the vulnerability to Secret through its six-month old bug bounty program on HackerOne. It was resolved before the Wired story was published.

Since Secret started paying out bug bounties in February, the company has addressed 50 security problems identified by 37 good-hearted, but money-seeking, hackers. Contrast that with Apple. A security researcher sought to bring attention to a flaw in the company’s security around the FindMyiPhone API that Apple seemed to be ignoring: Someone could try as many passwords as they wanted to access a Machead’s FindMyiPhone service until they got the right one. Showing how serious the problem was, researcher Alexey Troshichev posted code to Github for a tool he called iBrute, which could be used to crack someone’s iCloud password. According to some reports, the vulnerability and the tool may have played a part in the transfer of many celebrities’ intimate photos from iCloud to the home computers of Reddit and 4Chan users. Troshichev told my colleague Thomas Fox-Brewster that he would have disclosed the flaw to Apple were it running a bug bounty program. Apple does invite people to alert it to security issues but there is no formal bounty program.

“The philosophy of some bug bounty hold-outs is, ‘We don’t negotiate with terrorists,’” says Jake Kouns of the Open Security Foundation. “Why would we incent you to try to attack and break our stuff?”

Whether Apple's having a bounty program would actually mean that a bunch of pervs wouldn’t know what Jennifer Lawrence looks like naked is debatable. But Apple is one of the notable tech hold-outs in instituting a bug bounty program that incentivizes security researchers to find problems in its products and report them. “Bug bounty programs are all the rage at the moment,” says Kouns. “If you’re not doing a bug bounty program, you’re perceived as not really caring about problems.”

Google and Facebook have been running bug bounty programs for years. As of August 2013, Google had paid out $2 million in rewards. Facebook has given out as much as $33,500 as a bounty for a critical bug. Twitter joined the bug bounty train this summer, and has already used it to squash 55 bugs.

Kouns and co-researcher Carsten Eiram of Risk Based Security say there are now 300 documented bug bounty programs in operation, 75 of which involve cash (or Bitcoin) rewards; others, such as Etsy, simply offer a t-shirt and a shout-out. The field has exploded over the last year but it’s far from a new idea. The first bug bounty was offered up in October 1995 by Netscape. The prizes for those who could find bugs in the browser's beta products? "A nifty Netscape Mozilla mug or a snazzy Netscape polo shirt." But the security approach is going more mainstream now thanks to a host of start-ups all founded in 2012 and 2013 that offer specialized platforms for running bug bounty programs, including HackerOne, Bugcrowd, Crowdcurity, and Synack. They act like security matchmakers, getting hackers to sign up as members of their sites and then inviting them to find weaknesses in the clients who want to crowdsource searches for security flaws. HackerOne, whose executive DNA includes former managers of Microsoft’s and Facebook’s bug bounty programs, has over 9,000 security researchers on its site and over $9 million in venture funding from Benchmark. Bugcrowd, which was started by former penetration testers from Australia, has over 10,000 hackers and $1.7 million in funding. CrowdCurity, a Danish company that's relocated to San Francisco, has $1.5 million from funders including Tim Draper to focus on making Bitcoin companies more secure. Synack, a newer entrant to the field founded by two former NSA vulnerability finders with $9 million in funding, has several hundred researchers. The philosophy behind the start-ups is the same: you’re more likely to find security and privacy flaws if you throw lots of different people at your code, have an established process for how they inform you about those flaws, and make clear what they will get in return.

Katie Moussouris joined HackerOne this year as its head of policy after kickstarting Microsoft’s in-house bug bounty program last year. Before joining the tech giant, she was a penetration tester for seven years, and so is intimately familiar with the process of trying to tell companies they have problems. “It was always a shot in the dark what kind of reaction you’d get when you told them about a security vulnerability. I had great experiences and unnerving ones, where companies threatened legal action against me,” she said. She describes it as the five stages of vulnerability disclosure emotion: grief, denial, anger, bargaining and finally acceptance. What was historically difficult for security researchers is that they weren’t getting paid to walk companies through that long and difficult process; they were just trying to do a good deed and get a company to fix an issue. That’s why security researchers sometimes just went to the press instead, as when Andrew “weev” Auernheimer went to Gawker with the revelation that AT&T had a flaw that would reveal the email addresses and unique ids of people with iPads. “We simplify and streamline the communication process. We replace the messy and outdated security inbox,” said Moussouris of the ticketing and bounty pay-out system HackerOne provides. “We make it harder for the process to break down or fail, or for reports to fall through the cracks.”

The different start-ups have slightly different approaches:

  • HackerOne, a free platform that charges companies 20% on the bounties they pay out, differs from the other programs in that it encourages company to publicize the process. Its website has a running timeline of who has found bugs in which company’s products, and many companies elect to publish how much they paid out for a given bug. HackerOne vets the researchers before paying them – ensuring for example that they’re “not in Iran or on a terrorist lists” and takes care of the actual payout. “It’s a headache to deal with how you pay,” says Moussouris. “We send paypal, bitcoin, Western Union. Hackers in Russia need to be paid in real weird ways sometimes, like a Russian Western Union.”
  • Bugcrowd CEO Casey Ellis, a tall Australian with orange-blonde hair, started out as a pen tester, working on assignment for companies that wanted newly released products and code vetted. “I enjoyed breaking into stuff more than building it,” he said. After hearing that Facebook had launched a bug bounty program, he had a “light bulb” moment that the work he was doing would be better if he had 100 other guys doing it with him. He built a site listing all existing bug bounty programs and put a call out on Twitter for security researchers interested in bounty hunting to sign up at Bugcrowd. Then he went back to his old penetration testing clients and asked them if they wanted to try the new crowdsourced model. Now he’s got 10,000 researchers on his site, all of whom have public profiles and a place on a leaderboard that recognizes their prowess at finding bugs. His start-up offers flex programs where a company can ask the army of attackers to come at it for a set amount of time, say 24 hours or 2 weeks, or there’s a subscription model for ongoing assessments. “This new model is going to fundamentally disrupt companies that do pen testing and vulnerability scans,” says Ellis.
  • Synack was founded by Jay Kaplan and Mark Kuhr who worked together for four years at the NSA doing offensive hacking for counterterrorism missions. “At the NSA, we saw a common theme across multiple application technology stacks where software was highly susceptible to compromise and appropriate protection mechanisms were not in place," said Kaplan. “Companies are pushing out code all the time but they’re not taking the time to reassess to make sure it’s secure. We saw that bug bounty programs were starting to develop at places like Google and Paypal. We decided to take the model and make it more accessible to the broader world.” Synack charges companies a flat monthly subscription fee. Rather than a bug disclosure service, they think of it as outsourced penetration testing. “We’re not a cheap service,” says Kaplan. Synack, which has raised over $9 million in venture funding, pays researchers based on the severity of the vulnerabilities they find, from $100 for finding outdated software to $5,000 - $10,000 for a database compromise or full access to a server. They say they’ve paid out hundreds of thousands in bounties. They have several hundred researchers that they’ve “extensively vetted,” including background checks, skills assessments and in some cases, interviews by Skype. Researchers connect to customers’ products through a Synack VPN so the company knows the probes are coming from people it wants doing the probing.

Wesley Wineberg, 28, is a Vancouver security consultant who joined Synack’s platform in the last year, saying he chose it because the payouts were higher than on other platforms. To sign up, he had to take a test that involved multiple choice questions and long-form essays, evaluate a sample app with 30 vulnerabilities that he had a week to find, and go through a background check. “I fit Synack testing in as my schedule permits,” he said, estimating that he spends 20 hours a week on average looking for vulnerabilities in the products of Synack’s clients on top of his full-time job. “Half of the reason I do it is that I enjoy security work and testing systems. It's my career but it's also my hobby,” said Wineberg. “It’s financially worth my time to continue to do the testing. It could be tough as a full-time job. You only get paid when you find issues. Say you spend a whole weekend testing something and don’t find an issue.”

Another issue for researchers is finding a vulnerability that someone else has already identified. When that happens at Bugcrowd, the site gives the later arrivals to a flaw “karma points,” but those points don’t pay a mortgage.

Moussouris of HackerOne actually warns companies against rushing into bug bounty programs. “If you are an organization not ready to receive reports from the outside, throwing out a lot of cash is not a good idea until you have your house in order,” she said. In other words, if you haven’t already done a thorough vetting of your security before signing up, you’re going to be dropping a lot of cash really fast on problems.

Kouns says the major problem bug bounty programs might face is not being monetarily competitive enough. “When serious money gets on the table, you get the attention of all sorts of researchers,” he says, which is good in terms of bounties attracting researchers to help improve our privacy and security through improving the products we use. “But the only thing we continue to hear is that in some cases, it’s not worth it for researchers for the amount of money they pay. Companies are paying $500 to $600 on average as a bounty. If you spend 40-50 hours to find that, it’s not a good payout.”

HackerOne has paid as much as $15,000 for single bugs, including the one behind the infamous SSL bug Heartbleed. Bugcrowd has paid $13,500. But those big payouts are rare.

“Before bounty programs got big, there were other markets,” says Kouns. “You could go through a vulnerability broker who would find the highest bidder. There was more money to be made selling to grey markets and three letter agencies or even the black market, which would buy exploits to assist cybercrime. People reportedly made hundreds of thousands of dollars off a bug. One broker even says he brokered a deal of over a million dollars. If you’re all about making money, going through those platforms isn’t the way you’ll make the most money. But if you’re doing it on the side, or want to do good in the world, these programs are for you.”

Kouns says the future of bounties is still uncertain. “Security is hard,” says Kouns. “This is just one piece of the puzzle. Having more eyes incented to look at your products definitely leads to fixing bugs, but it doesn’t solve the problem of developing insecure code in the first place."

Though Bugcrowd's Casey Ellis has a suggestion for that: if his army of hackers doesn't find a bug in a company's code, the company should pay a bonus to its developers.