A company has to be mature enough to implement a responsible disclosure policy – or at least mature enough to implement its own tailor-made program. Implementing a responsible disclosure policy can show your security consciousness, yet if you do it wrong, the effects can be detrimental .
In our latest responsible disclosure blogpost we have examined the topic of bug hunting from an ethical hacker’s point of view. Now, it is time to take a look at the other side, the things companies need to think of before letting white hat hackers test their services.
We need to emphasize that responsible disclosure policies and bug bounty programs are great initiatives if a company is mature enough. They make security and the work of companies more transparent while encouraging talented researchers to engage with security. At the same time, bug bounty programs can significantly reduce the cost of vulnerability discovery. No system is perfect, however, and companies need to provide a systematic way for researchers to report vulnerabilities.
This systematic way is what we call a vulnerability disclosure policy or VDP.
A VDP has to define a communication process by which ethical hackers can reach the organization and report potential vulnerabilities.
But not necessarily all companies
There are a few more elements a good vulnerability disclosure policy must contain. First and foremost, one has to assess if the company and the software engineering teams are ready to work together with ethical hackers. Maturity and proper security training culture is key.
Do you have enough resources?
Once you have encouraged third party participants to test your website or services – you have to timely process with the incoming reports and fix the reported issues. Once a vulnerability is reported – finders usually wait for the reply and would like to see that your company is taking the issue seriously. Before releasing a vulnerability disclosure policy you need to make sure that there are adequate processes and responsible people for handling incoming issues. There has to be a responsible team member who should follow-up on issues and prioritize the reported bugs. They have to identify if a vulnerability is relevant or not (already solved, not an issue or existing at all) and have to be able to escalate the important problems to technical experts.
Yet, solving the issue is not enough. After fixing the bug the contact person has to update the finder and coordinate the publication of the issue. In bigger corporations, an internal action plan details the above described process containing responsibilities, deadlines and best practices – and the company has to be sure that all the impacted colleagues are aware of their role. The team need people with good communication, project management skills and technical expertise to rate, evaluate the reported issues.
Is your team skilled enough?
Security in often lacking in higher education According to a 2016 research of CloudPassage a software engineering student can graduate from the top 10 US-based universities without having a security course, moreover three of them did not even provide an elective course in the field.
The skillset needed to handle security issues differ from the skillset of a developer and implementing a responsible vulnerability disclosure policy also requires the team to have the expertise to understand the reports and fix the relevant bugs.
Even if you have a dedicated security team - you have to build up a security mindset inside the whole company. Security is everybody’s job and the training of all engineers makes them capable to handle vulnerability reports and release a fix for security issues.
We, at Avatao are creating a platform to educate developers in security. Avatao offers a rich library of hands-on IT security exercises for software engineers to teach secure programming from design to deployment in a fun and intuitive way. Topics cover web security, secure coding in Java, C/C++, Python and also include hot topics like GDPR, payment systems, secure API design, DevSecOps and more.