CGISecurity Logo

The security industry needs to re-align its training expectations for QA

I've been involved in the security community for over 10 years and have worked for small, medium, and
large companies. I have also worked in Quality Assurance and base my comments here on my experiences being a QA tester, and speaking with them as an outsider. I've seen advice in articles, and conferences discussing the need for security training for development and in the last 4-5 years quality assurance.

QA understands business use cases provided to them, and ensuring that the business use cases work (positive testing). Good QA people add negative testing to this mix typically to generate errors/crash things to ensure the platform is fairly stable. The majority of QA people aren't interested in becoming security engineers or having a thorough understanding of vulnerabilities such as sql injection, os commanding, or http response splitting. You may be lucky at your company and have a few that do care about these details but as a general rule they are in short supply and rarely sustainable.

Good training programs should use wording that makes sense to QA. For example most security related input validation testing would be classified as negative testing. Something as small as terminology can go a long way to help to communicate the purpose of why an organization needs to test for a given issue.

Much of the QA focused security training discussed in the industry involves training QA on weaknesses/attacks/vulnerabilities specified in a top 10/25 list. While top 25 lists can provide good insight into what issues you may be concerned with,  I don't think this is always the best approach. A better approach in my opinion is to identify the top 10/25/x/ attacks/weaknesses/vulnerabilities that are likely to affect your own organization and to

– Identify which issues require a human to identify
    – Can this be identified by QA?
    – Can this only be identified by development?
– Identify which ones can be tested in a repeatable automated fashion
    – Using existing QA tools
    – Using Security tools in the QA department
    – Identify which vulns are automatically identifiable in a reliable fashion based on the tools you have
      available to you/after a proper tool evaluation.

For manual issues having good test plan templates (when possible) for certain classes of flaws can go a long way. QA testers speak test plans and test plans covers steps, and expected behaviors. I am a firm believer that many of the vulnerabilities in software that pen testers gloat about being able to find, are really rather trivial and can be taught to anyone. Writing sample test cases for identifying OS commanding, Reflective XSS, or XML Injection is achievable. Keep in mind that 'average' QA testers won't be able to necessarily exploit a flaw, but given the right instruction, flag something needing further review.

Last but not least inform QA who in your organization is available for questions pertaining to these security tests.

Comments welcome 🙂