“Open and collaborative knowledge: that is the OWASP way.”
The OWASP testing guide is one of the most commonly used standards for web application penetration testing and testing software throughout the development life cycle. The testing framework was created to help people understand how, where, when, why, and where to test web applications. While other frameworks focus on creating an exhaustive checklist of tasks, OWASP focuses on creating a framework that testers can use when developing their own programs or methodologies.
The Testing Guide is broken up into distinct phases. Within Dradis, each testing phase is given a section in our methodology template with the individual tasks needed to complete each section. Below is an overview of each phase of testing. For more details, visit the OWASP website.
During the information gathering phase, the tester gets a high-level view of the server, the application, and gathers information for the next phases of the test. The tests in the information gathering phase should allow the tester to collect basic data like the web server version and type of a running web server, figure out what web applications are hosted on a web server and the frameworks used for those applications, understand typical requests/responses and map the application, and check search engines, robots.txt files, folder paths, commments, metadata, and more for information leakage.
This phase builds on the information gathered previously to start digging deeper. The tester has already mapped out the application, now they dig into how the infrastructure identified impacts the application security (e.g. testing for known vulnerabilities).
Next, the focus switches back to the server, looking at and testing aspects like the platform configuration and architecture, then testing how the server handles different file extensions, and finally checking "forgotten" files for important data. The tester also looks for administrator interfaces in the server or the web application that can be exploited.
Finally, the tester puts their focus back on the web application itself by testing to see what HTTP methods are supported by the web server, testing whether HSTS header is present, and testing for cross-site or cross-domain policies that they can exploit.
This section deals with account, priviliges, and access. The tester spends most of their during this phase on the login page working to understand how the application allows users to sign up and whether this system can be exploited if you know part of the login information (like the username).
During the configuration and deployment management testing, the tester looked for administrator interfaces. During Identity management testing, all possible application roles (user, administrator, author, etc) are to understand what access or priviliges come with different roles.
Next, the tester checks the requirements and the process to create an account and how accounts are deleted. Finally, the tester digs into the system to prepare for future tests by checking whether error messages give clues about existing usernames and trying to find username patterns to help them find those existing usernames and accounts.
Identigy Management testing is all about understanding the user accounts, usernames, and roles. However, during Authentication Testing, the tester is almost completely focused on passwords.
The tester spends time looking at default and simple username/password combinations. How many systems can you log in using the username/password combination of administrator/password123 or a similarly simple combination? You'd be surprised. Because of this, the tester also checks password strength rules during this phase of testing because without rules to force complexity, the average user will default to passwords like "password" and "qwerty".
Test lockout mechanisms. Strong security measures should include a lockout measure so that multiple incorrect login attempts kick the user out and prevent them from trying to log in again for a period of time. This measure prevents a brute-force attack where an attacker bombards the application with password guesses until they guess the correct password and gain access.
Most applications have security questions to help verify your identity in case you need to reset your password or if you log in from a new system. These questions can be an important security measure but if the answers are easily guessable (e.g. "what's your favorite color?"), they aren't going to be particularly effective. The tester looks at the strength of the existing questions to see whether they can be exploited to give an attacker access.
The tester also looks at more technical aspects like whether a user's login data is transmitted via an encrypted channed or in a non-secure clear text form. Or, whether it is possible to bypass the login process altogether. They check whether the browser cache and history store any sensitive data. If they do, this data is easily accessible through something as simple as the "Back" button. They also examine how passwords are stored to make sure they aren't in clear text form that is vulnerable to attackers. And, the tester examines the password reset process to see whether any aspects of the process are insecure.
Finally, the tester repeats this whole process for any alternative login channels. In some cases, users may be able to log in through the main website, a mobile-optimized version, a mobile application, or a host of other similar alternative channels. All of the different channels need to be tested for security vulnerabilities.
These tests focus on how web applications authenticate access to file systems.
The tester looks for common vulnerabilities like path traversal or file include flaws. The tester also tries to bypass authorization schemes and verifies how every function of the application is affected by user role, authentication status, and other authorization factors.
Testing for vertical privilege escalation (e.g. gaining Administrator access as an Author) or horizontal privilege escalation (e.g. gaining access to another Author's account).
After spending a good amount of time on the login process, the tester checks the logout process in more depth during this phase of testing. Applications allow users to stay logged in for a certain amount of time but if the cookies or session tokens aren't secure, an attacker could hijack legitimate sessions. The tester also checks that session time-out is in place so that a user is automatically logged out after a certain period of time without activity. And, the tester checks the entire logout process to make sure that sessions are effectively terminated.
The tester also checks for common problems related to user sessions. The first is session variable overloading. If the application uses the same session variable for multiple purposes, an attacker could exploit this and gain access to unintended (more priviliged) locations. The tester also looks for potential sross-site reuqest forgery (CSRF) which can force a logged-in user to execute actions within the application. Often, links are sent via email or social media and if clicked, the user has no control over the actions performed (like changing their email address or even transferring money).The tester also looks to see whether session tokens like cookies or session IDs are exposed. If they are, an attacker may be able to impersonate a user to access the application. Then, the tester checks the specific attributes of the cookies to ensure they are adequately protected.
In this phase, the tester goes through a total of 15 different input validation tests looking at everything from Cross-site scripting (XSS) to SQL injection. Why? Input validation is the most common web application security weakness. If all of the data coming from the client or from the environment isn't being validated before it's used, the application is vulnerable a host of different issues.
In the words of Michael Howard, "All input is evil."
The way that errors are handled by the application can reveal useful information to an attacker. The tester checks whether it is possible to access any stack traces or find relevant information within them. They also look at all of the error codes they come across while testing to try to get more information about the technologies used in the application, bugs, or databases.
The tests in this phase can be summarized with the question: "is sensitive data protected?". The tester checks whether and how sensitive data is being protected during transmission and whether it is possible for an attacker to decrypt the encrypted data.
The tests in this phase require the tester to "think outside the box" and try to break the application security measures by bypassing the normal processes or patterns. These tests cannot be automated like many other tests can be. Instead, the tester has to try to "outsmart" the application design. This set of tests also draws heavily from the information gathered in earlier phases of testing. The better the tester understands the logic and processes of the application, the better chance they will have to identify creative ways to "break" it.
The final phase of testing involves executing code within the browser rather than on the server.
The tester looks at a variety of different client-side aspects of the application to check for common vulnerabilities. Many of the vulnerabilities tested in this phase are related to cross-site scripting (XSS) or injection.
The Dradis Framework is collaboration and reporting platform for InfoSec teams that will cut your reporting time in half.
We connect with 19+ different tools including Burp, Nessus, Nmap, and Qualys. Track your progress, split tasks, and share screenshots and evidence with your team.
Dradis Professional Edition includes extra features designed for organizations working with bigger teams and multiple projects at a time.
Community Edition package
Professional Edition package
|HTML report template|
|Word report template|
|Issue, Evidence, and Note templates|
These instructions are also available in the instructions.txt file in your Compliance Package.
This project methodology creates a step-by-step checklist of all of the tasks requred for an OWASPv4 test. This methodology can also be useful independently (like for teams that want to structure their projects by IP).
templates/methodologies/folder of your local install.
See the Using Methodologies page of the Working with Projects guide.
This project template is ready to be updated with your results. Unlike the full project export, the Issue #[Status]# fields need to be updated before any will export into the report template(s). Placeholder Issues and instances of Evidence are already created for each section of the OWASPv4 testing guide. Simply update each Issue in the project with the findings from your tests, update the corresponding Evidence for the Issues, and then export using one of the report template options.
In the header, click Upload output from tool and upload the project template file as
Create a new (blank) project. In the header, click Upload output from tool and upload the project template file as
This is a full project export ready for you to export and test. This project comes pre-populated with 6 Failed status Issues, 6 Passed status Issues, and 7 Unknown status Issues. Each pre-populated Issue also has an instance of Evidence associated with it.
After uploading the project using the instructions above, try the following:
This HTML template will generate a report with the following sections:
This custom Dradis template has the following sections:
(Advanced) Edit the report template properties to filer by the Order field to display the findings in the same order they appear in the OWASPv4 testing guide. See the Report Template Properties page of the Administration guide for details.
Filenames: issue-template.txt, evidence-template.txt, note-template.txt
Use the templates to configure the Plugin Manager so that you can quickly and easily integrate external tool data (Nessus, Burp, Qualys, etc) to match the format of this report template. Or, add any of the templates to your instance as Note templates to painlessly pre-populate manually-created findings with the correct field names.