Skip to content

Final year project in Magshimim, focusing on Python development and Web Exploitation.

License

Notifications You must be signed in to change notification settings

LiamAs05/Auto-Pentester

Repository files navigation

Automatic Penetration Tester

Created as a final project for "Magshimim - National Cybersecurity Program". This project concentrates on making the internet a safer place by protecting websites from potential vulnerabilities.

We are able to scan your website for the following vulnerabilities:

  • Cross Site Scripting (XSS)
  • Command Injection (CI)
  • SQL Injection (SQLI)
  • Cross Site Request Forgery (CSRF)
  • Scanning Hidden/Forgotten Files (.git, .gitignore, 8.3 filenames, swapfiles)

Prerequisites

Your system should have the following programs installed:

  • Recent Version of Google Chrome
  • Python 3.11+

How Do I Start?

git clone https://gitlab.com/OriLev1/auto-pentester.git
cd auto-pentester
git checkout release-v1.0

Now, go to the config.json file and change the url section to your desired site. If for some reason such files does not exist, feel free to copy the recommended configuration from below.

Execute the program using the following commands:

Linux

python3 -m pip install -r src/requirements.txt
./main.py config.json

Windows

python -m pip install -r src/requirements.txt
python main.py config.json

The Approach

We manage to protect websites using a variety of relevant tools and skills.

  • Python: Selenium, Requests, BeautifulSoup, SQLModel, Scapy.
  • Docker: Easily use the applications, no installation of dependencies required
  • Object-Oriented Design: An abstract, easy-to-use API following the OOP paradigms.
  • SQLite: Database of cherry-picked payloads to trigger vulnerabilities
  • Backend Development: Flask, Node.js
  • Penetration Testing
  • Parallel Programming

All of this combines into a single main.py file that, when run, scans your websites and alerts you about potential security holes.

The Injection Algorithm

Used in the XSS and Command Injection scans.

  1. Parse the input json file provided by the user using JSON.
  2. Scan the website and find all the routes (pages) using Requests.
  3. Scan each page and parse the elements it contains using BeautifulSoup.
  4. For all inputs elements found, attempt injecting a special payload using Selenium.
  5. If the payload has echoed in it's escaped form, this input box is safe. Move on to the next one.
  6. If the payload is not escaped using HTML escaping, our program will alert you with a warning.
  7. Attempt injecting several other payloads and check the results.
  8. Report the final findings to the user.

The Proof-Of-Defense Algorithm

Used in the CSRF scan.

  1. Check if the CSRF Token is strictly for the given site
  2. Check if a CSRF Token is given to the user
  3. Check for HTTP protection headers
  4. Report the user of any missing security best-practices

The Configuration File

The configuration file is the one and only way to provide output to the program. It consists of several customizable options to alter the flow of the program to your needs. Our suggestion is to leave all the fields as default and only change the url, but if you do insist, the fields that you may care about modifying are marked with IMPORTANT in the example below. The scans field specifies which scans will be used.

{
    "url": "<website url>",
    "headers": {"optional": "provide HTTP headers"},
    "cookies": {"optional": "provide HTTP cookies"},
    "auth": ["optional: provide HTTP authentication creds"],
    "use_requests_interface": "specify interface",
    "hidden": "show browser windows",
    "timeout": "set max timeout of crawler",
    "blacklist": ["IMPORTANT: specify pages to avoid"],
    "max_pages": "IMPORTANT: specify max amount of pages to scan",
    "recursive": "IMPORTANT: specify whether or not to recurse crawler to subpages",
    "plugins": [
        "src.analyzer.web_elements.a",
        "src.analyzer.web_elements.area",
        "src.analyzer.web_elements.base",
        "src.analyzer.web_elements.button",
        "src.analyzer.web_elements.img",
        "src.analyzer.web_elements.link",
        "src.analyzer.web_elements.script"
    ],
    "scans": [
        "src.scan_manager.scans.xss",
        "src.scan_manager.scans.csrf",
        "src.scan_manager.scans.sqli",
        "src.scan_manager.scans.ci"
    ]
}

Our recommended configuration is:

{
    "url": "<website url>",
    "headers": {},
    "cookies": {},
    "auth": [],
    "use_requests_interface": true,
    "hidden": true,
    "timeout": 10,
    "blacklist": [],
    "max_pages": 100,
    "recursive": true,
    "plugins": [
        "src.analyzer.web_elements.a",
        "src.analyzer.web_elements.area",
        "src.analyzer.web_elements.base",
        "src.analyzer.web_elements.button",
        "src.analyzer.web_elements.img",
        "src.analyzer.web_elements.link",
        "src.analyzer.web_elements.script"
    ],
    "scans": [
        "src.scan_manager.scans.xss",
        "src.scan_manager.scans.csrf",
        "src.scan_manager.scans.sqli",
        "src.scan_manager.scans.ci"
    ]
}

About

Final year project in Magshimim, focusing on Python development and Web Exploitation.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •