I made some changes to wmiexec.py script due to some anonyances I encountered during peneration tests.
The use case scenario for these modded scripts is that if the password contains special characters like @ or : and you can’t use it with the default wmiexec.py/psexec.py/smbexec.py scripts (maybe its just me who can’t figure out how to know how to use them :P)
These 3 scripts (wmiexec.py/psexec.py/smbexec.py) are the common tools that you can use if you want to get the remote host to execute a meterpreter exe file generated via Veil-Evasion.
Using the modded scripts, you can get a list of hosts to run a single command (eg ipconfig) using one line of command.
The source code for the modded scriptscan be found here
Special thanks for Corelabs for making these scripts. Impacket scripts can be found here https://code.google.com/p/impacket/.
Screenshot of psexec.py
Examples of how you can use the modded psexec.py
python wmiexec.py -d testdomain -u user -p pass -ip 192.168.2.1 -command ipconfig
python wmiexec.py -d testdomain -u user -p pass -ip 192.168.2.1 -f ips.txt -command ipconfig
Screenshot of smbexec.py
Examples of how you can use the modded smbexec.py
python smbexec.py -d testdomain -u user -p pass -ip 192.168.2.1
python smbexec.py -d testdomain -u user -p pass -f ips.txt
Screenshot of wmiexec.py
Examples of how you can use the modded wmiexec.py
python psexec.py -d testdomain -u user -p pass -ip 192.168.2.1 -command ipconfig
python psexec.py -d testdomain -u user -p pass -f ips.txt -command ipconfig
Cirt.net is a useful resource that contains the default credentials for various devices.
I wrote a script that crawls, parses and extracts the credentials from cirt.net and outputs them into the “combo” format as required by medusa. Medusa is a brute force tool for numerous services like MySQL, SMB, SSH, Telnet and etc.
Currently, only ssh and telnet related credentials are extracted from cirt.net.
You can download the “combo” word lists for ssh and telnet via the direct links below.
SSH combo list for Medusa
Telnet combo list for Medusa
Combined users.txt and passwords.txt that you can use with Patator (https://code.google.com/p/patator/) which is another awesome brute force tool.
Sample command for medusa “combo” SSH attack.
medusa -M ssh -C wordList_ssh.txt -H port22.txt
If you would like to play around with the python script, you can download the file at the below location.
Patator is another awesome tool that you can use for brute forcing SSH logins
Sample command for patator SSH attack
patator.py ssh_login host=10.0.0.1 user=FILE0 password=FILE1 0=users.txt 1=passwords.txt -x ignore:mesg=’Authentication failed.’
Special shoutout to Cirt.net for maintaining and providing the extensive database of default credentials at cirt.net/passwords
I wrote a script to extend the functions of Burp plugin – Carbonator.
Carbonator is an awesome script by Integris Security. Carbonator uses Jython which is easy for me to understand.
Its similar to Sodapop by Redspin. However, the Sodapop script seems broken now.
Below is a link to Sodapop by Redspin
Below is a description for Carbonator from their website.
Carbonator’s purpose is to enable the ability to automate the vulnerability scanning of a large number of web applications.
A single command from a command line can now produce volumes of vulnerability information.
Carbonator can be found here
I made some additional tweaks to the original carbonator.py script as well as created my own launch_burp.py run script.
The additional functionalities that I have included are
1. Allow you to run Burp/Carbonator against a file containing a list of domain names/IPS/urls. Below is a screenshot of the file format.
2. Run Bing lookup against the IP address of the domain name and find other websites that are hosted on the same IP address (using the IP:x.x.x.x keyword in Bing) and run Burp/Carbonator against these additional websites. These seems to be some false positives in Bing search engine. The script checks to make sure that the domain name resolves to the same IP address.
3. Search Google for links belonging to the domain name (using the site:domain.com keyword) in Google and run Burp/Carbonator against these links. You might find additional website content/links as compared to crawling http://www.domain.com.
My Github repo for the code is at https//github.com/milo2012/carbonator. Please feel free to send me your feedback/comments. Thank you for reading.
I came across GDS Burp API which seems like a very useful tool for parsing Burp Proxy logs. The GDS Burp API exposes a Python object interface to requests/responses recorded by Burp. The below link provides a very good introduction to the API.
I wrote a simple script to use the API to parse the Burp proxy logs and send it to SQLMap to automate testing SQL injection for all GET and POST parameters and skip all urls without any parameters.
1. Clone the GDSSecurity burpee repository git clone https://github.com/GDSSecurity/burpee.git
2. Download burpSQL.py from https://github.com/milo2012/burpSQL into the burpee folder
3. Next, we will have to configure logging in Burpsuite
4. Change the proxy settings of your browser to 127.0.0.1:8080
5. Crawl the website with Owasp Ajax Crawling tool or spider with Burpsuite or the manual way.
Below are the command line options for burpSQL
6. The above is pretty self explanatory. If your Burp proxy log is cluttered with urls from multiple domains, you can filtered the SQL injection testing to specific domains using the –domain switch.
Drop me a message if you have any suggestions or comments. Thank you !
OWASP Ajax Crawling Tool is an awesome companion to the tool Burpsuite. It allows you to crawl ajax websites which is a feature missing from Burpsuite. Both are must have tools for penetration testing of modern Ajax websites.
The official website for ACT is https://www.owasp.org/index.php/OWASP_AJAX_Crawling_Tool
The current version of ACT 0.1a seems to have issues with crawling some Ajax websites due to some issues in the dependencies.
I have submitted the bugfixes to the website but it will take some time for the changes to be committed.
Below shows the screenshots of the results of the crawl before and after the patch.
After the patch
As shown in the below screenshot, extra 4 links were discovered after the patch.
Below is the temporary download link for the ACT if you can’t wait for the changes to be committed at the main site.