Sponsor

Security Videos

Entries in python (18)

Friday
Nov202015

Automater Update .21

Keeping Automater up to date:

One of the more outstanding modifications added to version .21 of Automater is that users no longer need to worry about keeping on top of the GitHub site to ensure all of the python modules are the latest version. With a small addition of a –Vv in the command line arguments, Automater will check if the local python modules match the modules on the TekDefense Automater GitHub site. The –V (--vercheck) argument is actually the argument that tells Automater to check the modules, and the small –v (--verbose) is required to make Automater report the outcome. If the files don’t match, Automater will send a notification to stdout to alert the user to which module has been modified, so up to date modules can be pulled if the user wants. The –v (--verbose) option can be used to turn on or off any information sent to stdout to either silence or allow Automater to talk to stdout.
Arguably an even better option added is yet another “version” check of sorts. The sites.xml file is still required locally so that Automater can get instructions on what sites to check and what regexs to report upon. However, a new tekdefense.xml file is also checked for and utilized if it is found locally. The significance of this is that with a –r (--refreshxml) switch included in the command line argument call, Automater will check the TekDefense Automater GitHub site and pull the tekdefense.xml file for use. If the –r switch is utilized, and the local tekdefense.xml file is found to be different on the local machine, the modified (updated) file on GitHub will be pulled and utilized. This ensures that you have the ability to do your own calls with the sites.xml file, while ALSO maintaining constant calls to sites and checks utilized by the TekDefense crew. Together this gives the Automater use the best coverage with no modifications required or manual processes followed.

New Requirements and what they mean:

The new Automater has several updates. Out of the blocks, the requests module (version 2.7 or above) is now required to run Automater. For instructions on getting the requests module if you don't already have view http://docs.python-requests.org/en/latest/user/install/. This gives us better control of returning HTML and sets us up for further upgrades in the near future when we begin using JSON APIs and data collecting capabilities – more on this as things progress. Using requests, the default timeout of get calls to web sites has now been set to 5 seconds. This allows Automater to move on after 5 seconds of waiting for a response from a web site. However, if a web site does respond and provide some input, but the site is slow in its response time, the get request will not timeout. This timeout is only for those sites that just don’t respond. Further refinements on this subject will continue in future upgrades. There are several bug fixes and other modifications and we will soon thread Automater to provide better response times. For instance the delay feature was fixed.
.\Automater.py -h
usage: Automater.py [-h] [-o OUTPUT] [-b] [-f CEF] [-w WEB] [-c CSV]
                    [-d DELAY] [-s SOURCE] [--proxy PROXY] [-a USERAGENT] [-V]
                    [-r] [-v]
                    target
IP, URL, and Hash Passive Analysis tool
positional arguments:
  target                List one IP Address (CIDR or dash notation accepted),
                        URL or Hash to query or pass the filename of a file
                        containing IP Address info, URL or Hash to query each
                        separated by a newline.
optional arguments:
  -h, --help            show this help message and exit
  -o OUTPUT, --output OUTPUT
                        This option will output the results to a file.
  -b, --bot             This option will output minimized results for a bot.
  -f CEF, --cef CEF     This option will output the results to a CEF formatted
                        file.
  -w WEB, --web WEB     This option will output the results to an HTML file.
  -c CSV, --csv CSV     This option will output the results to a CSV file.
  -d DELAY, --delay DELAY
                        This will change the delay to the inputted seconds.
                        Default is 2.
  -s SOURCE, --source SOURCE
                        This option will only run the target against a
                        specific source engine to pull associated domains.
                        Options are defined in the name attribute of the site
                        element in the XML configuration file. This can be a
                        list of names separated by a semicolon.
  --proxy PROXY         This option will set a proxy to use (eg.
                        proxy.example.com:8080)
  -a USERAGENT, --useragent USERAGENT
                        This option allows the user to set the user-agent seen
                        by web servers being utilized. By default, the user-
                        agent is set to Automater/version
  -V, --vercheck        This option checks and reports versioning for
                        Automater. Checks each python module in the Automater
                        scope. Default, (no -V) is False
  -r, --refreshxml      This option refreshes the tekdefense.xml file from the
                        remote GitHub site. Default (no -r) is False.
  -v, --verbose         This option prints messages to the screen. Default (no
                        -v) is False.
Specific sites (already in the sites.xml or tekdefense.xml file) can be called in case the user only wants responses from specific sites. While previous versions of Automater allowed this function for one site using the –s (--source) switch, the new version allows multiple sites to be utilized by separating the required sites with a semicolon. So in the past, if the user had a sites.xml file with the totalhash_ip entry, the user could call Automater with –s totalhash_ip and only receive information about totalhash. However, if the user now wants more than totalhash output, but not all information in the sites.xml or tekdefense.xml file(s), he could enter something like Automater –s totalhash_ip;robtex to get totalhash and robtex information. Any site within the sites.xml or tekdefense.xml ca be joined in this way using the semicolon separator between sites.

 

Lastly, there is now a bot output mode for those who want friendlier output for bots. For instance here is the output using Automater with -b in a skype bot.

 

Sunday
Jul202014

Over a year with Kippo

UPDATE: After posting @ikoniaris of Honeydrive and Bruteforce fame recommended running these. Here are the results of kippo-stats.pl created by Tomasz Miklas and Miguel jacq.

As many of you know from previous posts, I am a big fan of honeypots, particularly Kippo. My main Kippo instance sitting in AWS has been online for over a year now. Let's take a look at what we have captured and learned over this past year. If you want to validate any of these statistics I have made the raw logs available for download.

General Stats:

Unique values (135526 connections):

*csv with geo location

*Map Generated with JCSOCAL's GIPC

Top 11 Countries

China: 699

United States: 654

Brazil: 76

Russian Federation: 69

Germany: 65

Korea, Republic of: 57

Romania: 56

Egypt: 52

Japan: 50

India: 41

Indonesia: 41

Unique Usernames: 8600 (Username list)

 Unique Passwords: 75780 (wordlist)

Unique Sources: 1985 (list of IPs)

Passwords:

One of my favorite uses of kippo data is to generate wordlists from login attempts. I wrote a quick script to parse the kippo logs and pull out all passwords and unique them into a wordlist. Feel free to grab. Additionally I made the wordlists available for download.

Using Pipal I performed analysis of all the login attempts over this year:

Two items of note here are that over 60% of password attempts were 1-8 characters. 40% of attempts were for lowercase alpha characters only. The most used password was 123456. This is the default pass for Kippo.

If a user attempts to create an account or change the root password in a Kippo session those passwords are captured and added to the allowed credentials list. The following credentials were created:

root:0:albertinoalbert123
root:0:fgashyeq77dhshfa
root:0:florian12eu
root:0:hgd177q891999wwwwwe1.dON
root:0:iphone5
root:0:kokot
root:0:nope
root:0:picvina
root:0:scorpi123
root:0:test
root:0:xiaozhe
root:0:12345
root:0:bnn318da9031kdamfaihheq1fa
root:0:ls
root:0:neonhostt1
root:0:wget123

Downloads:

When an attacker attempts to download a tool via wget, within Kippo we allow that file to be downloaded, although they cannot interact with it. With this we are able to get a copy of whatever is being downloaded. In most cases these are IRC bots, but not all. I have made them all available for download.
Here is a listing of all the files:
*Duplicates and obviously legitimate files have been removed from the list.
20131030113401_http___198_2_192_204_22_disknyp
20131103183232_http___61_132_227_111_8080_meimei
20131104045744_http___198_2_192_204_22_disknyp
20131114214017_http___www_unrealircd_com_downloads_Unreal3_2_8_1_tar_gz
20131116130541_http___198_2_192_204_22_disknyp
20131129165151_http___dl_dropboxusercontent_com_s_1bxj9ak8m1octmk_ktx_c
20131129165438_http___dl_dropboxusercontent_com_s_66gpt66lvut4gdu_ktx
20131202040921_http___198_2_192_204_22_disknyp
20131207123419_http___packetstorm_wowhacker_com_DoS_juno_c
20131216143108_http___www_psybnc_at_download_beta_psyBNC_2_3_2_7_tar_gz
20131216143208_http___X_hackersoft_org_scanner_gosh_jpg
20131216143226_http___download_microsoft_com_download_win2000platform_SP_SP3_NT5_EN_US_W2Ksp3_exe
20131217163423_http___ha_ckers_org_slowloris_slowloris_pl
20131217163456_http___www_lemarinel_net_perl
20131222084315_http___maxhub_com_auto_bill_pipe_bot
20140103142644_http___ftp_gnu_org_gnu_autoconf_autoconf_2_69_tar_gz
20140109170001_http___sourceforge_net_projects_cpuminer_files_pooler_cpuminer_2_3_2_linux_x86_tar_gz
20140120152204_http___111_39_43_54_5555_dos32
20140122202342_http___layer1_cpanel_net_latest
20140122202549_http___linux_duke_edu_projects_yum_download_2_0_yum_2_0_7_tar_gz
20140122202751_http___www_ehcp_net_ehcp_latest_tgz
20140201131804_http___www_suplementar_com_br_images_stories_goon_pooler_cpuminer_2_3_2_tar_gz
20140201152307_http___nemo_rdsor_ro_darwin_jpg
20140208081358_http___www_youtube_com_watch_v_6hVQs5ll064
20140208184835_http___sharplase_ru_x_txt
20140215141909_http___tenet_dl_sourceforge_net_project_cpuminer_pooler_cpuminer_2_3_2_tar_gz
20140215142830_http___sourceforge_net_projects_cpuminer_files_pooler_cpuminer_2_3_2_tar_gz
20140219072721_http___www_psybnc_at_download_beta_psyBNC_2_3_2_7_tar_gz
20140328031725_http___dl_dropboxusercontent_com_u_133538399_multi_py
20140409053322_http___www_c99php_com_shell_c99_rar
20140409053728_http___github_com_downloads_orbweb_PHP_SHELL_WSO_wso2_5_1_php
20140413130110_http___www_iphobos_com_hb_unixcod_rar
20140416194008_http___linux_help_bugs3_com_Camel_mail_txt
20140419143734_http___www_activestate_com_activeperl_downloads_thank_you_dl_http___downloads_activestate_com_ActivePerl_releases_5_18_2_1802_ActivePerl_5_18_2_1802_x86_64_linux_glibc_2_5_298023_tar_gz
20140419144043_http___ha_ckers_org_slowloris_slowloris_pl
20140420104056_http___downloads_metasploit_com_data_releases_archive_metasploit_4_9_2_linux_x64_installer_run
20140420104325_http___nmap_org_dist_nmap_6_46_1_i386_rpm
20140505073503_http___116_255_239_180_888_007
20140505093229_http___119_148_161_25_805_sd32
20140505111511_http___112_117_223_10_280_1
20140515091557_http___112_117_223_10_280__bash_6_phpmysql
20140519193800_http___www_unrealircd_com_downloads_Unreal3_2_8_1_tar_gz
20140523120411_http___lemonjuice_tk_netcat_sh
20140610174516_http___59_63_183_193_280__etc_Test8888
20140614200901_http___kismetismy_name_ktx
20140625032113_http___ftp_mirrorservice_org_sites_ftp_wiretapped_net_pub_security_packet_construction_netcat_gnu_netcat_netcat_0_7_1_tar_gz
20140720005010_http___www_bl4ck_viper_persiangig_com_p8_localroots_2_6_x_cw7_3
To see the full source for some of the scripts downloaded by the attackers you can go to this Github Repo. A couple of my favorite ones.

TTY Replay Sessions:

My absolute favorite feature of Kippo is the ability to replay interactive sessions of attacker activity. Watching these replays gives us an idea of what attackers do once inside a session. For instance almost every session begins with a "w" which shows logged in users and uptime, and then a "uname -a" to show them system details. I made a Youtube series called The Kippo Kronicles a while back to showcase some of these sessions. While I don't have the time necessary to continue putting up videos for each session I have put the output of each session up at this Github Repo.
Here is a fun example:
AWSWeb:~# adduser
adduser: Only one or two names allowed.
AWSWeb:~# useradd
adduser: Only one or two names allowed.
AWSWeb:~# ls
AWSWeb:~# pwd
root
AWSWeb:~# cd /cat /etc/passwd
root:x:0:0:root:/root:/bin/bash
daemon:x:1:1:daemon:/usr/sbin:/bin/sh
bin:x:2:2:bin:/bin:/bin/sh
sys:x:3:3:sys:/dev:/bin/sh
sync:x:4:65534:sync:/bin:/bin/sync
games:x:5:60:games:/usr/games:/bin/sh
man:x:6:12:man:/var/cache/man:/bin/sh
lp:x:7:7:lp:/var/spool/lpd:/bin/sh
mail:x:8:8:mail:/var/mail:/bin/sh
news:x:9:9:news:/var/spool/news:/bin/sh
uucp:x:10:10:uucp:/var/spool/uucp:/bin/sh
proxy:x:13:13:proxy:/bin:/bin/sh
www-data:x:33:33:www-data:/var/www:/bin/sh
backup:x:34:34:backup:/var/backups:/bin/sh
list:x:38:38:Mailing List Manager:/var/list:/bin/sh
irc:x:39:39:ircd:/var/run/ircd:/bin/sh
gnats:x:41:41:Gnats Bug-Reporting System (admin):/var/lib/gnats:/bin/sh
nobody:x:65534:65534:nobody:/nonexistent:/bin/sh
libuuid:x:100:101::/var/lib/libuuid:/bin/sh
richard:x:1000:1000:richard,,,:/home/richard:/bin/bash
sshd:x:101:65534::/var/run/sshd:/usr/sbin/nologin
AWSWeb:~# user
bash: user: command not found
AWSWeb:~# adduser obz
Adding user `obz' ...
Adding new group `obz' (1001) ...
Adding new user `obz' (1001) with group `obz' ...
Creating home directory `/home/obz' ...
Copying files from `/etc/skel' ...
Password: 
Password again: 
Changing the user information for obz
Enter the new value, or press ENTER for the default
        Username []: 
Must enter a value!
        Username []: obz
        Full Name []: ladmin obz
        Room Number []: 1
        Work Phone []: 1234567890
        Home Phone []: 
Must enter a value!
        Home Phone []: 0
        Mobile Phone []: 0
        Country []: cn
        City []: xang
        Language []: mand
        Favorite movie []: 1
        Other []: 1
Is the information correct? [Y/n] y
ERROR: Some of the information you entered is invalid
Deleting user `obz' ...
Deleting group `obz' (1001) ...
Deleting home directory `/home/obz' ...
Try again? [Y/n] y
Changing the user information for obz
Enter the new value, or press ENTER for the default
        Username []: obx
        Full Name []: obx toor
        Room Number []: 1
        Work Phone []: 19089543121
        Home Phone []: 9089342135
        Mobile Phone []: 9089439012
        Country []: cn
        City []: xang
        Language []: manenglish
        Favorite movie []: one
        Other []: twofour
Is the information correct? [Y/n] y
ERROR: Some of the information you entered is invalid
Deleting user `obz' ...
Deleting group `obz' (1001) ...
Deleting home directory `/home/obz' ...
Try again? [Y/n] n
AWSWeb:~# cat adduser obz user cat /etc/passwd
root:x:0:0:root:/root:/bin/bash
daemon:x:1:1:daemon:/usr/sbin:/bin/sh
bin:x:2:2:bin:/bin:/bin/sh
sys:x:3:3:sys:/dev:/bin/sh
sync:x:4:65534:sync:/bin:/bin/sync
games:x:5:60:games:/usr/games:/bin/sh
man:x:6:12:man:/var/cache/man:/bin/sh
lp:x:7:7:lp:/var/spool/lpd:/bin/sh
mail:x:8:8:mail:/var/mail:/bin/sh
news:x:9:9:news:/var/spool/news:/bin/sh
uucp:x:10:10:uucp:/var/spool/uucp:/bin/sh
proxy:x:13:13:proxy:/bin:/bin/sh
www-data:x:33:33:www-data:/var/www:/bin/sh
backup:x:34:34:backup:/var/backups:/bin/sh
list:x:38:38:Mailing List Manager:/var/list:/bin/sh
irc:x:39:39:ircd:/var/run/ircd:/bin/sh
gnats:x:41:41:Gnats Bug-Reporting System (admin):/var/lib/gnats:/bin/sh
nobody:x:65534:65534:nobody:/nonexistent:/bin/sh
libuuid:x:100:101::/var/lib/libuuid:/bin/sh
richard:x:1000:1000:richard,,,:/home/richard:/bin/bash
sshd:x:101:65534::/var/run/sshd:/usr/sbin/nologin
AWSWeb:~# cat /etc/shadow
cat: /etc/shadow: No such file or directory
AWSWeb:~# /etc/init.d\D/ssh start
bash: /etc/init.D/ssh: command not found
AWSWeb:~# /etc/init.D/ssh startd
bash: /etc/init.d/ssh: command not found
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# 
AWSWeb:~# exit
cConnection to server closed.
localhost:~# exit
Connection to server closed.
localhost:~# bye
bash: bye: command not found
localhost:~# exit
Connection to server closed.
localhost:~# admin
bash: admin: command not found
localhost:~# su
localhost:~# ls -l
drwxr-xr-x 1 root root 4096 2013-02-03 17:11 .
drwxr-xr-x 1 root root 4096 2013-02-03 17:11 ..
drwxr-xr-x 1 root root 4096 2009-11-06 11:16 .debtags
-rw------- 1 root root 5515 2009-11-20 09:08 .viminfo
drwx------ 1 root root 4096 2009-11-06 11:13 .aptitude
-rw-r--r-- 1 root root  140 2009-11-06 11:09 .profile
-rw-r--r-- 1 root root  412 2009-11-06 11:09 .bashrc
localhost:~# pwd
/root
localhost:~# cd /
localhost:/# ls -l
drwxr-xr-x 1 root root  4096 2013-02-03 17:11 .
drwxr-xr-x 1 root root  4096 2013-02-03 17:11 ..
drwxr-xr-x 1 root root     0 2009-11-20 08:19 sys
drwxr-xr-x 1 root root  4096 2009-11-08 15:42 bin
drwxr-xr-x 1 root root  4096 2009-11-06 11:08 mnt
drwxr-xr-x 1 root root  4096 2009-11-06 11:08 media
lrwxrwxrwx 1 root root    25 2009-11-06 11:16 vmlinuz -> /boot/vmlinuz-2.6.26-2-686
drwxr-xr-x 1 root root  4096 2009-11-06 11:09 opt
lrwxrwxrwx 1 root root    11 2009-11-06 11:08 cdrom -> /media/cdrom0
drwxr-xr-x 1 root root  4096 2009-11-06 11:08 selinux
drwxrwxrwx 1 root root  4096 2009-11-20 08:19 tmp
dr-xr-xr-x 1 root root     0 2009-11-20 08:19 proc
drwxr-xr-x 1 root root  4096 2009-11-08 15:41 sbin
drwxr-xr-x 1 root root  4096 2009-11-20 08:20 etc
drwxr-xr-x 1 root root  3200 2009-11-20 08:20 dev
drwxr-xr-x 1 root root  4096 2009-11-06 11:09 srv
lrwxrwxrwx 1 root root    28 2009-11-06 11:16 initrd.img -> /boot/initrd.img-2.6.26-2-686
drwxr-xr-x 1 root root  4096 2009-11-08 15:46 lib
drwxr-xr-x 1 root root  4096 2009-11-06 11:22 home
drwxr-xr-x 1 root root  4096 2009-11-06 11:09 var
drwxr-xr-x 1 root root  4096 2009-11-08 15:46 usr
drwxr-xr-x 1 root root  4096 2009-11-08 15:39 boot
drwxr-xr-x 1 root root  4096 2009-11-20 09:08 root
drwx------ 1 root root 16384 2009-11-06 11:08 lost+found
localhost:/# cd /home
localhost:/home# ls -l
ldrwxr-xr-x 1 root root 4096 2013-02-03 17:11 .
drwxr-xr-x 1 root root 4096 2013-02-03 17:11 ..
drwxr-xr-x 1 1000 1000 4096 2009-11-06 11:22 richard
localhost:/home# exit
Connection to server closed.
localhost:~# 
localhost:~# 
localhost:~# 
localhost:~# 
localhost:~# 
localhost:~# 
localhost:~# ssh -D root@http://60.250.65.112/ 1337
The authenticity of host '60.250.65.112 (60.250.65.112)' can't be established.
RSA key fingerprint is 9d:30:97:8a:9e:48:0d:de:04:8d:76:3a:7b:4b:30:f8.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '60.250.65.112' (RSA) to the list of known hosts.
root@60.250.65.112's password: 
Linux localhost 2.6.26-2-686 #1 SMP Wed Nov 4 20:45:37 UTC 2009 i686
Last login: Sat Feb  2 07:07:11 2013 from 192.168.9.4
localhost:~# uname -a
Linux localhost 2.6.24-2-generic #1 SMP Thu Dec 20 17:36:12 GMT 2007 i686 GNU/Linux
localhost:~# pwd
/root
localhost:~# cd /
localhost:/# ls -l
drwxr-xr-x 1 root root  4096 2013-02-03 17:19 .
drwxr-xr-x 1 root root  4096 2013-02-03 17:19 ..
drwxr-xr-x 1 root root     0 2009-11-20 08:19 sys
drwxr-xr-x 1 root root  4096 2009-11-08 15:42 bin
drwxr-xr-x 1 root root  4096 2009-11-06 11:08 mnt
drwxr-xr-x 1 root root  4096 2009-11-06 11:08 media
lrwxrwxrwx 1 root root    25 2009-11-06 11:16 vmlinuz -> /boot/vmlinuz-2.6.26-2-686
drwxr-xr-x 1 root root  4096 2009-11-06 11:09 opt
lrwxrwxrwx 1 root root    11 2009-11-06 11:08 cdrom -> /media/cdrom0
drwxr-xr-x 1 root root  4096 2009-11-06 11:08 selinux
drwxrwxrwx 1 root root  4096 2009-11-20 08:19 tmp
dr-xr-xr-x 1 root root     0 2009-11-20 08:19 proc
drwxr-xr-x 1 root root  4096 2009-11-08 15:41 sbin
drwxr-xr-x 1 root root  4096 2009-11-20 08:20 etc
drwxr-xr-x 1 root root  3200 2009-11-20 08:20 dev
drwxr-xr-x 1 root root  4096 2009-11-06 11:09 srv
lrwxrwxrwx 1 root root    28 2009-11-06 11:16 initrd.img -> /boot/initrd.img-2.6.26-2-686
drwxr-xr-x 1 root root  4096 2009-11-08 15:46 lib
drwxr-xr-x 1 root root  4096 2009-11-06 11:22 home
drwxr-xr-x 1 root root  4096 2009-11-06 11:09 var
drwxr-xr-x 1 root root  4096 2009-11-08 15:46 usr
drwxr-xr-x 1 root root  4096 2009-11-08 15:39 boot
drwxr-xr-x 1 root root  4096 2009-11-20 09:08 root
drwx------ 1 root root 16384 2009-11-06 11:08 lost+found
localhost:/# cd /root
localhost:~# ls -l
ldrwxr-xr-x 1 root root 4096 2013-02-03 17:19 .
drwxr-xr-x 1 root root 4096 2013-02-03 17:19 ..
drwxr-xr-x 1 root root 4096 2009-11-06 11:16 .debtags
-rw------- 1 root root 5515 2009-11-20 09:08 .viminfo
drwx------ 1 root root 4096 2009-11-06 11:13 .aptitude
-rw-r--r-- 1 root root  140 2009-11-06 11:09 .profile
-rw-r--r-- 1 root root  412 2009-11-06 11:09 .bashrc
localhost:~# cd /hocd /home/
localhost:/home# ls -l
drwxr-xr-x 1 root root 4096 2013-02-03 17:20 .
drwxr-xr-x 1 root root 4096 2013-02-03 17:20 ..
drwxr-xr-x 1 1000 1000 4096 2009-11-06 11:22 richard
localhost:/home# exit
Connection to server closed.
localhost:~# exit
Connection to server closed.
localhost:~# 

 Conclusion:

After a year with Kippo, I have learned a lot about what these basic attackers do when connecting to seemingly open ssh hosts. There is plenty more to learn though. I have some plans on building out a larger honeypot infrastructure, and automating some of the data collection and parsing. Additionally I would like to spend more time analyzing the sessions and malware for further trends. I'll keep you all posted!

*Big thanks to Bruteforce Labs for their tools and expertise in honeypots.

Wednesday
Jun182014

Automater version 2.1 released - Proxy capabilities and a little user-agent modification

It has been a little while since some of our posts on Automater and its capabilities. However, we haven't stopped moving forward on the concept and are proud to announce that Automater has been included in the latest release of REMnux  and also made the cut for ToolsWatch. Of course, you should get your copy from our GitHub repo since we'll be updating GitHub just prior to getting the updates to other repositories. Okay, enough back-patting and proverbial "glad handing", we are excited to let everyone know that Automater has a new user-agent output that is configurable by the user and now fully supports proxy-based requests and submissions! Thanks go out to nullprobe for taking interest in the code and pushing us forward on getting the proxy capability completed. Although we didn't use the exact submission he provided, we definitely used some code and ideas he provided. Thanks again nullprobe!

The New Stuff

Okay, for a quick review of some of the old posts if you're new to Automater, or need to refresh yourself with the product, please go here, here, and here to read about Automater its capabilities and extensibility as well as output format etc... As you probably know, Automater is an extensible OSINT tool that has quite a few capabilities. To get straight to the point, Automater can now be run with new command-line tags to enable proxy functionality and to change the user-agent submitted in the header of the web requests made from the tool.

User-Agent Changes

Prior to this upgrade, the Automater sent a default user-agent string based on the browser settings on the device hosting the application. While this is probably fine, it just......well.....wasn't good enough for us. By default, the Automater now sends the user-agent string of 'Automater/2.1' with requests and posts (if post submissions are required). However, you now have the ability to change that user-agent string to one of your liking by using the command-line parameter or -a or --agent followed by the string you'd like to use. A new Automater execution line using this new option would look something like:

python Automater.py 1.1.1.1 -a MyUserAgent/1.0

or some such thing that you'd like to send as a user-agent string in the header.

Proxy Capabilities

A significant modification in this version was the inclusion of a capability to utilize a network proxy system. To enable this functionality, all that is needed is the command line argument --proxy followed by the address and the port the proxy device is listening on during Automater execution. For instance, if my network proxy is at IP address 10.1.1.1 and is listening on port 8080 I would execute the Automater by typing:

python Automater.py 1.1.1.1 --proxy 10.1.1.1:8080

of course, your system will utilize standard DNS resolution practices if you only know the name of your network proxy and resolve the IP address automatically. So, if the proxy is known as proxy.company.com listening on port 8080, you would type:

python Automater.py 1.1.1.1 --proxy proxy.company.com:8080

it's as simple as that!

Further Movement

We are still working on other submissions and requests, so please keep them coming as we will continue to upgrade as we get requests as well as when we find more efficient ways to do things. We appreciate the support and would love to answer any questions you may have, so give us a yell if you need anything.

p4r4n0y1ng and 1aN0rmus.....OUT!

Wednesday
Dec112013

Automater Output Format and Modifications

Our recent post on the extensibility of Automater called for a few more posts discussing other options that the program has available. Particularly, we want to show off some different output options that Automater provides and discuss the sites.xml modifications that provide different output formatting. Please read the extensibility article to get caught up with sites.xml modifications if you are not aware of the options provided with that configuration file.

Automater offers a few possibilities for printouts outside of the standard output (screen-based output) that most users are aware of. By running:

python Automater.py 1.1.1.1 –o output.txt

We tell Automater to run against target 1.1.1.1 and to create a text file named output.txt within the current directory. You can see here, that after Automater does its work and lays out the standard report information to the screen, it also tells you that it has created the text file that you have requested.

Once opened, it is quite obvious that this is the standard output format that you see on your screen now saved to a text file format for storage and further use later.

While this text format is useful, we thought it would be better to provide the capability to provide a csv format as well as something that would render in a browser. To retrieve a csv formatted report, you would use the –c command line switch and to retrieve an html formatted report, you would use the –w command line switch. These options can all be run together, so if we ran the command:

python Automater.py 1.1.1.1 –o output.txt –c output.csv –w output.html

We would receive 3 different reports other than the standard screen reporting – 1 standard text file, 1 comma-seperated text file, and 1 html formatted file. Each of the reports are different and can be utilized based on your requirements.

Since we’ve already seen the text file, I wanted to show you the layout of the HTML and comma-separated outputs. Below you can see them, and I think you’ll find each of these quite useful for your research endevours.

You will notice that I’ve called out a specific “column” in each of the files that is marked with the header “Source” in each. This is where the modification of the sites.xml file comes into play. Again, if you need to take a look at how to use sites.xml file for adding other sites and modifying output functionality, please see this article. But for now, let’s take a look at what we can do with changing the html and comma-separated report format functionality by changing one simple entry in the sites.xml file. Below, you can see a good look at the robtex.com site element information within the config file. It is obviously here that we want to modify this scenario, since both of our outputs have RobTex DNS written out in the Source “column.” Looking at the sites.xml file we can easily see that this entry must be defined within the <sitefriendlyname> XML element.

Let’s change our sites.xml file to show how modifying the <sitefriendlyname> XML element can change our report ouput. We will change the <entry> element within the <sitefriendlyname> element to say “Changed Here” as seen below:

Now we will run Automater again with the same command line as before:

python Automater.py 1.1.1.1 –o output.txt –c output.csv –w output.html

And we’ll take a look again at our output.csv and output.html files. Notice that the Source “column” information has been changed to represent what you want to see based on the sites.xml configuration file.

As you’ll see when you inspect the sites.xml format, you can change these <entry> elements within the <sitefriendlyname> elements for each regular expression that you are looking for on those sites that have multiple entries. This allows you to change the Source output string in the file based on specific findings. For instance, if you look at the default sites.xml file that we provide you at GitHub you will find that our VirusTotal sites have multiple entries for the Source string to be reported. This allows you full autonomy in reporting information PER FINDING (regex) so that your results are easily read and understood by you and your team.

Tuesday
Dec102013

The Extensibility of Automater

With the recent release of version 2.0 of Automater, we hoped to significantly save some of your time by being able to use the tool as a sort of one-stop-shop for that first stage of analysis. The code as provided on GitHub will certainly accomplish that, since we have provided the ability for the tool to utilize sites such as virustotal, robtex, alienvault, ipvoid, threatexpert and a slew of others.  However, our goal was to make this tool more of a framework for you to modify based on you or your team’s needs. 1aN0rmus posted a video (audio is really ow ... sorry) on that capability, but we wanted to provide an article on the functionality to help you get the tool working based on your requirements.

One of the steps in the version upgrade was to ensure the Python code was easily modified if necessary, but truthfully our hope was to create the tool so that no modification to the code would be required. To accomplish this, we provided an XML configuration file called sites.xml with the release. We utilized XML because we thought it was a relatively universal file format that was easily understood, that could also be utilized for future web-based application of the tool. When creating the file, we made the layout purposefully simple and flat so that no major knowledge of XML was required. The following will discuss sites.xml manipulation where we will assume a new requirement for whois information.

Our scenario will be wrapped around the networksolutions.com site where we will gather a few things from their whois discovery tool. Our first step is to look in detail at the site and discover what we want to find from it when we run Automater. In this case, we determine that we want to retrieve the NetName, NetHandle, and Country that the tool lists based on the target we are researching. Notice also that we need to get the full URL that is required for our discovery, to include any querystrings etc…

Now that we know what we want to find each time we run Automater, all we have to do is create some regular expressions to find the information when the tool retrieves the site. I left the regexs purposely loose for readability here. See our various tutorials on Regex if you would like to learn more. In this case, we will use:


• NetName\:\s+.+
• NetHandle\:\s+.+
• Country\:\s+.+


which will grab the NetName, NetHandle, and Country labels as well as the information reported on the site. The more restrictive your regex is, the better your results will be. This is just an example, but once you have the regex you need to get the information you desire, you are ready to modify the sites.xml file and start pulling the new data.

Our first step will be to add a new XML <site> element by simply copying and pasting an entire <site> within the current sites.xml file. Since we need to add a new site to discover, we can easily just copy and paste an already established entry to utilize as a skeleton to work with. Just copy and paste from a <site> element entry to a closing </site> element. Since you’re adding the site, you can place it anywhere in the file, but in our case we will put it at the top of the file.

Once this is done, we need to modify the new entry with the changes that we currently know. Let’s come up with a common name that we can use. The <site> element’s “name” parameter is what the tool utilizes to find a specific site. This is what the tool uses when we send in the –s argument to the Automater program. For instance, let’s run python Automater.py 11.11.11.11 –s robtex_dns. Here you can see that Automater used the ip address 11.11.11.11 as the target, but it only did discovery on the robtex.com website. This was accomplished by using the –s parameter with the friendly name parameter.

 

We will use ns_whois for our friendly name and will continue to make modifications to our sites.xml file. We know that this site uses IP addresses as targets, so it will be an ip sitetype. A legal entry for the <sitetype> XML element is one of ip, md5, or hostname. If a site can be used for more than one of these, you can list extras in each <entry> XML element. (You can see an example of this in use in the standard sites.xml file in the Fortinet categorization site entry.) We also know that the parent domain URL is http://networksolutions.com. The <domainurl> XML element is not functionally used, but will be in later versions, so just list the parent domain URL.  With this information, we can modify quite a bit of our file as shown.

 

Now let’s move down the file to the regex entries since we know this information, as well as the Full URL information. In the <regex> XML element, we list one regex per <entry> XML element. In this case, we want to find three separate pieces of information with our already defined regex definitions so we will have three <entry> elements within our <regex> element. We also know our Full URL information based on the networksolutions site we visited and this information is placed in the <fullurl> XML element. However, we can’t list the ip address as we found in the Full URL information because that would not allow the tool to change the target based on your requirements. Therefore whenever a target IP address, MD5 hash or hostname is needed in a querystring, or within any post data, you must use the keyword %TARGET%. Automater will replace this text with the target required – in this case 11.11.11.11. Now we have the Full URL and regex entries of:


• http://www.networksolutions.com/whois/results.jsp?ip=%TARGET%
• NetName\:\s+.+
• NetHandle\:\s+.+
• Country\:\s+.+


A requirement of Automater is that the <reportstringforresult>, <sitefriendlyname> and <importantproperty> XML elements have the same number of <entry> elements as our <regex> XML elements – which in this case is three. This “same number of <entry> elements” requirement is true for all sites other than a site requiring a certain post. I will post another document discussing that later. For now, we will just copy the current reportstringforresult, sitefriendlyname, and importantproperty entries a couple of times and leave the current information there so you can see what happens. Then we’ll modify that based on your  assumed requirements.
Our new site entry in the sites.xml file currently looks like the following:

 

Here you can see the use of the %TARGET% keyword in the <fullurl> element as well as the new <regex> element regex entries. You can also see that I just copied the <sitefriendlyname> and <reportstringforresult> element information from the robtex entry that we copied and pasted. We did the same for the <importantproperty> XML element, but the entries here will be “Results” most of the time. I will post more on what this field allows later. Let’s take a look at running Automater with the current information in the sites.xml file and ensure we only use the networksolutions site by using the –s argument as before with the new ns_whois friendly name as the argument. Our call will be:


python Automater.py 11.11.11.11 –s ns_whois

Once we run this command we receive the following information:

 

Notice that the <reportstringforresult> element is shown with the report string. Also notice the %TARGET% keyword has been replaced with the target address. Now we need to change the <reportstringforresult> element so that we can get a better report string for each entry. In this case, we will change the report strings to [+] WHOIS for each entry just to show the change. We will also change the <sitefriendlyname> element to NetName, NetHandle, and Country so that they are correct. The <sitefriendlyname> element is used in the other reporting capabilities (web and csv). I will post something on that later as well. For now change your sites.xml <reportstringforresult> entries and then see what your report looks like! Should look something like the following screenshot, except that in my case I have also added a few more <entry>'s.

Hopefully this helps you understand how extensible the Automater application is now. Simple modifications to the sites.xml will give you the ability to collect massive information from multiple sites based on what you or your team needs with no Python changes required. Let us know.