Monday, May 13, 2013

Reverse & Hardware Hacking: Hacking apple accessories to pown iDevices

Wake up Neo! Your phone got pwnd !

Apple accessories, especially dock stations and alarm clocks become more and more popular. Nowadays, it is common to find such device in hotel rooms. The question is can we really trust this kind of devices? This paper is trying to answer this question by analyzing the iDevices’ connectivity features, the evasi0n jailbreak and the features of an Apple MFi (Made for iPhone / iPad) accessory. 
This talk is discussing about the most interesting Apple services (from the attacker point of view) and describe how they can be exploited in order to retrieve confidential information or to deploy the evasi0n jailbreak. Finally, the author will present the analysis of a Made For Apple (MFI) dock station and its weapownizing in order to allow an automated jailbreak

Hacking apple accessories to pown iDevices [Slides][Paper]

Saturday, November 3, 2012

Pentest & Reverse: iOS Application Hacking

iOS Appllication hacking

Last month I gave some lecure about iOS application Hacking first at GreHack (Grenoble, France) and then at Hack.Lu (Luxembourg, Luxembourg). Here you will find a quick summary of the talk and the slides and paper. Don't hesitate to send me your question.


This talk demonstrates how professional applications like, Mobile Device Management (MDM) Client, Confidential contents manager (Sandbox), professional media players and other applications handling sensitive data are attacked and sometimes easily breached. This talk is designed to demonstrate many of the techniques attackers use to manipulate iOS applications in order to extract confidential data from the device. In this talk, the audience will see examples of the worst practices we are dealing with every day when pentesting iOS applications and learn how to mitigate the risks and avoid common mistakes that leave applications exposed. Attendees will gain a basic understanding of how these attacks are executed, and many examples and demonstrations of how to code more securely in ways that won't leave applications exposed to such attacks. 

[Paper | Slides]

Saturday, July 28, 2012

Forensics: Recovering data from heavily damaged / scratched DVD, CD

How many time have you lend your CD or DVD collections for movies, pictures or application to your friends and when you got them back, you discovered that they are badly damaged and beyond readable.

CD and DVD are fragile media. A few scratches here and there and they can easily become unreadable. This blog post present some of the tool that can allow recovering data from those spoilt CD/DVD’s.

Recoverdm Toolkit - (recoverdm)
This program will help you recover disks with bad sectors. In case if finds sectors which simply cannot be recovered, it writes an empty sector to the output file and continues. If you're recovering a CD or a DVD and the program cannot read the sector in "normal mode", then the program will try to read the sector in "RAW mode", meaning without error-checking etc. 

This toolkit also has a utility called 'mergebad': mergebad merges multiple images into one. This can be useful when you have, for example, multiple CD's with the same data that are all damaged. In such case, you can then first use recoverdm to retrieve the data from the damaged CD's into an ISO image-files and then combine them into one image with mergebad.

ISO Image

An ISO image is an archive file (disk image) of an optical disc using a conventional ISO (International Organization for Standardization) format. The name “ISO” is taken from the ISO 9660 file system used with CD-ROM media, but an ISO image can also contain UDF file system.

Data carving 

According to Wikipedia "Carving, is the practice of searching an input for files or other kinds of objects based on content, rather than on metadata. File carving is a powerful tool for recovering files and fragments of files when directory entries are corrupt or missing, as may be the case with old files that have been deleted or when performing an analysis on damaged media. 

Most file carvers operate by looking for file headers and/or footers, and then "carving out" the blocks between these two boundaries." In order to retrieve the files stored on our damaged media, we need a tool able to handle the ISO-9660 and/or UDF file system. At the time of writing of this blog post there is only few file carvers handling this file format. Dares is one of them.

Dares - (Windows - Linux)

Dares scans a CD/DVD image or a CD/DVD and tries to find files. It does not depend on file system information, but instead uses the Magic library to identify files. Doing it this way Dares can recover files even when the file system (ISO-9660 or UDF) on the disc is damaged and cannot be mounted anymore.

Case of study

In our case we have a heavily damaged DVD. This DVD is supposed to be a backup of pictures but it can't be read anymore. Let's see what we can restore.

Step 1 - Dumping the data

recoverdm -t 30 -i /dev/sr0 -o backupdvd.iso -s 1 -l badsectors.lst-r 1

Step 2 - Carving

dares –i backupdvd.iso –s outputdir

Step 3 - Renaming the files

Since we are looking for JPEG files we can rename all extracted file within the following command:

ls -d *.bin | sed 's/\(.*\).bin$/mv "&" "\1.jpg"/' | sh

Friday, March 25, 2011

Pentest: Audit your website security with Metasploit WMAP plugin and the HTTP Crawler Module

WMAP is implemented as a Metasploit plugin and depends on an active database to function. The database is used to store a list of target URLs as well as the results of the WMAP modules. To get started with WMAP, the database needs to be configured and at least one target must be added. In most situations, you would bring target data into WMAP through a spider, proxy, or export from another tool. In the example below we will use the msf http crawler module to add a target and demonstrate the process.

Install these packages:

sudo apt-get install libxml-ruby libxml2-dev
sudo apt-get install libxslt-ruby libxslt-dev
sudo apt-get install libnokogiri-ruby
gem install robots
gem install nokogiri 
sudo gem install anemone

Start the Metasploit Framework¶

Open the Metasploit Framework Console (msfconsole):
$ ./msconsole

#    # ###### #####   ##    ####  #####  #       ####  # #####
##  ## #        #    #  #  #      #    # #      #    # #   #
# ## # #####    #   #    #  ####  #    # #      #    # #   #
#    # #        #   ######      # #####  #      #    # #   #
#    # #        #   #    # #    # #      #      #    # #   #
#    # ######   #   #    #  ####  #      ######  ####  #   #

       =[ metasploit v3.7.0-dev [core:3.7 api:1.0]
+ -- --=[ 669 exploits - 345 auxiliary
+ -- --=[ 217 payloads - 27 encoders - 8 nops
       =[ svn r12131 updated today (2011.03.25)

msf > 
Select the database driver (for this tutorial I use the sqlite3 driver but if you have an postesql up you can use it with db_driver postgresql).
msf > db_driver sqlite3
[*] Using database driver sqlite3
Create a database
msf> db_connect wmap_test
[-] Note that sqlite is not supported due to numerous issues.
[-] It may work, but don't count on it
[*] Creating a new database file...
[*] Successfully connected to the database
[*] File: wmap_test

Crawling the target

Load the scanner
msf > use scanner/http/crawler
msf auxiliary(crawler) > show options

Module options (auxiliary/scanner/http/crawler):

   Name         Current Setting  Required  Description
   ----         ---------------  --------  -----------
   MAX_MINUTES  5                yes       The maximum number of minutes to spend on each URL
   MAX_PAGES    500              yes       The maximum number of pages to crawl per URL
   MAX_THREADS  4                yes       The maximum number of concurrent requests
   Proxies                       no        Use a proxy chain
   RHOST                         yes       The target address
   RPORT        80               yes       The target port
   URI          /                yes       The starting page to crawl
   VHOST                         no        HTTP server virtual host

msf auxiliary(crawler) > 

Define target
msf auxiliary(crawler) > set RHOST
msf auxiliary(crawler) > set RPORT 443

Launch the scan
msf auxiliary(crawler) > run
[*] Crawling
[*] [00001/00500]    200 - -
[*]                         FORM: POST /index.asp
[*] [00002/00500]    200 - -
[*]                         FORM: GET /index.asp
[*]                         FORM: POST /index.asp
[*] [00003/00500]    200 - -
[*]                         FORM: GET /index.asp
[*]                         FORM: POST /index.asp
[*] [00004/00500]    200 - -
[*]                         FORM: POST /index.asp
[*] Crawl of complete
[*] Auxiliary module execution completed

Load the WMAP plugin
msf > load wmap
[*] [WMAP 1.0] ===  et [  ] 2011
[*] Successfully loaded plugin: wmap

Check the crawler results
msf > wmap_sites -l
Available sites

     Id  Host             Vhost                             Port  # Pages  # Forms
     --  ----             -----                             ----  -------  -------
     0   XXX.XXX.XXX.XXX  443   4        3

Note: If a test module requires a specific parameter to be set or we want to modify the value of a predefined variable,
this can be done with the 'setg' command.
msf > setg VHOST
msf > setg DOMAIN
msf > setg EXT .asp

Select the target

msf > wmap_targets -t,XXX.XXX.XXX.XXX:443

To view the targets:
msf > wmap_targets -l
Defined targets

     Id  Vhost                             Host             Port  SSL   Path
     --  -----                             ----             ----  ---   ----
     0                    XXX.XXX.XXX.XXX  443   true  

Running WMAP modules¶

Now that a target has been selected, You can obtain a list of what WMAP modules are available using the wmap_run -t command:

msf > wmap_run -t
wmap_run -t
[*] Testing target:
[*]  Site: (XXX.XXX.XXX.XXX)
[*]  Port: 443 SSL: true
[*] Testing started. Fri Mar 25 14:12:23 +0100 2011
=[ SSL testing ]=
[*] Loaded auxiliary/scanner/http/ssl ...
[*] Loaded auxiliary/scanner/http/cert ...

=[ Web Server testing ]=
[*] Loaded auxiliary/scanner/http/verb_auth_bypass ...
[*] Loaded auxiliary/scanner/http/robots_txt ...
[*] Loaded auxiliary/admin/http/tomcat_administration ...
[*] Loaded auxiliary/scanner/http/webdav_internal_ip ...
[*] Loaded auxiliary/scanner/http/webdav_website_content ...
[*] Loaded auxiliary/scanner/http/http_version ...
[*] Loaded auxiliary/scanner/http/frontpage_login ...
[*] Loaded auxiliary/admin/http/tomcat_utf8_traversal ...
[*] Loaded auxiliary/scanner/http/webdav_scanner ...
[*] Loaded auxiliary/scanner/http/web_vulndb ...
[*] Loaded auxiliary/scanner/http/vhost_scanner ...
[*] Loaded auxiliary/scanner/http/options ...
[*] Loaded auxiliary/scanner/http/open_proxy ...
[*] Loaded auxiliary/scanner/http/svn_scanner ...

=[ File/Dir testing ]=
[*] Loaded auxiliary/scanner/http/ms09_020_webdav_unicode_bypass ...
[*] Loaded auxiliary/scanner/http/files_dir ...
[*] Loaded auxiliary/scanner/http/replace_ext ...
[*] Loaded auxiliary/scanner/http/dir_webdav_unicode_bypass ...
[*] Loaded auxiliary/scanner/http/copy_of_file ...
[*] Loaded auxiliary/scanner/http/file_same_name_dir ...
[*] Loaded auxiliary/scanner/http/dir_listing ...
[*] Loaded auxiliary/scanner/http/brute_dirs ...
[*] Loaded auxiliary/scanner/http/writable ...
[*] Loaded auxiliary/scanner/http/prev_dir_same_name_file ...
[*] Loaded auxiliary/scanner/http/dir_scanner ...
[*] Loaded auxiliary/scanner/http/backup_file ...
[*] Loaded auxiliary/scanner/http/trace_axd ...

=[ Unique Query testing ]=
[*] Loaded auxiliary/scanner/http/error_sql_injection ...
[*] Loaded auxiliary/scanner/http/blind_sql_query ...

=[ Query testing ]=

=[ General testing ]=
[*] Analysis completed in 52.9915919303894 seconds.
[*] Done.

For a help: wmap_run -h
In wmap/date/ You can find various configuration file as an
example of the profile, activated with wmap_run -e path/profile
If you would like to limit the WMAP test to a specific set of modules, you can use a profile file.
Profiles can be specified via additional arguments to the wmap_run command.

msf > wmap_run -e path/to/profile/file

The profile file contains the list of modules to execute. See data/wmap/wmap_sample.profile for a sample.

To launch the modules, execute wmap_run with the -e parameter:
msf > wmap_run -e
[*] Using ALL wmap enabled modules.
[*] Testing target:
[*]  Site: (XXX.XXX.XXX.XXX)
[*]  Port: 443 SSL: true
[*] Testing started. Fri Mar 25 14:14:33 +0100 2011

Currently, the results of the WMAP scan are stored in the database.
The database can be used to build custom reporting tools, or queried directly from the console:

msf > db_notes
[*] Time: Fri Mar 25 13:15:21 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=http.vhost data={:name=>""}
[*] Time: Fri Mar 25 13:15:21 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=ssl.certificate data={:cn=>"", :subject=>[["serialNumber", "xxxxxxxxxxxxxxxxxxxxxxxxxx/xxxxx", xx], ["C", "US", 19], ["O", "", 19], ["OU", "TX", 19], ["OU", "See (c)11", 19], ["OU", "Domain Control Validated - QuickSSL(R) Premium", 19], ["CN", "", 19]], :algorithm=>"sha1WithRSAEncryption"}
[*] Time: Fri Mar 25 13:15:38 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=HTTP_OPTIONS data="OPTIONS, TRACE, GET, HEAD, POST"
[*] Time: Fri Mar 25 13:23:19 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=FILE data="/intro.htm Code: 200"
[*] Time: Fri Mar 25 13:33:15 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=FILE data="/css Code: 301"
[*] Time: Fri Mar 25 13:33:24 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=FILE data="/images Code: 301"
[*] Time: Fri Mar 25 13:33:37 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=FILE data="/script Code: 301"
[*] Time: Fri Mar 25 13:34:23 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=FILE data="/script Code: 404"
[*] Time: Fri Mar 25 13:44:58 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=DIRECTORY data="/css/ Code: 403"
[*] Time: Fri Mar 25 13:45:29 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=DIRECTORY data="/images/ Code: 403"
[*] Time: Fri Mar 25 13:46:00 UTC 2011 Note: host=XXX.XXX.XXX.XXX service=https type=DIRECTORY data="/script/ Code: 403"

msf > db_vulns
[*] Time: Fri Mar 25 13:15:40 UTC 2011 Vuln: host=XXX.XXX.XXX.XXX port=443 proto=tcp name=HTTP-TRACE-ENABLED refs=BAhbByIIQ1ZFIg4yMDA1LTMzOTg=

msf >
The vulnerability information is encoded in base64-format so we will need to decode. We can use openssl for this.
msf > echo "BAhbByIIQ1ZFIg4yMDA1LTMzOTg=" | openssl base64 -d
[*] exec: echo "BAhbByIIQ1ZFIg4yMDA1LTMzOTg=" | openssl base64 -d

msf >

We can now use this information to gather further information on the reported vulnerability.
As pentesters, we would want to investigate each finding further and identify if there are potential methods for attack.

To get CVE details we can use our friend google\\