Archive | November, 2011

Rec Studio 4 – Reverse Engineering Compiler & Decompiler

Outsmart Malicious Hackers

REC Studio is an interactive decompiler. It reads a Windows, Linux, Mac OS X or raw executable file, and attempts to produce a C-like representation of the code and data used to build the executable file. It has been designed to read files produced for many different targets, and it has been compiled on several host systems.

REC Studio 4 is a complete rewrite of the original REC decompiler. It uses more powerful analysis techniques such as partial Single Static Assignment (SSA), allows loading Mac OS X files and supports 32 and 64 bit binaries.

Although still under development, it has reached a stage that makes it more useful than the old Rec Studio 2.


  • Multihost: Rec Studio runs on Windows XP/Vista/7, Ubuntu Linux, Mac OS X.
  • Symbolic information support using Dwarf 2 and partial recognition of Microsoft’s PDB format.
  • C++ is partially recognized: mangled names generated by gcc are demangled, as well as inheritance described in dwarf2 is honored. However, C++ is a very broad and difficult language, so some features like templates won’t likely be ever supported.
  • Types and function prototype definitions can be specified in text files. Some standard Posix and Windows APIs are already provided in the Rec Studio package.
  • Interactivity is supported, limited to definition of sections, labels and function entry points. Will need to improve it to support in-program definition of types and function parameters.

Although REC can read Win32 executable (aka PE) files produced by Visual C++ or Visual Basic 5, there are limitations on the output produced. REC will try to use whatever information is present in the .EXE symbol table. If the .EXE file was compiled without debugging information, if a program data base file (.PDB) or Codeview (C7) format was used, or if the optimization option of the compiler was enabled, the output produced will not be very good. Moreover, Visual Basic 5 executable files are a mix of Subroutine code and Form data. It is almost impossible for REC to determine which is which. The only option is to use a .cmd file and manually specify which area is code and which area is data.

You can download Rec Studio 4 here:

Windows –
Ubuntu – RecStudioLinux.tgz
Mac – RecStudioMac.tgz

Or read more here.

Posted in: Exploits/Vulnerabilities, Forensics, Programming

Tags: , , , , , , , , ,

Posted in: Exploits/Vulnerabilities, Forensics, Programming | Add a Comment
Recent in Exploits/Vulnerabilities:
- LastPass Leaking Passwords Via Chrome Extension
- Ubiquiti Wi-Fi Gear Hackable Via 1997 PHP Version
- Powerfuzzer – Automated Customizable Web Fuzzer

Related Posts:

Most Read in Exploits/Vulnerabilities:
- Learn to use Metasploit – Tutorials, Docs & Videos - 237,530 views
- AJAX: Is your application secure enough? - 120,545 views
- eEye Launches 0-Day Exploit Tracker - 86,096 views

13 Out Of 15 Popular CAPTCHA Schemes Vulnerable To Automated Attacks

Keep on Guard!

This is not a real shock to be if I’m perfectly honestly, I only use reCAPTCHA whenever I need a CAPTCHA implementation for anything.

And well even then, it’s not totally safe as apparently you can farm out your CAPTCHA cracking (those the fail the automated attempts) to India for a few dollars. It does help cut down on sign-ups and bot spam – but it’s certainly not fool proof.

The report just reinforces my stance proving that 13 out of 15 popular captures could be cracked with automated software.

Security researchers have discovered the vast majority of text-based anti-spam tests are easily defeated.

Computer scientists from Stanford University discovered 13 of 15 CAPTCHA schemes from popular websites were vulnerable to automated attacks. The CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) has been used for several years to prevent automated sign-ups to webmail accounts or online forums in order to block spam bots. Surfers are typically asked during a registration process to identify distorted letters as depicted in an image. A variety of other approaches – including pictures of cats, audio clips and calculus puzzles – have been applied to the problem over the years.

Cybercrooks have responded to the challenge posed by CAPTCHAs by devising techniques that typically involve semi-automatically signing up for new accounts, while relying on the human cogs in 21st century sweatshops – typically located in India – to solve the CAPTCHA puzzles themselves.

The Stanford team, by contrast, looked at whether it was possible to fully automate the process of breaking CAPTCHAs. Their techniques including removing deliberately introduced image background noise and breaking text strings into single characters for easier recognition. The team built an automated tool, called Decaptcha, that applied these various tricks. The approach was partially inspired by techniques used to orientate robots in unknown environments.

It’s interesting to see an academic take on this subject though as it’s usually the realm of blackhats and hobbyists. I’m sure with a fair bit of science they did an excellent job at removing the ‘noise’ that most CAPTCHA systems tend to add to the image to try and foil automatic solving.

I’m also glad to see reCAPTCHA once again stood up well to automated cracking, you’d have to rely on the sweatshops to get past that.

The worst seems to be from VISA – which is surprising and also sad as it’s dealing with banking.

Decaptcha was turned against the challenge response CAPTCHAs used by 15 high-profile websites, enjoying excellent bowling figures against the majority.

For example, Visa’s payment gateway CAPTCHA was defeated 66 per cent of the time. eBay’s CAPTCHA was sidestepped 43 per cent of the time. Lower, but still workable, bypass rates were achieved against Wikipedia, Digg and CNN.

Google and reCAPTCHA were the only two CAPTCHA systems that consistently thwarted Decaptcha during the tests. and Digg have both switched to reCAPTCHA since these tests were run, Computerworld adds.

In a research paper (PDF), the Stanford team suggest several approaches towards making CAPTCHAs harder to beat, including making the length of a text string changeable and randomising character font and size. Lines in the background of CAPTCHAs might also prove effective. In addition, the Stanford team highlighted features that are ineffective against automated attacks but may counter the activities of humans.

The researchers, Elie Bursztein, Matthieu Martin and John C Mitchel, who previously developed techniques for breaking audio CAPTCHAs, presented their latest research at the recent ACM Conference On Computer and Communication Security in Chicago.

Fortunately both and Digg have switched to reCAPTCHA since this report came out making them safer, it’s probably a case of responsible disclosure by the Stanford scientists.

It’s definitely worth a read if you have anything to do with CAPTCHA implementation and especially relevant if you are thinking of developing your own rather than just using something like reCAPTCHA.

You can grab the full report here:


Source: The Register

Posted in: Exploits/Vulnerabilities, Web Hacking

Tags: , , , , , , , , , ,

Posted in: Exploits/Vulnerabilities, Web Hacking | Add a Comment
Recent in Exploits/Vulnerabilities:
- LastPass Leaking Passwords Via Chrome Extension
- Ubiquiti Wi-Fi Gear Hackable Via 1997 PHP Version
- Powerfuzzer – Automated Customizable Web Fuzzer

Related Posts:

Most Read in Exploits/Vulnerabilities:
- Learn to use Metasploit – Tutorials, Docs & Videos - 237,530 views
- AJAX: Is your application secure enough? - 120,545 views
- eEye Launches 0-Day Exploit Tracker - 86,096 views

DirBuster – Brute Force Directories & Files Names

Outsmart Malicious Hackers

DirBuster is another great tool from the OWASP chaps, it’s basically a multi threaded java application designed to brute force directories and files names on web/application servers. Often is the case now of what looks like a web server in a state of default installation is actually not, and has pages and applications hidden within. DirBuster attempts to find these.

However tools of this nature are often as only good as the directory and file list they come with. A different approach was taken to generating this. The list was generated from scratch, by crawling the Internet and collecting the directory and files that are actually used by developers! DirBuster comes a total of 9 different lists (Further information can be found below), this makes DirBuster extremely effective at finding those hidden files and directories. And if that was not enough DirBuster also has the option to perform a pure brute force, which leaves the hidden directories and files nowhere to hide! If you have the time ;)

What DirBuster can do for you

– Attempt to find hidden pages/directories and directories with a web application, thus giving a another attack vector (For example. Finding an unlinked to administration page).

What DirBuster will not do for you

– Exploit anything it finds. This is not the purpose of DirBuster. DirBuster sole job is to find other possible attack vectors.

How does DirBuster help in the building of secure applications?

– By finding content on the web server or within the application that is not required.
– By helping developers understand that by simply not linking to a page does not mean it can not be accessed.

You can download DirBuster here:


Or read more here.

Posted in: Hacking Tools, Web Hacking

Tags: , , , , , , , , , , ,

Posted in: Hacking Tools, Web Hacking | Add a Comment
Recent in Hacking Tools:
- SessionGopher – Session Extraction Tool
- Powerfuzzer – Automated Customizable Web Fuzzer
- Angry IP Scanner – Fast Network Scanner

Related Posts:

Most Read in Hacking Tools:
- Top 15 Security/Hacking Tools & Utilities - 2,015,009 views
- Brutus Password Cracker – Download AET2 - 1,568,831 views
- wwwhack 1.9 – Download Web Hacking Tool - 700,422 views