10 October 2011 | 16,969 views

New Research Shows Facebook’s URL Scanner Is Vulnerable To Cloaking

Check For Vulnerabilities with Acunetix

Oh look, Facebook security (or insecurity) is in the news again – not that this technique is anything revolutionary or ground-breaking.

It’s basically a HTTP referer detection system for the Facebook URL scanner (the thing that generates the preview/thumbnail etc for links posted to Facebook). By detecting it, you can feed it something benign – but when a normal user comes – feed them some malware.

So be careful what you click in Facebook, or Google+ or anything else that gives you a preview but doesn’t really show you the URL or what is on the page.

Members of a hacking think-tank called Blackhat Academy claim that Facebook’s URL scanning systems can be tricked into thinking malicious pages are clean by using simple content cloaking techniques.

Such attacks involve Web pages filtering out requests that come from specific clients and feeding them content that is different from what is displayed to regular users.

Attackers have been using this method to poison search results on Google for years now by serving keyword-filled pages to its indexing robot, but redirecting visitors to malware when they click on the links. However, it turns out that Facebook is also vulnerable to this type of content forging. “Hatter,” one of the Blackhat Academy members, provided a live demonstration, which involved posting the URL to a JPEG file on a wall.

Facebook crawled the URL and added a thumbnail image to the wall post, however, clicking on its corresponding link actually redirected users to YouTube. This happened because the destination page was able to identify Facebook’s original request and served a JPEG file.

“While most major sites that allow link submission are vulnerable to this method, sites including Websense, Google+ and Facebook make the requests easily identifiable,” the Blackhat Academy hackers said.

This kind of technique is VERY popular in the Blackhat SEO world, or at least it was back in the day – you could feed pages to the search engines that weren’t really human readable, but they were perfect in terms of link density, keywords and so on for Google and other search engines.

When humans visited, they’d get the normal page – when search bots visited they’d get a specially tailored version to hike the page up in the rankings. I’m not sure if it goes on (Google is a hell of a lot smarter now) – but I’d be surprised if it’s totally gone.

Websense of course are claiming that it doesn’t really effect them due to the all the l33t techniques they use to filter URLs…cool story bro.

“These sites send an initial request to the link in order to store a mirror thumbnail of the image, or a snapshot of the website being linked to. In doing so, many use a custom user agent, or have IP addresses that resolve to a consistent domain name,” they explained.

Earlier this week, Facebook signed a partnership with Websense to use the security vendor’s cloud-based, real-time Web scanner for malicious URL detection. Blackhat Academy has now provided proof-of-concept code, which, according to its advisory, can be used to bypass it.

Websense doesn’t believe that to be the case. “This is nothing new. We use numerous methodologies and systems to ensure that our analysis of content (in real time) is not manipulated by malware authors, including using IP addresses not attributable to Websense so that malware authors are unaware that it is Websense analyzing the content,” the company said.

“Also, the Websense ThreatSeeker Network is fed via an opt-in feedback loop from tens of thousands of customers distributed globally. These IPs are also not attributable to Websense.com. It is because of technologies like this that Facebook chose Websense to provide protection for their growing user base of more than 750 million users,” it added.

That could well be true, but it’s worth keeping in mind that Websense primarily sells security solutions to businesses and Facebook is usually blocked on many corporate networks. It would be logical to assume that relying on its customers’ appliances to scan URLs on the social networking website might not have an immediate impact.

I know Facebook have signed the agreement, but have they started using Websense filtering yet? We did write something about their collaboration last year – Websense Offers Facebook Users Free ‘Firewall’ Service.

Well if it keeps Facebook users safe from malware, and stops us having to fix more computers for our friends and relatives – it’s good in my books.

We will have to wait and see though until it’s fully implemented if it stops the next round of Facebook malware from sprouting and running riot.

Source: Network World



Recent in Exploits/Vulnerabilities:
- Heartbleed Implicated In US Hospital Leak
- XML Quadratic Blowup Attack Blows Up WordPress & Drupal
- Password Manager Security – LastPass, RoboForm Etc Are Not That Safe

Related Posts:
- Facebook Attachment Uploader Owned By A Space
- SWFScan – Free Flash Application Security Scanner
- Priamos Project – SQL Injector and Scanner

Most Read in Exploits/Vulnerabilities:
- Learn to use Metasploit – Tutorials, Docs & Videos - 227,410 views
- AJAX: Is your application secure enough? - 119,097 views
- eEye Launches 0-Day Exploit Tracker - 85,056 views

Advertise on Darknet

Comments are closed.