Archive for the Hacking Category

Shortly before entering the Inland Regional Center in San Bernardino, California and opening fire, killing 14 people and injuring another 20, the shooters — Syed Rizwan Farook and Tashfeen Malik — discarded their cell phones laptop’s hard drive. While the hard drive has not been located, the cell phones turned up in a dumpster near the terrorists’ rented home.

Four hours after the attack, Farook and Malik were killed in a gun battle with FBI agents. Unfortunately, they were shot before anybody got a chance to ask Farook what the four-digit lock code on his iPhone was. Oops.

An iPhone, when configured to do so, will back itself up to Apple’s iCloud when connected to an approved WiFi hotspot. Farook’s iPhone was configured to do this, but hadn’t been backed up in six weeks. To access the data on the phone, all the FBI needed to do was take the phone to a pre-approved WiFI network (say, Farook’s house or work) and turn the phone on. The phone would have backed itself up to iCloud, and the FBI would have been able to file a subpoena to obtain the (unencrypted) data from Apple.

But that’s not what they did. Instead, an FBI agent attempted to reset the phone’s security PIN via iCloud. This requires the phone to be unlocked to sync up. In other words, a random FBI agent who knew nothing about how iCloud works (he could have asked any 13 year old) locked the FBI out of the phone with this one single (dumb) action.

The FBI’s backup plan was to have Apple unlock the terrorist’s phone. First, they politely asked if Apple would break into the phone for them. Apple politely declined. Then, the FBI took Apple to court. When Apple still refused to cooperate, the Department of Justice also took the company to court, citing the All Writs Act (part of the Judiciary Act of 1789). Apple continued to drag their feet on the request.

And, for clarification, what the FBI was asking Apple to do was create a custom version of iOS with a backdoor in it that would allow them to bypass the security code. Because, nothing bad could possibly come from developing that. The government promised that it would only be used one time in a controlled environment, because of course they would promise that.

This story has freedom of speech, citizens’ rights, the right to encryption (and privacy from the government), the FBI vs. Apple, terrorists, murder… all they had to do was throw in a Star Wars reference and a video game and it would have been perfect!

From day one, I told my wife “the FBI does not need Apple to get into that phone. They will get in, regardless. This is a PR stunt.” My wife thinks I’m crazy (and not just because of this theory.) Any time the FBI makes a public release, it’s for a reason. The stuff they don’t want you to know about, you don’t know about. The stuff they do want you to know about makes the news.

Think of it this way: if Apple were to cave, it’s a lose/lose. Apple loses because it makes them look like they are catering to the government at the expense of their customers’ privacy. And the FBI loses twice: first, they look weak by not being able to break into a single phone, and second, they look like bullies. But if Apple were to stand up to the FBI and refuse to unlock the phone and the FBI were eventually able to unlock it on their own, that would be a win/win! Apple becomes the valiant defender of encryption and customer rights, while the FBI ends up looking like uber-hackers!

And, of course, that’s exactly what happened. On Monday, the FBI withdrew their case against Apple and said “thanks, bro, but we got in anyway.”

Above is a video of the XPIN CLIP in action attacking an iPhone running iOS 7x. What the device on the left is doing is sequentially sending passcodes to the phone. If you want to jump to the 3:30 mark you’ll see it send 1230, 1231, 1232, and 1233 before unlocking the phone with the correct code, 1234. Apple fixed this hole in iOS 8. A few weeks later, someone released a new device that worked against iPhones running iOS 8. Apple fixed that hole in iOS 9. It wouldn’t take a complete leap of faith to say that there’s a new device out there that works on the latest iPhone operating system.

But the terrorist’s phone had the security feature enabled that would wipe his phone after 10 incorrect guesses. Welp…

This is the IP Box unlocking an iPhone running iOS 8. The IP Box utilized an exploit that prevented the iPhone from recognizing incorrect guesses by pressing two buttons at the same time. Rumor has it that the newer versions of this box (available for around $200) can cut the power to the phone immediately after each attempt to prevent the phone from logging the incorrect guesses. It takes longer, extending the maximum amount of time from hours to days (but not weeks), but if you’re just dealing with one phone, that’s not too bad.

For now, this story is over (although you can bet Apple already has people trying to figure out how the FBI got into iOS 9, and will be patching that hole in the inevitably soon-to-be released update). Apple politely asked the FBI how they did it; the FBI politely refused to offer up that information. In the end, Apple won by not backing down, and the FBI won by gaining access to the terrorists’ selfies. The terrorists lost, but they were already dead so having their phone compromised is really just a parting gift.

The rest of us are stuck in the middle, hoping that the private information on our phones, computers, and stored in the cloud remains private.

child_iphone_ars-thumb-640xauto-20200[1]

If you think you don’t need to read this post, you definitely need to read this post.

Heartbleed is a security vulnerability that was discovered this week. It probably affects you. First, the five W’s:

Who: Anyone who uses the web and uses https links. That’s probably you.
What: Heartbleed is a vulnerability that allows people to see the information you send to some websites that use OpenSSL. It’s a lot of them.
Where: Gmail, Yahoo, Tumblr, Flickr, Facebook…
When: The problem has been around for two years now, but nobody noticed it until this week.
Why: Honest human error.

You’ve probably noticed the letters “HTTP” preceding most web links. HTTP stands for “hypertext transfer protocol,” and by putting that in front of a web link you’re telling your web browser “Hey, what comes next is going to be a web page.” It’s kind of like saying, “the following message will be in English.”

Sometimes, you’ll see HTTPS instead. The S stands for “secure sockets layer” (or SSL for short), but you can think of that S as simply meaning “secure”. When you use HTTP, the things you read and send across the internet are sent in plain text. That means anyone with the means to do so who is looking and listening for your message can read what you are sending and receiving. With HTTPS, what you send and receive to and from websites is secure and encrypted. Even if someone were to intercept your message, if you are using HTTPS, the information would look scrambled and no one would be able to read it. This is why websites like Gmail and Facebook and your bank’s website default to HTTPS — because it’s secure.

Or, so we thought. Turns out, back in early 2012, someone made a mistake while updating OpenSSL. A big one. Well known security expert Bruce Schneier said on his website this week, “on a scale of 1 to 10, this is an 11.” This bug, which again was introduced in 2012, allows/allowed hackers to read information in certain HTTPS transfers. One frustrating thing about this bug is that there’s no way for servers owners to know if people were hacking them or not; all they can tell is if they were vulnerable or not. And it turns out, a lot of websites were vulnerable.

The good news is Heartbleed only lets attackers view a small portion of memory at a time, so there’s a chance nobody ever saw your password. The bad news is, this vulnerability has been around for two years now, so there’s no telling if you were affected or not.

Several sites including this link at Mashable.com are compiling lists of websites that were affected and have been patched. You’ll want to change your password on those sites. Some of the ones on that list currently include: Facebook, Instagram, Pinterest, Tumblr, Flickr, Google/Gmail, Yahoo/Yahoo Mail/AIM, YouTube, Etsy, GoDaddy, Netflix, Soundcloud, TurboTax, USAA, Box, DropBox, Github, and IFTTT.

Oh, and Minecraft.

This is a good time to remind you that if you use the same password on any other site that you also use on those sites, you should change that password too. Also, stop doing that.

So what about your bank or some other SSL page you want to test? Several “Heartbleed Testers” have been stood up online. Here’s one. Simply click the link and cut/paste the URL to your bank (or any other HTTPS web link) and the website will let you know if they are currently using a safe version of OpenSSL. Of course it doesn’t tell you if they had the bad version last week…

I spent a couple of hours last night changing my passwords on a bevy of services including Facebook, Twitter, Gmail, and more. You should to. It’s a pain in the butt, especially when you have multiple devices (phones, tablets, laptops) that will all need the news passwords, but you’ll thank me in the morning.

A lot of things just happened when you clicked on this article. Your computer connected to my computer, and each of these words I wrote zipped across the internet to their destination. Since this article contains words like encryption, NSA, and secret codes, it probably flagged something for the NSA along the way — you for reading about it, and me for writing about it. In some giant, government data warehouse, there’s now a record that you were here. We’re probably both on a watch list now. Welcome to the machine, and all that.

About five years ago I wrote a silly little program called eCoder Ring. eCoder Ring is a small program that allows you to encrypt and decrypt secret codes. It does this by using any text file, web page, or graphic file as a key for a one-time pad encryption. Here’s what Wikipedia has to say about one-time pad encryption:

In cryptography, the one-time pad (OTP) is a type of encryption which is impossible to crack if used correctly. Each bit or character from the plaintext is encrypted by a modular addition with a bit or character from a secret random key (or pad) of the same length as the plaintext, resulting in a ciphertext. If the key is truly random, as large as or greater than the plaintext, never reused in whole or part, and kept secret, the ciphertext will be impossible to decrypt or break without knowing the key. It has also been proven that any cipher with the perfect secrecy property must use keys with effectively the same requirements as OTP keys. However, practical problems have prevented one-time pads from being widely used.

The key to breaking most codes lies in discovering patterns, and in a properly implemented one-time pad there are none. Not to delve too far into details, but the point of eCoder Ring is that it plucks letters out of a keyfile and uses the numerical position of those letters to represent the letters of your message. eCoder Ring lets you use things like digital pictures (which it converts to ASCII numbers and characters) as keyfiles. It also allows you to skew the code by adding variables to start your code further down in the keyfile, or skip numbers, and do all other sorts of random files. Even if you had eCoder Ring and the keyfile used to generate a message, it would be practically impossible to crack a code generated by it without the proper variables inserted into the program.

It is my belief now as it was when I wrote it that the codes generated with eCoder Ring are impervious to brute force attacks. To prove my point, when I released eCoder Ring I included a code and offered a reward for cracking it. At first I was offering a hundred bucks; later I upped it to two hundred, and I think I may have raised it to five hundred at one point. The reward for cracking the code is moot because without the keyfile or the skew variables, the code is unbreakable. In theory I feel confident about offering a million dollars, but I wouldn’t do that for two reasons, the second of which exposes the weakness of eCoder Ring. The first reason is quite simply that I don’t have a million dollars. The second reason, the scarier reason, and the weakness that plagues all implementations of one-time pads is that both the sender and the receiver have to know what the keyfile is. I know what the keyfile is for the message I encoded. For a hundred dollars I am hoping someone does not kick in my front door, hold a gun to my head and demand access to the keyfile. For a million dollars, someone might. When I wrote that original readme file five years ago that contained the code, I specifically made it clear that the keyfile does not exist on any computer I have control over (not my laptop or my desktop and not my server) and no one else knows what the keyfile is, so bribing my kid with candy or PlayStation games won’t work.

But yes, as I joked in the program’s readme file, any codes generated with eCoder Ring will stand thousands of years of brute force attacks, but will fail in seconds when someone shows up to your house and begins to peel your children’s fingernails off as you watch. As a human being who knows the keyfile, you are eCoder Ring’s weakest link. If the keyfile is stored improperly or transferred improperly, the code can be compromised. When some mug shows up and decides to squeeze the cider out of your Adam’s apple for the keyfile, look out.

So why am I writing about eCoder Ring again after all these years?

From 2007 (when I released it) to 2012, eCoder Ring was downloaded approximately 2,000 times.

In the past two months, eCoder Ring has been downloaded an additional 3,000 times.

In the last two months we have learned that the NSA either gathers or simply pilfers through pretty much everything we do on the Internet. They store records of what websites you visit. They keep track of who you e-mail, and how many times you do so. Most signs point to the fact that the NSA has direct connections to some of the largest content providers in the world and pull data pre-encryption, making the phrases “HTTPS” and “SSL” mean almost nothing. The latest NSA-related leak tells us the NSA pays 35,000 people to break codes and crypto. I hope one of those 35,000 guys runs across a code generated with eCoder Ring someday. That would make me chuckle. There are also rumors that the NSA can effectively either crack or circumvent some/most/all encryption methods being employed today.

Based on the increase in downloads, do I think eCoder Ring is the answer?

No, obviously. It’s too cumbersome to be used on any mass scale and too difficult to properly implement. (What I had always imagined implementing (but is beyond my skills) is an API or something that could be used in chat programs, so instead of sending clear text back and forth across the internet, people could send random-looking encoded text.) What these recent downloads tell me based on current events is that normal people are interested in security. Normal people are interested in learning about codes, and keeping their messages away from prying eyes. Normal people are hitting search engines and looking for ways to regain their privacy. eCoder Ring probably isn’t the answer, but maybe it’ll inspire someone else to create the answer.

Link: eCoder Ring

A few months ago I spun up a new website, SpriteCastle.com. There’s no real content there yet — it’s more of a proof of concept site at this point. Last night after finishing up the latest episode of You Don’t Know Flack I decided to do some tweaking to the Sprite Castle. When I opened the site in Google Chrome, I got the following message:

Crap. I know WordPress has been under attack lately, so my first assumption was that the site had been compromised. Bypassing Chrome’s warning, I opened the site and searched for any sign of malware. I couldn’t find any. I then clicked “View Source Code” and quickly found the problem — links to a “posh laptop bag” website. While viewing the page itself I couldn’t see the link, but while viewing the code there it was, plain as day. A quick Google search shows that I’m not the only person running WordPress with the issue.

After a few minutes of research I tracked the problem back to the free WordPress theme I had downloaded. The theme was injecting links to sites hosting malware in the theme’s footer, and the links were encrypted (technically, obfuscated) making them difficult to find while sifting through the code.

There are lots of websites out there like this one that will help you remove encrypted footer links. Even with those removed, I was still seeing links in my source to malware sites. By using Windows’ FINDSTR command (similar to GREP) I was able to find more encrypted sections (hint: search your PHP files for “EVAL”). Each time I tried dinking with the code, the website would stop loading. Someone spent a lot of time putting those encrypted links into this particular theme.

So, I spent a lot of time getting rid of them.

The simplest branching point in any programming language is the IF…THEN clause, which does exactly what it sounds like:

IF (this) THEN (do this)

One baby step beyond that is IF…THEN…ELSE logic. Even if you are not a programmer you can see that this is used in every single program.

IF PASSWORD IS CORRECT
– ALLOW USER TO LOG IN TO E-MAIL
ELSE
– PRINT “Denied!”
END IF

Simple.

This was also, in its simplest form, the basis for most early forms of copy protection. Consider the old paper-based protection schemes that required gamers to enter a code to play a game.

HAVE USER ENTER CODE
IF CODE IS CORRECT
– RUN GAME
ELSE
– DO NOT RUN GAME
END IF

Once you understand this logic you can see that with a minor change, programs could be re-programmed to always load. Or, “cracked.”

HAVE USER ENTER CODE
IF CODE IS CORRECT
– RUN GAME
ELSE
DO NOT RUN GAME
END IF

Again, simple. No matter what the user enters at the prompt, the game loads. There are other ways to do it, of course. Another simple way would be to tell the program that no matter what the user enters, it’s correct.

HAVE USER ENTER CODE
CODE IS CORRECT
IF CODE IS CORRECT
– RUN GAME
ELSE
– DO NOT RUN GAME
END IF

In this instance, no matter what the player enters, we tell the code that it was correct and the program continues down that path.

This is essentially how I removed the malware from the theme. The theme checks to see if a particular file exists on the computer. If it is, it reads a serial number from the file. If the serial number checks out, the malware links are removed from the footer.

CHECK TO SEE IF LICENSE FILE EXISTS
TELL PROGRAM FILE EXISTS
IF FILE EXISTS
– DO NOT INJECT MALWARE LINKS
ELSE
– INJECT MALWARE LINKS
END IF

A quick check of the theme’s output showed that the technique worked and the malware links had been removed. With that part fixed I began systematically removing all the malware-seeking code. It took a couple of hours, but I think the entire theme is now clean.

Unfortunately, once Google detects malware on a site it removes the URL from its search engine (SpriteCastle.com no longer shows up in Google searches) and Google Chrome still flags the site as one that hosts malware, even though the links have been removed. To get re-added, a request has to be submitted to Google and a scan of the site has to be performed. That ball’s already started rolling, so hopefully in the next day or two I’ll be back in business.

Another week, another episode.

Episode 119 of You Don’t Know Flack is about Hohocon — specifically Hohocon ’94, the last Hohocon and the only one I attended. Hohocon was a hacker conference that ran for 5 years in a row, from 1990 to 1994. It was put on by dFx, the Cult of the Dead Cow, and Phrack Magazine.

This was a tough episode to complete. During the time slot I set aside to record, my sister inconveniently and inconsiderately had a baby. Don’t you hate it when other people schedule things when you already have plans? Sheesh! All kidding aside, I spent a few hours at the hospital yesterday and a few hours watching the NFL playoffs yesterday, just enough to set me back half a day. On top of that I spent 90 minutes recording and another 3 hours editing my own babble.

Listen to me ramble. I sound like Jodie Foster’s award speech from last night, except I’m not coming out in this post. Unless it’ll increase my number of subscribers.

Link: YDKF Episode 119: Hohocon ’94
Facebook: You Don’t Know Flack

Last week at the 27th annual Chaos Communication Congress (CCC), a group calling themselves “fail0verflow” displayed the single-most important PlayStation 3 hack to date. A few months from now, when everybody who wants one has a modified PS3, you’ll be able to point your finger back to fail0verflow’s CCC presentation and say, “that is where is all began.”

Just like the original Xbox, the PlayStation 3’s defenses didn’t fall to pirates, but to Linux experts. The quickest way to have your security precautions ripped out of your device, run up the flagpole and laughed at is to prevent people from running Linux on it. In fact, the general consensus has been all along that since the PlayStation 3 allowed users to install Linux on an unmodified console, Linux hackers have had no incentive to tinker with the console’s security measures. As a result, the PS3 has remain “unbroken” for over four years, the longest of any modern console. However in the late spring of 2009, Sony removed the OtherOS feature from PlayStation 3’s through a mandatory (if you want to play online and/or new games) BIOS upgrade. While this made a lot of PlayStation 3 owners mad, it apparently made fail0verflow really mad.

The reason your PS3 (or any game console) won’t play a copied disc is because games must be digitally signed. As with any encryption, this digital handshake requires a private key and a public key. A PlayStation 3, using its private key, examines public keys and, based on its findings, determines whether or not to execute the code. This is why games you buy off the shelf will run on your PS3, but a copy of that same game will not.

(Old mod chips for the original PlayStation used to trick consoles by returning the right answer, regardless of what the question was. The PS1 was looking for region codes instead of digitally encrypted signatures, but the concept was the same. When a backup copy was inserted into the original PlayStation, the console would ask, “should I play this game?” The console checked for the region code and, when it could not be found, would reply with “no.” That response was sent back through the modchip, who slyly changed it to “yes!”)

While digging through the PlayStation 3, fail0verflow didn’t just find a private key — they found the private key. The master root encryption key. Using this key, hackers can generate working public keys. With valid public keys, hackers can boot anything they want on the PS3. There are two important things to note here. One, is that this key is included in the PlayStation 3’s hardware. It does not appear that a BIOS upgrade can change the master key. And two, changing the key could cause all PlayStation 3 games to stop working — so that’s not very likely. fail0verflow went looking for this key in the name of Linux. Other folks may not be so kind.

You know how there’s that one guy that takes things to another level? In the hacking world, that guy is GeoHot. GeoHot perfected the iPhone jailbreak; if your iPhone is jailbroken, you owe it to GeoHot. The PlayStation 3 has been a thorn in GeoHot’s side for quite some time now. He’s picked at it, poked at it, and even released a couple of hacks that were eventually closed up by Sony. fail0verflow announced that within the next month, they plan on releasing some tools that will allow the homebrew and hacking communities to start looking at the PS3. GeoHot said to hell with that, and posted the master key on his website.

Click to Enlarge

Right now, this kid’s house is probably surrounded by lawyers. Or assassins. Or both.

Now, I don’t know what to do with that number, and chances are you don’t either, but you can get your booty there are people that do, people that have been waiting four long years for those numbers. The PS3’s homebrew and hacking scenes are about to light up. I can’t wait to see what happens next.