Email Attack on Vendor Set Up Breach at Target | Krebs on Security

A very good (and damning) investigation of the Target breach. Key items are:

The company’s primary method of detecting malicious software on its internal systems was the free version of Malwarebytes Anti-Malware.

“Target would have paid very little attention to vendors like Fazio, and I would be surprised if there was ever even a basic security assessment done of those types of vendors by Target.”

Krebs then goes on to explain how by downloading publicly available documents from one of the Target web sites you can get a fair idea of how the Target network is setup.

Email Attack on Vendor Set Up Breach at Target — Krebs on Security.

Protecting customer data from government snooping | The Official Microsoft Blog

Interesting to see that Microsoft is starting a push that as they say will take them to “the end of 2014” to secure all of their systems with encryption. It is rather curious though that they are going to “leave the choice to developers” when it comes to Windows Azure. It really should be a case of “one in, all in”.

Protecting customer data from government snooping – The Official Microsoft Blog – Site Home – TechNet Blogs.

Hacked Via RDP: Really Dumb Passwords | Krebs on Security

I have been reading this article about passwords and how easy it is to hack a server when you do not have secure passwords – Hacked Via RDP: Really Dumb Passwords — Krebs on Security.

One of the things that stands out is that the Local Security Policy on Windows is wrong in that “Password must meet complexity requirements” setting should always be set to “Yes”. The definition is quite clear cut:

This security setting determines whether passwords must meet complexity requirements.

If this policy is enabled, passwords must meet the following minimum requirements:

Not contain the user’s account name or parts of the user’s full name that exceed two consecutive characters
Be at least six characters in length
Contain characters from three of the following four categories:
English uppercase characters (A through Z)
English lowercase characters (a through z)
Base 10 digits (0 through 9)
Non-alphabetic characters (for example, !, $, #, %)
Complexity requirements are enforced when passwords are changed or created.

Default:

Enabled on domain controllers.
Disabled on stand-alone servers.

Note: By default, member computers follow the configuration of their domain controllers.

(Emphasis on the “user account name” part is mine). Where I believe the settings are incorrect is in the “Disabled on stand-alone servers”. If a computer is able to be reached remotely via RDP then it MUST have “Password must meet complexity requirements” set to “Yes”. This should be a requirement of enabling the Remote Desktop Protocol.

This would negate ALL the attacks in the Krebs article.

Also, we have found that MOST companies that store passwords in an encrypted manner use an unsalted MD5 hash. While this is a one-way encryption it is very easily defeated by rainbow tables. For example, let’s take a simple password and get it’s MD5 hash:

MD5(‘password’) = 5f4dcc3b5aa765d61d8327deb882cf99

Now, let’s use MD5 Online to “crack” this password by doing a simple reverse lookup:

Found : password (hash = 5f4dcc3b5aa765d61d8327deb882cf99)

Ok, so unsalted MD5 hashes are not a good idea with common words. What we suggest is that YOU take responsibilty for adding your own “salt“, eg:

MD5(‘$$password$$’) = 213a95dfa43321c74cf0b5c843afbe6e

using MD5 online again we find:

No result found in our database.

Obviously this is only as good as the “salt” that you choose – but make sure you have a number of different special characters in your password and DON’T DO letter substitution:

MD5(‘p@55w0rd’) = 39f13d60b3f6fbe0ba1636b0a9283c50

MD5 Online can easily find this password – even though it’s not a real word!

Rick.

Important Security Update for D-Link Routers | Krebs on Security

D-Link has released an important security update for some of its older Internet routers. The patch closes a backdoor in the devices that could let attackers seize remote control over vulnerable routers. On Nov. 28, D-Link released a series of updates to fix the problem. Updates are available for the following models: DI-524, DI-524UP, DIR-100, DIR-120

Heffner says based on his research, several other versions of D-Link routers may be vulnerable, including the DIR-624S, DI-604S, DI-604UP, DI-604+ and the TM-G5240. However, no updates were released by D-Link for these models.

via Important Security Update for D-Link Routers — Krebs on Security.

Anatomy of a password disaster – Adobe’s giant-sized cryptographic blunder | Naked Security

A very good read of how bad the Adobe breach was. It also answers the question as to how Facebook was able to determine what the passwords used on Adobe were the same as being used on Facebook (even though they do not store the password or even the encrypted password). It is also truly scary how easily you can determine passwords given a large enough sample size.

Anatomy of a password disaster – Adobe’s giant-sized cryptographic blunder | Naked Security.

Facebook engineer describes how they have used the Adobe breach to tighten general security | Krebs on Security

Google crawler tricked into performing SQL injection attacks using decade-old technique | Ars Technica

Google crawler tricked into performing SQL injection attacks using decade-old technique | Ars Technica.

It is interesting to note that you could be the unfortunate recipient of a DOS attack from Google/Bing/Yahoo simply because someone creates a page that will overload your database and bring your server to its knees. The real lesson here is that you should NEVER execute any SQL based on user supplied data without vetting it first. The secondary lesson is that you should (if you can) limit requests to your server from search engines via your robots.txt file (from here):

If your web application does have issues with handling occasional requests (for example on request per second), you can slow down Bing and Yahoo with the following entry in robots.txt:

Crawl-delay: 120

This will ask crawlers to wait at least 120 seconds between requests. For Google, you can define the delay in the webmaster tools.