Homomorphic Encryption – Saviour of the cloud? Ready for prime time?

Homomorphic encryption has been around for a while (in fact it has been debated for around 30 years), but most systems that are Homomorphic are only partially homomorphic thus limiting their use in enabling real world distributed, including cloud based, systems.

I’ll start by briefly describing what the term homomorphic means when used to describe a cryptosystem.  If a mathematical operation can be performed on the encrypted data to produce an encrypted output that when decrypted gives the same result as if the operation had been performed on the plaintext.

I’m sure you can see how this removes one of the main barriers to the adoption of cloud computing.  If an efficient, proven and thoroughly tested homomorphic encryption system would potentially revolutionise the view of cloud computing security.  Currently it is easy to send data to and from the cloud in a secure encrypted manner, however if any computation is to be carried out in this data it has to be unencrypted at some point.  When the data is unencrypted in the cloud the risk that employees of the cloud provider, and potentially other customers, could access the data becomes a real concern.  It is this risk that is one of the key road blocks to companies moving their data to the cloud.

Additionally some legal / regulatory rules prevent certain unencrypted data types, such as personally identifiable information (PII), leaving countries / regions such as the EU.  A system that enabled data to remain encrypted would potentially get around these regulatory issues and allow data to be housed in the cloud (many cloud providers have data centres located in various global locations and can’t guarantee where data will reside.   In fact this is one of the benefits of the cloud – the high level of redundancy and resilience provided by multiple data centres in geographically diverse locations).

Some existing algorithms are partially homomorphic, this means that they are homomorphic with regards to one or maybe a couple of operations.  For example the RSA algorithm is homomorphic with regards to multiplication.

IBM has published some research in this area in 2009 they proposed fully homomorphic systems that are linked to from here;

http://domino.research.ibm.com/comm/research_projects.nsf/pages/security.homoenc.html

Currently fully homomorphic systems are too new and not yet practical enough to be implemented for production systems.  For any cryptographic algorithm to be recommended it requires considerably more time to be peer reviewed and tested by security and encryption researchers to allow a reasonable level of assurance that there are not attacks that could be used to unencrypt the data.  In terms of practicality currently proposed homomorphic encryption systems, the complexity of the system grows enormously as the number of actions you need to perform on the encrypted data increases.  This leads to a massive increase in the computational power required to run the system, this is a non-trivial increase that will not be solved by Moore’s law anytime in the near future.

So homomorphic encryption has now been proven to be possible which is a huge step forwards, and the work done by people like Craig Gentry and the guys at IBM and MIT must be hugely applauded.

Microsoft researchers published a paper in May of this year (2011) titled ‘Can Homomorphic Encyption be Practical’ that can be found here;

http://research.microsoft.com/apps/pubs/default.aspx?id=148825

This provides an overview of a proposed partially homomorphic implementation along with thoughts on how it could be made fully homomorphic and how the efficiency could be improved.  The page also contains some useful links to cloud and lattice based cryptography.

However the reality is that we need several more years for a broader range of cryptographers to examine the cryptosystem to be assured it is secure, and for further work to go into making the system much more efficient.

These are definitely interesting times, and over the next few years I would hope to see homomorphic cryptosystems removing some of today’s key barriers to the adoption of cloud computing services!

K

Advertisements

USAF Predator control systems compromised by malware

Following on from the very high profile targeted attacks such as the Stuxnet worm that was used to target Siemens supervisory control and data acquisition (SCADA) systems such as those used in Iranian nuclear facilities;

http://www.google.co.uk/search?aq=f&gcx=c&sourceid=chrome&ie=UTF-8&q=stuxnet

 

 

and the RSA security breach that impacted many businesses earlier this year;

http://blogs.rsa.com/rivner/anatomy-of-an-attack/

It has emerged that some USAF (United States Air Force) computer systems have been infected by malware.

While the reports of this state that is it likely to just be a keylogger and not something that is co-opting control of armed military drones, this should be seen as yet another wake up call – any network attached systems or any systems that allow storage devices (e.g. USB drives) to be connected are vulnerable to attack by malware.  I am sure from reading the previous section you have realised that this means pretty much every computer system..

Details can be found here;

http://nakedsecurity.sophos.com/2011/10/10/malware-compromises-usaf-predator-drone-computer-systems/

One particularly worrying comment from the story is around the fact that they are not sure if the malware has been wiped from the systems properly and that it keeps coming back.  Best practice is always to do a clean rebuild of any infected machines, especially something as critical as this!

In short, if high profile security vendors and supposedly secure military computers can be successfully attacked and gaps exploited this should be a wake up call to anyone who does not yet take the security of their systems and data seriously.

Oh, and if in any doubt – reinstall, don’t keep trying to clean the malware from the system!

K

The Internet of Things

Intel has created a graphical representation of all the different devices connected to the internet along with some key milestones throughout the development of the internet.

These include;

– the first connected computer in 1960

– the birth of what we now know as the internet / world wide web from 1989 through the ’90s

– the Sega Dreamcast which was the first connected games console

– through to current devices such as the iPad.

The graphic also predicts the future, estimating 4 billion people connected with 31 billion devices by 2020!

The graphic can be found in various sizes here;

http://newsroom.intel.com/docs/DOC-2297

K

Bad Science 15 minute overview

Following from my book review of ‘Bad Science’ by Ben Goldacre, his 15 minute overview was posted on Ted;

http://www.ted.com/talks/ben_goldacre_battling_bad_science.html

If you are curious about the content of the book, or just what he means by Bad Science then I highly recommend checking this speak out.

K

Choosing the right project(s)

Choosing the right projects to focus limited resources on is clearly key to the success of any business.

When projects / programs are prioritised in in your (most) businesses is this always done using the best and most objective methods available?  How are they chosen in your organisation?  How are the chosen projects and programs then prioritised against each other?

Most organisations will no doubt claim to have a very organised and agreed approach to this process based around business priorities and the clear business benefits from each project being considered.  If you look more closely though the reality is often very different with processes like these;

–          Which project is sponsored by the most senior individual in the organisation?

–          Which project is being pushed by the most aggressive sponsors/ individuals?

–          Which project has the best sales pitch (e.g. best presentation)?

–          Which project is being pujshed by the sponsors / individuals with the best political connections in the organisation?

–          Whish project will provide the greatest return on investment (ROI)?

While I am sure you are thinking that ROI sounds like a reasonable choice for choosing projects, and indeed used 100% impartially it can be, however it easy to manipulate ROI figures and most ROI statements such as “will save xx millions” have little supporting, reproducible, evidence.  Also, in reality how many projects thoroughly calculate the ROI on a project after it is completed and hold those who made the statements accountable for their accuracy?

In addition to the above thoughts on how projects are chosen, it is also clear that the more projects an organisation has to choose from the less time they are likely to be able to put into correctly choosing the best projects for that organisation.

One logical approach to the process of choosing and prioritising the best projects for your organisation is that of value graph analysis.  Interestingly this process has come up twice recently, in the book ‘Simple Architectures for Complex Enterprises’ and on the recent ISEB Enterprise and Solution Architecture course I attended.

The idea of Value Graph Analysis is that it allow you to impartially take into account factors such as the risks of doing or not doing the project, the cost of doing the project, the potential returns of doing the project, the time and resource requirements to complete the project.

While the included factors in a graph can be tailored, both sources that highlighted this approach suggested the same set of default / typical factors;

–          Market Drivers – what market reasons support the project?

–          Cost – what is the project cost?

–          Organisational Risk – what are the risk factors the project addresses?

–          Financial Value – what are the financial benefits of doing the project?

–          Organisational Preparedness – how ready is the organisation to complete the project?

–          Team Readiness – how ready is the proposed project team to complete the project?

–          Status Quo – what are the outcomes / impacts of not doing the project?

The output of assessing all the above factors is the Value Graph, an example of which is shown below as a spider graph;

Spider diagram value graph
value graph example

Values closer to the edge of the graph are considered positive.  Aside from ensuring a wide range of key inputs are included in the prioritisation process, a key advantage is that Value Graphs, especially when using the spider graph representation, enable easy comparison of projects to define priorities by comparing the relevant graphs for those projects.

I recommend checking these out; creating Value Graphs for your projects will enable clear and logical prioritisation and will definitely benefit your organisation in the long term!

K