Recently came across an excellent article around the complexity of cloud here;
If you just use / consume cloud computing the concept seems simple enough, and on the surface it is. However if you are implementing a cloud type service whether a huge public cloud or a smaller private cloud the work involved is considerably more complex.
The cloud concept is to deliver IT services as a utility much like power or other utilities. From a consumer viewpoint this makes the consumption of the services a simple idea. The provision of these services in a reliable, location independent, scalable manner is far from simple. Many larger businesses are either implementing or at least considering the idea of a private cloud, if you are in this camp, or just interested in the complexities of implementing cloud computing then this article makes a great read!
If you ever doubted either the inventiveness of criminals, or the need for taking sensible security precautions this story should be a wake up call;
Hackers have developed ‘Man in the Browser’ attacks that potentially allow them to circumvent even the relatively new 2-factor chip and pin security many banks now implement. These attacks also have the potential to at least temporarily evade protection such as AV software and any blacklists as they will redirect to new sites that are not yet known by security firms.
In short stay vigilant, keep your computer(s) protected and up to date, and always use security software such as anti virus etc. And as documented by Bruce Schneier several years ago we need to look at authenticating each transaction.
Checked today and I have passed the final module – Secure Systems Programming. I actually did considerably better than I expected and as I had virtually no C/C++ experience prior to this module I’m very pleased!
Nice to make some progress on the first item from my post about plans for the year as well..
Just the project to go, so I’ll definitely complete the Masters this year. I will post some updates on the project later in the year as that gets started and progresses from around April time.
I came across this excellent post via Bruce Schneier’s blog;
The post highlights that while Verisign has publicly claimed that they have dealt with the recent breach of their systems and that the Domain Name System (DNS) has not been compromised, they are still very light on details of what actually happened and how the DNS system was protected and has in fact not been compromised.
The point of the post is that for us to truly trust them and the systems the own and run again they must be open and transparent.
This is an excellent point and one well worth remembering. While it may appear that the most secretive systems or organisations may be the most secure, actually it is likely we can place the most trust in those that are most open where we can clearly see and verify the security of their systems and processes.
Read the post and Verisign’s statement and make up your mind on whether you think you would be more ready to trust them if they were more open and transparent.
Be secure, open and trustworthy..
The below is of course hypothetical and similarity to any real people or situations is purely coincidental..
Hi can you design solutions for our environment?
Sure, could you provide me with details of the environment, ideally some sort of architecture document covering what it comprises of and how it’s configured?
No we don’t have that.
OK, could you provide me with access to the environment so I can understand it, how it’s configured and any capacity constraints etc.
Erm OK.. I’m off to buy a magic 8 ball then.
Dilbert couldn’t do better
Back to more serious posts next!
Need to test the potential scalability or performance of your VMWare virtual environment? Then this tool from VMWare will possibly fit the bill;
VMmark is a free tool VMWare provides that enables you to assess the performance of a physical host when running a variety of workloads. These different workloads are referred to as ’tiles’ as is demonstrated by this diagram;
This method can be used for testing the scalability of a single workload, multiples of a single type of workload, or a variety of workloads. All of which may be the use case you need to understand prior to deployment or changes in requirements.
If you are looking to specify new hardware or understand the harware requirements of upcoming projects there are VMmark results for many server types and configurations already uploaded here;
This is a great reference for understanding real world server performance from actual users and companies other than VMWare.
I stumbled across this excellent VMWare blog called vReference that I wanted to share;
This is overall an excellent blog written by a guy called Forbes Guthrie who has in depth VMWare knowledge and has even written (in conjunction with others) books on the topic. This blog covers many VMWare / vSphere related topics from SAN booting to Windows clustering.
Of particular note are the reference cards he creates that are incredibly useful and cover a surprising amount of detail from maximum guest sizes to maximum numbers of hosts in a cluster through many useful command line installation options, storage management, and using vCentre..
vSphere 5 reference card can be found here;
vSphere 4.1 reference card can be found here;
He even still has the 4.0 card for anyone yet to upgrade from this version. I very much recommend move to a more current version in the near future if you are still on 4.0 as you’ll get many benefits in all areas from performance to BCP/DR to scalability and management! 4.0 card can be found here;