Amazon Cloud is now FISMA certified: Joins Google and Microsoft

The Amazon Cloud has now classed as being FISMA certified. FISMA is an acronym for Federal Information Security Management Act. FISMA sets security requirements for federal IT systems. and is a required certification for US federal government projects.

This is the third set of certifications Amazon has recently announced coming on top of VPC ISO 27001 certification and SAS 70 Type II certification.

The accreditation covers EC2 (Amazon Elastic Compute Cloud), S3 (Simple Storage Service), VPC (Virtual Private Cloud), and includes Amazon’s underlying infrastructure.

AWS’ accreditation covers FISMA’s low and moderate levels. This level of accreditation requires a set of security configurations and controls that includes documenting the management, operational and technical processes used in securing physical and virtual infrastructure, and a requirement for third-party audits.

Other vendors who recently announced FISMA certification recently where Google with Google Apps for Government and Microsoft with the Microsoft’s Business Productivity Online Suite among cloud services (although there was a spat between Microsoft and Google regarding these claims).

Expect to see further certifications as these are a pre-requisite of expansion into lucrative government and private sector contracts as vendors feels more comfortable choosing Cloud resources as commoditisation marches on.

Amazon – what is coming soon, and what is not !

We had a meeting with Amazon in the UK recently and covered off some off the pressing issues that we wanted to speak about and also learnt some other of what Amazon have lined up.

First, what is not going to happen anytime soon:

– From what we heard Amazon are not going to resolve the issue of billing in local currency with locally issued invoices any time soon. See our prior post on this topic. We did learn however that large organisations can request an invoice.

– Right now if you want to use your own AMI image to sell on a SaaS basis using Amazon infrastructure you have to a US organisation. Again Amazon don’t seem to have plans to change this in the immediate timeframe so that leaves out any organisation outside of the US who want to sell their product offering as SaaS on Amazon’s web services infrastructure unless they integrate their own commerce infrastructure and not use DevPay. This can be both a blessing (charge margin on Amazon’s infrastructure pieces like AMQS) but also a curse (can leave you exposed as you will be month behind in billing your clients). Even though Amazon are entrenched right now as the Public Cloud infrastructure of choice, it wouldn’t be the first time we have seen 100 pound gorilla displaced from it’s prime market position. If I were Amazon, I’d fix this and soon. Microsoft and RackSpace are looking more attractive all the time.

– Amazon’s ingestion services again require you to be a US organisation with a US return address. Are you detecting a common theme here….

And what we can expect to see soon:

– VPC (Virtual private cloud) access is in private beta now. This is a mechanism for securely connecting public and private clouds within the EC2 infrastructure.

– High memory instances analogous to High CPU instances are in the pipeline

– Shared EBS is in the pipeline

– Functionality for Multiple users associated with a single account is in the pipeline and will provide simple privileges too. This has long been a bone of contention for organisations using AWS so will be welcomed.

– Amazon is planning to have lot more EC2 workshops through local partners.

Other things of note that we learnt where:

– We learned that large physical instances currently have their own dedicated blade / box.

– As AWS has grown, large number of machines are available and organizations can request hundreds of machines easily. Even extreme cases are catered for i.e. even requests for 50000 machines.

– As a matter of policy new functionally will be rolled out simultaneously in EU and US unless there is a good reason.

All in all some exciting stuff, and there was other things in the pipeline they could not share, but the public cloud market is starting to get more players and I think Amazon need to get some of their infrastructure pieces in place sooner rather than later.

Amazon Elastic MapReduce now available in Europe

From the Amazon Web Services Blog:

 Earlier this year I wrote about Amazon Elastic MapReduce and the ways in which it can be used to process large data sets on a cluster of processors. Since the announcement, our customers have wholeheartedly embraced the service and have been doing some very impressive work with it (more on this in a moment).

Today I am pleased to announce Amazon Elastic MapReduce job flows can now be run in our European region. You can launch jobs in Europe by simply choosing the new region from the menu. The jobs will run on EC2 instances in Europe and usage will be billed at those rates.

 Because the input and output locations for Elastic MapReduce jobs are specified in terms of URLs to S3 buckets, you can process data from US-hosted buckets in Europe, storing the results in Europe or in the US. Since this is an internet data transfer, the usual EC2 and S3 bandwidth charges will apply.

Our customers are doing some interesting things with Elastic MapReduce.

 At the recent Hadoop Summit, online shopping site ExtraBux described their multi-stage processing pipeline. The pipeline is fed with data supplied by their merchant partners. This data is preprocessed on some EC2 instances and then stored on a collection of Elastic Block Store volumes.The first MapReduce step processes this data into a common format and stores it in HDFS form for further processing. Additional processing steps transform the data and product images into final form for presentation to online shoppers. You can learn more about this work in Jinesh Varia’s Hadoop Summit Presentation.

Online dating site eHarmony is also making good use of Elastic MapReduce, processing tens of gigabytes of data representing hundreds of millions of users, each with several hundred attributes to be matched. According to an article on, they are doing this work for $1,200 per month, a considerable savings from the $5,000 per month that they estimated it would cost them to do it internally.

We’ve added some articles to our Resource Center to help you to use Elastic MapReduce in your own applications. Here’s what we have so far:



You should also check out AWS Evangelist Jinesh Varia in this video from the Hadoop Summit:

— Jeff;

PS – If you have a lot of data that you would like to process on Elastic MapReduce, don’t forget to check out the new AWS Import/Export service. You can send your physical media to us and we’ll take care of loading it into Amazon S3 for you.