Last week, security firm Rapid7 released the results of a scan they did of Amazon’s S3 (Simple Storage Service), and the results have caused shockwaves around the Cloudsphere.
It turns out that 1,951, or 16% of all storage “buckets” were exposed to the public, such that any user stumbling across them would be able to see their contents. Rapid7’s scanning of 40,000 of the more than 126 billion files revealed items such as:
- Personal photos from a medium-sized social media service
- Sales records and account information for a large car dealership
- Affiliate tracking data and account information for an ad company’s clients
- Employee personal information and member lists across various spreadsheets
- Unprotected database backups containing site data and encrypted passwords
- Video game source code and development tools for a mobile gaming firm
- PHP source code including usernames and passwords
- Sales “battlecards” for a large software vendor
Amazon’s original response was that their interface made it obvious that the buckets were public, and that users were not inadvertently exposing their data. But a rescan of the 1,951 open buckets made after Amazon notified the users that their buckets were public revealed that 27% had switched their buckets to private.
If you’re an IT executive reading this, chances are you’ve authorized some use of the Cloud for backup, auxiliary storage, development, sandbox space, or even for production applications. And likely your users have participated in some rogue cloud activity, to which you have no visibility.
What have we learned? Amazon’s security still provides very good protection, but like many security issues, human behavior is the weak link. Their user interface is among the best on the planet, and yet a large number of people inadvertently exposed confidential data.
IT executives can use this example to look at addressing user demand from two perspectives:
- Public vs. private cloud – which workloads should go in a private cloud within the firewall, and which should be in the public cloud?
- Heavy control vs. hands-off management – should IT create extra checkpoints and protection mechanisms to maximize protection of data, or should it minimize the interference to create the optimal consumer experience?
For both perspectives, there’s a happy medium, though if you don’t have a viable private cloud in place, question 1 is academic.
Question 2 is sensitive, in that users are going to the cloud in many cases to avoid IT processes they consider to be burdensome and unresponsive. Having IT manage the public cloud engagement and perform appropriate security testing on a regular basis makes sense, and will reduce risk for the sanctioned cloud usage.
But clearly, public cloud and hands-off management is a deadly combination for IT.
One way to address this growing risk is to quickly deploy a workable (not necessarily perfect) private cloud, and once your capability is solid, create an “amnesty” program to bring workloads back from the public cloud. With the proper incentives, you’ll get both the workloads you knew about and some that you never knew existed. But you’ll be in a position to protect your company’s data while providing users the IT experience they need to compete.