A report filed over the weekend detailed research that discovered more than 2 billion files exposed online from a variety of sources, including cloud servers, network-based storage, and company-owned data repositories. The discovery includes a massive trove of credit card information, medical records, private photographs, and details of intellectual property patents.
While not all the files in the discovery are considered confidential or sensitive, they were indeed all private. That said, the files included millions of x-rays and scans, credit card details, payroll files, and intellectual property patents. All of these files were stored alongside some less sensitive information, but all were making use of cloud and network storage servers that were determined to be misconfigured. It was also determined that 17 million of the files were encrypted by ransomware, an indication that, if not discovered, would very likely have resulted in ransomware spreading quickly to other files.
The report from Digital Shadows said that 2.3 billion files were publicly available across “…SMB-enabled file shares, misconfigured network-attached storage (NAS) devices, File Transfer Protocol (FTP) and rsync servers, and Amazon S3 buckets. In what may be one of the largest exposures ever discovered in the public cloud, the discovery also points to the breadth of data sources that can be breached with relative ease.
The main issue is for most of the exposed resources were misconfigured servers in the public cloud. This resulted from lack of configuration management and poor visibility into whether or not servers were adhering to CIS Benchmark and internal configuration requirements.
Configurations are where so many vulnerabilities get started. Servers, buckets, virtual machines, applications, and other resources change configurations and settings in order to meet business needs and adapt to the integration with other data sources. It’s part of the dynamic nature of the cloud, but it’s also like a constant and continuous game of hide-and-go-seek. And frankly, mom isn’t having any of it.
It’s also critical that organizations have insight into their cloud accounts and workloads and container infrastructure. Without insight, the organization is prone to information gaps that prevent their ability to detect misconfigurations, policies not being enforced, or other issues that could easily lead to a breach.
The solution to these potential gaps in cloud security is one that monitors and logs all inter-process activities, even those occurring inside the same file. You need a host-based intrusion detection system designed to monitor process hierarchy, process and machine communications, any changes in user privileges, internal and external data transfers, and all other cloud activity. An effective system looks across all layers, and it analyzes activity based on normalized behavior, which gives a continuous real-time view even across short-lived services that may only exist for a few minutes. Having that process-to-process visibility is a critical factor in having strong, effective security built into any cloud environment.
- Understand the potential threats that exist in your environment.
- Make security part of your cloud IT practices.
- Ask the right questions when evaluating security vendors.