SafeNet Delivers Enterprise-Grade Big Data Security; Focuses on Hadoop Data at Rest

Data protection provider SafeNet is shipping an update to its ProtectFile security solution to encrypt the sensitive data at rest in Apache Hadoop clusters.

Tags: big data, data security, encryption, compliance, Hadoop, HIPAA, policy, ProtectFile, SafeNet,

Data protection provider SafeNet is shipping an update to its ProtectFile security and solution to encrypt the sensitive data at rest in Apache Hadoop clusters.

 

As organizations expand their use of Hadoop to improve reap the benefits of cost-effective storage and faster data processing, they are looking for ways to secure their sensitive data that lies distributed across nodes in Hadoop clusters, according to Todd Moore, vice president of SafeNet’s encryption products unit.

 

“Organizations are playing a balancing act between capitalizing on Hadoop’s scalability and efficiency to uncover value in big data and protecting high-value information in their implementations,” Moore said in a statement. “With the volume of data that a company generates growing exponentially and the number of breaches on the rise, security must be a priority with Hadoop deployments.”

 

With this update, SafeNet extends ProtectFile’s capabilities to provides data security with fully automated file encryption of unstructured data in network drives and file servers across Hadoop cluster architecture – without the need for IT or data owners to do any re-architecting tasks.

 

Notably, the updates to SafeNet’s ProtectFile for Linux allows Hadoop adopters to secure their data without paying any performance penalty, Further, because encryption is transparent to the user, there is no disruption to business operations, performance, or the end user experience, Moore noted.

 

Other updates to SafeNet ProtectFile include:

  • Rapid deployment and implementation: Automation tools provide fast and easy roll-out and standard deployment to multiple data nodes in a Hadoop cluster.
  • No re-architecting required: No changes are needed to an enterprise’s existing big data implementation to obtain granular file encryption and folder level encryption throughout the data life cycle.
  • Hardware-based centralized key and policy management: Maintain control of encryption keys for added security, and define tight access controls to guard against unauthorized or rogue mining of high value data in a Hadoop cluster.
  • Compliance-ready capabilities: Support and enforce compliance mandates, such as HIPAA and PCI DSS, in big data implementation. By utilizing access policies, only authorized, authenticated users can view sensitive unstructured data, enabling collaboration with the next generation of security.

451 Research’s senior security analyst Garrett Bekker, noted that as Hadoop becomes more widely adopted, data security features are becoming more top-of-mind. “The ability to encrypt data at rest that is potentially distributed across thousands of nodes should provide comfort for organizations that are concerned not only about the security of their existing data, but also the proprietary and potentially highly confidential outcomes of their big data experiments,” he said in a statement.

 

Relatedly, aside from its ProtectFile for Linux, SafeNet also updated its ProtectFile for Windows to enable what the company called “transparent and automated file-system-level encryption of server data” for distributed Windows environments, Moore added.




back

Share
Go