Whether sensitive data is accessed via these ever-changing endpoints, stored in Big Data repositories, or provisioned through virtual applications and virtual desktops, RSA DLP is designed to give organizations the visibility they need to make more educated decisions around security and governance.
Organizations need to manage and secure sensitive information across a number of new consumerized endpoints and virtualized data centers. The enhanced RSA DLP Suite is engineered to allow organizations to better manage risk from sensitive data introduced when end users connect to corporate assets using their own consumer grade smartphones, tablets and personal computers.
The RSA DLP Suite is designed to now monitor the flow of sensitive data and help prevent data loss whether end users connect to corporate email through Microsoft ActiveSync or Outlook Web Access.
In addition, the new RSA DLP Suite is designed to monitor all data transfers from virtual desktops and virtual applications to the end user's physical device whether it's a smartphone or a tablet or a home PC. By integrating more deeply with the infrastructure and addressing risk holistically, organizations can avoid deploying agents on multiple devices.
As the amount of data, the variety of loss vectors and the types of sensitive data grow, the management of DLP programs is getting more complex. To address these management challenges RSA has added new functionality to DLP and is also offering two new DLP process management modules called DLP Policy Workflow Manager and DLP Risk Remediation Manager.
These DLP process modules are designed to help organizations better manage the lifecycle of data protection policies and to better manage risk discovered through DLP scans.
To simplify the process of discovering and managing risk for data at rest, the new RSA DLP Suite is combining the power of grid-based scanning with native support for Microsoft SharePoint sites and Microsoft Exchange repositories. With this combination, organizations can now manage risk introduced by the proliferation of data across various Big Data repositories cost effectively.