Common Challenges When Handling PII Data - Security Boulevard

2022-09-03 00:04:27 By : Mr. Davis Zhou

The Home of the Security Bloggers Network

Home » Security Boulevard (Original) » Common Challenges When Handling PII Data

Personally identifiable information (PII) is information that can be used to distinguish or trace an individual’s identity. Understandably, if compromised, such sensitive information may be used for identity theft, fraud and other harmful and unlawful purposes. With data today considered a lucrative commodity, organizations are more aware than ever of the need for adopting the most stringent security and compliance standards and regulations, such as GDPR, to reinforce trust and confidence with their users and customers. 

The act of balancing security and business needs is often challenging, as environments are becoming more complex. Data structures, relationships and usage constantly change, and simple tasks such as access tracking may now introduce technical and procedural challenges. That being said, there are key processes and mechanisms to support both business agility and enforcement of data security guardrails. 

At the foundation of securing all data are data protection measures. As a baseline, all sensitive data—either at rest or in transit—should always be encrypted (or pseudonymized) according to industry best practices, making it difficult for hackers to obtain valuable consumer information even if they have obtained access to it. It is important to note, however, that security is built in layers, and encryption does not ensure complete protection, especially as the “hack now, decrypt later ” approach becomes popular.

Below are the top six challenges in data handling that should be addressed and taken into account when designing and evaluating sensitive environments.

When creating data flow diagrams (DFDs) and flagging resources as containing PII or sensitive data, it is often easier to set a high compliance bar across the board and use a standard set of controls on any classified data without ample consideration for the possible ramifications of overclassification. However, not all PII data should be treated equally. Personal data and PII are broad categories that encapsulate numerous more specific classifications varying from public to highly classified, and when combined with business privacy models and users’ consent, the sensitivity level is not always straightforward. Using uniform controls for all types of PII data, regardless of variations in sensitivity or value, may lead to an inability to protect highly critical data appropriately.

Assigning the right classification level to data is an important task that drives a reduction of organizational spending across the board, ensures engineers and researchers have relevant access to data, encourages balanced ownership and assists with education efforts. Overclassification has potentially harmful effects, as it leads to stringent restrictions on usage and sharing, false positive alerts, or a false measurement of an alert’s severity. A recommended practice is to create a classification wizard based on common organizational use cases, using simple questions and clear instructions to guide employees on how to classify and handle specific sets and types of data without over-classifying them

Data, specifically sensitive data, is a high-value asset and often a mandatory organizational building block that drives research, investigation, product development and other critical business needs. By nature (and upon customer consent) it is shared internally between teams, exported to third-party approved entities and sometimes even published publicly. These rapidly and constantly evolving environments cause organizations to quickly lose control of where their data is stored. When it comes to PII, there are real-world financial consequences to loss of data, and therefore maintaining a continuously updated PII inventory and controlling access to shared data is crucial—but has become increasingly more difficult.

Useful practices to ensure that organizational data privileges are not abused or overly provisioned include maintaining context and similar access controls between replicas/backups and the original data, as well as creating clear ownership models and empowering owners with tools to periodically evaluate sharing use cases and data security levels.

Data protection does not stop at access control or encryption; it also entails the responsibility to purge sensitive records. This requirement is regulated, as detailed in GDPR section 5, for example, which specifies that data should be retained for only as long as required to achieve the purpose for which data was collected and processed. Enforcing this directive and tracking the life cycle of data between production environments, backups, research and other areas are challenging tasks.

Whenever possible, organizations should harness the power of the cloud (regardless if SaaS, PaaS or DBaaS) and configure auto-purge for any sensitive object and DB record, or invest in application-level monitoring and automation to ensure that data is kept only as long as required—but no longer. It is also imperative that users work with security or privacy officers to ensure that they are making use of the proper data retention policies. 

One of the hardest tasks for security/compliance owners is to determine, assess and refresh the appropriate permission levels and enforce dynamic authorization to sensitive information. When analyzing incidents involving identity theft and malicious insiders, we learn about the often disastrous implications and growing attack surface resulting from the number of users with access to resources they no longer need in production environments.

A recommended practice to mitigate this is to ensure that the data owner alone approves any access request to sensitive data, based on the principle of least privilege. This is certainly an arduous challenge, but identifying the more sensitive assets and classifying them correctly will help security teams focus and prioritize their efforts accordingly. Additionally, the owner should enforce an automatic expiration of no longer than 24 hours, thus educating the teams on production access policies, creating an ongoing evaluation of granted permissions and implementing an access trail record.

A critical element of data usage is its appropriate and timely recording and logging. Proper hygiene of the data audit trail is a vital component of an organization’s cloud data security posture and enables the traceability of data movement which plays a key role in incidents and investigations.

By leveraging audit solutions, local or centralized, and combining them with CDSPM solutions, user access and activity involving sensitive data can be easily monitored and analyzed to generate insights around data ownership and consumption, as well as to highlight risky behavior or data exposure. 

Ephemeral datastores containing PII are commonly used to address urgent needs and act as a quick or temporary fix. While short-lived, their transitory nature is misleading. During their brief existence, they may have been accessed without proper authorization or oversight, and the PII data within them may have been exported to another location without supervision. While regulations such as GDPR do define requirements for short-lived datastores, enforcing their implementation is challenging. The activity of ephemeral datastores is often not tracked, they lack a clear owner, and the appropriate security guidelines may not be in place and are increasingly difficult to regulate. Making things worse, time-to-discover and time-to-investigate risks in these datastores are often much longer than the datastore’s actual lifetime. Automation is a key component of managing this challenge, with automated datastore discovery and pattern identification providing quick, actionable insights for continuous oversight and incident mitigation. 

Gad Rosenthal is a product manager at Eureka Security, a Cloud Data Security Posture Management platform that enables security teams to successfully navigate the ongoing and often chaotic expansion and growth of cloud data. Prior to joining Eureka, Gad led cybersecurity and compliance initiatives at the Israeli Cyber Command, the Israeli Cyber Education Center, Siemplify (now part of Google) and Microsoft (M365 Defender Suite) as a PM, an auditor and a security architect. In these roles, he shaped how organizations experience and handle the cybersecurity and compliance landscapes. Passionate about technology, cyber, compliance and people, Gad is driven to identify and develop exciting new technologies and strategies to help fellow security leaders.

gad-rosenthal has 1 posts and counting.See all posts by gad-rosenthal