[ETR #25] GDPR, DSAR, PII & U


Extract. Transform. Read.

A newsletter from Pipeline.

Hi past, present or future data professional!

When I worked at Disney there was one line (aside from “Have a Magical Day”) that was borderline beaten into us: “We are all custodial employees.” The line meant, of course, to keep areas under your purview neat and presentable (“show ready” in Disney-speak).

Using the same logic, I’d like to emphasize that while the various data roles (data analyst, data scientist, data engineer, etc.) have their distinct responsibilities, we are all one thing:

Guardians of data security.

Ok, maybe that’s a bit dramatic. But to be even more dramatic, you should have 1.2 billion reasons to care about data privacy. That’s the amount Meta (the artist formerly known as Facebook) paid after violating perhaps the world’s most comprehensive data privacy framework, the EU’s General Data Protection Regulation (GDPR).

And if you think that’s an isolated incident, there are literally listicles being written about fines issued under just the GDPR; sure “20 biggest GDPR fines” doesn’t have the same ring as “30 Under 30”, but it is a stark compilation that should be taken seriously; it can happen to you (or your org).

As someone who has been the instigator of data privacy claims, I was shocked to find one (against a realtor illegally using my data for in-person solicitation) was taken deadly seriously while another (against a hospital that sent my wife’s health data to the wrong address) was met with a shrug.

Be the former. Doing that begins with understanding both your individual responsibility as someone who works with sensitive data AND understanding or spearheading any effort within your org to standardize sensitive data storage or encryption.

At an individual level

  • (Tactfully) Question requests that might unnecessarily require sensitive user data; do you really need a credit card and social security number?
  • Leverage cloud-based tools to encrypt data in-transit and at-rest; I’m partial to GCP’s Sensitive Data Protection suite
  • Work with your security team to restrict access to your data warehouse and any larger repositories that might contain sensitive data

At an organizational level

  • Hire or distinguish who is “in charge” of privacy; for the best results this probably shouldn’t be someone already busy like a director of data science
  • Define and adhere to a clear and consistent deletion policy (after x months we delete records)
  • Publicize your data privacy protection efforts and let your users know how to request a deletion

Aside from running an ethical operation and remaining transparent for users, why put this much effort into data protection?

To paraphrase Marshawn Lynch: I’m just doing this so I don’t get fined.

You won’t get fined if you don’t read these, but here are this week’s links.

Thanks for ingesting,

-Zach Quinn

Pipeline To DE

Top data engineering writer on Medium & Senior Data Engineer in media; I use my skills as a former journalist to demystify data science/programming concepts so beginners to professionals can target, land and excel in data-driven roles.

Read more from Pipeline To DE

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! From 2014-2017 I lived in Phoenix, Arizona and enjoyed the state’s best resident privilege: No daylight saving time. If you’re unaware (and if you're in the other 49 US states, you’re really unaware), March 9th was daylight saving, when we spring forward an hour. If you think this messes up your microwave and oven clocks, just wait until you check on your data pipelines. Even though data teams...

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! As difficult as data engineering can be, 95% of the time there is a structure to data that originates from external streams, APIs and vendor file deliveries. Useful context is provided via documentation and stakeholder requirements. And specific libraries and SDKs exist to help speed up the pipeline build process. But what about the other 5% of the time when requirements might be structured, but...

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! To clarify the focus of this edition of the newsletter, the reason you shouldn’t bother learning certain data engineering skills is due to one of two scenarios— You won’t need them You’ll learn them on the job You won’t need them Generally these are peripheral skills that you *technically* need but will hardly ever use. One of the most obvious skills, for most data engineering teams, is any...