[ETR #21] Data Pipeline Non-Negotiables


Extract. Transform. Read.

A newsletter from Pipeline

Hi past, present or future data professional!

Despite falling into the realm of engineering, data infrastructure construction is a bit like basic art. At times building a data pipeline is as simple as filling in one of those color-by-numbers books. Other times, the process of extracting and ingesting data can be as abstract and disconnected as paint flicked onto a canvas, Jackson Pollack style.

No matter the complexity of your build, there are always certain brushes, a.k.a. non-negotiables, you should paint with to create intuitive and robust pipelines. I consider the following recommendations to be non-negotiable because they serve the most basic goal of a data pipeline: Providing reliable, prompt and accurate data to data consumers.

A non-negotiable you must include in not only data pipelines, but programmatic scripts at large, is a clear, consistent and accessible form of logging. Good logs will concisely reflect what is going on within a script, revealing insights about each function or step as it is executed. Learn more about the importance of logging and best practices here.

Going hand-in-hand with logging is the capturing of and reference to API status codes. While not all APIs will emit similar text messages when a response is triggered, there are universal codes like 200 that can be helpful in indicating the presence of data or other attributes and distinguish an unsuccessful request from a successful effort.

Once you have the data, I’d suggest, as a non-negotiable, that you keep it in a consistent format. It might be nice being able to iterate through columns in a data frame, convert it to JSON, and then convert to a final data frame, but the resources required to execute the transformations and redundancy of the operations makes this inefficient. If you have to do significant work to unnest data, for instance, it may be better and more efficient to keep your data in JSON form.

Finally, one of the worst things a pipeline can do (after breaking) is generate duplicate data. Nearly every one of my work builds includes what I call a “refresh” query that deletes the current date’s data as the pipe runs. This means that if the pipeline has to run again, it will generate the exact same output. The word for maintaining state like this is “idempotent.” In an org running hundreds of pipelines, you don’t want to create the 1 pipe with an uncontrollable output.

To review, non-negotiables include:

  • Logging statements
  • A record of API status codes/output
  • Consistent data format
  • “Refresh” delete statements to make the pipeline idempotent

This week's links:

What did I miss? Reply to this email and let me know.

Thanks for ingesting,

-Zach Quinn

Extract. Transform. Read.

Reaching 20k+ readers on Medium and nearly 3k learners by email, I draw on my 4 years of experience as a Senior Data Engineer to demystify data science, cloud and programming concepts while sharing job hunt strategies so you can land and excel in data-driven roles. Subscribe for 500 words of actionable advice every Thursday.

Read more from Extract. Transform. Read.

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! I dreaded entering the job market after my data science master's. I felt like I knew more than a data analyst but less than a professional data scientist. I've since realized my program was more effective than I thought, but it couldn't prepare me for the key areas like cloud deployments and real-world problem-solving I had to learn on the job as a data engineer. And I’ve noticed these gaps in...

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! If you live in the U.S., this week marks the end of back to school season; though, if you’re like my southern relatives, you’ve been back since July. The closest feeling most adults get to back to school (aside from the teachers), is starting a new job. While a new org, title and compensation package represents new opportunities, it’s also easy to feel like the “new kid”, which can lead to being...

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! I once participated in a remote job interview in which the interviewer was on the video call while driving... and smoking. While that instance was among the most memorable interview experiences (for the wrong reasons), I’ve had just as many interviews that have blended together and faded into the recesses of my mind. The common denominator, however, was the insistence on asking one question. The...