[ETR #28] 2024 In 12 Errors You Must Avoid


Extract. Transform. Read.

A newsletter from Pipeline

Hi past, present or future data professional!

If you’ve ever seen the legendary American sitcom Seinfeld, you might be familiar with the fictional holiday the characters create, festivus, “A festival for the rest of us.” As a rejection of conventional winter holidays like Christmas/Haunnukah, a core part of festivus is the “airing of grievances.” While I have yet to attempt this in real-life, I’ve spent the past two years airing my grievances with aspects of data engineering with the intention of exposing you, the aspiring or beginning-career engineer, to niche errors that require on-the-fly problem solving.

Since, for many, it’s deep into the holiday season, I won’t take too much time listing all 12 errors; instead, here are three you’re most likely to encounter when first using technologies like Python, Airflow & SQL.

Erroneous datetime conversion

  • The problem: The vendor API & UI were in different time zones
  • The solution: Convert a timestamp to a datetime value and use the offset to subtract the correct amount of hours from UTC to generate the correct EST start time for the fetch, like I explain here

Creating Excessive Docker Images (And Killing Memory)

  • The problem: Creating a new Docker image without accounting for impacts on system memory
  • The solution: 1) Recognize that the Docker image doesn’t need to be recreated for small changes to a Python script. 2) Run Docker prune to remove unused containers, networks and images to free up memory

SQL: Using CREATE OR REPLACE TABLE() instead of INSERT()

  • The problem: Running a CREATE OR REPLACE TABLE() statement will recreate a table BUT it will also wipe useful metadata and data protection policies
  • The solution: Create a date range that encompasses the entirety of your table’s data and then run a delete statement. Personally, I use MIN() AND MAX() in a subquery

While understanding the possible errors you could encounter as a data engineer working with multiple technologies is helpful, I believe it’s just as important to cultivate a healthy mental approach to programming.

Programming is one of the coolest, most frustrating ways you can spend your time. The sooner you realize the absurdity of what we do, the sooner you’ll free yourself to make and learn from mistakes like the ones above and those I highlight in the full story.

Here’s to overcoming more bugs, blockers and annoyances in ‘25.

Happy holidays and thanks for ingesting,

-Zach Quinn

Pipeline To DE

Top data engineering writer on Medium & Senior Data Engineer in media; I use my skills as a former journalist to demystify data science/programming concepts so beginners to professionals can target, land and excel in data-driven roles.

Read more from Pipeline To DE

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! From 2014-2017 I lived in Phoenix, Arizona and enjoyed the state’s best resident privilege: No daylight saving time. If you’re unaware (and if you're in the other 49 US states, you’re really unaware), March 9th was daylight saving, when we spring forward an hour. If you think this messes up your microwave and oven clocks, just wait until you check on your data pipelines. Even though data teams...

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! As difficult as data engineering can be, 95% of the time there is a structure to data that originates from external streams, APIs and vendor file deliveries. Useful context is provided via documentation and stakeholder requirements. And specific libraries and SDKs exist to help speed up the pipeline build process. But what about the other 5% of the time when requirements might be structured, but...

Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! To clarify the focus of this edition of the newsletter, the reason you shouldn’t bother learning certain data engineering skills is due to one of two scenarios— You won’t need them You’ll learn them on the job You won’t need them Generally these are peripheral skills that you *technically* need but will hardly ever use. One of the most obvious skills, for most data engineering teams, is any...