Extract. Transform. Read.A newsletter from Pipeline Hi past, present or future data professional! To clarify the focus of this edition of the newsletter, the reason you shouldn’t bother learning certain data engineering skills is due to one of two scenarios—
You won’t need them Generally these are peripheral skills that you *technically* need but will hardly ever use. One of the most obvious skills, for most data engineering teams, is any visualization tool. This might involve out-of-the-box BI tools like Looker. Or we might be talking about scripting-based visualizations like the kind you’d generate using Matplotlib or Ggplot (shout out to any R users in the house). Speaking of R, remember those statistical languages/methodologies (R, Matlab, etc.) you learned as part of your data science degree? Yeah, you’ll almost never use them to build production pipelines. In some circumstances you may, however, use these tools to validate data or build analytic models. But ML modeling is typically outside the scope of a data engineering role. Unless your company isn’t yet in the cloud (there are some *ahem* late adapters out there), you likely won’t use paradigms like Postgres or obscure SQL variants like T-SQL. You’ll learn them on the job I’ll caveat this category with the assumption that you’re fortunate enough to land in an org that provides proper training and mentorship for new engineers. Even with the cynicism that comes with being a senior engineer, I believe most team members want to help each other; data engineering is a team sport, after all. One of the regular exercises this team executes is commits, reviews and merges into a production code base. If you’re like the majority of companies, you’ll do this through GitHub. I disagree with those who say you need to learn git before working professionally. I only knew the UI and picked up the CLI commands quickly. It’s not technically complex. Another big skill you don’t really need to worry about is a team’s codeless pipeline (assuming they use one). Some job listings include FiveTran (a big codeless provider), but there is often plenty of documentation and third-party support to help you acclimate to the platform and troubleshoot issues. Finally, something useful you’ll pick up on the job is how to properly validate data. It’s important to have a baseline understanding of “what looks right” when completing school or portfolio projects, but there’s no way to know what your team/org expects until you’re completing a deliverable. If you want to get a sense of how to “smell check” the data in SQL, you can refer to one of my previous articles. When acquiring skills or upskilling one of the most valuable things you can learn is where to focus your time and attention. Thanks for ingesting, -Zach Quinn |
Reaching 20k+ readers on Medium and 3k learners by email, I draw on my 4 years of experience as a Senior Data Engineer to demystify data science, cloud and programming concepts while sharing job hunt strategies so you can land and excel in data-driven roles. Subscribe for 500 words of actionable advice every Thursday.
Hi fellow data professional! Once thought to be a purely back office role, data engineering is undergoing a radical transformation and gaining a new responsibility: Front-end deployment. The folks already deploying applications in this capacity are known, incidentally, as forward deployed software engineers or forward deployed engineers (FDEs). Before you worry about needing to learn JavaScript or other web programming paradigms, know that I’m referring to the preparation, deployment and...
Hi past, present or future data professional! As time in 2025 dwindles, I wanted to share what I learned about optimizing design, development and troubleshooting time while working 3 days per week this fall. Quick background: If you’ve been a long-time reader, you’ll know that in March my wife and I had our first child. Consequently, through my employer, I was eligible for several months of parental leave. Anticipating my wife’s return to work (after much needed time off!) I allocated the...
Hi past, present or future data professional! As the winter holidays approach, we’re entering a period of downtime for most orgs. Assuming your employer has hit goals (or accepted losses), allocated coverage for the slew of inevitable vacation requests and maybe even entered a “code freeze”, you’re entering data & tech’s slow season. If you’re working, during this time you may be asked to do any number of “downtime” (actual free time, not data outages) tasks ranging from code refactors to...