Extract. Transform. Read.A newsletter from Pipeline: Your Data Engineering ResourceHi past, present or future data professional! Data engineering can be dangerous; ok—not, like, physically, but by building and maintaining data infrastructure, data engineers are given a surprising amount of access and responsibility. Every commit, table alteration and deletion must be made with care. It took 2 years, but I finally learned a shortcut to make developing SQL staging tables less risky and more efficient. Even seemingly minor mistakes like joining on the wrong key can result in losing days or months of valuable data, which can be equal to hundreds of thousands or millions of dollars in revenue visibility. Outside of code mistakes, not paying attention to logistic factors like vendor contracts and API usage can not only result in downtime, in a worst-case scenario it can lead to an all-out blackout. If the stakes sound ominous, I’d suggest examining the root of your hesitation to work more confidently and efficiently—it may even be the code itself. There is a happy medium between freely building data pipelines and using the appropriate guard rails. As long as you take your time and don’t commit code directly to the main branch then you can do data engineering safely and avoid bursting your pipelines. For those who are anti-virus minded, here are this week’s links as plain text:
P.S. Want to learn how to go from code to automated pipeline? Take advantage of my 100% free email course: Deploy Google Cloud Functions In 5 Days. Thanks for ingesting, -Zach |
Reaching 20k+ readers on Medium and nearly 3k learners by email, I draw on my 4 years of experience as a Senior Data Engineer to demystify data science, cloud and programming concepts while sharing job hunt strategies so you can land and excel in data-driven roles. Subscribe for 500 words of actionable advice every Thursday.
Hi past, present or future data professional! If you’re in the U.S., Happy Thanksgiving! I’m prepping for my food coma, so I’ll make this week’s newsletter quick. Like millions of Americans, I’ll be watching NFL football (go Ravens!). The average NFL game is 3 hours. If you can skip just one of today’s games and carve out that time for professional development, here’s how I’d spend it. In the spirit of football, I’ll split the time designation into 4 quarters. Documentation pass - if you read...
Extract. Transform. Read. A newsletter from PipelineToDE Hi past, present or future data professional! In 2 weeks or so The Oxford English Dictionary will reveal its 2025 word of the year, a semi-democratic process that lends academic legitimacy to words like “rizz” (2023’s pick). If you’re currently employed or interact with white collar workers, you would think the word of the year is “headwinds.” Used in a sentence: “We’ve pivoted our AI strategy but still encountered headwinds that...
Extract. Transform. Read. A newsletter from PipelineToDE Hi past, present or future data professional! After choosing a dataset, one of the most significant decisions you must make when creating displayable work is: How am I going to build this thing? For some, you may try to “vibe code” along with an LLM doing the grunt technical work. If you choose this approach, be warned: Nearly half of all “vibe code” generated contains security vulnerabilities and that’s before you even consider its...