Extract. Transform. Read.A newsletter from Pipeline​ Hi past, present or future data professional! It’s never good when you wake up to this from a coworker: 💀 The skull wasn’t because the sender felt like they would suffer any kind of dramatic fate. Instead, they were prepared to administer near-fatal justice to the junior engineer who made several unnecessary overnight commits straight to our org’s main branch. The thing is, for a first-time violation, I can understand why testing is an afterthought for new engineers. Schools and courses emphasize local output over production so testing feels like an extra step. To properly test code, you need to configure a clean, production-adjacent environment. If you’re new to this concept, here are 2 of my favorites along with an unusual choice. The safe choice: Virtual Environment I use two virtual environments that can be configured interchangeably: Pyenv and Venv. Pyenv is easy to configure and use within a terminal in a “professional” IDE like VS code. Pyenv is ideal because it allows me to create an environment from a blank slate each time. Venv is another option. Instead of using Venv in VS Code this is how I set up a virtual Python environment inside of a Virtual Machine (VM). ​Read more to learn how to set up a quick, durable sandbox in a Compute Engine VM. The portable option: Docker I’ll confess: I didn’t used to be a fan of Docker. I didn’t really “get” containerization and could set up a virtual environment using the processes described above. However, I learned that Docker’s true power is its portability. Not only can I create a clean slate (an image), I can push this to a registry to create testing configs before I test script changes in production. Powerful stuff. The one issue I had was authenticating with GCP; I describe my solution here. The unusual pick: Jupyter Notebook Jupyter Notebook gets a bad rap in the data engineering community. Seen as a tool for data analysts and data scientists, it doesn’t quite make sense for data engineers to develop and test in an environment best known for its nicely rendered outputs. But buying into that argument would cause you to miss out on some useful features and, frankly, a nice UX. And so you don’t have to search for resources, here are this week’s links.
What’s your preferred testing ground? Let me know: zach@pipelinetode.com. Thanks for ingesting, -Zach Quinn |
Top data engineering writer on Medium & Senior Data Engineer in media; I use my skills as a former journalist to demystify data science/programming concepts so beginners to professionals can target, land and excel in data-driven roles.
Extract. Transform. Read. A Newsletter From Pipeline Hi past, present or future data professional! Since today marks Thanksgiving in the US, I hope this reaches you before your eyes glaze over from the tryptophan-induced turkey coma we all inevitably slip into. While today is a day of gratitude, from a data engineering perspective, I’d like to focus, instead, on the under-the-radar tasks that can make a difference at this time of year—even if they don’t gain you any recognition at work. The...
Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! It’s been a busy fall; I currently have 14 tasks in various states of development. Right now my JIRA board looks like I just won bingo—twice. Unfortunately when you climb the tech ladder things only get busier which means you’re going to burn out unless you take steps toward proactivity. For me this means learning which tasks I don’t need to (and really shouldn’t) do manually. And before you...
Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! Despite falling into the realm of engineering, data infrastructure construction is a bit like basic art. At times building a data pipeline is as simple as filling in one of those color-by-numbers books. Other times, the process of extracting and ingesting data can be as abstract and disconnected as paint flicked onto a canvas, Jackson Pollack style. No matter the complexity of your build, there...