[ETR #78] F DE


Hi fellow data professional!

Once thought to be a purely back office role, data engineering is undergoing a radical transformation and gaining a new responsibility: Front-end deployment. The folks already deploying applications in this capacity are known, incidentally, as forward deployed software engineers or forward deployed engineers (FDEs).

Before you worry about needing to learn JavaScript or other web programming paradigms, know that I’m referring to the preparation, deployment and stakeholder/customer support of AI products.

That last bit about stakeholder support defines the FDE role, which "focus[es] on enabling many capabilities for a single customer or business unit environment", according to Ariel Jalali CEO of Paragon.

And increasingly the capabilities customers want enabled and accelerated revolve around everyone's favorite buzz acronym: AI.

An October 2025 study found that, increasingly, “data engineers are stepping out from behind the scenes to help AI strategy and influence business decisions.”

And before you think this is simply good PR speak for data engineers, I’ve observed this firsthand.

This year, my team gained a new engineer exclusively dedicated to creating data-oriented products and infrastructure to service AI initiatives. Their contributions are by far the most visible out of anyone on our team since leadership is salivating over anything AI-related.

The type of data engineering required to build and scale quality AI integrations is a bit different than source data that might pass through a batch pipeline.

To yield the most accurate output, AI demands two qualities: Recency and volume.

This means data engineers creating the infrastructure for AI deployment will be feeding real-time and unstructured data to their integrations.

Before you get confused and think you need to brush up on prompt engineering, the most competitive AI skillset you can develop is an understanding of the “big picture” of deployment.

  • Master CI/CD for the pods that contain ETL scripts
  • Understand the constraints of upstream APIs and how they can be leveraged to extract business insights.
  • Build comfort with agentic workflows
  • Learn how best to shape and transform the data you feed to underlying models. **CSVs and data frames aren’t going to cut it here.**

Most importantly, realize that the best data engineers now are leveraging AI knowledge to work across the data spectrum, as one engineer acknowledges: “I’ve found that all of us have taken on the role of engineer at some point. All of us have taken on the role of data scientists. All of us have taken on the role of data analysts.”

So as you master the technical side of the job, understand that the future of data engineering job responsibilities is as dynamic as the data you will mine.

Thanks for ingesting,

-Zach Quinn

Medium | LinkedIn | Ebooks

Extract. Transform. Read.

Reaching 20k+ readers on Medium and 3k learners by email, I draw on my 4 years of experience as a Senior Data Engineer to demystify data science, cloud and programming concepts while sharing job hunt strategies so you can land and excel in data-driven roles. Subscribe for 500 words of actionable advice every Thursday.

Read more from Extract. Transform. Read.

Hi past, present or future data professional! As time in 2025 dwindles, I wanted to share what I learned about optimizing design, development and troubleshooting time while working 3 days per week this fall. Quick background: If you’ve been a long-time reader, you’ll know that in March my wife and I had our first child. Consequently, through my employer, I was eligible for several months of parental leave. Anticipating my wife’s return to work (after much needed time off!) I allocated the...

Hi past, present or future data professional! As the winter holidays approach, we’re entering a period of downtime for most orgs. Assuming your employer has hit goals (or accepted losses), allocated coverage for the slew of inevitable vacation requests and maybe even entered a “code freeze”, you’re entering data & tech’s slow season. If you’re working, during this time you may be asked to do any number of “downtime” (actual free time, not data outages) tasks ranging from code refactors to...

Hi past, present or future data professional! If you’re in the U.S., Happy Thanksgiving! I’m prepping for my food coma, so I’ll make this week’s newsletter quick. Like millions of Americans, I’ll be watching NFL football (go Ravens!). The average NFL game is 3 hours. If you can skip just one of today’s games and carve out that time for professional development, here’s how I’d spend it. In the spirit of football, I’ll split the time designation into 4 quarters. Documentation pass - if you read...