|
Hi fellow data professional! This week I’ve gotten back into something I haven’t even attempted since my college intern days: Meal prepping. Prep is a priority for me since I’m watching my son (and our pets) solo while my wife is away for work. And, I hate to say it but, I somewhat agree with Sam Altman’s controversial quote about not understanding how people parented before widespread AI adoption; when used properly, AI-generated “parental assets” like meal plans, budgets and workout routines can buy back some serious time that can be converted to precious father-son bonding opportunities. If you’re job searching while parenting or even while working a demanding job I’m sure you’ve thought about or implemented automations to carve out precious extra minutes. With discussion about rising energy costs and AI subscription costs steadily increasing, working devs and learners are grappling with a choice I didn’t have to make while learning data science. What is more valuable: (AI usage) tokens or your time? We are entering an era of 'Peak Token Volatility.' With AI expected to drive an 8% spike in national electricity costs by 2027, the industry is moving away from unlimited 'all-you-can-prompt' plans toward metered, high-efficiency system designs. I ask you this question not just out of intellectual curiosity but because I believe this may soon be integrated into interview processes, especially those dealing with system design. Although AI wraps resource allocation in a shiny new layer, weighing tool cost against finite dev time isn’t a new discussion. One of the pillars of data engineering (and software engineering in general) is the deployment of automatic systems that enable the seamless delivery of data. Expect interview questions that probe your ability to free up time blocks by utilizing the appropriate AI stack within an org's budget constraints. Scenarios like—
Notice none of these questions are: What prompt would you use for x? Or why this agent over that agent? You may even see future take-home assignments with constraints like “Use an LLM or agentic workflow to complete this in 2 hours max.” To this working engineer, there’s some merit to those concerns but for a manager they want you to demonstrate you’re a resource-minded dev. And even while job searching you may hit a wall where you’d literally rather pay an AI subscription to streamline applications rather than “grind out” responses to 5-10 per day. Maybe you’re confident enough in your existing coding skills to crank out a Python-based automation with minimal generated code. Or you may be willing to hand all your work to Claude. Pay attention to your personal threshold. Because resource consumption issues are only projected to compound so soon the choice could be tougher. But even with rising costs, no ROI will be as satisfying and productive as “buying back” the most finite resource of all: Time. So I ask you...
Thanks for ingesting, -Zach Quinn Medium | LinkedIn | Ebooks |
Reaching 20k+ readers on Medium and over 3k learners by email, I draw on my 4 years of experience as a Senior Data Engineer to demystify data science, cloud and programming concepts while sharing job hunt strategies so you can land and excel in data-driven roles. Subscribe for 500 words of actionable advice every Thursday.
Hi fellow data professional! I learned one of the most important personal branding lessons in the basement of Arizona State University. I was seated at my desk in the Post Office/Writing Center as my coworker, a fellow writing tutor, reviewed my resume. “The content is good, but I won’t remember this. There’s no branding.” She thought for a second. “You know what? Change the font color to navy. Your brand is now blue.” I laughed but she was serious and the interaction imprinted on me not the...
Hi fellow data professional! This edition almost became an apology because I’ve been on a tight deadline and pre-baby morning wake up thinking/writing time has become GSD (get sh!t done) hour. Long story short: I got brought in late to a time-sensitive project that required me to speed through a planned pipeline migration. As a recovering news junkie (aka journalist), I used to live and die by deadlines. But, given the unpredictability of data-oriented work and internal deliverables, it’s...
Hi fellow data professional! For years, the opening of The Simpsons, specifically Bart writing lines on the chalkboard, has been incredibly relatable to me. Not because I’m up to mischief (none I’ll admit to here, anyway), but because I spend most days writing the same three lines of SQL over and over again. If you've ever been paranoid about a table's content, you might know what I'm talking about. It’s the aggregate COUNT(*) grouped by a date field, ordered by date DESC. The output of that...