Extract. Transform. Read.A newsletter from Pipeline Hi past, present or future data professional! Winter in the western hemisphere is grim. Even in sunny Florida, where I write from, we’ve experienced weeks of gray skies and plunging temperatures. In the corporate world, winter (Q1) presents another grim reality: Layoffs. Unfortunately, no position, no matter how “critical to the organization” is layoff-proof. Even your CEO can be let go; hence, the “golden parachute” many executives build into their contracts in the event they are unable to fulfill lofty annual goals. In addition to the conventional “clean out your desk” immediate termination, there are other procedures that many consider to be "soft" layoffs:
It’s tempting to shrug and say “What can you do?” To me, the answer is contingency. 40% of workers will experience a layoff at some point in their careers. Understand that tech and high-paying data engineering roles (especially those that don’t directly generate revenue) can be targets in tough times. Once you accept that, you can create an escape plan that involves, broadly:
Most importantly, it’s not a bad idea to conceptualize an “action plan” that can be triggered with the first hint of downsizing. I literally keep a folder on my personal computer called “in_case_of_layoff.” For the contents of that folder and how to craft your own action plan, read “Break Glass in Case of Layoff." While there is a lot of optimism surrounding the bounce back of the technical market this year, there is also some turbulence ahead. Just like you would on an airplane, take a minute to review your emergency plan and hang in there until you reach your next cruising altitude. Thanks for ingesting, -Zach Quinn |
Top data engineering writer on Medium & Senior Data Engineer in media; I use my skills as a former journalist to demystify data science/programming concepts so beginners to professionals can target, land and excel in data-driven roles.
Extract. Transform. Read. A newsletter from Pipeline For a STEM discipline, there is a lot of abstraction in data engineering, evident in everything from temporary SQL views to complex, multi-task AirFlow DAGs. Though perhaps most abstract of all is the concept of containerization, which is the process of running an application in a clean, standalone environment–which is the simplest definition I can provide. Since neither of us has all day, I won’t get too into the weeds on containerization,...
Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! From 2014-2017 I lived in Phoenix, Arizona and enjoyed the state’s best resident privilege: No daylight saving time. If you’re unaware (and if you're in the other 49 US states, you’re really unaware), March 9th was daylight saving, when we spring forward an hour. If you think this messes up your microwave and oven clocks, just wait until you check on your data pipelines. Even though data teams...
Extract. Transform. Read. A newsletter from Pipeline Hi past, present or future data professional! As difficult as data engineering can be, 95% of the time there is a structure to data that originates from external streams, APIs and vendor file deliveries. Useful context is provided via documentation and stakeholder requirements. And specific libraries and SDKs exist to help speed up the pipeline build process. But what about the other 5% of the time when requirements might be structured, but...