Required Skills
About the Job
NielsenIQ is seeking a Senior Data Infrastructure Engineer to manage and optimize our Databricks and Airflow infrastructure. This role involves cluster lifecycle management, job scheduling, Unity Catalog administration, user permissions, and cost monitoring for Databricks. For Airflow, you will manage clusters deployed on Kubernetes via Helm charts, including upgrades and configuration management, integrating with ArgoCD and Vault.
You will be responsible for writing and maintaining Terraform modules to provision and manage scalable, reliable, and secure cloud infrastructure. The role also includes developing and maintaining data pipelines and workflows using Python, PySpark, and dbt to support data processing and analytics. You will leverage GitHub Copilot with custom skills and agents to enhance team productivity, automate development tasks, and accelerate code reviews and documentation.
Key responsibilities include communicating with stakeholders to understand data needs and translate them into infrastructure and pipeline solutions, collaborating with engineering teams to ensure data quality, reviewing and approving infrastructure changes, and participating in an on-call rotation to resolve incidents. Maintaining up-to-date technical documentation and runbooks is also essential.
**Qualifications:** * 5+ years of professional software engineering experience. * 5+ years of experience with AWS and Kubernetes. * 3+ years managing Databricks infrastructure (cluster management, job scheduling, user management, cost monitoring, performance optimization). * 3+ years of Python development for data pipelines and workflows (PySpark, dbt). * 2+ years managing Airflow infrastructure. * 2+ years writing and maintaining Terraform modules. * Proficiency in using agent tools for software engineering tasks. * Excellent written and oral communication skills. * Ability to work collaboratively with cross-functional teams. * Analytical thinking and attention to detail.
**Nice to Have:** * Experience with ArgoCD. * Experience with HashiCorp Vault. * Experience with uv. * Knowledge of data quality frameworks. * Experience with Azure.