Overall Purpose:
As a Senior Data Engineer, you will be an integral member of the Data Engineering team, responsible for building, maintaining, and enhancing the enterprise data platform. This platform serves as the foundation for AI/Data Science solutions, digital applications, and a variety of advanced analytics use cases.
Key Responsibilities:
-
Design, develop, and maintain scalable, high-performance data products and pipelines.
-
Build and optimize robust data infrastructure to support diverse business needs.
-
Partner with stakeholders to gather data requirements and translate them into effective technical solutions.
-
Implement and manage data quality monitoring frameworks to ensure accuracy, integrity, and reliability.
-
Ensure adherence to data governance best practices including lineage, documentation, and security compliance.
-
Develop and maintain ETL/ELT processes using Azure Synapse, ADF, Spark, Kafka, and similar technologies.
-
Collaborate with Data Analysts and Data Architects to enable advanced analytics and machine learning initiatives.
-
Evaluate emerging tools, frameworks, and practices to continuously improve the data engineering ecosystem.
Qualifications & Experience:
-
Bachelor’s degree in Computer Science, Computer/Electrical Engineering, or a related technical field.
-
4–8 years of total IT experience with a strong focus on data engineering.
-
4+ years of hands-on experience with Azure services (e.g., IAM, Synapse, Data Lake, SQL Server, ADF).
-
2+ years of experience in containerization (Docker) and deploying on Kubernetes.
-
Solid experience in supporting development teams with Kubernetes best practices and performance tuning.
-
2+ years working with CI/CD tools like Jenkins or GitHub Actions.
-
Strong proficiency in Python, PySpark, and SQL.
-
4+ years of scripting and automation experience (e.g., Bash, Python, Go).
-
2+ years of working with infrastructure-as-code tools (e.g., Terraform, Ansible, CloudFormation).
-
Solid understanding of agile methodologies and the software development lifecycle.
-
Demonstrated ability to design and implement enterprise-grade data solutions, pipelines, and workflows on Azure.
Technical Skills:
-
Strong troubleshooting skills for data quality, pipeline failures, and Azure performance issues.
-
Deep knowledge of data warehousing concepts and Azure-based big data services.
-
Proficient in designing scalable data architectures and data integration patterns.
-
Strong understanding of DevOps principles and cloud infrastructure automation.
Language & Tools:
-
Fluent in English (both verbal and written).
-
Comfortable using modern development and data engineering tools.
Key Competencies:
-
Communication: Strong interpersonal and communication skills with the ability to convey complex technical concepts.
-
Analytical Thinking: Ability to assess problems and deliver practical, long-term solutions.
-
Problem Solving: Demonstrated ability to tackle complex issues with a sustainable mindset.