Job Description
Minimum 4 years of professional experience in data engineering or backend development with a strong focus on Python and data processing.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, or a related field.
- Strong understanding of data structures, algorithms, and software engineering principles.
- Certification in AWS/GCP or Data Engineering is a plus.
Communication
- Strong verbal and written communication skills
- Ability to translate business requirements into technical solutions
- Collaborative mindset to work across multiple teams
Skills
- PYTHON, DJANGO, FLASK, SQL, NOSQL, BIG DATA, AWS, SPARK, DATA PIPELINE, ETL, DOCKER, AIRFLOW
Roles & Responsibilities
- Develop, maintain, and optimize scalable and reliable data pipelines
- Collaborate with cross-functional teams to define data needs and infrastructure
- Integrate data from various sources and transform it for business intelligence and analytics
- Monitor and troubleshoot performance issues in ETL processes
- Implement data quality checks, logging, and alerts
- Design and build APIs or microservices for internal data use
- Maintain data security and compliance with industry standards
- Participate in code reviews, testing, and documentation