Hiring - Python Developer with Airflow Experience - Dallas, TX

Wed Oct 09 2024 14:07:20
apply   |   share

Job Title: Python Developer with Airflow Experience

Location: Dallas , TX 

Job Type: W2 Contract 

Capgemini /Goldman 

 

Job Description:

We are seeking a talented and experienced Python Developer with strong knowledge of Apache Airflow to join our team in Dallas, TX. The ideal candidate will have a deep understanding of Python programming, data pipelines, and orchestration tools like Airflow. You will work closely with data engineers, data scientists, and other cross-functional teams to build and optimize scalable data workflows.

 

Responsibilities:

Design, implement, and maintain scalable data pipelines using Python and Airflow.

Develop, test, and deploy automation scripts for scheduling and monitoring data workflows.

Optimize existing data workflows for performance and scalability.

Collaborate with data engineers and analysts to integrate various data sources and APIs.

Monitor, troubleshoot, and enhance Airflow jobs to ensure the smooth execution of data tasks.

Write clean, efficient, and well-documented code following best practices in software development.

Ensure code quality and testing through unit tests, code reviews, and debugging.

Support data-driven decision-making by improving the reliability and efficiency of data pipelines.

Work in an Agile environment and participate in regular sprints.

 

Qualifications:

6+ years of Python development experience in a production environment.

3+ years of experience with Apache Airflow, including building and maintaining DAGs.

Strong understanding of ETL processes and experience with data engineering workflows.

Experience with SQL and working with relational databases (e.g., PostgreSQL, MySQL).

Knowledge of cloud services like AWS, Azure, or GCP for data pipeline orchestration.

Familiarity with Docker, Kubernetes, or other containerization tools is a plus.

Excellent problem-solving skills and the ability to work independently or in a team.

Familiarity with CI/CD pipelines and version control (Git, Jenkins, etc.).

Experience with task automation and scheduling tools beyond Airflow is a bonus.

 

Nice-to-Have:

Experience with big data technologies like Spark or Hadoop.

Familiarity with APIs and web scraping techniques.

Knowledge of data security and compliance best practices.

Prior experience in financial services, healthcare, or e-commerce domains.

Posted by:
Rahul Yadav
Email: rahul@tigerbells.com

View all job posts.