Apply for this position

Allowed Type(s): .pdf, .doc, .docx

3+10=?

Urgent Job Opening For DevOps | Arch Support | ETL | Data Pipelines Developers

 

Job Description:

We are currently seeking a talented and ambitious DevOps and Architecture Support Engineer with 2-5 years of experience, specializing in ETL (Extract, Transform, Load) and data pipelines. In this role, you will contribute to the optimization of our data workflows, support ETL processes, and collaborate with cross-functional teams to enhance the overall system architecture.

Responsibilities:
DevOps and Infrastructure:
  • Assist in the implementation and maintenance of CI/CD pipelines for ETL processes.
  • Support development teams in automating and streamlining deployment processes.
  • Participate in the management and optimization of cloud infrastructure.
ETL Development and Maintenance:
  • Contribute to the design, development, and maintenance of ETL processes for data extraction, transformation, and loading.
  • Collaborate on troubleshooting and optimizing existing ETL processes for performance and efficiency.
Data Pipeline Management:
  • Support the building and management of data pipelines for efficient and reliable data flow between systems.
  • Collaborate with data engineers on the implementation of scalable and maintainable data architectures.
Monitoring and Support:
  • Assist in the implementation of monitoring solutions to ensure the health and performance of data pipelines and ETL processes.
  • Provide support for production issues, conduct root cause analysis, and contribute to corrective actions.
Collaboration:
  • Work closely with data scientists, analysts, and other stakeholders to understand data requirements and contribute to data availability and quality.
  • Collaborate with cross-functional teams to integrate data solutions into the overall system architecture.
Documentation:
  • Contribute to the maintenance of comprehensive documentation for ETL processes, data pipelines, and infrastructure configurations.
Security:
  • Assist in the implementation and enforcement of security best practices for data handling and storage.
Qualifications:
  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 2-5 years of hands-on experience in DevOps roles, including exposure to CI/CD implementation and cloud infrastructure management.
  • Familiarity with designing, developing, and maintaining ETL processes.
  • Exposure to data pipeline orchestration tools (e.g., Apache NiFi, Apache Airflow).
  • Proficiency in scripting languages (e.g., Python, Bash) for automation tasks.
  • Basic understanding of database systems (SQL and NoSQL) and data warehousing concepts.
  • Familiarity with version control systems (e.g., Git) and infrastructure as code (IaC) principles.
  • Strong troubleshooting and problem-solving skills.
  • Good communication skills and the ability to work collaboratively in a team.
  • Awareness of security protocols and best practices in data handling.
Job Experience:
  • Experience: 2-5 years
  • Work Location: Faridabad (Inhouse and Remote Job)
Salary:
  • Open for Best Candidates or Negotiable.
Education:
  • UG: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • PG: Any Postgraduate
How to Apply:

Please send your resume and a cover letter highlighting your relevant experience to hr@connectinfosoft.com

Tags: Urgent Job Opening For DevOps Developers, Job Opening For Arch Support Developers, Job Opening For ETL Developers, Job Opening For Data Pipelines Developers, Connect Infosoft, Job Opening for DevOps Consultant, Immediate Joiner, Developer Faridabad jobs, Remote Jobs, Developer Jobs Delhi NCR