Careers > ETL Developer

ABOUT US

We are at the forefront of innovation, developing cutting-edge AI-based decision intelligence products designed to transform data into actionable insights. We are driven by a passion for technology and a commitment to delivering solutions that empower organizations to navigate complex decision-making environments with precision and foresight.


JOB DESCRIPTION

Job Title : ETL Developer

Location: Trivandrum,Kerala

Type: On site, Full time

Pay: Competitive salary based on experience

We are seeking a skilled ETL Developer with a strong background in database management to join our team. This role will work closely with the Database Architect to build, optimize, and maintain ETL pipelines and data systems that support our machine learning and AI initiatives. The ideal candidate has hands-on experience in developing robust ETL processes and a keen understanding of database operations, enabling seamless data flow across our data infrastructure.

JOB RESPONSIBILITIES:

  • ETL Pipeline Development: Design, implement, and maintain ETL workflows to efficiently ingest, transform, and load data from various sources into data warehouses, lakes, and other storage solutions.

  • Data Transformation: Ensure raw data is transformed into structured formats ready for machine learning processes, including feature engineering, time-series data preparation, and data cleansing.

  • Database Support & Optimization: Work with relational and NoSQL databases to ensure data consistency, optimize performance, and support complex querying and reporting needs.

  • Collaboration with Data Architect: Partner closely with the Database Architect to ensure ETL processes align with the data architecture and meet performance standards for ML/AI applications.

  • Data Quality & Validation: Implement data validation, quality checks, and monitoring mechanisms within ETL pipelines to ensure data accuracy and integrity across all processes.

  • Documentation: Create and maintain documentation for ETL workflows, database structures, and data transformation processes to support team collaboration and long-term maintenance.

  • Data Monitoring & Troubleshooting: Monitor ETL jobs and data pipelines, quickly diagnosing and resolving issues to maintain efficient data flow and minimize downtime.

  • Data Integration & Governance: Ensure data integration processes comply with company standards for data governance, security, and privacy.


    QUALIFICATIONS

    • Experience: Minimum of 3 years of experience as an ETL Developer, with hands-on experience in data engineering or database development in data-intensive environments.

    • ETL Tools Expertise: Proficiency in ETL tools such as Apache NiFi, Talend, Informatica, or Airflow, with experience building and optimizing complex ETL workflows.

    • Database Knowledge: Strong experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) and knowledge of SQL scripting and database optimization.

    • Big Data & Data Lakes: Familiarity with big data tools and architectures (e.g., Hadoop, Spark) and cloud-based data lakes (e.g., AWS S3, Azure Data Lake, GCP BigQuery).

    • Programming Skills: Proficiency in a relevant programming language (e.g., Python, SQL) for data manipulation and transformation within ETL processes.

    • Data Quality Assurance: Experience in implementing data validation and quality checks to ensure data accuracy and consistency.

    • Cloud Experience: Familiarity with cloud platforms (AWS, GCP, Azure) and managing ETL workflows in cloud environments is preferred.

    • Problem-Solving Skills: Strong analytical skills, with a proactive approach to troubleshooting data issues and optimizing ETL performance.


    ADDITIONAL QUALIFICATIONS

    • Certifications: Relevant certifications in ETL development, data engineering, or cloud data services (e.g., AWS Certified Big Data - Specialty, Google Professional Data Engineer).

    • Experience with Graph Databases: Exposure to Graph SQL or graph databases (e.g., Neo4j, Amazon Neptune) is a plus, as this role may work with complex relational data structures.

    • Understanding of Machine Learning Pipelines: Familiarity with the data processing requirements of ML pipelines, including experience with feature stores or real-time data streaming, is a bonus.

      Fill this form to apply:
      APPLY HERE


Are you ready to transform tomorrow?

ABOUT US


We are at the forefront of innovation, developing cutting-edge AI-based decision intelligence products designed to transform data into actionable insights. We are driven by a passion for technology and a commitment to delivering solutions that empower organizations to navigate complex decision-making environments with precision and foresight.


JOB DESCRIPTION


Job Title : ETL Developer

Location: Trivandrum,Kerala

Type: On site, Full time

Pay: Competitive salary based on experience

We are seeking a skilled ETL Developer with a strong background in database management to join our team. This role will work closely with the Database Architect to build, optimize, and maintain ETL pipelines and data systems that support our machine learning and AI initiatives. The ideal candidate has hands-on experience in developing robust ETL processes and a keen understanding of database operations, enabling seamless data flow across our data infrastructure.


JOB RESPONSIBILITIES:

  • ETL Pipeline Development: Design, implement, and maintain ETL workflows to efficiently ingest, transform, and load data from various sources into data warehouses, lakes, and other storage solutions.

  • Data Transformation: Ensure raw data is transformed into structured formats ready for machine learning processes, including feature engineering, time-series data preparation, and data cleansing.

  • Database Support & Optimization: Work with relational and NoSQL databases to ensure data consistency, optimize performance, and support complex querying and reporting needs.

  • Collaboration with Data Architect: Partner closely with the Database Architect to ensure ETL processes align with the data architecture and meet performance standards for ML/AI applications.

  • Data Quality & Validation: Implement data validation, quality checks, and monitoring mechanisms within ETL pipelines to ensure data accuracy and integrity across all processes.

  • Documentation: Create and maintain documentation for ETL workflows, database structures, and data transformation processes to support team collaboration and long-term maintenance.

  • Data Monitoring & Troubleshooting: Monitor ETL jobs and data pipelines, quickly diagnosing and resolving issues to maintain efficient data flow and minimize downtime.

  • Data Integration & Governance: Ensure data integration processes comply with company standards for data governance, security, and privacy.


    QUALIFICATIONS


    • Experience: Minimum of 3 years of experience as an ETL Developer, with hands-on experience in data engineering or database development in data-intensive environments.

    • ETL Tools Expertise: Proficiency in ETL tools such as Apache NiFi, Talend, Informatica, or Airflow, with experience building and optimizing complex ETL workflows.

    • Database Knowledge: Strong experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) and knowledge of SQL scripting and database optimization.

    • Big Data & Data Lakes: Familiarity with big data tools and architectures (e.g., Hadoop, Spark) and cloud-based data lakes (e.g., AWS S3, Azure Data Lake, GCP BigQuery).

    • Programming Skills: Proficiency in a relevant programming language (e.g., Python, SQL) for data manipulation and transformation within ETL processes.

    • Data Quality Assurance: Experience in implementing data validation and quality checks to ensure data accuracy and consistency.

    • Cloud Experience: Familiarity with cloud platforms (AWS, GCP, Azure) and managing ETL workflows in cloud environments is preferred.

    • Problem-Solving Skills: Strong analytical skills, with a proactive approach to troubleshooting data issues and optimizing ETL performance.


    ADDITIONAL QUALIFICATIONS

    • Certifications: Relevant certifications in ETL development, data engineering, or cloud data services (e.g., AWS Certified Big Data - Specialty, Google Professional Data Engineer).

    • Experience with Graph Databases: Exposure to Graph SQL or graph databases (e.g., Neo4j, Amazon Neptune) is a plus, as this role may work with complex relational data structures.

    • Understanding of Machine Learning Pipelines: Familiarity with the data processing requirements of ML pipelines, including experience with feature stores or real-time data streaming, is a bonus.

    Fill this form to apply:
    APPLY HERE

Are you ready to transform tomorrow?


© 2024 Numenor. All rights reserved.


© 2024 Numenor. All rights reserved.

numenor