Data Integration Engineering Technical Lead Jobs

IKIGAI ENABLERS

Job Brief

We have a vacancy of Data Integration Engineering Technical Lead in our company, IKIGAI ENABLERS. This vacancy is based in Singapore. Please go through the job detail mentioned below.

Position Title: Data Integration Engineering Technical Lead
Company: IKIGAI ENABLERS
Work Type: Full Time
City of work: Singapore
Salary: Salary detail is not available
URL Expiry: 2022-08-23
Posted on: https://sg.jobsoffices.com

Job Detail

Job Description:

Data Integration Engineering Technical Lead

  • Understand business requirements, conduct analysis, and be responsible for designing, building, and maintaining Data Engineering solutions.
  • Tech Lead for Extraction Ingestion & data processing development, test support and deployment.
  • Responsible for overall Data Integration technical design, code validations, Build, test support and deployment planning & execution.
  • Responsible for design & building reliable Data Integration/Data Ingestion/Data Extraction pipelines and processes, to extract and load data from existing and new source systems and external pipelines for both structured and unstructured data. Responsible for design and develop scripts for batch scheduling, monitoring & automation. Responsible for design & build of Reconciliation framework for source vs target.
  • Responsible for completeness & timely delivery of Technical deliverables.
  • Develop detail-level design documents from functional requirements and adhere to the development and naming standards established.
  • Ensure Handover and trainings to BAU staff.
  • Estimating, Planning, organizing and assigning tasks to various team members.
  • 3+ years’ experience in working as a Technical Lead on Data Integration/Data Ingestion/Data Engineering projects.
  • Rich Experience with relational SQL databases like Oracle, Postgres, SQL Server, Snowflake, as well as NoSQL databases (desirable).
  • Experience with designing & building data integration pipelines (using variety of ETL tools and Database programming) and workload or workflow management tools: Airflow etc. Experience in implementing security like Access Control, implement DQ checks I the pipeline processes.
  • Experience with cloud services like compute, storage, cloud DW (like Snowflake, Redshift etc.), relational data services, Big Data processing, Data sharing.
  • Experience in data engineering with scripting languages like Python (using PySpark), Java, Scala, Unix etc.
  • Experienced in working in a DevOps setup and familiar with DevOps platforms and tools.
  • Excellent communication skills and different type of stakeholder’s engagement skills.
  • Experienced in projects executed on an Agile development methodology.

Tech Skills:

  • AWS S3, AWS GLUE, RDS, Athena, Kinesis, EMR, SPARK, Python, PySpark, ANSI SQL, Workflow Orchestration (preferably Airflow), DevOps tools, Airflow (additional workflow orchestration tool or scheduler experience desirable), UNIX, PowerShell.
  • RDBMS experience, Cloud DW (preferably Snowflake).
  • Desirable-
  • Experience on Azure.
  • Experience on any new generation pipeline tool (e.g., SnapLogic)
  • Experience on Hadoop related tools and technologies such as Hive, HBase, Sqoop, Impala, Kafka, Kerberos etc.
Required Skills:

Business