Data Engineer Jobs


Job Brief

We have a vacancy of Data Engineer in our company, Prudential. This vacancy is based in Singapore. Please go through the job detail mentioned below.

Position Title: Data Engineer
Company: Prudential
Work Type: Part Time
City of work: Singapore
Salary: Salary detail is not available
URL Expiry: 2022-08-15
Posted on:

Job Detail

Prudential’s purpose is to help people get the most out of life. We will deliver our purpose by creating a culture in which diversity is celebrated and inclusion assured, for our colleagues, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and in exchange, we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow and Succeed.

This role will play a critical part in supporting the delivery of Prudential Assurance Company Singapore strategic objectives captured as part of the Analytics Roadmap. You will collaborate closely with Data and Analytics Centre of Excellence (CoE), technology and business teams. The role is responsible for progressing Prudential Singapore’s business growth by building world-class systems, processes and engineering standards that enable advanced analytics and machine learning at scale. You will build data solutions, processes, and standards by developing use cases defined in the roadmap

  • Build pipelines to ingest a wide variety of data from multiple sources within the organization and from external sources (e.g., government, vendors, and social media).
  • Optimize existing pipelines .
  • Testing data structures to ensure that they are fit for use for data analysts and scientists.
  • Prepare and maintain environments for secure prototyping, development, testing and data manipulation for data scientists.
  • Design and implement effective data storage solutions and models.
  • Assist data models and AI/ML deployment.
  • Assess database implementation procedures to ensure they comply with internal and external regulations.
  • Prepare accurate database design and architecture reports for management and executive teams.
  • Oversee the migration of data from legacy systems to new solutions on Cloud infrastructure.
  • Monitor the system performance by performing regular tests, troubleshooting, and integrating new features.
  • Automate low value tasks
Who we are looking for
Technical skills and work experience
  • Experience with at least one major Cloud Infrastructure provider (Azure/AWS/GCP)
  • Experience building data pipelines using batch processing with Apache Spark (Spark SQL, DataSet / Dataframe API) or Hive query language (HQL)
  • Knowledge of Big Data ETL processing tools
  • Experience in Data Modelling, Data mapping for Data Warehouse and Data Marts solutions
  • Experience with Hive and Hadoop file formats (Avro / Parquet / ORC)
  • Basic knowledge of scripting (shell / bash)
  • Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza), NoSQL / document databases, flat files
  • Understanding of CICD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops.
  • DevOps practices using Git version control
  • An interest in staying up to date with industry standards and technological advancements that will improve the quality of your outputs.
  • Ability to debug, fine tune and optimize large scale data processing jobs
  • Highly capable in
    • Python or Scala
    • SQL
    • DataBricks
  • Knowledge in
    • Azure Data Factory
    • Azure DevOps
    • BitBucket or GitHub
    • Machine learning
    • MLFlow
    • ML (Machine Learning) frameworks (e.g., scikit-learn, TFX, PyTorch etc.).
  • Knowledge of life insurance industry preferred.
Competencies & Personal Traits
  • Flexibility, creativity, and the capacity to receive and utilize constructive feedback.
  • Curiosity and outstanding interpersonal skills.
  • Work as a team player
  • Capacity to successfully manage a pipeline of duties with minimal supervision.
  • Master’s degree or equivalent work experience in Computer Science or related discipline.
  • Certificates in Data Engineering from reputable MOOCs or cloud vendors are welcome
Language: Fluent written and spoken English

Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with special requirements.