- Role OverviewThe Data Engineer will be responsible for designing, developing, and maintaining data pipeline from end to end that support data flow and data integration from many sources into our target data platform in accurate and timely manner. They will collaborate closely with the other data team member (Data Architect, Data Modeler, etc.) to create highly effective and efficient ETL process and deliver data output to support business
Objective
s. The candidate must be familiar with Data Warehousing concepts and must have done at least one end to end data processing pipeline.Roles &
Responsibilities
Collaboration & coordination with multiple stakeholders.Develop pipelines to ingest data from different sources such as Database, delimited files etc., do complex transformation and loading data into the data platform (On Premise and Cloud).Develop efficient and effective query and capable to do performance tuning over existing code or query.Responsible for coordinating or participating in all aspects of the development cycle from design and development to release planning and implementation of data systems.Advanced SQLs capability for data analysis.Able to handle the end-to-end ETL or ELT pipeline solely.Ensuring the work is delivered on time and with quality.
Requirements
4+ years of extensive practical experience in developing data warehouse systems or ETL processes.Having extensive experience in on premise data warehouse using SQL Server, Oracle and SSIS/Informatica or any other similar technology.Having experience in on cloud data technologies – in particular on AWS is preferred.Strong understanding of ETL and data warehouse concepts processes and best practices.Good understanding of Big data and cloud technologiesPossess a combination of solid business knowledge and technical expertise with strong communication skills.Demonstrate excellent analytical and logical thinking.Good verbal & written communication skills and ability to work independently as well as in a team environment providing structure in ambiguous situation.Good to have:Understanding of Insurance DomainBasic understanding of data visualization tools like SAS VA, TableauGood understanding of Master Data Management, Data Quality, and Data Governance.Good understanding of implementing & architecting data solutions using the AWS components.📍
Location
Onsite – Kuningan🏢
Placement
At one of our clients – insurance company📅 Contract Duration: 4 months⏱️ Work Hours: 09:00 AM – 18:00 PM (WIB)Start Date: ASAP