Design and implement robust & secure data pipelines into
a Snowflake data warehouse from on premise and cloud data sources.
Work with business analysts and users to translate functional
specifications into technical requirements and designs.
Support maintenance, bug fixing and performance analysis
along data pipeline.
Contribute to knowledge building and sharing by researching
best practices, documenting solutions and continuously iterating on new ways to
solve problems.
Requirements :
Bachelor's degree in Computer Science, Engineering,
Mathematics or Statistics from a recognized university /
institute.
2+ years of experience in data engineering field.
Experience
in Data Warehousing / Data Lake and BI landscape.
Experience in tools and technologies
including below,
- Amazon Redshift.
- AWS Glue.
- Pyspark.
- Microsoft SQL
Server.
- Power BI.
Experience
in Logistics systems would be an advantage.