Snowflake Data Engineer
Data Engineer with good knowledge of Snowflake cloud platform, architecture.
Experience with various data ingestion methods (snowpipe others), time travel and data sharing and other Snowflake capabilities
Very good knowledge of data warehouse, data infrastructure, data platforms, ETL implementation, data modelling and design
Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success.
Excellent interpersonal skills.
Eager to contribute in a team-oriented environment.
Strong prioritization and multi-tasking skills with a track record of meeting deadlines.
Ability to be creative and analytical in a problem-solving environment.
Effective verbal and written communication skills.
Adaptable to new environments, people, technologies, and processes
Ability to manage ambiguity and solve undefined problems
Build and maintain robust and scalable ETL/ELT pipelines to integrate data from multiple systems across the enterprise
Maintain and evolve our data catalog to enable ML/AI use-cases
Build complex data models to perform business analysis using PL/SQL Queries on SAP HANA and Snowflake
Collaborate across the company’s multiple data teams to meet analytics/ML/AI deliverables
Maintain databases and develop/support monthly/quarterly metrics reports
Improve performance, efficiency, and accuracy of all queries
Training analysts and data scientists on available data sources
Provide business partners with actionable insights into KPIs in the form of regular reporting and ad-hoc analysis
Identify and drive process and reporting enhancements, develop new tools and collaborate across teams to help scale the process with ServiceNow’s growth
Good knowledge of SQL and should be able to write complex SQLs
To be successful in this role you have:
5 + years of documented experience in writing strong SQL, PLSQL in data warehouse technologies (Snowflake, SAP Hana or any modern database)
Strong understanding of Data Warehousing concepts for big data applications and services, preferably using Snowflake
3+ years of proven experience in building large scale ETL/ELT pipelines using your own code (Python, Java, Scala, Spark) and/or dedicated tools (Matillion, Perspectium, Azure Data Factory, etc.)
Experience working with ML/AI teams is a big plus
Knowledge of ServiceNow platform and its data model is a big plus
Expertise in database design development, writing optimized queries, handling facts,dimension data effectively
Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders
Structured thinking with ability to easily break down ambiguous problems and propose impactful data modeling designs
Passion for analyzing large and complex data sets and converting them into the information which drive business decisions
Attention to detail, organization, and effective verbal/written communication skills
Must be able to work in fast paced environment and be able to adapt to changing requirements