Data Warehouse/ Data Engineer Remote 13 views1 applications

Job Expired

More Information

Job Description

SUMMARY OF ROLE & RESPONSIBILITIES:

The role will develop the catalog of publishable data (consisting of both currently published and unpublished datasets), create a plan and schedule for publishing this data on the State’s open data portal, and coordinate the initial and ongoing publishing of that data per the plan. In carrying out this work, the Analyst recommends changes to data pipelines and data storage to facilitate efficient publishing of the data.

REQUIRED EXPERIENCE:

  • Develop Data pipelines to move data between source and target systems. These pipelines can be ETL/ELT jobs for building data warehouse/ reporting solutions.
  • Responsible for the preparation of the corporate Open Data catalog.
  • Need to coordinate the initial preparation and ongoing processing and uploading of an estimated two hundred datasets
  • Interface with Data Analyst to extract, transform, and load (ETL) data from a wide variety of data sources to ensure the timely and seamless delivery of data
  • Triage, troubleshoot and help remediate issues
  • Contribute towards documentation of the processes/projects
  • Ensure we have data consistency between production and analytical databases.
  • Develop scripts and queries to import, manipulate, clean, and transform data from different sources

Requirements:

  • A deep understanding of relational database technologies, dimensional Modeling, Operational Data Stores, Enterprise Data Warehouses and Data Marts.
  • Significant experience with data modeling (conceptual, logical and physical model design), architecture, and performance optimization.
  • Expert level understanding of ETL fundamentals and building efficient data pipelines using Extract Transform Load (ETL) tools such as SSIS.
  • Experience with cloud data movement and compute technologies including Azure Data Factory, Logic App, Data Bricks and storage technologies including Azure Managed instance, Azure Data Lake, Azure synapse analytics.
  • Experience with major database platforms including Oracle, SQL Server as well as Cloud databases. NOSQL experience is a plus.
  • Advanced data processing, programming skills using SQL-based technologies, as well as other languages like Python.
  • Experience in modern DevOps practices (including Git, CI/CD)
  • Experience designing, enhancing, and/or modifying data warehouses
  • Functional knowledge of Microsoft Power BI

SKILLS REQUIRED:

  • A deep understanding of relational database technologies, dimensional Modeling, Operational Data Stores, Enterprise Data Warehouses and Data Marts.
  • Significant experience with data modeling (conceptual, logical and physical model design), architecture, and performance optimization.
  • Expert level understanding of ETL fundamentals and building efficient data pipelines using Extract Transform Load (ETL) tools such as SSIS.
  • Experience with cloud data movement and compute technologies including Azure Data Factory, Logic App, Data Bricks and storage technologies including Azure Managed instance, Azure Data Lake, Azure synapse analytics.
  • Experience with major database platforms including Oracle, SQL Server as well as Cloud databases. NOSQL experience is a plus.
  • Advanced data processing, programming skills using SQL-based technologies, as well as other languages like Python.
  • Experience in modern DevOps practices (including Git, CI/CD)
  • Experience designing, enhancing, and/or modifying data warehouse

How To Apply

Please include you Linkedin link in your application.

  • This job has expired!
Share this job