Design, develop, document and implement end-to-end data pipelines and data integration processes, both batch and real-time.
Monitor, recommend, develop and implement ways to improve data quality including reliability, efficiency and cleanliness, and to optimize and fine-tune ETL /ELT processes.
Recommend, execute and deliver best practices in data management and data lifecycle processes.
Prepare test data, assist to create and execute test plans, test cases and test scripts.
Requirements
At least 3+years solid and hands-on experience in real-time event/data streaming.
At least 3+years of solid hands-on development experience with ETL development to transform complex data structure in multiple data sources environment.
Strong programming on Python and T-SQL ETL/ELT programming.
Experience on Azure Databricks for ETL/ELT development and big data analytics programming in Python.
StrongExperience with various of ETL/ELT frameworks, data warehousing concepts, data management framework and data lifecycle processes.
Solid understanding on Azure Data Management Solution including Azure Data Factory, Azure Databricks, Azure Blob storage (Gen2) and Azure Synapse.
At least 3+years solid and hands-on experience in real-time event/data streaming.
At least 3+years of solid hands-on development experience with ETL development to transform complex data structure in multiple data sources environment.
Strong programming on Python and T-SQL ETL/ELT programming.
Experience on Azure Databricks for ETL/ELT development and big data analytics programming in Python.
StrongExperience with various of ETL/ELT frameworks, data warehousing concepts, data management framework and data lifecycle processes.
Solid understanding on Azure Data Management Solution including Azure Data Factory, Azure Databricks, Azure Blob storage (Gen2) and Azure Synapse.
Work benefits
Very competitive benefits; talk to your JP Associates Recruiter for more info.
Very competitive benefits; talk to your JP Associates Recruiter for more info.