Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 5-8 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
What you’ll be doing
• Translate business requirements to technical solutions leveraging strong business acumen
• Assist in implementing the Big Data Solutions strategy laid out by solution architect
• Previous 3 years’ experience in defining and implementing solution designs, development standards, data quality measurements, end to end development processes, and similar practices for data platforms and data integration platforms.
• Should be able to assist in developing Data Pipelines/ETL/ELT jobs
• Assist in updates and upgrades to the existing ETL and BI system to ensure consistent system performance
• Develop ETL jobs as per the business requirements, data warehouse design and build using Azure Services like Azure Data Factory.
• Strong experience in using a variety of data modelling/data analysis methods, time series forecasting, data tools, building and implementing models, and creating/running simulations to test business hypothesis.
• Maintain existing ETL pipelines, respond and resolve issues in timely fashion as role involves around day-to-day BAU activities and enhancement of existing data platform
• Undertake/support the monitoring of BAU processes as directed, including undertaking root cause analysis, advise remediation and fix the issues
• Support operational and BAU change requests and issues, complete change request tasks as assigned along with technical documentation
• Be a strategic partner to the business in resolving business queries regarding reports /data warehouse/Big Data.
• Strong experience working with structured, semi-structured, and unstructured data sources and data types
• Assist in monitoring daily BI job runs and make sure of smooth runs every day.
• Provide adequate process and technical documentation for any ETL/reporting development work done
• Microsoft Certification in DP 203 is must.
Experience
• Hands on experience implementing Azure modern data warehouse/Big Data Solutions stack with at least 1 full cycle implementation.
• Hands-on experience implementing Big data solutions as a designer or developer, with a strong background in data engineering
• Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows.
• Strong experience & background in data warehousing solutions and patterns with strong understanding of data modelling techniques like dimensional modelling.
• Hands-on experience with Azure data stack - ADLS, SQL DW/Synapse Analytics, SQLDB, Azure Data Factory, DevOps & CI/CD tools, PowerShell scripting and ML Ops.
• Hands-on experience with Azure Databricks and stream analytics.
• Minimum 4 years' experience in writing complex SQL queries.
• Should have experience handling Live Production Data and managing stakeholders
• Experience on Database tuning and performance optimization of ETL jobs
• Experience on issue finding and resolution due to bad data in tables /ETL Jobs
• Should be able to work independently, co-ordinating work across multiple users/departments
• Knowledge of Python, Spark, Scala and popular big data frameworks is a must.
Skills Required:
• Azure data stack – SQLDW/Synapse Analytics, Data Factory & ARM Templates, T-SQL
• Azure Databricks, Stream Analytics, Azure Functions
• PowerShell and Azure DevOps
• Python/R/Scala
• SAP Business objects Information Design Tool & Web Intelligence
• SSIS, Microsoft SQL Server
Education:
Master of Data Science/Computer Science or equivalent is preferred.