Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 3-5 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
Let’s talk about the role and responsibilities: • Build robust, efficient and reliable data pipelines consisting of diverse data sources to ingest and process data • Design and develop real time streaming and batch processing pipeline solutions • Design, develop and implement data pipelines for data migration and collection, data analytics and other data movement solutions • Work with stakeholders and data analyst teams to assist with data-related technical issues and support their data infrastructure needs • Collaborate with Architects to define the architecture and technology selection • Share knowledge and best practices with peers that promote better technical practices across the organisation
Let’s talk about your capability and experience: • A minimum of 3 years relevant experience with at least two of these having worked as a Big Data engineer preferably building data lake solutions by ingesting and processing data from various source systems • Understanding of SDLC processes, experience of working in a large-scale program, data warehousing and ETL development • Experience working in an Agile environment • Experience in DevOps, Continuous Integration and Continuous Delivery principles to build automated pipelines for deployment and production assurance on the data platform • Experience in building various frameworks for enterprise data lake is highly desirable • Experience in one or more Cloud platforms - AWS, Azure, GCP, and/or Snowflake • Experience in one or more of Java, Scala, Python and Bash, Informatica, integration, ETL, Hadoop, Spark, Hive, Terdata, Cloudera