Educational requirements: Bachelor
English requirements: Competent English
Requirements for skilled employment experience for years: 3-5 years
Required residence status: Temporary visa, Permanent resident, Citizen
Accept remote work: unacceptable
Mandatory Skills:
At least 7 years of overall experience in Big Data Domain with extensive knowledge in Banking, Insurance, and Mortgage Domain. At least 5 years of experience in the Big Data technologies for Batch Implementation using Python, Spark, Hive, Scala, HQL, Hadoop, Phoenix, Hbase, Bash, PowerShell At least 3 years of experience in the Big Data technologies for Real Time Implementation using Apache Spark Streaming, Kafka and NiFi Must have strong experience with DDEP Analytics platform and metadata driven framework Apache ATLAS. Must have experience with code management tools such as Bitbucket, GitHub and SVN. At least 3 years of experience in NoSQL databases such as Hbase, Cassandra and Phoenix. Proficient in Azure cloud services – ADLS2, HD Insight, Azure SQL, Cosmos DB, Blob Storage, Databricks. Must have hands on devops deployment tools – Jenkins, JIRA, Maven, sbt, Azure Devops, docker. Must understand Azure Machine Learning Services and problem-solving skills using Reinforcement, Supervised and Unsupervised learning models.