QUALIFICATIONS / EXPERIENCE
• Bachelor in IT, Computer Science or Engineering.
• At least 1-3 years of using Big Data technologies like Azure, AWS or Hadoop Big Data Solution, strongly prefer candidate with Azure Big Data technologies
• At least 1-3 years of solid hands-on development experience with ETL development to transform complex data structure in multiple data sources environment.
• Minimum 1-3 years ETL programming in any of these languages including Python, Scala, Java or R
KNOWLEDGE & TECHNICAL SKILLS
• Hands-on experience on Azure Big Data Solution such as Data Factory, Databricks, Gen2, Synapse, PowerBI
• Experience with various of ETL/ELT frameworks, data warehousing concepts, data management framework and data lifecycle processes
• Strong understanding of ETL programming languages like Python, Scala, Java, R, Shell and PLSQL, prefer working under MS Databricks.
• Experienced in handling and processing different types of data (structured, semi-structured and unstructured).
• Strong knowledge in various database technologies (RDBMS, NoSQL and columnar).
• Preferably with a good understanding of data analytics and data visualization, highly prefer Power BI
• Good understanding on Master Data Management (MDM) and Data Governance Tools preferring Informatica technologies
• Experienced working in insurance industry will be an added advantage.
• Ability to communicate and present technical information in a clear and unambiguous manner.
• Strong ability to work independently and cooperate with diverse teams in a multiple stakeholder’s environment.
• Strong sense of work ownership, high affinity with anything data and a desire for constant improvements.
COMPETENCIES
• Understanding existing and emerging technologies
• Understanding business practices, approaches, organization, politics, and culture
• Assessing the current technology gap to develop, coordinate and implement changes to meet the new requirements according to critical deadlines