Role: Hadoop Developer
Location: Sentral KL Malaysia
Experience : 3 – 12 years
To support design, development and maintenance of Hadoop solutions for the Enterprise.
To be part of initiatives that brings data into the data lake and delivers insights.
To perform production support and maintenance of existing datasets in Hadoop.The Job
Design; develop ETL/ELT jobs based on requirements from source to Data lake platform.
Ensure that all development standards are being followed and work closely with the Data Architect.
Serve as hands on Sr. Developer/lead to a team of ETL developers.
Oversee code review functions for applications programs. Mentor developers in technical matters regarding the implementation of enterprise data standards, guidelines and industry best practices.
Monitor and manage production jobs to verify execution and measure performance to assure ongoing data quality and optimization of the system to manage scalability and performance and identify improvement opportunities for key ETL processes.
Work effectively with all technical personnel (Development Team, business analysts, security, risk and compliance, data center, project managers, data architects and testers), and clearly translate business priorities and objectives into technical solutions.
Experience in data related work in Warehouse / Data Marts, with atleast 4+ years of experience as senior ETL developer/Lead with emphasis on Hadoop implementation.
Experience leading a small team in design, development, testing and implementation of ETL solutions using enterprise ETL tools.
Strong interpersonal skills; ability to work on cross-functional teams. Strong verbal and written communication skills with an ability to express complex technical concepts in business terms and complex business concepts in technical terms. Ability to lead teams to consensus decisions on complex business and technical data challenges.
Deep knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise wide data architectures and data warehousing/BI.
Demonstrated problem-solving skills. Ability to learn effectively and meet deadlines. Demonstrated skill leading technical teams, including organizing workflow and scheduling assignments.
Strong scripting skills in Linux environment
Hands on development on Sqoop, Hive, Spark, Python Scala is a must.
Experience with design, management, implementation of Backup, Disaster Recovery and/or High Availability solutions
Experience with Insurance Financial datamart is preferred.