Key Responsibilities
- Lead the architecture design and implementation of big data platform components including storage, compute, scheduling, security and governance.
- Work with Product, R&D, BI and Business teams to translate business requirements into practical data platform use cases and capabilities.
- Optimise platform performance and improve overall efficiency of compute and storage usage.
- Ensure high availability, observability, security and regulatory compliance of the big data platform.
- Provide technical guidance and mentorship to data engineers and ETL developers.
- Drive technical knowledge sharing, documentation standards and best practice adoption across the team.
Requirements
- Bachelor’s Degree in Computer Science, Software Engineering, Data Science or a related field.
- Minimum 5 years’ experience in big data platform development, with at least 3 years in architecture design.
- Proficient in English (spoken and written) and able to use it as a working language.
- Strong hands-on experience with Hadoop ecosystem (HDFS, Hive, YARN, MapReduce), Spark, Flink and Kafka.
- Solid knowledge of data warehouse modelling and ability to abstract complex business requirements into architecture designs.
- Good understanding of distributed systems principles and experience in performance tuning, high availability and elastic scaling.
- Familiar with data lake technologies; experience in cloud-native big data platform implementation will be an added advantage.
- Excellent system analysis, communication and stakeholder management skills.
- Experience in large-scale internet/tech or fintech companies is preferred.
Job Types: Full-time, Permanent
Pay: RM15,000.00 - RM25,000.00 per month
Work Location: In person