Back to job search
Lead Engineer – Big Data
About our client:
Our client is a well known and prestigious multinational consultancy who is working with a Big 4 Banking end client.
Employees enjoy access to cutting-edge resources, continuous learning and development programs, and a collaborative environment that fosters creativity and career growth. Our client’s commitment to diversity and inclusion ensures a welcoming workplace for all. Additionally, the company’s focus on sustainability and social responsibility allows employees to contribute to meaningful global initiatives.
About the role:
We are looking for a skilled Data Solutions Architect to design, develop, and support advanced data solutions across industries, leading technical aspects of data projects and guiding data engineers and testers. This role involves creating detailed solution designs, developing end-to-end data pipelines, and working on batch and near real-time data flows. The ideal candidate has 5+ years in solution design or technical leadership, with expertise in Big Data, cloud platforms (Azure, AWS, GCP), NoSQL databases (HBase), and streaming technologies (Kafka). Strong knowledge of data modelling, data mesh architecture, and performance-sensitive services is essential. Exceptional interpersonal and presentation skills are required, along with a solid foundation in Agile, Kanban, or Waterfall methodologies and a degree in Computer Science or a related field.
Key responsibilities will include:
• Design, Develop, test and support future-ready data solutions for customers across industry verticals
• Produce Detailed Solution Design and support data engineers and testers as a technical lead in a data project
• Develop, test and support end-to-end batch and near real-time data flows/pipelines
• Demonstrate understanding in data architectures, modern data platforms, big data, data modelling, ML/AI, analytics, cloud platforms, data governance and information management and associated technologies
• Develop and demonstrate Proof of Concepts and Working Demos
• Support and collaborate with other internal/external consultants in consulting, workshops and delivery engagements
• Mentor junior IBM consultants in the practice and delivery engagements
The successful candidate:
This contract is available for an initial 6 month term with an extension of 3 x s6 months.
Located in Sydney, this role offers a hybrid working arrangement.
How to Apply
Please upload your resume to apply. We will be in touch with further instructions for suitably skilled candidates. Please note that you will be required to complete selection criteria to complete your application for this role.
Call Farbar Siddiq on 0489 922 211 or email farbars@whizdom.com.au for any further information. Applications close 19/11/2024 @ 5pm
Candidates will need to be willing to undergo pre-employment screening checks which may include, ID and work rights, security clearance verification and any other client requested checks.
Lead Engineer - Big Data
Job title : | Lead Engineer - Big Data |
Contract type : | Contract |
Location : | Sydney |
Sectors : | |
Salary : | $100 - $110 p/Hour Inclusive Super |
Start date : | 2024-11-13 |
Duration : | 6.00 Months |
Job Reference : | V-47574 |
Contact name : | Farbar Siddiq |
Contact email : | farbars@whizdom.com.au |
Job published : | about 16 hours ago |
About our client:
Our client is a well known and prestigious multinational consultancy who is working with a Big 4 Banking end client.
Employees enjoy access to cutting-edge resources, continuous learning and development programs, and a collaborative environment that fosters creativity and career growth. Our client’s commitment to diversity and inclusion ensures a welcoming workplace for all. Additionally, the company’s focus on sustainability and social responsibility allows employees to contribute to meaningful global initiatives.
About the role:
We are looking for a skilled Data Solutions Architect to design, develop, and support advanced data solutions across industries, leading technical aspects of data projects and guiding data engineers and testers. This role involves creating detailed solution designs, developing end-to-end data pipelines, and working on batch and near real-time data flows. The ideal candidate has 5+ years in solution design or technical leadership, with expertise in Big Data, cloud platforms (Azure, AWS, GCP), NoSQL databases (HBase), and streaming technologies (Kafka). Strong knowledge of data modelling, data mesh architecture, and performance-sensitive services is essential. Exceptional interpersonal and presentation skills are required, along with a solid foundation in Agile, Kanban, or Waterfall methodologies and a degree in Computer Science or a related field.
Key responsibilities will include:
• Design, Develop, test and support future-ready data solutions for customers across industry verticals
• Produce Detailed Solution Design and support data engineers and testers as a technical lead in a data project
• Develop, test and support end-to-end batch and near real-time data flows/pipelines
• Demonstrate understanding in data architectures, modern data platforms, big data, data modelling, ML/AI, analytics, cloud platforms, data governance and information management and associated technologies
• Develop and demonstrate Proof of Concepts and Working Demos
• Support and collaborate with other internal/external consultants in consulting, workshops and delivery engagements
• Mentor junior IBM consultants in the practice and delivery engagements
The successful candidate:
- Strong leadership skills, impacting both project teams and broader organizational goals.
- Skilled at navigating large organization governance and processes with ease.
- Exceptional interpersonal skills; confident in presenting solution designs to large technical audiences.
- 5+ years in solution design, tech leadership, or development in IT consulting, ideally with Banking/FSS experience.
- 3+ years hands-on experience with Big Data/Data Lake services in cloud and on-premise.
- Expert in industry best practices, design patterns, and pioneering technical solutions across platforms.
- Advanced proficiency with NoSQL databases (HBase) and real-time data pipelines (Kafka).
- Deep understanding of data modeling (ER, 3NF, Dimensional Modeling) and Data Mesh Architecture.
- Proven background in ETL, data warehousing, and BI solutions.
- Skilled in Scala, Python, SQL, shell scripting, and related languages.
- Experience with high-availability, fault-tolerant services that meet performance SLAs.
- Strong working knowledge of cloud platforms (Azure, AWS, GCP, IBM) and modern data tools.
- Proficient in containerization, virtualization, infrastructure, and networking.
- Versatile in Agile, Kanban, and Waterfall methodologies.
- Degree in Computer Science, IT, or related field, with industry-recognized certifications or badges.
This contract is available for an initial 6 month term with an extension of 3 x s6 months.
Located in Sydney, this role offers a hybrid working arrangement.
How to Apply
Please upload your resume to apply. We will be in touch with further instructions for suitably skilled candidates. Please note that you will be required to complete selection criteria to complete your application for this role.
Call Farbar Siddiq on 0489 922 211 or email farbars@whizdom.com.au for any further information. Applications close 19/11/2024 @ 5pm
Candidates will need to be willing to undergo pre-employment screening checks which may include, ID and work rights, security clearance verification and any other client requested checks.