Data Modeller About our client: Our client is a well known and prestigious multinational consultancy who is working with a Big 4 Banking end client.
Employees enjoy access to cutting-edge resources, continuous learning and development programs, and a collaborative environment that fosters creativity and career growth.
Our client's commitment to diversity and inclusion ensures a welcoming workplace for all.
Additionally, the company's focus on sustainability and social responsibility allows employees to contribute to meaningful global initiatives.
About the role: We are looking for an experienced Data Engineer to design and implement data models across diverse platforms and business domains.
This role involves creating future-ready data solutions, developing real-time and batch data pipelines, and leading data model designs using techniques such as 3NF, Dimensional Modelling, and Data Vault.
The ideal candidate has 10+ years in IT consulting with expertise in Big Data, data lakes, and cloud platforms like Azure, AWS, or GCP, along with hands-on skills in Python, SQL, and containerization.
Strong leadership skills, familiarity with Agile methodologies, and the ability to mentor junior team members are essential.
Industry certifications are a plus.
Key responsibilities will include: Build, test, and support future-ready data solutions across diverse industries.
Develop robust end-to-end batch and real-time data pipelines.
Showcase expertise in data architecture, modern platforms, Big Data, ML/AI, analytics, cloud, and data governance.
Design data models aligned with business requirements for specific use cases and domains.
Apply advanced data modelling techniques like 3NF, Dimensional Modelling, ER Models, and Data Vault.
Demonstrate knowledge of Data Asset concepts and Data Mesh Architecture.
Create Proof of Concepts and working demos to showcase solutions.
Lead or collaborate with consultants in workshops and delivery engagements.
Mentor junior consultants, guiding best practices in data engineering and delivery.
The successful candidate: 10+ years in IT consulting as a solution designer, developer, tester, or support professional.
3+ years of hands-on experience with Big Data/Data Lake services on cloud and on-premise.
Skilled in designing and implementing data models in Big Data/Data Lake or Data Warehouses using techniques like Dimensional Modeling with Star or Snowflake schemas.
Experience with data model products from vendors like IBM and Teradata.
Proficient in data modeling tools, including Erwin and Sparx Systems Enterprise Architect.
Skilled in applying industry best practices, design patterns, and innovative technical solutions across data and application platforms.
Expertise in building near real-time data pipelines and distributed streaming applications.
Strong background in ETL, data warehousing, and BI solutions.
Proficient in Python, Scala, SQL, shell scripting, and other programming languages.
Experience with performance-sensitive, highly available, fault-tolerant services with automatic failover.
Hands-on with data services like Azure Synapse and Cosmos DB on Azure, AWS, GCP, and IBM.
Solid understanding of containerization, virtualization, infrastructure, and networking.
Diverse experience with Agile, Kanban, and Waterfall methodologies.
Skilled in stakeholder management and team leadership.
Degree in Computer Science, Information Technology, or related fields.
What's on offer?
This contract is available for an initial 6 month term with an extension of 3x6 months.
Located in Sydney, this role offers a hybrid working arrangement.
How to Apply Please upload your resume to apply.
We will be in touch with further instructions for suitably skilled candidates.
Please note that you will be required to complete selection criteria to complete your application for this role.
Call Farbar Siddiq on 0489 922 211 or email ****** for any further information.
Applications close 19/11/2024 @ 5pm Candidates will need to be willing to undergo pre-employment screening checks which may include, ID and work rights, security clearance verification and any other client requested checks.