Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Data Quality Specialist Lead

TAFE NSW Life-Changing Careers Data Quality Specialist Lead Location negotiable (subject to campus availability) 2x temporary full time until November 2025 B...


From Tafe Nsw - New South Wales

Published 14 days ago

Business Analyst - Non-Financial Risk

Business/Systems Analysts (Information & Communication Technology) Are you passionate about driving meaningful change and delivering impactful solutions? A l...


From Morgan Mckinley - New South Wales

Published 14 days ago

Applications Specialist

At Varian, a Siemens Healthineers Company, we bring together the world's best talent to realize our vision of a world without fear of cancer. Together, we wo...


From 0460 Vms Australasia Pty Ltd. - New South Wales

Published 14 days ago

Security Operations Engineer

Salary: $900 to $1000 per day including super Location: Sydney CBD office Work Arrangement: Hybrid WFH 2 days a week Contract Duration: 6 to 12 month cont...


From Https:/Www.Energyjobline.Com/Sitemap.Xml - New South Wales

Published 14 days ago

Senior Data Engineer (Gcp)

Details of the offer

Senior Data Engineer (GCP) Sydney $185,000 + Bonus Senior GCP Data Engineer required to join a Data Platforms team within the Digital Bank and contribute to building the best technology-driven business in financial services. Key Responsibilities

Play a pivotal role in designing, building, and enhancing enterprise-scale data solutions.
Work in a DevOps environment, with end-to-end accountability for designing, developing, deploying, and supporting your data assets.
Develop high-quality, low-maintenance software solutions, coordinating requirements, schedules, and activities.
Participate in team meetings, contribute to peer code reviews, and assist in testing.

Key Requirements

5+ years of enterprise engineering experience in a mature cloud environment.
Proven experience in building and deploying data lakes and data warehouses on GCP using services such as BigQuery, Dataflow, Pub/Sub, Cloud Functions, and GCS.
Expertise in developing code-based ETL/ELT data pipelines with performance-optimized data modeling.
Experience ingesting data from various sources, including databases, APIs, streaming data feeds, and other cloud platforms.
Familiarity with event-driven processing and big data processing, including PII handling, data wrangling, formatting, and preparation.
Strong background in analytics, with knowledge of BigQuery, data denormalization, large views, and reporting tools.
Machine learning (ML) and artificial intelligence (AI) experience is a plus, particularly with Natural Language Processing (NLP) and APIs for document processing and language translation.

If you feel you possess the relevant skills to the above, apply now with most updated CV!
#J-18808-Ljbffr


Nominal Salary: To be agreed

Source: Whatjobs_Ppc

Requirements

Built at: 2024-11-07T04:33:08.309Z