Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Data Quality Specialist Lead

TAFE NSW Life-Changing Careers Data Quality Specialist Lead Location negotiable (subject to campus availability) 2x temporary full time until November 2025 B...


From Tafe Nsw - New South Wales

Published 14 days ago

Business Analyst - Non-Financial Risk

Business/Systems Analysts (Information & Communication Technology) Are you passionate about driving meaningful change and delivering impactful solutions? A l...


From Morgan Mckinley - New South Wales

Published 14 days ago

Applications Specialist

At Varian, a Siemens Healthineers Company, we bring together the world's best talent to realize our vision of a world without fear of cancer. Together, we wo...


From 0460 Vms Australasia Pty Ltd. - New South Wales

Published 14 days ago

Security Operations Engineer

Salary: $900 to $1000 per day including super Location: Sydney CBD office Work Arrangement: Hybrid WFH 2 days a week Contract Duration: 6 to 12 month cont...


From Https:/Www.Energyjobline.Com/Sitemap.Xml - New South Wales

Published 14 days ago

Data Engineer

Details of the offer

Hybrid and flexible working arrangementsAbout RDCRich Data Co (RDC) Delivering the Future of Credit, Today! We believe credit should be accessible, fair, inclusive, and sustainable. We are passionate about AI and developing new techniques that leverage traditional and non-traditional data to get the right decision in a clear and explainable way. Global leading financial institutions are leveraging RDC's AI decisioning platform to offer credit in a way that aligns to the customers' needs and expectations. RDC uses explainable AI to provide banks with deeper insight into borrower behaviour, enabling more accurate and efficient lending decisions to businesses.Join the Future of Credit!Work at a 5-Star Employer of Choice 2023 - RDC was named one of HRD Australia's "best companies to work for in Australia".Join a fast-growing global AI company - Grow your skills, capabilities and gain AI and global experience.High performance team - Work alongside some of the best product teams, data scientists and credit experts in the industry.Vibrant team culture - Join an innovative and agile team who celebrates wins and solves problems together.Work-life balance - Highly flexible working arrangements - work how's right for you!Financial inclusion - Be part of a company that is striving for global financial inclusion and driving the future of credit.About the RoleDesign, develop, and maintain ETL pipelines using Apache Airflow to ensure efficient and scalable data processing.Use strong Python programming skills to write clean, maintainable code for data transformation and automation.Manage data storage, retrieval, and optimization in both RDBMS (e.g., MySQL, PostgreSQL) and NoSQL (e.g., DynamoDB) databases.Collaborate with cross-functional teams to gather requirements, integrating data from various sources to create comprehensive reports and dashboards.Ensure scalability, performance, and security in data pipelines and visualizations while working in cloud-based environments.Here's what a typical day might look like:Team Stand-Up:Start your day by participating in an agile stand-up, collaborating with product, sales, and customer support teams to discuss ongoing data engineering and reporting projects.Data Integration:Work closely with internal teams to ensure seamless data integration from various sources, optimizing data pipelines to ensure data flows correctly into your dashboards and reports.Collaboration & Refinement:Collaborate with stakeholders to gather data requirements and refine report outputs based on feedback. Participate in agile ceremonies, helping clarify user stories and project timelines.Learning & Continuous Improvement:Stay updated on the latest trends in data engineering and dashboard development, focusing on optimizing performance and enhancing the user experience.Your Skills & ExperienceYou must have:2+ years of proven experience in Apache Airflow for building and managing data pipelines.Strong proficiency in Python, with experience in writing efficient and scalable data processing code.Experience with both RDBMS (e.g., MySQL, PostgreSQL) and NoSQL (e.g., DynamoDB) databases.Experience with data pipelines or ETL processes, ensuring data is available for dashboards and reports.Strong SQL skills, with the ability to write complex queries and optimize them for performance.Experience with data visualization tools such as Power BI, Tableau, or similar platforms.Excellent communication skills to collaborate with cross-functional teams and translate business requirements into technical designs.Strong problem-solving skills, with the ability to troubleshoot technical issues and optimize data pipelines and dashboards.You ideally have:Experience working in cloud-based environments such as AWS, Azure, or GCP.Familiarity with API integration for real-time data fetching and interaction.Experience with agile development methodologies, working within fast-paced, iterative environments.Knowledge of DevOps practices and CI/CD pipelines.What's Next?If this sounds like you, get in touch or apply now. Our Head of Talent and Capability would love to speak with you! Get ready for the #futureofcreditPlease note candidates must have Australian working rights. Posted Date02 Oct 2024 LocationSydneyNSW /Australia
#J-18808-Ljbffr


Nominal Salary: To be agreed

Source: Whatjobs_Ppc

Requirements

Built at: 2024-11-06T19:24:43.117Z