About the role… 'Powering EnergyAustralia to make better data driven decisions towards a greener energy future'
Reporting to the Data Engineer Leader – Trading & Ops, the Senior Data Engineer will work closely with the Data Product team to support, design, build and deploy data solutions to enable effective and efficient value realisation from EnergyAustralia's data. As a senior member of the Data Engineering team, this role is to assist in leading the discipline, with a core focus on data ingestion and integration work, primarily in the AWS analytics ecosystem. Additional responsibilities include:
Leverage the function's patterns for appropriate ETL/data import and interfacing mechanisms such as consumption layer databases, APIs, and caches. Support and development of the data products and platform. Support the formulation of data solution designs and architecture, leveraging these for support of the data platform and products, and delivery of new products. There will be some out-of-hours work required to support business-critical data products. Review business requirements, create data mappings between systems, data profiling, query design, and data flow design. Mentor junior Developers in the department. Leverage DevSecOps principles utilizing CI/CD workflows for accelerated development of infrastructure components and data ingestion pipelines. Be Impactful when you are applying… You will have strong Engineering experience & exposure to Agile or DevOps delivery environments in both general and technology-specific applications. Also, you will have some of the following competencies:
Tertiary Qualification in computer science, information technology, business systems, or other relevant tertiary qualification or equivalent significant practical experience. Extensive experience in building Data Engineering pipelines as well as exposure to Cloud Data Platforms. Expert level hands-on experience (and relevant certifications) with Databricks and in AWS cloud data environments including tools such as S3, EMR/EC2, Lambda, Redshift. Expert level implementation experience on streaming technologies e.g. Kafka/Kinesis and Spark streaming with Python or Scala. Extensive design and build experience with Apache Airflow or similar orchestration services on cloud. Expert big data and query language skills such as SQL, Java, Python, R, PySpark. Experience establishing, curating, and maintaining Logical and Physical data models. How to Apply: If you're ready to 'light the way' towards your next career move, click the 'Apply' button to submit a confidential application. For any questions, please reach out to Jock Clydesdale, Talent Acquisition Partner @
Why Us: At EnergyAustralia, we are committed to providing an inclusive culture so our employees can bring their whole selves to work and have a sense of belonging. As an employee, you can enjoy such benefits as:
Annual Performance Bonus Discounts on Gas & Electricity for employees Hybrid working environment that promotes flexibility. Energise Program - flexible working that is team-centric enabling all individuals to agree and succeed together. Excellent company culture, down-to-earth and friendly organisation - be authentic, bring your whole self to work! State-of-the-art Melbourne office, stunning views only a 3-4 minute walk from Southern Cross station. In-house Café and Onsite Tech Bar. We're committed to providing an inclusive culture so our employees can bring their whole selves to work and have a sense of belonging. From our PRISM network that creates a positive culture for LGBTQ+ employees to our Reconciliation Action Plan that has commitments to strengthen relationships with Aboriginal and Torres Strait Islander people and organisations, it's a workplace where everyone's welcome.
#J-18808-Ljbffr