Are you passionate about leveraging cutting-edge AWS technologies to transform data into actionable insights?
IQX Business Solutions is seeking an experienced AWS Data Engineer / Cloud Architect to design and implement innovative data pipelines and analytics solutions for StratexOnline, our SaaS platform.
This is a unique opportunity to work on multi-environment, multi-region AWS solutions that drive real business outcomes, integrating modern data engineering principles with robust cloud architecture.
If you're ready to make a big impact in a supportive fast-paced, innovative environment, we want to hear from you!
In this role, you will:
Architect Solutions: Leverage AWS services like DynamoDB, Glue, Athena, S3, and Redshift to build scalable, secure, and cost-efficient data analytics pipelines.
ETL Expertise: Design and implement ETL processes to transform data into analytics-ready formats (e.g., Parquet or CSV) and optimize performance for querying.
Automate Workflows: Use Infrastructure as Code tools (e.g., CloudFormation, CDK, Terraform) and CI/CD pipelines to streamline deployments and updates.
Ensure Security & Efficiency: Configure IAM policies, enforce data isolation, and optimize costs using best practices.
Collaborate & Document: Work closely with stakeholders, document workflows, and ensure solutions are maintainable for the long term.
About IQX Business Solutions:At IQX Business Solutions, we empower organizations to optimize their strategic execution through StratexOnline, our innovative SaaS solution.
StratexOnline enables businesses to manage strategic initiatives, align priorities, and deliver measurable results with ease.
By leveraging cutting-edge cloud technologies and a user-friendly platform, we transform strategies into actionable outcomes, ensuring success in today's fast-paced digital world.
Must-Haves:
2–3 years of hands-on experience with AWS services such as DynamoDB, Glue, S3, Athena, and Redshift.
Strong knowledge of Python (e.g., for Glue scripts and automation with boto3) and SQL for querying and data transformation.
Proven experience designing ETL pipelines for analytics workloads, including working with semi-structured data (e.g., JSON).
Familiarity with Infrastructure as Code tools (e.g., AWS CloudFormation, CDK, or Terraform) for automating deployments.
Expertise in optimizing analytics workflows for multi-environment setups with a focus on cost efficiency and security.
Nice-to-Haves:
AWS certifications, such as AWS Certified Solutions Architect or AWS Certified Data Analytics – Specialty.
Experience with Redshift query optimization and integrating with DynamoDB data.
Proficiency with PySpark or Pandas for data processing and transformation.
Familiarity with big data tools or frameworks, such as Apache Spark.
Background in configuring and managing CI/CD pipelines for AWS deployments.
Join our supportive and innovative team in North Sydney, where great culture meets cutting-edge AWS solutions!