Come and join a thriving company and become part of a diverse global collective of free-thinkers, entrepreneurs and industry experts who are all driven to use technology to reimagine what's possible. Capgemini. Get the future you want.
Let's talk about the role and responsibilities:The Solution Designer will report to the lead architect on the program and be a part of the Technical/Architecture Design Authority. They will take direction from the architecture team and project delivery lead as required. They will also work closely with the project delivery lead and the project team providing support as required on technical design. The Solution Designer Role is a hands-on technical "build" role and a solutions role, and will be required to guide others on how technical designs can be implemented. The responsibilities include designing the essential core building blocks that make up a technical solution, allowing modular and flexible design to meet the business requirements.
Core Responsibilities include:Designing and architecting scalable and efficient data warehouse and data lakehouse using AWS as cloud platform and Databricks as data processing platform.Provide design and technical guidance to feature teams performing big data ingestion, Curation and Conformance.Be a focal design and technical guide for data engineers including pattern design, development and technical direction.Collaborate with various stakeholders to understand business goals and support delivery outcomes.Ensure design is consistent and relevant to meet the standards and direction of design teams and architectural design authority.Able to work with cross program and platform teams to drive design decision and develop solutions.Let's talk about your qualifications and experience:Required Skills/Experience:
10+ years of commercial experience.Experience in big data technologies and AWS/Multi Cloud.Good experience in Data engineering.Databricks experience is preferred but not mandatory.Experience with Data Architecture with hands-on experience working with data platform.Experience designing event-driven applications & near real-time and real data solutions using Spark structured streaming and Kafka (various connectors and sinks).Experience building monitoring/alerting framework for Data pipelines.Experience in building data pipelines for data lake and multi-hop architecture.Experience in orchestration tools such as Control-M, Airflow, Jenkins etc.Experience with API design/development and if required be able to develop/build API solutions.Experience with JAVA/Python/Spark development and as required build or review code to ensure it meets design direction or coding standards.Strong interpersonal/communications skills, with the ability to perform hands-on build work and assist and lead technical build staff.Strong technical team leadership, mentorship and collaboration based on a hands-on approach to design and technical work.Ability to develop technical relationships with end-users.Advantageous Skills/Experience:
Understanding of underlying infrastructure for Big Data Solutions would be advantageous.Experience working with distributed data processing framework Spark and Databricks, distributed storage solution on cloud like S3, ADLS.Understanding/Experience with data modelling concepts such as normalization, denormalization, Data Vault & others.Experience designing, building and operating cloud services in IT, Systems Integrator, or Service Provider would be advantageous. #J-18808-Ljbffr