Bachelor’s/ Master’s degree in Computer Science, Engineering or related field, or equivalent proven experience as a Data Engineer, Software Developer or similar role … Demonstrated hands-on experience with AWS data services (Kinesis, Glue, Appflow, Lambda, S3), Snowflake and dbt (dbt labs) for building and modeling data pipelines
1
Design, Entwicklung und Betrieb von Datenpipelines und Datenmodellen mit Snowflake und dbt (Data Build Tool) - Aufbau und Pflege von ETL-/ELT-Prozessen sowie Implementierung von Best Practices für Datenqualität und Governance
2
Entwicklung und Umsetzung stabiler, skalierbarer Datenprozesse zur Erfassung und Aufbereitung von Telemetriedaten aus ESL- und EdgeSense-Systemen - Aufbau und Weiterentwicklung von Datenmodellen und Analyse-Workflows für Reporting und Datenauswertung
3
Areas that play to your strengths - All the responsibilities we'll trust you with … Assist business analysts across all departments and markets with developing impactful analytics products, guiding them from ideation through proof of concept and value to scalable assets
4
Areas that play to your strengths - All the responsibilities we'll trust you with: DATA ARCHITECTURE - As a Data Architect - Supply Chain you ensure consistency, integrity, and strategic reuse of data assets within a specific business domain
5
Areas that play to your strengths - All the responsibilities we'll trust you with: PROJECT MANAGEMENT - Lead the implementations and rollouts of services across Red Bull and its subsidiaries. Coordinate scope, budget, time, quality and external implementation teams/partners
6
Collaborate with cross-functional teams to translate business requirements into scalable and easy to maintain data structures … Familiarity with modern data platforms and tools (e.g., Snowflake, dbt, GCP, Airflow) is a plus, but not a requirement
7
Automate Infrastructure: Use Terraform to provision, configure, and maintain Snowflake environments and supporting cloud infrastructure. Craft Efficient Data Pipelines: Design, implement, and optimize ELT/ETL pipelines using SQL and Python
8
Bachelor’s/ Master’s degree in Computer Science, Engineering or related field, or equivalent proven experience as a Data Engineer, Software Developer or similar role … Demonstrated hands-on experience with AWS data services (Kinesis, Glue, Appflow, Lambda, S3), Snowflake and dbt (dbt labs) for building and modeling data pipelines
9
Develop and maintain regional business performance dashboards and KPIs and provide business insights to local leadership team • Analyze monthly variances to plan/forecast and clearly identify their root cause, concisely explain the ramifications, and creatively recommend areas to strengthen business performance
10