Administrate, monitor and optimize our Big Data environments based on Apache Hadoop (AWS Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables - IaaC deployment via Terraform - Plan and execute updates and upgrades
1
Gute Kenntnisse von Methoden, Prozessen und Werkzeugen des modellbasierten Software- und System-Engineerings, von Forschungsgrundlagen, maschinellem Lernen, (Big) Data Analytics und Entwicklungsmethoden für LLMs und andere KI-Technologien
2
What we offer you - An autonomous vehicle is a complex cyber-physical system that requires state-of-the-art ML/AI software components and sophisticated hardware equipment to work together. For the application of testing autonomous driving systems under different driving conditions, AVL is developing various machine …
3
Administration, monitoring, and optimization of the Big Data environment based on Apache Hadoop in the AWS Cloud - Management and maintenance of services like Kafka, Flink, NiFi, DynamoDB, and Iceberg Tables - Deployment using Infrastructure as Code (Terraform)
4
Aufgaben - Administrate, monitor and optimize our Big Data environment based on Apache Hadoop from Cloudera (AWS-Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables - IaaC deployment via Terraform
5
Testing and analysing full self-driving stacks under different hardware and software configurations with the integrated methods. Technologien und Skills - Python - C++ - Unsere Erwartungen an dich: Qualifikationen - Ongoing studies in the fields of Computer Science, Telematics, Physics or Electrical Engineering
6
Im Zuge dieser Rolle sorgen Sie für die Überwachung und den Betrieb unserer IT-Infrastruktur (Server, Datenbanken, Containertechnologien auf Basis Openshift/ Kubernetes), um die Stabilität unserer Predictive Analytics-Anwendungen gewährleisten zu können
7
Administration, monitoring, and optimization of the Big Data environment based on Apache Hadoop in the AWS Cloud - Management and maintenance of services like Kafka, Flink, NiFi, DynamoDB, and Iceberg Tables - Deployment using Infrastructure as Code (Terraform)
8
für Wien, Linz, Wels, Salzburg, Graz, St. Pölten - Aufgaben: Administrate, monitor and optimize our Big Data environment based on Apache Hadoop from Cloudera (AWS-Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables
9
Was unser Auftraggeber bietet: Sehr gutes Arbeitsklima - Zentral gelegener Arbeitsplatz mit sehr guter öffentlicher Anbindung in belebter innerstädtischer Lage … Beratung mit Bezug zu E-Commerce und Big Data im IT-, IP-, Medien- und Telekommunikationsrecht (Projektverträge, Outsourcing, AGB etc
10