Informații principale
Big Data Engineer - Apache Kafka and Big Data Technologies
Poziție: Nu este specificat
Start: Îndată ce este posibil
Final: 31 Dec. 2025
Localizare:
Ludwigshafen am Rhein, Germania
Metoda de colaborare: Doar proiect
Tarif pe oră: 95 Lei
Ultima actualizare: 1 Nov. 2024
Descrierea și cerințele proiectului
Responsibilities:
Designing and implementing real-time data processing workflows using Apache Kafka.
Developing and managing data pipelines with Kafka Streams, Kafka Connect, and KSQL to facilitate data flow between systems.
Writing and maintaining code in Java, Python, and Scala for data integration and transformation tasks.
Ensuring efficient data processing and integration with big data technologies for large-scale data handling.
Troubleshooting data pipeline issues, optimizing performance, and resolving any data flow interruptions or bottlenecks.
Skills:
Proficiency in Apache Kafka, including Kafka Streams, Kafka Connect, and KSQL.
Strong programming skills in Java, Python, and Scala.
Experience in real-time data processing and big data technologies.
Problem-solving and troubleshooting skills, particularly with data pipeline performance.
Ability to optimize and fine-tune data flow processes for efficiency and reliability.
Tools:
Apache Kafka (Kafka Streams, Kafka Connect, KSQL).
Programming languages: Java, Python, Scala.
Big data technologies and processing frameworks (e.g., Hadoop, Spark).
Performance monitoring and debugging tools for real-time data processing.
Collaboration tools (e.g., Microsoft Teams, Slack).
—
Start: 11/2024
Duration: 12 Months+
Location: REMOTE