Your tasks: In this role you will be working with state of the art cloud technologies to build a big data processing pipeline that handles TBs of raw data and millions of events on a daily basis. You will be responsible to keep the platform highly available, scalable, secure and cost-efficient. Your profile: Bachelor’s degree in Computer Science, or a related field, or equivalent practical experience. Experience with one or more general purpose programming languages including but not limited to: C/C++, Go, Python or Java. Good understanding of cloud architectures, services and container technology. Experience with common cloud platforms (AWS, Azure, GCP). Experience with big-data processing systems featuring ingestion, storage and indexing. Solid communication skills, fluent in English. High level of drive, independence and strong analytical skills. Preferred qualifications: Master’s or PhD degree in Computer Science, further education or experience in engineering, computer science or a related field. Experience in IT security, especially malware analysis. Experience in message passing architectures (e.g. Kafka, RabbitMQ, Redis). Experience in architecting, building and maintaining hybrid cloud environments. Experience with agile development, testing and continuous integration.
Job Status Aktiv
Job Type Vollzeit