Speedinvest and our portfolio of startups are always hiring exceptional talent!
Browse open jobs below to find your next career move.

DevOps Engineer - Data Platform



Software Engineering
Bengaluru, Karnataka, India
Posted on Wednesday, October 11, 2023

FairMoney is a credit-led mobile bank for emerging markets. The company was launched in 2017, operates in Nigeria, and has raised close to €50m from global investors like Tiger Global, DST & Flourish Ventures. The company has offices in France, Nigeria, Germany, Latvia, the UK, Türkiye, and India.

About the Role

You will assume the role of a "DevOps Engineer - Data Platform" within our Technology Division, a dynamic team of over 75 professionals spread across locations in Lagos, Bangalore, Paris, Türkiye, and globally. Your primary responsibility will be contributing to the development of a cutting-edge data platform, leveraging real-time data processing technologies.

In collaboration with our Data Tech team, you will play a vital role in the integration of open-source technologies and their deployment on our Kubernetes infrastructure. This initiative aims to create a versatile data platform while reducing reliance on managed services.

As we establish Engineering Centers of Excellence in various regions, we are actively seeking talented and driven engineers. This represents a unique opportunity to join the core engineering team of a rapidly growing fintech company poised for substantial expansion in the coming years

Watch the way FairMoney is building Africa’s money story here

Roles and Responsibilities

  • Design, deploy, maintain, and improve our CI/CD Systems.
  • Design, deploy, and maintain our cloud-native IT Infrastructure.
  • Design and Implement Infrastructure as Code and dynamic environments setup.
  • Perform root cause analysis for production errors.
  • Configure, monitor and manage access for our systems.
  • Design procedures for system troubleshooting and maintenance.

Our Technical stack

  • Cloud providers: AWS, GCP.
  • Backend: Ruby, Golang.
  • Mobile applications: Android (Kotlin).
  • Configuration management: Terraform, Kubernetes.
  • CI/CD: GitHub Actions, ArgoCD.

Data Stack

  • Languages: Python / Scala.
  • Techs: GCP BigQuery / DBT / Apache Spark / Apache Flink.
  • Streaming: Kinesis / Kafka.
  • Bachelor’s/Master's degree in Computer Science, Maths, or related technical domain.
  • 5+ years of dedicated hands-on experience in the relevant position.
  • 3+ years of proven experience with complex AWS projects.
  • 3+ years of proven experience with Infrastructure as Code with Terraform.
  • 2+ years of proven experience with Kubernetes in production.
  • Experience with DevOps and CI/CD tools (GitHub Actions, ArgoCD).
  • Good knowledge of at least one programming language (Python or Ruby is a plus).

Nice to have:

  • Exposure - having worked with Kafka, potentially at a large scale.
  • Exposure - having worked with Apache Flink / Spark deployments, potentially on Kubernetes.
  • Open-source project contributions (changes into README files are not counted), a solid understanding of network routing protocols, DBA experience, and proficiency in at least two programming languages
  • Exposure or interest in machine learning and distributed data processing frameworks.
  • Learning & Development
  • Family Leave (Maternity, Paternity)
  • Paid Time Off (Vacation, Sick & Public Holidays)

Recruitment Process

  • A screening interview with one of the members of the Talent Acquisition team for 30 minutes.
  • Technical round with Senior Data Engineering Manager for 45-60 minutes.
  • Technical round with DevOps Lead for 45-60 minutes.
  • Final discussion with Head of Data Engineering for 30-45 minutes.