DataOps / DevOps Specialist
If you’re passionate about modern data platforms, automation, and cloud innovation, this is your chance to make an impact at one of the Nordics’ leading financial institutions.
We usually respond within a day
Job Title: DataOps / DevOps Specialist
Location: Hybrid (Stockholm, Sweden – mix of office and remote work)
Start Date: ASAP
Utilization: 100%
Language: English (Swedish is a plus)
About Client:
We are a progressive tech company powered by innovation and collaboration. From COBOL to AI, we offer one of the most diverse tech ecosystems in the Nordics. Together, we’re building sustainable technology for the future — an inclusive, forward-thinking environment where every idea matters.
Are you ready to take on a new challenge and help shape Our next-generation data platform?
About the Assignment:
Our client is modernizing its data platform to ensure a future-proof, compliant, and scalable foundation for its data-driven operations. The current on-premise data lake (based on Cloudera-supported Apache technologies such as Spark, NiFi, and Airflow) will be migrated to a modern, cloud-native architecture on Google Cloud Platform (GCP).
We are looking for a DataOps / DevOps Specialist who will play a key role in supporting this migration. You will be part of the DataOps DCA Tribe, contributing hands-on to architecture design, migration activities, automation, and continuous delivery pipelines — helping build a secure, efficient, and sustainable platform for the future.
Key Responsibilities:
- Support the migration of on-premise data lake to Google Cloud Platform (GCP).
- Design, build, and maintain data pipelines and DevOps automation workflows.
- Implement containerized environments using Docker and Kubernetes.
- Apply Infrastructure as Code (IaC) principles using Terraform.
- Optimize data processing using Apache Spark, NiFi, and other Apache ecosystem tools.
- Collaborate with teams on CI/CD pipelines, monitoring, and observability improvements.
- Ensure compliance with regulatory and data security standards.
- Contribute to platform performance optimization and documentation.
Required Qualifications:
- 4–8 years of experience in DataOps or DevOps roles.
- Strong hands-on experience with:
- Apache Spark
- Docker and Kubernetes
- Infrastructure as Code (Terraform)
- Apache NiFi
- Python and SQL
- ETL processes and database management
- Cloudera ecosystem
- Scala
Meritorious Skills:
Cloud & Modern Stack:
- Google Cloud Platform (GCP), especially Dataproc
- Observability and monitoring tools (e.g., Prometheus, Kibana, Elasticsearch)
- CI/CD pipelines
- YAML, Linux, Windows
- GCP Event Hubs
- Command-line interface (CLI) proficiency
On-Premise Experience:
- Apache Airflow
- Data lake and legacy system management
Recruitment Partner: Sperton
This position is exclusively managed by Sperton, a global talent partner connecting high-performing professionals with leading organizations worldwide.
- Department
- Information Technology
- Locations
- Sweden
- Remote status
- Hybrid
- Job location
- Stockholm, Sweden
Workplace & Culture at Sperton
At Sperton, we believe that great results come from great people.
Our culture is built on trust, collaboration, and a shared passion for delivering quality in everything we do.
We are a Norwegian-owned international company with colleagues across Europe, Asia, and the USA, working together seamlessly across time zones and cultures. Our teams are diverse, yet united by the same goal — to connect people and companies in meaningful ways.
We value openness, initiative, and continuous learning. Everyone at Sperton is encouraged to take ownership, share ideas, and challenge existing ways of working to make our solutions even better.
Even though we operate globally, our approach is personal. We take pride in creating a supportive and inclusive environment where people feel heard, respected, and motivated to grow — both professionally and personally.
Already working at Sperton Global AS ?
Let’s recruit together and find your next colleague.