Paris, IDF, FranceFull-time
Just Hack it!
With 73 millions of tracks and a presence in 180 countries, Deezer is the most personal music streaming service in the world.
Behind the code and the pixels is our team of 600 music lovers, and we’re building something incredible together. Want in? If you’re looking for an adventure, not just a job, and you fancy seeing ideas come to life in a heartbeat, you’re in the right place.
We dare to challenge the status quo and believe innovation is part of our DNA.
Our scope covers everything related to content, which starts with ingesting new deliveries (be it music releases, but also podcasts or lyrics among others) into our catalog, continuously cleaning and curating its content by providing internal tools and automation, goes on with serving this content back to our end users through our different access points (mobile and desktop apps, website, APIs) and is concluded by reporting highly detailed usage data to content providers and right holders for analysis and royalties distribution.
In short: we connect Deezer users and content creators through robust ingestion, engaging display and smart data feedback.
What you will do:
Based on several hundreds million of listening events per day, design & develop reliable data pipelines autonomously and be accountable for them
Play a part in royalties calculation, reporting efforts, catalog metadata reconciliation and feeding data insights to Deezer For Creators
Participate in the data community by collaborating with other data engineers, analysts and scientists to define simple & scalable architectures
Contribute to the team’s success through close collaboration with developers, product managers, business stakeholders and internal users
What we are looking for:
3+ years experience in software engineering using Big Data
Strong engineering skills (code design and quality, tests, reviews, logging, monitoring, continuous integration, DevOps)
Experience in building and maintaining data pipelines in production
Ability to understand complex architectures and explaining them
Rigorous and enthusiastic team-player, passionate about problem solving
Proficient in either Java, Scala or Python
Previous experience with our stack
Experience with streaming pipelines
Experience with building HTTP APIs
ETL: Scala + Spark
Analysis: BigQuery / DataStudio
Database: Cassandra, Clickhouse, MySQL
Infra: GCP + Kubernetes on Premise
You can work up to 10 days per month remotely (= 50% of the time)
LIFE @ DEEZER PARIS
If you feel like this is the right opportunity for you, press play!
We are an equal opportunity employer.