Roles:
Data
Must-have skills:
PythonSQLNoSQL
One of skills:
AWSAzureGCP
Nice-to-have skills:
ScalaRubyGo
Considering candidates from:
Europe
Europe
Work arrangement: Onsite
Industry: Insurance
Language: English
Level: Middle or senior
Required experience: 2+ years
Relocation: Paid
Visa support: Not provided
Size: 501 - 1000 employees
Company
CoverWallet, an Aon company, is the leading digital insurance platform for small and medium-sized businesses. They are dedicated to making insurance simple, fast and convenient so that businesses around the world can get the protection they need and get back to what matters most - growing and managing their business. Powered by deep analytics, thoughtful design, and state-of-the-art technology, CoverWallet is reinventing the $200 billion commercial insurance market for small and medium-sized businesses.
Description
They are currently looking for a Data Engineer, who will work with business stakeholders and their Engineering team (mostly with Data Scientists, Data Analysts, Backend Engineers, and SREs) to develop and have a real impact on the overall design of the data architecture to allow the growth of CoverWallet, thanks to exceptional skills designing, developing, testing, maintaining and optimizing their data management systems.
This will be a critical position, and the person who leads this effort will be fundamental in the success of CoverWallet’s growth strategy.
The position is open in Madrid and Valencia.
Tasks:
Tasks:
- Work in a multi-cloud environment with GCP/AWS with cutting-edge technologies like Apache Airflow, Redshift, Kubernetes, MongoDB, AWS Lambda among others, and use Python as your main programming language (but we are open to others).
- Develop, maintain and optimize high-quality, reliable, and robust data pipelines on Kubernetes and Airflow that convert data into valuable information.
- Design and deploy high-load microservices with strong quality requirements and high business value for the development of Machine Learning models and data tools.
- Keep constantly improving your technical knowledge and expertise by implementing solutions based on open-source projects (e.g. DBT).
Must-have:
- Bachelor’s Degree in Computer Science or similar.
- 3+ years working in data engineering.
- 2+ years of experience with AWS/GCP stack or another cloud computing platform is a must.
- Solid background in DWH solutions as Amazon Redshift.
- Expertise in SQL/noSQL databases (PostgreSQL, MongoDB, Redis, etc.).
- Coding proficiency in Python is a must.
- Experience with CI/CD pipelines (Jenkins, Bitbucket, Github, Gitlab, etc.).
- Experience working with bash scripting, Docker, Kubernetes, and Linux environments on a daily basis.
- Strong written and verbal communication, presentation, and technical writing skills.
- Team player, technology passionate, and self-motivated individual.
- Strong communication skills in English.
Nice-to-have:
- At least one programming language (Scala, Go, or Ruby) in addition to Python is highly desirable.
- Experience with ETL and Airflow.
- Experience working in Agile environments, preferably with Jira and Asana, and the ability to work in a fast-paced environment is a huge plus.
Benefits:
- The possibility to disrupt one of the biggest industries, and in one of the most developed digital markets in the world.
- Competitive and flexible compensation (restaurant tickets, transport card, daycare checks, and external training).
- Annual variable bonus.
- Company-paid Life and Accident Insurance, and Medical insurance as benefits.
- Additional budget for individual education and online training courses (1.500€ per year).
- Top vacation days per year.
- Flexible working hours.
- Team building activities: hackathon, meetups and tech talks in the office, code katas.
- A fun, multicultural, and fast-paced environment.
- Drinks, coffee, and fruits.
Interview Process:
- Intro call with Toughbyte
- 15 min cultural fit interview
- 2 technical interviews
- Final interview