Senior Data Engineer [AWS, Python, Snowflake] in DataArt

Closed job - No longer accepting applications

With over 25 years since 1997 of experience, teams of highly-trained engineers around the world, deep industry sector knowledge, and ongoing technology research, we help clients create custom software that improves their operations and opens new markets.

The company’s main product has combined the innovative analysis system and statistical data processing results of tens of millions of users. This has allowed them to implement a convenient consumer service for mass use both as a web application and a mobile service.

Apply exclusively at getonbrd.com.

Funciones del cargo

  • Build and support large-scale batch and real-time data pipelines with data processing frameworks like Spark and AWS managed services
  • Use best practices in continuous integration and delivery
  • Help drive optimization, testing and tooling to improve data quality and our ability to use data to make product decisions
  • Collaborate with other software engineers, ML engineers and stakeholders, taking learning and leadership opportunities that will arise every single day
  • Collaborate with the analytics team to support their BI tools and initiatives to deliver the reliability, speed, and scalability of a data platform they’ll love working with
  • Work in multi-functional agile teams to continuously experiment, iterate and deliver on new product and infrastructure objectives
  • Optimize the existing data infrastructure which will create a single version of the truth and standardize data into coherent formats for self-service
  • Keep abreast of new data storage, delivery, analysis, visualization, reporting techniques and software to develop more powerful data infrastructure
  • Be a trusted technical advisor to customers and solve complex data challenges

Requerimientos del cargo

  • Hands-on experience with data processing software and algorithms
  • General knowledge and experience with AWS: Cloud Formation or CDK, Glue, Lambda
  • Hands-on experience with Spark
  • Experience in SQL
  • Experience with data projects
  • Experience with relational databases
  • Working experience with Python
  • Skills working with Airflow or similar
  • Ability to work well within a team context, from collaborating with teammates and providing code reviews to mentoring and leading
  • Familiarity with and understanding of Agile software development approaches such as Kanban and Scrum
  • Upper-Intermediate English level

Opcionales

  • Experience with Snowflake

Condiciones

  • Contractor Schema
  • Paid Days Off.
  • Paid Training.
  • Remote/Flex Work.
  • Paid Certifications (AWS; GCP; Microsoft; among others)
  • English classes

GETONBRD Job ID: 37236

Digital library Access to digital books or subscriptions.
Education stipend DataArt covers some educational expenses related to the position.
Conference stipend DataArt covers tickets and/or some expenses for conferences related to the position.

Remote work policy

Locally remote only

Position is 100% remote, but candidates must reside in Uruguay, Colombia or Brazil.

Life's too short for bad jobs.
Sign up for free and find jobs that truly match you.