Bertoni Solutions

Lead Data Engineer in Bertoni Solutions

FULL_TIME

  Remote (South America) | Expert | Full time | Data Science / Analytics

2 applications
Last checked today
Apply now
Requires applying in English

We are Bertoni Solutions, a multinational team uniting Latin American talent and Swiss precision. Since 2016, we have helped clients translate technology into success by combining the creativity and passion of Latin American professionals with the rigor and organizational mindset of Swiss culture. Our development team is located across Latin America, including Lima, and we deliver consulting software and digital transformation services to organizations worldwide, always working 100% remotely. We focus on innovative, impactful projects leveraging the latest technology trends and digital transformation initiatives, offering a truly multicultural and global collaboration environment.

© Get on Board.

Key Responsibilities

  • Design and develop scalable data pipelines using PySpark to support analytics and reporting needs.
  • Write efficient SQL and Python code to transform, cleanse, and optimize large datasets.
  • Collaborate with machine learning engineers, product managers, and developers to understand data requirements and deliver effective solutions.
  • Implement and maintain robust ETL processes to integrate structured and semi-structured data from multiple sources.
  • Ensure data quality, integrity, and reliability across all pipelines and systems.
  • Participate in code reviews, troubleshooting, and performance tuning activities.
  • Work independently and proactively identify and resolve data-related issues.
  • Contribute to cloud-based data solutions, specifically on Azure platforms such as Azure Data Factory, Synapse, ADLS, Databricks, and Fabric.
  • Support cloud migration initiatives and adopt DevOps best practices.
  • Provide guidance on best practices and mentor junior team members when needed.

Job Description and Requirements

We are looking for a highly skilled Lead Data Engineer with extensive experience in PySpark, SQL, Python, Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. The candidate must have a solid understanding of ETL processes, data warehousing principles, and end-to-end data engineering workflows.

Required Skills and Experience:

  • 8+ years of experience working in cross-functional teams that include machine learning engineers, developers, product managers, and analytics teams.
  • 3+ years of hands-on experience developing and managing data pipelines using PySpark.
  • 3 to 5 years of experience with Azure-native services such as Azure Data Lake Storage (ADLS), Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, Azure SQL Database, and Fabric.
  • Strong programming skills in Python and SQL.
  • Proven expertise in ETL processes, data modeling, and data warehousing end-to-end solutions.
  • Self-driven, resourceful, and comfortable working in fast-paced, dynamic environments.
  • Excellent communication skills and fluency in written and spoken English at an advanced level (B2, C1 or C2 only).
  • Demonstrated leadership experience in current or previous projects.
  • Must be located in Central or South America due to nearshore position requirements.

Soft Skills: Effective communication, proactive problem-solving, collaboration, mentoring capacity, and adaptability to changing priorities and technologies are crucial to succeed in this role.

Desirable Skills

  • Databricks certification.
  • Knowledge of DevOps practices, including CI/CD pipelines and cloud migration best practices.
  • Familiarity with Azure Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Services, and Cosmos DB.
  • Basic understanding of SAP HANA.
  • Intermediate experience with Power BI.

Why Join Us?

  • 100% Remote Work.
  • Multicultural and Diverse Environment.
  • Continuous Professional Development:.
  • Challenging and Innovative Projects.

More Details:

  • Contract type: Independent contractor (This contract does not include PTO, tax deductions, or insurance. It only covers the monthly payment based on hours worked).
  • Location: The client is based in the United States; however, the position is 100% remote for nearshore candidates located in Central or South America.
  • Contract/project duration: Initially 6 months, with extension possibility based on performance.
  • Time zone and working hours: Full-time, Monday to Friday (8 hours per day, 40 hours per week), from 8:00 AM to 5:00 PM PST (U.S. time zone).
  • Equipment: Contractors are required to use their own laptop/PC.
  • Start date expectation: As soon as possible.
  • Payment methods: International bank transfer, PayPal, Wise, Payoneer, etc.

GETONBRD Job ID: 54948

Remote work policy

Locally remote only

Position is 100% remote, but candidates must reside in South America.

  1. Jobs
  2. Data Science / Analytics
  3. Bertoni Solutions
  4. Lead Data Engineer

About Bertoni Solutions

En Bertoni Solutions trabajamos 100% remoto, fusionando talento Latinoamericano y precisión suiza. — Bertoni Solutions's full profile

Lead Data Engineer
Bertoni Solutions •   Remote (South America)
Apply
Requires applying in English
Share this job Share