Data Engineer Python in Ensitech

Closed job - No longer receiving applicants

We are a company dedicated to custom software development and cloud technologies, we have more than 18 years in the market and we have recently integrated into the American market by placing highly specialized personnel in the United States, working remotely.

Our current project is to work 100% remotely, on dynamic projects with developers of different levels to collaborate on updating, maintaining, and configuring systems on different financial servers.

We hope you will join the ENSITEAM!

Funciones del cargo

  • As a data engineer will be in charge of designing, developing and maintaining systems that process large volumes of data, so that they remain available to other data engineering specialists and analysts.
  • Utilize agile project management principles, with a minimum of 3 years of agile project experience, using tools such as JIRA.
  • Work with technologies including DBT, Talend, and Apache Airflow to develop robust data pipelines and workflows.
  • Focus heavily on SQL database design, utilizing your expertise to design efficient and scalable data models.
  • Collaborate with cross-functional teams to implement and enhance data solutions on platforms such as Snowflake, Oracle, SQL Server, PostgreSQL, and MySQL.

Requerimientos del cargo

  • Key Required Skills: PySpark, Python, Airflow, Databricks, Azure Synapse, ADF, GIT, Apache Airflow, ADLS, SQL.
  • Minimum of a bachelor’s degree in computer science, Engineering, or a similar field.
  • 5-7+ years of project experience, preferably as a Data Engineer/Developer and minimum of 3 years of agile project experience is a must (preferred tool – JIRA).
  • Must have exposure to technologies such as DBT, Talend and Apache airflow.
  • SQL is heavily focused.
  • Experience with SQL database design.
  • Experience in data platforms: Snowflake, Oracle, SQL Server, PostgreSQL, and MySQL Lead R&D efforts to find solutions for data engineering requirements not addressed by existing technology standards.
  • Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket).
  • Develop metrics that illuminate the flow of data across the organization.
  • Experience in data modeling and relational database design.
  • Experience in AWS and Azure data platforms.
  • Preferred: Strong programming/ scripting skills (Python, Powershell, etc.)


  • 100% remote opportunity.
  • Competitive salary and benefits.
  • Custom benefit package aligned to your experience and your needs.
  • Working with foreign clients.

Fully remote You can work from anywhere in the world.

Remote work policy

Fully remote

Candidates can reside anywhere in the world.

Life's too short for bad jobs.
Sign up for free and find jobs that are truly your match.