Kafka DevOps (AWS) in Sngular

Closed job - No longer receiving applicants

Sngular is a rapidly growing information technology solutions provider and a trusted technology partner to global companies who are leading digital transformation in industries as diverse as financial services, health, retail, and telecommunications. Founded in Spain more than 20 years ago, with offices in the US, Mexico, Chile and Singapore, we help companies harness and leverage today’s most cutting-edge digital technologies to create value and grow. As we like to say, if a customer needs it, “It Can Be Done.” For more information on our US footprint, range of technologies, services, and experience visit www.sngular.com.

Funciones del cargo

  • Knowledgeable in administering Kafka clusters.
  • Assist deployments of KSQL, UDF/UDAF artifacts.
  • Fine tune and solve Kafka performance issues.
  • Support testing activities.
  • Hands-on experience maintaining and managing Linux servers.
  • Experience with AWS platform services.
  • Experience with IasC (CTF, Terraform, Ansible).

Requerimientos del cargo

Technical Skills Required:


  • Knowledgeable in administering Kafka clusters.
  • Hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with the interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams
  • Strong fundamentals in Kafka administration, configuration, and troubleshooting
  • Fine tune and solve Kafka performance issues.
  • Knowledge of Kafka clustering, and its fault-tolerance model supporting HA and DR
  • Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of QOS
  • Experience with Kafka Streams / KSQL architecture and associated clustering model
  • Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce
  • Knowledge of connectors available from Confluent and the community
  • The familiarity of the Schema Registry
  • Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producers/consumer apps)
  • Familiarity with Confluent Control Center


  • Assist deployments of KSQL, UDF/UDAF artifacts through pipelines.
  • Support testing activities.
  • Hands-on experience maintaining and managing Linux servers.
  • Experience with AWS platform services.
  • Experience with IasC (CTF, Terraform, Ansible).

Developer (nice to have):

  • Practical experience with how to scale Kafka, KStreams, and Connector infrastructures, with the motivation to build efficient platforms
  • Hands-on experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters
  • Experience with developing KSQL queries and best practices of using KSQL vs streams
  • Strong understanding of relational and NoSQL databases (Postgres, Mongo, Redis, etc), SQL, and database/schema design
  • Hands-on experience in designing, writing and operationalizing new Kafka Connectors using the framework
  • Development Languages
  • Solid programming proficiency with Java/Scala/node.js/python/.net, and best practices in development


Strong communication and client-facing skills, ability to work across all levels of the business, banking / financial services experience

English: Fluency.


We consider Kafka developers with a lot of experience, especially if they have worked with Confluent Cloud, that they move well in Linux environments, that is, that they perform well doing things by command line and can do scripts and such


  • An excellent work environment based on commitment and peer support.
  • Competitive salary with benefits of law (100% payroll) adapted to the abilities and experience of the person
  • Social benefits (major medical expenses, minor medical expenses, excellent vacation policy, grocery vouchers, savings fund …)
  • Bet on continuous training.

Remote work policy

Locally remote only

Position is 100% remote, but candidates must reside in México - Chile.

Life's too short for bad jobs.
Sign up for free and find jobs that are truly your match.