Haystack News

Data Engineer in Haystack News

Closed job - No longer receiving applicants

Haystack TV is the leading local & world news service on Connected TVs reaching millions of users! This is a unique opportunity to work at Haystack TV – named "Best TV Experience" by Google together with Netflix. We are one the fastest growing TV startups in the world, and are already preloaded on 37% of all TVs shipped in the US! Be part of a Silicon Valley startup and work directly with the founding team. Jumpstart your career by working with Stanford & Carnegie Mellon alumni and faculty who have already been part of other successful startups in Silicon Valley.

You should join us if you're hungry to learn how Silicon Valley startups thrive, you like to ship quickly and often, love to solve challenging problems, and like working in small teams.

Job description

(IMPORTANT: Your job application has to be in English)
We are looking for a creative and passionate Data Engineer to help build our Personalization and Analytics team. You will bring experience and expertise in Data Warehouse architecture, Analytics tools and streaming data solutions across a variety of platforms. You should have a passion for working with Big Data solutions at scale and ability to work with large and complex datasets leveraging cloud based solutions.
In this role you will own the Data Engineering that supports multiple teams and help execute strategic vision and roadmap for the team.
Key Responsibilities Include

  • Lead the efforts of building world class Data Solutions leveraging cloud ecosystem.
  • Work Closely with Data Scientist and Software Engineer, to ensure solutions deliver the data needed for modeling and application in automated and timely fashion
  • Create and manage business intelligence dashboards to monitor the health of the business

MINIMUM REQUIREMENTS

  • Bachelor’s degree in Computer Science, Mathematics, Statistics, or similar analytical field (Masters or PhD a big plus!)
  • 2+ Years of experience building/operating highly available, distributed systems of data extraction, ingestion, processing and analysis of large data sets
  • Expert with data analytics tools including SQL, R and Spark
  • Expert knowledge of managing and maintaining relational databases (e.g. Postgres)
  • Experience with at least one NoSQL database (e.g. MongoDB, etc)
  • Proficiency in any programming language; preferably Python.

Desirable skills

SQL, R, Spark, PostgreSQL, Mongodb

Life's too short for bad jobs.
Sign up for free and find jobs that are truly your match.