Job Title : Data Engineer
Location : Bogota, Colombia (Remote)
Duration : + Months
looking for a Data Engineer to add to our dynamic and rapidly scaling team. We’re making this investment to help us optimize our digital channels and technology innovations with the end goal of creating competitive advantages for our restaurants around the globe. We’re looking for a solid lead engineer who brings fresh ideas from past experiences and is eager to tackle new challenges in our company.
We’re in search of a candidate who is knowledgeable about and loves working with modern data integration frameworks, big data and cloud technologies. Candidates must also be proficient with data programming languages (, Python and SQL). The data engineer will build a variety of data pipelines and models to support advanced AI / ML analytics projects - with the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. The candidate will work in our office in Gurgaon, India.
Daily Responsibilities :
- Partner with clients to build data pipelines to enable best-in-class restaurant technology solutions.
- Play a key role in our Data Operations team - developing data solutions responsible for driving Client Growth.
- Design and develop data pipelines – streaming and batch – to move data from point-of-sale, back of house, operational platforms and more to our Global Data Hub
- Contribute to standardizing and developing a framework to extend these pipelines across brands and markets
- Develop on the Client data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.).
- Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration points.
Skills and Qualifications :
Vast background in all things data relatedAWS platform development experience (EKS, S, API Gateway, Lambda, etc.)Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plusHigh level of proficiency with SQL (Snowflake a big plus)Proficiency with Python for transforming data and automating tasksExperience with Kafka, Pulsar, or other streaming technologiesExperience orchestrating complex task flows across a variety of technologiesBachelor’s degree from an accredited institution or relevant experience