DataOps Engineer

3 semanas atrás


Vitoria, Brasil Trustly Tempo inteiro
About the team Trustly's DataOps team is responsible for delivering the data generated by the application to interested areas, as well as data from APIs and other tools. All this thought in a safe, structured, scalable and generic way, because we work with multiple environments (in different regions) and we need to maintain consistency. We work with both the batch layer (using Airflow) and the streaming layer (Kafka). We help areas in process automation to deliver data more quickly and reliably. We are also concerned with the quality of the data (Data Quality), creating an observability layer for the data to act in a preventive and immediate way to the inconsistencies and failures of our processes. We also work on the delivery of products and services to facilitate the use and consultation of our data, such as maintaining tools such as Debezium and Quicksight. Last but not least, we interacted with the Data Science area to provide the necessary support and infrastructure for the production of models. To achieve our goals, we follow good code and development practices, apply end-to-end encryption in all our processes, use infrastructure as code, etc.

What you'll do:

Work closely with the data team to build, scale, and optimize the Trustly big data environment, including the data lake setup, AWS infrastructure stack, BI data warehouse, and automation/visualization tools. Implement ETL processes and QA checks to ensure data in the data lake is accurate and up to date. Implement data pipelines to automate the timely delivery of customer reporting and dashboards. Partner with DevOps to ensure our environment and tools are compliant with security protocols. Partner with data scientists to productionalize machine learning models. Documentation of data flows, architectural setup, and data model.

Who you are:

Bachelor’s or Master’s degree in IT/Math/CS/Engineering or other technical discipline. Successful history of building big data pipelines, and data sets. Experience with AWS cloud services (DMS, EC2, EMR, RDS) and big data tools (Redshift). Desirable experience with Spark and Delta Lake. Experience with relational databases (preferably Postgres), strong SQL coding, data modeling, data warehouses. Experience with IaC, Terraform, etc. Desirable experience with Kubernetes, Docker. Experience with CI/CD. Experience with automation and workflow management tools Airflow, kubeflow). Intermediate Python programming skills.

Our perks and benefits:

Bradesco health and dental plan, for you and your dependents, with no co-payment cost; Life insurance with differentiated coverage; Meal voucher and supermarket voucher; Home Office Allowance; Wellhub - Platform that gives access to spaces for physical activities and online classes; Trustly Club - Discount at educational institutions and partner stores; Monthly happy hours with iFood coupon; English Program - Online group classes with a private teacher; Extended maternity and paternity leave; Birthday Off; Flexible hours/Home Office - our culture is remote-first You can work in every city in Brazil; Welcome Kit - We work with Apple equipment (Macbook Pro, iPhone) and we send many more treats Spoiler alert: Equipment can be purchased by you according to internal criteria; Annual premium - As a member of our team, you are eligible to receive an annual bonus, at the company's discretion, based on the achievement of our KPIs and individual performance; Referral Program - If you refer a candidate and we hire the person, you will receive a reward for that #LIRemoteCheck out our or our for more details about Brazil, our culture, and much more.