
Sr. Data Architect
Há 17 horas
We are looking for a Senior Data Engineer to design and maintain scalable data pipelines on AWS, ensuring performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.
Key Responsibilities:
- Build and optimize ETL pipelines with AWS Glue.
- Work with AWS S3, Glue, and SageMaker for data and AI workflows.
- Develop solutions in Python and SQL .
- Integrate data from Salesforce and APIs .
- Ensure data governance, documentation, and best practices.
- AWS (S3, Glue, SageMaker)
- Python, SQL
Requirements:
- Proven experience in data engineering with AWS .
- Experience with ETL, data modeling, and pipeline optimization.
- Advanced English (international collaboration).
Avenue Code reinforces its commitment to privacy and to all the principles guaranteed by the most accurate global data protection laws, such as GDPR, LGPD, CCPA and CPRA. The Candidate data shared with Avenue Code will be kept confidential and will not be transmitted to disinterested third parties, nor will it be used for purposes other than the application for open positions. As a Consultancy company, Avenue Code may share your information with its clients and other Companies from the CompassUol Group to which Avenue Code's consultants are allocated to perform its services.
The company and our mission:
Zartis is a digital solutions provider working across technology strategy, software engineering and product development.
We partner with firms across financial services, MedTech, media, logistics technology, renewable energy, EdTech, e-commerce, and more. Our engineering hubs in EMEA and LATAM are full of talented professionals delivering business success and digital improvement across application development, software architecture, CI/CD, business intelligence, QA automation, and new technology integrations.
We are looking for a Data Engineer to work on a project in the Technology industry.
The project:
Our teammates are talented people that come from a variety of backgrounds. We're committed to building an inclusive culture based on trust and innovation.
You will be part of a distributed team developing new technologies to solve real business problems. Our client empowers organizations to make smarter, faster decisions through the seamless integration of strategy, technology, and analytics. They have helped leading brands harness their marketing, advertising, and customer experience data to unlock insights, enhance performance, and drive digital transformation.
We are looking for someone with good communication skills, ideally with experience making decisions, being proactive, used to building software from scratch, and with good attention to detail.
What you will do:
- Designing performant data pipelines for the ingestion and transformation of complex datasets into usable data products.
- Building scalable infrastructure to support hourly, daily, and weekly update cycles.
- Implementing automated QA checks and monitoring systems to catch data anomalies before they reach clients.
- Re-architecting system components to improve performance or reduce costs.
- Supporting team members through code reviews and collaborative development.
- Building enterprise-grade batch and real-time data processing pipelines on AWS, with a focus on serverless architectures.
- Designing and implementing automated ELT processes to integrate disparate datasets.
- Collaborating across multiple teams to ingest, extract, and process data using Python, R, Zsh, SQL, REST, and GraphQL APIs.
- Transforming clickstream and CRM data into meaningful metrics and segments for visualization.
- Creating automated acceptance, QA, and reliability checks to ensure business logic and data integrity.
- Designing appropriately normalized schemas and making informed decisions between SQL and NoSQL solutions.
- Optimizing infrastructure and schema design for performance, scalability, and cost efficiency.
- Defining and maintaining CI/CD and deployment pipelines for data infrastructure.
- Containerizing and deploying solutions using Docker and AWS ECS.
- Proactively identifying and resolving data discrepancies, and implementing safeguards to prevent recurrence.
- Contributing to documentation, onboarding materials, and cross-team enablement efforts.
What you will bring:
- Bachelor's degree in Computer Science, Software Engineering, or a related field; additional training in statistics, mathematics, or machine learning is a strong plus.
- 5+ years of experience building scalable and reliable data pipelines and data products in a cloud environment (AWS preferred).
- Deep understanding of ELT processes and data modeling best practices.
- Strong programming skills in Python or a similar scripting language.
- Advanced SQL skills, with intermediate to advanced experience in relational database design.
- Familiarity with joining and analyzing large behavioral datasets, such as Adobe and GA4 clickstream data.
- Excellent problem-solving abilities and strong attention to data accuracy and detail.
- Proven ability to manage and prioritize multiple initiatives with minimal supervision.
Nice to have:
- Experience working with data transformation tools such as Data Build Tool or similar technologies.
- Familiarity with Docker containerization and orchestration.
- Experience in API design or integration for data pipelines.
- Development experience in a Linux or Mac environment.
- Exposure to data QA frameworks or observability tools (e.g., Great Expectations, Monte Carlo, etc.).
What we offer:
- 100% Remote Work
- WFH allowance: Monthly payment as financial support for remote working.
- Career Growth: We have established a career development program accessible for all employees with a 360º feedback that will help us to guide you in your career progression.
- Training: For Tech training at Zartis, you have time allocated during the week at your disposal. You can request from a variety of options, such as online courses (from Pluralsight and Educative.io, for example), English classes, books, conferences, and events.
- Mentoring Program: You can become a mentor in Zartis or you can receive mentorship, or both.
- Zartis Wellbeing Hub (Kara Connect): A platform that provides sessions with a range of specialists, including mental health professionals, nutritionists, physiotherapists, fitness coaches, and webinars with such professionals as well.
- Multicultural working environment: We organize tech events, webinars, parties, and activities to do online team-building games and contests.
About The Role
We are seeking experienced Data Engineers to develop and deliver robust, cost-efficient data products that power analytics, reporting and decision-making across two distinct brands.
What You'll Do
- Build highly consumable and cost-efficient data products by synthesizing data from diverse source systems.
- Ingest raw data using Fivetran and Python, staging and enriching it in BigQuery to provide consistent, trusted dimensions and metrics for downstream workflows.
- Design, maintain, and improve workflows that ensure reliable and consistent data creation, proactively addressing data quality issues and optimizing for performance and cost.
- Develop LookML Views and Models to democratize access to data products and enable self-service analytics in Looker.
- Deliver ad hoc SQL reports and support business users with timely insights.
- (Secondary) Implement simple machine learning features into data products using tools like BQML.
- Build and maintain Looker dashboards and reports to surface key metrics and trends.
What We're Looking For
- Proven experience building and managing data products in modern cloud environments (GCP preferred).
- Strong proficiency in Python for data ingestion and workflow development.
- Hands-on expertise with BigQuery, dbt, Airflow and Looker.
- Solid understanding of data modeling, pipeline design and data quality best practices.
- Excellent communication skills and a track record of effective collaboration across technical and non-technical teams.
Why Join Kake?
Kake is a remote-first company with a global community — fully believing that it's not where your table is, but what you bring to the table. We provide top-tier engineering teams to support some of the world's most innovative companies, and we've built a culture where great people stay, grow, and thrive. We're proud to be more than just a stop along the way in your career — we're the destination.
The icing on the Kake:
Competitive Pay in USD – Work globally, get paid globally.
Fully Remote – Simply put, we trust you.
Better Me Fund – We invest in your personal growth and passions.
Compassion is Badass – Join a community that invests in social good.
Databricks Data ArchitectHoje
Job Title: Solution Architect – Databricks
Location: Remote Work
12 months
About the Role
Celebal Technologies is seeking an experienced Data Architect with deep expertise in Azure Databricks, Unity Catalog, PySpark, and Delta Live Tables . In this role, you will lead the design, architecture, and delivery of advanced data solutions, collaborating closely with cross-functional teams to ensure scalable, secure, and high-performance implementations.
Key Responsibilities
- Lead solution architecture and design for Azure Databricks projects, ensuring best practices and performance optimization.
- Implement and manage Unity Catalog for data governance and security.
- Develop and optimize data pipelines using PySpark and Delta Live Tables .
- Collaborate with stakeholders to translate business requirements into technical solutions.
- Provide technical leadership and guidance to development teams.
- Ensure architecture compliance with enterprise standards and security policies.
- Support pre-sales activities, including solution design, effort estimation, and client presentations.
- Travel to client locations as needed for project delivery and stakeholder engagement.
Required Skills & Qualifications
- Proven experience as a Solution Architect or Senior Data Engineer working with Azure Databricks .
- Hands-on expertise in Unity Catalog setup and governance.
- Strong proficiency in PySpark for data processing and transformation.
- Experience with Delta Live Tables for building automated and reliable data pipelines.
- In-depth understanding of Azure cloud services, data architecture, and big data technologies.
- Excellent problem-solving, communication, and stakeholder management skills.
Preferred Qualifications
- Experience with other Azure data services (e.g., Azure Data Factory, Azure Synapse).
- Familiarity with enterprise data governance frameworks.
- Prior consulting or client-facing experience in large-scale data projects.
End
#J-18808-Ljbffr-
Data Architect
Há 16 horas
Uberlândia, Minas Gerais, Brasil buscojobs Brasil Tempo inteiroAbout Us At Aravita, we're not just building a product; we're building the future of AI-enabled fresh food operations. We are a fast-paced, innovative startup on a mission to solve complex problems with data. Our culture is built on collaboration, curiosity, and a passion for technology. We're looking for a brilliant Data Engineer / Cloud Architect to join...
-
Sr. Data Architect
Há 2 dias
Uberlândia, Minas Gerais, Brasil buscojobs Brasil Tempo inteiroWe are looking for a Senior Data Engineer to design and maintain scalable data pipelines on AWS, ensuring performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.Key Responsibilities:Build and optimize ETL pipelines with AWS Glue.Work with AWS S3, Glue,...
-
Data Architect
2 semanas atrás
Uberlândia, Minas Gerais, Brasil Stellantis South America Tempo inteiro3 days ago Be among the first 25 applicants Get AI-powered advice on this job and more exclusive features. Do you like to solve complex and high scale data challenges? Do you like to work in a variety of business environments, through high impact projects that use the newest data analytic technologies transitioning to a modern data stack? We are seeking a...
-
Data Platforms Architect
1 semana atrás
Uberlândia, Minas Gerais, Brasil beBeeDataArchitect Tempo inteiro R$1.400.000 - R$1.700.000Key Business Challenges:Data Architect is responsible for designing and managing data systems, ensuring data security and compliance.The ideal candidate will have experience with data warehousing solutions, data modeling, ETL processes, big data technologies, cloud-based solutions, and analytical skills.Responsibilities:Design Data Systems: Create blueprints...
-
Chief Data Infrastructure Architect
Há 10 horas
Uberlândia, Minas Gerais, Brasil beBeeData Tempo inteiro R$800.000 - R$1.000.000Job Opportunity">Are you passionate about building data platforms that empower organizations to make informed decisions? We are seeking a skilled Data Engineer/Cloud Architect to join our dynamic team and help us build a scalable, world-class data infrastructure.This is a hands-on role where you will design, build, and maintain our cutting-edge data...
-
Strategic Data Architect
1 semana atrás
Uberlândia, Minas Gerais, Brasil beBeeData Tempo inteiro R$90.000 - R$120.000Business Intelligence AnalystKey Responsibilities:Understand the business and architect a data solution from scratch, including the definition of KPIs and metrics.Work with stakeholders to understand reporting needs, identifying and querying data sources to get the required data.Cleanse and massage data into the shape required, creating datasets, reports,...
-
Chief Data Architect
1 dia atrás
Uberlândia, Minas Gerais, Brasil beBeeData Tempo inteiro R$150.000 - R$190.000Job OverviewWe are seeking a senior data architect to lead data strategy and architecture for a high-impact startup.Design and build modern data platforms using the Microsoft ecosystem.Develop scalable pipelines with Azure Data Factory.Optimize data performance using Delta/Parquet, partitioning, and Spark.Model data into easy-to-use structures (star schema,...
-
Enterprise Data Architect
1 dia atrás
Uberlândia, Minas Gerais, Brasil beBeeData Tempo inteiro R$125.000 - R$175.000Job SummaryA Data Architect is sought to lead the development of modern data platforms using Microsoft technologies.This role is ideal for a skilled professional who excels in solving complex data challenges, stays up-to-date with cutting-edge tools, and thrives in a fast-paced environment.Key Responsibilities:Design and build data platforms using Microsoft...
-
Data Engineer for Modern Data Platforms
Há 4 dias
Uberlândia, Minas Gerais, Brasil beBeedata Tempo inteiro R$150.000 - R$200.000Job Title: Microsoft Fabric ArchitectWe are seeking a senior-level data engineer to lead the design and implementation of modern data platforms using the Microsoft ecosystem.
-
Chief Data Infrastructure Specialist
Há 2 dias
Uberlândia, Minas Gerais, Brasil beBeeData Tempo inteiro R$150.000 - R$170.000Job Title: Cloud Data ArchitectWe are seeking an experienced Data Architect to lead the design, architecture, and delivery of advanced data solutions. As a key member of our team, you will collaborate closely with cross-functional teams to ensure scalable, secure, and high-performance implementations.Key Responsibilities:Design and implement data pipelines...