Lead Data Engineer

Há 4 dias


Guarapari, Brasil Fusemachines Tempo inteiro

About FusemachinesFusemachines is a leading AI strategy, talent, and education services provider.Founded by Sameer Maskey Ph.D., Adjunct Associate Professor at Columbia University, Fusemachines has a core mission of democratizing AI.With a presence in 4 countries (Nepal, United States, Canada, and Dominican Republic and more than 450 employees).Fusemachines seeks to bring its global expertise in AI to transform companies around the world.LocationLocation: Remote (Full-time)About The RoleThis is a remote full-time position, responsible for designing, building, testing, optimizing and maintaining the infrastructure and code required for data integration, storage, processing, pipelines and analytics (BI, visualization and Advanced Analytics) from ingestion to consumption, implementing data flow controls, and ensuring high data quality and accessibility for analytics and business intelligence purposes.This role requires a strong foundation in programming, and a keen understanding of how to integrate and manage data effectively across various storage systems and technologies.We\'re looking for someone who can quickly ramp up, contribute right away and lead the work in Data & Analytics, helping from backlog definition, to architecture decisions, and lead technical the rest of the team with minimal oversight.We are looking for a skilled Sr.Data Engineer/Technical Lead with a strong background in Python, SQL, Pyspark, Redshift and AWS cloud-based large scale data solutions with a passion for data quality, performance and cost optimization.The ideal candidate will develop in an Agile environment, and would have GCP experience too, to contribute to the migration from AWS to GCP.This role is perfect for an individual passionate about leading, leveraging data to drive insights, improve decision-making, and support the strategic goals of the organization through innovative data engineering solutions.Qualifications / Skill SetMust have a full-time Bachelor\'s degree in Computer Science Information Systems, Engineering, or a related field5+ years of real-world data engineering development experience in AWS and GCP (certifications preferred).Strong expertise in Python, SQL, PySpark and AWS in an Agile environment, with a proven track record of building and optimizing data pipelines, architectures, and datasets, and proven experience in data storage, modeling, management, lake, warehousing, processing/transformation, integration, cleansing, validation and analyticsSenior person who can understand requirements and design end to end solutions with minimal oversightStrong programming Skills in one or more languages such as Python, Scala, and proficient in writing efficient and optimized code for data integration, storage, processing and manipulationStrong knowledge SDLC tools and technologies, including project management software (Jira or similar), source code management (GitHub or similar), CI/CD system (GitHub actions, AWS CodeBuild or similar) and binary repository manager (AWS CodeArtifact or similar)Good understanding of Data Modeling and Database Design Principles.Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutionsStrong SQL skills and experience working with complex data sets, Enterprise Data Warehouse and writing advanced SQL queries.Proficient with Relational Databases (RDS, MySQL, Postgres, or similar) and NonSQL Databases (Cassandra, MongoDB, Neo4j, etc.)Skilled in Data Integration from different sources such as APIs, databases, flat files, event streaming.Strong experience in implementing data pipelines and efficient ELT/ETL processes, batch and real-time, in AWS and using open source solutions, being able to develop custom integration solutions as needed, including Data Integration from different sources such as APIs (PoS integrations is a plus), ERP (Oracle and Allegra are a plus), databases, flat files, Apache Parquet, event streaming, including cleansing, transformation and validation of the dataStrong experience with scalable and distributed Data Technologies such as Spark/PySpark, DBT and Kafka, to be able to handle large volumes of dataExperience with stream-processing systems: Storm, Spark-Streaming, etc. is a plusStrong experience in designing and implementing Data Warehousing solutions in AWS with Redshift.Demonstrated experience in designing and implementing efficient ELT/ETL processes that extract data from source systems, transform it (DBT), and load it into the data warehouseStrong experience in Orchestration using Apache AirflowExpert in Cloud Computing in AWS, including deep knowledge of a variety of AWS services like Lambda, Kinesis, S3, Lake Formation, EC2, EMR, ECS/ECR, IAM, CloudWatch, etcGood understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistentGood understanding of BI solutions including Looker and LookML (Looker Modeling Language)Strong knowledge and hands-on experience of DevOps principles, tools and technologies (GitHub and AWS DevOps) including continuous integration, continuous delivery (CI/CD), infrastructure as code (IaC – Terraform), configuration management, automated testing, performance tuning and cost management and optimizationGood Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issuesPossesses strong leadership skills with a willingness to lead, create Ideas, and be assertiveStrong project management and organizational skillsExcellent communication skills to collaborate with cross-functional teams, including business users, data architects, DevOps/DataOps/MLOps engineers, data analyst, data scientists, developers, and operations teams.Essential to convey complex technical concepts and insights to non-technical stakeholders effectivelyAbility to document processes, procedures, and deployment configurationsResponsibilitiesDesign, implement, deploy, test and maintain highly scalable and efficient data architectures, defining and maintaining standards and best practices for data management independently with minimal guidanceEnsuring the scalability, reliability, quality and performance of data systemsMentoring and guiding junior/mid-level data engineersCollaborating with Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable componentsEvaluating and implementing new technologies and tools to improve data integration, data processing and analysisDesign architecture, observability and testing strategies, and building reliable infrastructure and data pipelinesTakes ownership of storage layer, data management tasks, including schema design, indexing, and performance tuningSwiftly address and resolve complex data engineering issues, incidents and resolve bottlenecks in SQL queries and database operationsConduct Discovery on existing Data Infrastructure and Proposed ArchitectureEvaluate and implement cutting-edge technologies and methodologies and continue learning and expanding skills in data engineering and cloud platforms, to improve and modernize existing data systemsEvaluate, design, and implement data governance solutions: cataloging, lineage, quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns.Define and document data engineering architectures, processes and data flowsAssess best practices and design schemas that match business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive)Be an active member of our Agile team, participating in all ceremonies and continuous improvement activitiesEqual Opportunity EmployerEqual Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.#J-*****-Ljbffr


  • Data Engineer

    Há 2 dias


    Guarapari, Brasil Monks Tempo inteiro

    Join to apply for the Data Engineer role at Monks.As Monks expands our Global Enterprise Analytics capabilities, we are looking for someone to help scale our capabilities.This role will involve developing, implementing and maintaining data pipelines, systems, and models to enable data-driven decision making.The role includes supporting junior members in...


  • Guarapari, Brasil Bebeefullstack Tempo inteiro

    Experience Data EngineerAbout the RoleWe are seeking a talented Senior Full-Stack Engineer to lead our data engineering initiatives.As a key member of our team, you will design and implement end-to-end data pipelines using Python, focusing on delivering high-quality business intelligence solutions.Key Responsibilities:Develop and maintain robust ETL...


  • Guarapari, Brasil Bebeedataengineer Tempo inteiro

    Key Data Engineer RoleWe are looking for a skilled Data Engineer to join our team and participate in various projects.The ideal candidate will have experience with data analysis, querying, and reporting.The successful candidate will be responsible for collaborating with stakeholders, cleansing and massaging data, creating datasets and reports, and deploying...


  • Guarapari, Brasil Welocalize Tempo inteiro

    18 hours ago Be among the first 25 applicantsWelo Data works with technology companies to provide datasets that are high-quality, ethically sourced, relevant, diverse, and scalable to supercharge their AI models.As a Welocalize brand, WeloData leverages over 25 years of experience in partnering with the world's most innovative companies and brings together a...


  • Guarapari, Brasil Kake Tempo inteiro

    Senior Integrations Engineer Summary We're looking for a Senior Integrations Engineer to join one of our partners focused on delivering intelligent, data-driven e-commerce solutions.You'll be responsible for building and maintaining backend services and data integrations using NestJS and Python , ensuring smooth data flows between systems and scalability for...


  • Guarapari, Brasil Bebeecareer Tempo inteiro

    Unlock Your Data Engineering PotentialWe're seeking a highly skilled Data Engineer to join our team and contribute to the success of our organization.Key Responsibilities:Snowflake, DBT, and SQL expertiseFluent English skillsAgile MethodologiesOperational Monitoring: Proactively monitor data jobs and pipelines for smooth execution and timely delivery of...


  • Guarapari, Brasil Oleve Tempo inteiro

    About Oleve Oleve builds and manages a suite of mobile and web products that power experiences for millions of users worldwide.We combine thoughtful design, strong engineering, and data-driven iteration to create software that is fast, beautiful, and reliable.Our tech stack spansNext.js ,React Native , andPython , allowing us to deliver consistent...


  • Guarapari, Brasil Bebeedata Tempo inteiro

    Seeking a seasoned data professional to lead our business intelligence efforts.We are looking for an expert in data analysis and visualization to join our team.The ideal candidate will have a proven track record of delivering high-quality insights and driving informed decision-making.This is a unique opportunity for a talented individual to make a...

  • Data Architect

    Há 4 dias


    Guarapari, Brasil Allata Tempo inteiro

    Join to apply for the Data Architect (Azure) role at AllataAllata is a global consulting and technology services firm with offices in the US, India, and Argentina.We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology.Our expertise covers defining business vision,...


  • Guarapari, Brasil Jpmorganchase Tempo inteiro

    Join to apply for the HP Non-stop Security Engineer - Infrastructure Engineer III role at JPMorganChaseOverviewAs a HP Non-Stop Security Engineer at JPMorgan Chase within the Technology Solutions department, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and...