SnapLogic Developer

2 meses atrás


Belo Horizonte, Minas Gerais, Brasil YASH Technologies Tempo inteiro
Job Summary

The primary objective of this role is to provide support for specialized utility programs utilizing software packages and data warehouses for the Direct Procurement functional area. A typical analyst is responsible for enhancing and supporting software for client use with the aim of optimizing operational efficiency for procurement functions.

Key Responsibilities
  1. Hands-on experience in SnapLogic Pipeline Development with strong debugging skills.
  2. Proficiency in Expression Language.
  3. Familiarity with various data formats and protocols, such as JSON, XML, REST, SOAP, JDBC, etc.
  4. Developing interfaces, documentation, troubleshooting, and fixing bugs.
  5. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations.
  6. Solid understanding of data integration concepts, ETL (Extract, Transform, Load) processes, and data quality management.
  7. Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources.
  8. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping.
  9. Conducting unit testing, integration testing, and performance testing of integration solutions to ensure quality and reliability.
  10. Maintaining documentation of data integration workflows, including design specifications, configuration details, and operational procedures.
Requirements
  1. Verbal and written communication skills, problem-solving skills, customer service, and interpersonal skills.
  2. Excellent troubleshooting skills.
  3. Snaplogic Integration - Pipeline Development.
  4. Well-versed with SDLC (software development life cycle) process and IT controls.
  5. Able to deliver independently and collaborate effectively.
  6. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.
  7. Monitoring and optimizing data integration workflows to ensure efficient data transfer and processing.
Education and Experience
  1. Bachelor's degree in computer science or equivalent training required.
  2. 4-8 years related experience required.