You are viewing a preview of this job. Log in or register to view more details about this job.

Data Engineer

Key Skills: ETL, Java, Microsoft Azure Synapse Analytics, Python, Snowflake on Microsoft Azure

· We are seeking hands-on Data Engineer consultants to build out the next-generation data warehouse/mess for the organization. To solve the data availability and access issues of all data across the organization. Enabling a graph of connectivity between 100s of data sets. We need people that are enthusiastic about enabling internal and external clients by streamlining and facilitating easy access to their critical data that is well-defined and has established transparent levels of quality. This engineer will leverage our data platforms to achieve this while providing critical input to extend data platform capabilities. Familiarity with ETL and Cloud Platform data pipeline solutions is critical, as is REST API authoring for data access.
·   Member of the Business Date Engineering team, working to deliver Data Ingest/Enrich Pipelines, and Access APIs using common cloud technologies. Work with consumers to understand the data requirements and deliver data contracts with well-defined SLIs to track SLA agreements.
· Harness modern application best practices with code quality, API test Coverage, Agile Development, DevOps, and Observability and support.
·   Maintain programming standards and ensure the usage of the pattern/template for API Proxy.
·   Conduct code reviews and automatic test coverage
·   Standardize the CI/CD setup for API management tools and automated deployment.
· Utilize problem-solving skills to help your peers in the research and selection of tools, products, and frameworks (which is vital to support business initiatives)
Mandatory Skills Description:
Qualifications & Experience
·   5+ years of proven industry experience; bachelor's degree in IT or related fields
·  Hands-on development expertise in Java, Python, GraphQL, SQL, Junit, Spring Boot, OpenAPI, Spark, Flink, Kafka
·  Experience working in cloud data platforms such as Azure, Snowflake, Yellowbrick, Single store, GBQ
·  Understanding of Databases, API Frameworks, Governance Frameworks, and expertise in hosting and managing platforms like: Hadoop, Spark, Flink, Kafka, SpringBoot, BI Tools like Tableau, Alteryx, Governance Tools like Callibra, Soda, and Amazon DeeQu Strong understanding of Twelve-Factor App Methodology
·  Solid understanding of API and integration design principles and pattern experience with web technologies.
·  Design object-oriented, modularized, clean, and maintainable code and create policies in Java, JavaScript, Node JS, Python, etc.
·   Experience with test-driven development and API testing automation.
· Demonstrated track record of full project lifecycle and development, as well as post-implementation support activities
·  Hands-on experience in designing and developing high-volume REST using API Protocols and Data Formats.
Nice-to-Have Skills:
Additional Qualifications
·  Financial experience: Public and Alternatives Asset Management Familiar with NoSQL\NewSQL databases
·   Working with Azure API and DB Platforms
·   Strong documentation capability and adherence to testing and release management standards
· Design, development, modification, and testing of databases designed to support Data Warehousing and BI business teams
·   Familiarity with SDLC methodologies, and defect tracking (JIRA, Azure DevOps, ServiceNow, etc.)

Soft Skills:
·   Candidate must have an analytical and logical thought process for developing project solutions
·   Strong interpersonal and communication skills; works well in a team environment
·   Ability to deliver under competing priorities and pressures.
Excellent organizational skills in the areas of code structuring & partitioning, commenting, and documentation for team alignment and modify