At LeverX, we have had the privilege of delivering over 950 projects. With 20+ years in the market, our team of 1,800 is strong, reliable, and always evolving: learning, growing, and striving for excellence.
We are looking for an Azure Data Engineer to join us. Let’s see if we are a good fit for each other!
what We offer:
- Projects in different domains: Healthcare, manufacturing, e-commerce, fintech, etc.
- Projects for every taste: Startup products, enterprise solutions, research & development projects, and projects at the crossroads of SAP and the latest web technologies.
- Global clients based in Europe and the US, including Fortune 500 companies.
- Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one.
- Healthy work atmosphere: On average, our employees stay in the company for 4+ years.
- Market-based compensation and regular performance reviews.
- Internal expert communities and courses.
- Perks to support your growth and well-being.
REQUIREd skills:
- 4+ years of experience as a Data Engineer focusing on Azure services: Azure Data Factory, Azure SQL Database, Azure Synapse, or similar technologies.
- Strong SQL skills, including complex query development, optimization, and troubleshooting.
- Proficiency in at least one programming language, such as Python, C#, or Scala.
- Experience with DevOps practices, including version control and CI/CD pipelines.
- Familiarity with cloud-based data storage and processing (Azure Data Lake, Azure Databricks, etc.).
- English B2+.
Nice-to-have skills:
- Experience with dbt, Snowflake data platform, Microsoft Fabric, CosmosDB, or ElasticSearch.
- Certifications as Fabric Data Engineer Associate or Azure Data Engineer Associate.
RESPONSIBILITIES:
- Design, develop, and maintain efficient data pipelines and workflows.
- Develop SQL-based solutions for data transformation, extraction, and loading (ETL) processes.
- Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and provide optimized solutions.
- Manage and optimize data storage solutions, including data lakes and warehouses.
- Troubleshoot data-related issues and ensure the accuracy and integrity of data across all systems.