Open Vacancies at LeverX - Join the team of talented software professionals

Senior Data Engineer (Snowflake) | Work at LeverX

Written by Liza | Jan 6, 2026 12:23:29 PM

At LeverX, we have had the privilege of delivering over 1,500 projects for various clients. With 20+ years in the market, our team of 2,200+ is strong, reliable, and always evolving: learning, growing, and striving for excellence.

We are looking for a Data Analytics Engineer to join us. Let’s see if we are a good fit for each other!


what we offer:

    • Projects in different domains: healthcare, manufacturing, e-commerce, fintech, etc.
    • Projects for every taste: Startup products, enterprise solutions, research & development initiatives, and projects at the crossroads of SAP and the latest web technologies.
    • Global clients based in Europe and the US, including Fortune 500 companies.
    • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one.
    • Healthy work atmosphere: On average, our employees stay with the company for 4+ years.
    • Market-based compensation and regular performance reviews.
    • Internal expert communities and courses.
    • Perks to support your growth and well-being.

required skills:

  • 5+ years of hands-on experience in Data Engineering.

  • Strong expertise in Snowflake, including architecture, advanced SQL, Snowpark (Python), and data governance (RBAC, masking policies).

  • Solid experience with dbt (models, tests, snapshots, incremental strategies, CI/CD).

  • Experience with data orchestration tools (Airflow, Dagster, Prefect) or Snowflake Tasks and Streams.

  • Proven ability to design and maintain scalable Data Warehouse or Data Lakehouse architectures.

  • Experience optimizing costs by managing Snowflake credits, warehouse configurations, and storage usage.

  • Strong Python skills for data transformation, automation, and scripting.

  • English B2+.

nice-to-have skills:

  • SnowPro Core or SnowPro Advanced Architect certifications.
  • Experience building production-grade dashboards in BI tools (Tableau, Looker, Superset, Metabase).
  • Experience with Cloud Platforms (AWS/Azure/GCP) regarding IAM, S3/Blob Storage networking.


    responsibilities:

  • Design and build scalable data pipelines in Snowflake using SQL, Snowpark (Python), and Stored Procedures.

  • Own Snowflake security and governance, including RBAC, dynamic data masking, row access policies, and object tagging.

  • Develop and optimize data ingestion using Snowpipe, COPY INTO, and external stages.

  • Optimize performance through virtual warehouse sizing, query profiling, clustering, and search optimization.

  • Implement data transformations and semantic layers using dbt, applying medallion or Data Vault modeling patterns.

  • Build reliable orchestration and CI/CD for data pipelines using Snowflake Tasks & Streams, external orchestrators, and infrastructure as code.

  • Collaborate with and mentor engineers, partner with analytics and ML teams, and drive continuous improvements across the data platform.