Skip to content

Data Engineer

  • Hybrid
    • Bucharest, București, Romania

This role is foundational. We are looking for a senior individual who can serve as a key technical reference point and help develop the Data Engineering department from the ground up.

Job description

Take architectural ownership for delivering an international project while playing a foundational role in the strategic development and scaling of our emerging Data Engineering discipline.

Challenge

We are seeking a highly experienced and Senior Data Engineer to join us. You will leverage your technical expertise to set best practices, establish technical standards, and lead an international project.

 

What’s in It for You?

 

  • Competitive salary that reflects your skills and impact.

  • Invest in your growth: We offer training and development opportunities.

  • Work on complex, impactful projects with international teams.

  • Regular, constructive feedback to help you level up.

  • A collaborative and inclusive culture built on trust and open communication.

  • Join a young, friendly team in our unique campus environment.

  • Comprehensive benefits: private medical subscription, and more.

  • Access to online learning platforms. 

 

How Will You Make an Impact?

  • Department Development: Actively contribute to defining the technical vision, architecture, tool selection, and implementation standards for the new Data Engineering discipline.

  • Architectural Leadership: Lead the architectural design, construction, and maintenance of robust, scalable enterprise-level data management systems and pipelines (ETL/ELT) using PySpark and related technologies.

  • Data Strategy & Modeling: Drive the strategic implementation of dimensional models (Star/Snowflake schema, Data Vault) and manage data storage solutions in our Lakehouse environment (e.g., Microsoft Fabric, Azure Data Lake) to optimize data architecture and performance.

  • Advanced Implementation: Apply expert-level proficiency in Python and PySpark to implement complex data transformation and integration solutions.

  • Performance Optimization: Proactively identify and resolve complex data retrieval and pipeline performance bottlenecks by implementing advanced techniques such as partitioning, clustering, indexing, and sophisticated query tuning.

  • Governance & Quality: Design and enforce comprehensive data quality checks, monitoring, alerting, and robust data governance and security frameworks (RBAC).

  • Orchestration: Strategically utilize orchestration tools to automate, monitor, and manage enterprise-level data flows.

What You Need to Succeed

We’re looking for someone who’s not only passionate about data engineering and reliable but also loves working with people. Here’s what will help you crush it in this role:

Job requirements

  • Minimum 5-7 years of hands-on experience in Data Engineering, including significant time spent in a senior or lead capacity.

  • Architect-level knowledge of SQL for complex data extraction, manipulation, and performance tuning.

  • Expert-level proficiency in Python and PySpark for architecting data manipulation and pipeline development.

  • Deep expertise in data warehousing concepts and advanced data modeling (dimensional/denormalized).

  • A proven track record of successfully delivering data projects.

  • Fluency in English (for working with international customers).

  • Fluency in Romanian (for working with the local team).

  • Proven ability to work effectively as a team player with excellent communication skills.

  • Experience in mentoring peers.

  • Strong self-organization, accountability, and professional maturity.

  • Openness to giving and receiving professional feedback.

Work Ethic

  • Team player with very good communication skills.

  • Strong self-organization and accountability.

  • Open to giving and receiving feedback.

  • Continuous learner, staying up-to-date with industry trends.

  • Willing to mentor and support others.

Bonus Skills

  • Experience with Microsoft Fabric or Azure Synapse.

  • Deep experience with Change Data Capture (CDC) implementation and advanced incremental loading strategies.

  • Familiarity with containerization and DevOps practices (CI/CD) for deploying data services.

or