- Booking.Com
- Bucharest,
- Full-Time
- 2 days ago
Booking Holdings Romania – Senior Data Engineer II: our view in 3 lines...
- The Role: A technical leader role for an experienced data engineer to design and operationalize cloud data pipelines and data products for Booking Holdings' brands.
- The Person: Lead design and build of scalable cloud data pipelines and data products, maintain pipeline reliability and data quality, define data architecture and governance, and drive technical solutions across teams.
- Requirements: Proven knowledge of Python, experience with Spark, CDC, Kafka, Cassandra, Airflow, Snowflake, DBT, Terraform and Data Warehousing and ETL/ELT pipelines.
Job Description
Booking Holdings Romania is a Center of Excellence based in Bucharest, Romania and was created to support the increasing business demands of the Booking Holdings Brands. The Center of Excellence provides access to specialized and highly skilled talent, leading industry best practices, and collaboration opportunities across all of our Brands.
As part of our Booking Holdings Romania team, you will have the opportunity to be a part of the world’s leading provider of online travel, with a mission of making it easier for everyone to experience the world through five-primary consumer facing brands: Booking.com, Priceline, Agoda, KAYAK and OpenTable.
Role description
As a Senior Data Engineer II you will be acting as a technical leader who drives data engineering technical strategies and delivery across the team. You will lead solution envisaging, technical designs, and hands-on implementation. You need to influence, differentiate, and guide the business and technology strategies, as they relate to data, through constant cross-functional interaction. You ask the right questions to the right people in order to align data strategy with commercial strategy, demonstrating technical expertise and business knowledge.Â
Â
In this role, you will innovate and operationalize data pipelines in a modern cloud environment (e.g., AWS, Snowflake), automate workflows (Airflow, Dagster), manage infrastructure as code (Terraform), and establish robust, auditable CI/CD practices. You’ll partner closely with Data Engineering, FP&A Reporting and analytics teams to deliver timely, reliable, and secure data solutions critical for regulatory (SOX) and business reporting.
This role provides a hybrid way of working with an onsite presence of 2 days/week.
Key Job Responsibilities and DutiesÂ
-
Producing curated, reusable analytical data products to enable self-serve analytics for many internal customers across departments.
-
Modeling data following best practices and Data Warehousing methodologies such as Data Vault and (Kimball) Dimensional modeling.
-
Transforming large, complex data sets into pragmatic, actionable insights and providing them in a consumable format for historical or predictive analysis.
-
Maintaining and tuning data pipeline health, including troubleshooting issues, implementing data quality controls, monitoring performance, and proactively addressing issues and risks.
-
Leading the technical resolution of problems, and communicating them to both technical and non-technical audiences.
-
Supporting product teams in defining the Data Architecture for their domains, from conceptual to physical modeling in the Data Warehouse.
-
Driving the culture across the business unit for data quality and data governance and its best practices.
-
Driving the implementation of reliable and well trusted metrics defined by the business, connecting disparate datasets into unified data products in the Lakehouse and/or Data Warehouse.
-
Performing Data Governance responsibilities such as technical stewardship, data classification, compliance management, data quality monitoring, and security considerations.
-
Working alone and self-steering initiatives, defining and breaking down work for more junior members of the team.
-
Mapping data flows between systems and workflows across the company to improve efficiency and resilience.
-
Developing scalable, real-time event-based streaming data pipelines to support internal and customer-facing use cases.
-
Ensuring ongoing reliability and performance of data pipelines through proactive monitoring, end-to-end testing standards, and incident handling.
-
Writing maintainable, reusable code by applying standard libraries and design patterns, and refactoring for simplicity and clarity.
-
Developing scalable and extensible physical data models aligned with operational workflows and infrastructure constraints.
-
Owning end-to-end data applications by defining and tracking SLIs and SLOs to ensure reliability and quality.
Role Qualifications and Requirements
-
7+ years of professional experience as a Software Developer and/or Data Engineer
-
Education in Computer science or related field
-
2+ years of experience handling data streaming, Flink knowledge is considered a plus
-
Proven knowledge of Python is required, Java/Scala will be a plus
-
You have built production data pipelines in the cloud, setting up data-lake and server-less solutions; ‌you have hands-on experience with schema design and data modeling.
-
You have experience designing systems E2E and knowledge of technical awareness of concepts (lb, db, caching, NoSQL, etc)
-
You have knowledge of Spark, CDC, Kafka, Cassandra, Airflow, Snowflake, DBT or equivalent tools
-
Experience with Data Warehousing and ETL/ELT pipelines and modeling techniques
-
Excellent communication skills - verbal and written
-
Proven experience driving technical change and impact across multiple teams
-
Excellent English communication skills and ability to influence
Benefits & Perks
-
Contributing to a high scale, complex, world renowned product and seeing real-time impact of your work on millions of travelers worldwide
-
Working in a fast-paced and performance driven culture
-
Technical, behavioral and interpersonal competence advancement via on-the-job opportunities, experimental projects, hackathons, conferences and active community participation
-
Competitive compensation and benefits packageÂ
-
Vast amounts of data to validate your ideas and the opportunity to experiment with real users
Booking Holdings is proud to be an equal opportunity workplace and is an affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. We strive to move well beyond traditional equal opportunity and work to create an environment that allows everyone to thrive.
Pre-Employment Screening
If your application is successful, your personal data may be used for a pre-employment screening check by a third party as permitted by applicable law. Depending on the vacancy and applicable law, a pre-employment screening may include employment history, education and other information (such as media information) that may be necessary for determining your qualifications and suitability for the position.
