Waymo is an autonomous driving technology company with the mission to be the most trusted driver. Since its start as the Google Self-Driving Car Project in 2009, Waymo has focused on building the Waymo Driver—The World's Most Experienced Driver™—to improve access to mobility while saving thousands of lives now lost to traffic crashes. The Waymo Driver powers Waymo One, a fully autonomous ride-hailing service, and can also be applied to a range of vehicle platforms and product use cases. The Waymo Driver has provided over one million rider-only trips, enabled by its experience autonomously driving tens of millions of miles on public roads and tens of billions in simulation across 13+ U.S. states.
Want to build the next generation of data infrastructure and data model for driverless cars? Join the Waymo Commercialization team, and help design the future of transportation! We are looking for a passionate Data Engineer who wants to help build the business critical data systems to empower Waymo's ride-share fleet optimization.
You will report to our Engineering Director and Site Lead in Warsaw.
You Will
- Translate requirements into conceptual, logical, and physical data models.
- Define core data concepts for Waymo commercialization and implement models in the database.
- Design and build data warehouse and pipeline solutions. Bring in and transform raw data that can be easily used to generate reports and insights for Waymo's commercialization tracking and optimization.
- Ensure proper handling of PII, implement data quality framework, and maintain documentation.
- Be highly collaborative and work closely with data producers and data consumers across DS, PM, eng roles in Waymo to understand the data needs, provide consultation, and align data solutions.
You Have
- 5+ years of professional experience in data engineering or a related field
- Proven track record of building complex data engineering projects from conception to deployment
- Experience in leading projects and driving collaborating with cross-functional teams
- Experiences of designing and implementing data warehousing solutions like Google BigQuery and Snowflake, optimizing them for complex analytical queries and reporting
- Deep knowledge of relational and NoSQL databases
- Expertise in designing and implementing scalable data models, data quality frameworks, and data governance practices
We Prefer
- Familiarity with distributed processing frameworks and tools like Spark, Hadoop, or Kafka, handling massive datasets for real-time or batch analytics
- Deep understanding of data privacy, security, quality, correctness, and efficiency tradeoffs
- Proven track record of driving collaboration effectively across organizational boundaries, building multi-dimensional stakeholder relationships, and importing and exporting ideas to achieve broad organizational goals