DevJobs

Data Engineer

Overview
Skills
  • Python Python
  • Kotlin Kotlin
  • Kafka Kafka
  • Redis Redis
  • PostgreSQL PostgreSQL
  • MongoDB MongoDB
  • Jenkins Jenkins
  • GitHub Actions GitHub Actions
  • AWS AWS
  • Kubernetes Kubernetes
  • Docker Docker
  • Helm
  • Terraform Terraform
  • Grafana Grafana
  • Airflow Airflow
  • EKS
  • ELT
  • ETL
  • EventBridge
  • Lambda
  • S3
  • Sagemaker
  • Athena
  • ECS
  • GitLab CI
  • Kinesis
  • Prometheus Prometheus
  • Argo
  • Datadog
Description

Rekor Systems (NASDAQ: REKR) is the global authority in Roadside Intelligence. We fuse computer vision, connected vehicles, and third-party mobility data into real-time insights that make transportation safer, smarter, and more efficient. Our customers include public safety agencies, urban planners, toll operators, and traffic management centers around the world.

Data Engineer

Rekor processes tens of terabytes of data and billions of events daily, and we’re looking for an experienced Data Engineer to join our Global Data & Analytics Organization. As a data engineer, you’ll design and maintain streaming and batch data pipelines, ensuring clean, reliable, high-quality data at scale while collaborating with Product, Engineering, and Data Science teams. The ideal candidate thrives in fast-changing environments, takes initiative, and values collaboration, candor, and continuous improvement.

Key Responsibilities

  • Champion Data culture and practices, enabling the Team to build better, more reliable, automated and secure data products faster.
  • Design and implement streaming data pipelines and batch data pipelines.
  • Implement data quality and integrity monitoring to proactively identify and fix data issues.
  • Design and implement solutions to support data lake and data warehousing.
  • Work across diverse teams in Product Management, Software, Hardware, DevOps, MLOps, Data Science and Data Governance to ensure that the data infrastructure meets solution requirements.
  • Collaboratively determine how best to manage data under business, technical, compliance, privacy and ethical constraints.
  • Work with other data SMEs and stakeholders to ensure Rekor’s data assets are FAIR (Findable, Accessible, Interoperable, Reusable).
  • Contribute to an open, creative and collaborative culture where everyone feels accountable for shaping the future of the Team and of Rekor.
  • Contribute to the adoption and integration of Generative AI tools and practices to enhance developer productivity and code quality.

Requirements

  • BS/MS in Computer Science, Engineering, or a related technical field, or equivalent practical experience.
  • 5+ years of experience in data engineering, data infrastructure, or backend development.
  • Proven experience designing, building, and maintaining cloud-based data pipelines and internal data tools.
  • Must- Strong hands-on experience with AWS services (e.g., S3, ECS/EKS, Lambda, EventBridge, Sagemaker, Athena).
  • General understanding of running in Kubernetes environment (deployment, scaling, monitoring).
  • Must- Proficiency in Python (data workflows and automation).
  • Experience with databases such as PostgreSQL, Redis, or MongoDB.
  • Experience with messaging and streaming systems (Kafka, Kinesis, or similar).
  • Solid understanding of data engineering concepts — data modeling, ETL/ELT, data quality, and observability.
  • Familiarity with system design and software engineering best practices.

Preferred Qualifications

  • Experience with Infrastructure as Code (Terraform).
  • Exposure to CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI).
  • Familiarity with observability and monitoring tools (Prometheus, Grafana, Datadog).
  • Experience with data orchestration or workflow tools (Airflow, Argo).
  • Knowledge of container build and deployment best practices (Docker, Helm).
  • Background in system performance optimization and cost-efficient cloud design.
  • Kotlin (backend or tool development).



Job Location:

Tel-Aviv, Israel

Rekor Systems