DevJobs

Backend Team Leader (AI Data platform)

Overview
Skills
  • SQL SQL ꞏ 4y
  • Node.js Node.js ꞏ 5y
  • Kafka Kafka ꞏ 4y
  • MySQL MySQL ꞏ 4y
  • PostgreSQL PostgreSQL ꞏ 4y
  • Docker Docker
  • Kubernetes Kubernetes
  • RabbitMQ RabbitMQ ꞏ 4y

What does Dataloop do?

Dataloop plays a central role in the advancement of artificial intelligence by addressing the critical dependency of model accuracy and relevance on high-quality training, finetuning, and Retrieval-Augmented Generation (RAG) data. Our platform equips enterprise AI teams with comprehensive tools for constructing robust data pipelines and orchestrating unstructured data at significant scale. This includes the essential processes of managing, preparing, and enriching data to fuel a wide range of AI solutions, spanning predictive analytics, generative models, and intelligent agents. By leveraging Dataloop's capabilities, AI teams can significantly expedite their workflows and achieve superior AI performance with unprecedented speed.


About the position:

We are looking for a highly motivated and experienced engineer with a strong background in data engineering or data-management, to join our R&D team as a Backend Team Lead.

This role involves leading and mentoring a team of backend and full-stack engineers to design and build a robust system for managing unstructured data at scale. The goal is to deliver an exceptional user experience for data scientists using our application. As a key member of our global team, you will collaborate closely with R&D, product, AI engineering, and solution engineering teams to continuously improve our data services based on data-driven insights and direct customer feedback.


As a Team Lead, your responsibilities will include:

  • Leading a team of skilled software engineers in developing complex flows and business logic, considering high data scale, integrity, and security standards.
  • Taking a hands-on approach by contributing to the code and the product.
  • Defining and tracking team KPIs.
  • Leading and scaling the engineering team, promoting high standards in execution, code quality, and customer satisfaction.
  • Ensuring the data architecture aligns with our short-term and long-term objectives.
  • Leading the research and implementation of new technologies.
  • Developing comprehensive system specifications, including API designs, data flow diagrams, and integration strategies.
  • Driving the complete lifecycle of complex technical projects, from initial concept to final deployment.
  • Promoting architectural consistency and efficiency across various teams and repositories.


What’s it like to work at Dataloop?

Working at Dataloop offers a stimulating environment where you will tackle complex challenges on a versatile platform used by enterprise clients globally, demanding high scalability and continuous availability within a rapidly evolving industry. You will collaborate with skilled engineers, utilizing cutting-edge technologies such as Kubernetes, cloud orchestration, large-scale unstructured data management, and MLOps.

Dataloop fosters a culture that values creativity, commitment, and personal responsibility. Your contributions will have a tangible impact, and your insights will be actively considered. We are dedicated to your professional development, providing mentorship, opportunities for continuous learning, and a clear trajectory for career advancement.

Requirements


Refined Requirements for Backend Team Lead (Expert in Data Management):

  • 10+ years of extensive experience in backend engineering, with a significant focus on data management systems and practices.
  • 5+ years of in-depth experience developing robust and scalable node.js services, demonstrating a strong understanding of asynchronous programming and performance optimization in data-intensive applications.
  • 4+ years of hands-on experience with various SQL databases (e.g., PostgreSQL, MySQL), including advanced query optimization, schema design, and data integrity management at scale.
  • 4+ years of practical experience working with distributed messaging systems such as Kafka or RabbitMQ, including design patterns for reliable data streaming and processing.
  • Proven and demonstrable track record of architecting and developing large-scale data solutions, encompassing comprehensive software design and development, sophisticated database architectures, robust security protocols for sensitive data, and advanced performance tuning techniques for high-throughput data pipelines.
  • Solid understanding and practical experience with iterative or Agile methodologies and techniques, including Scrum, Kanban, or Lean-based approaches to software delivery, with a focus on incorporating data-centric considerations into the development lifecycle.
  • Exceptional debugging, analytical, and problem-solving skills, specifically applied to complex data-related challenges and system bottlenecks.
  • Strict adherence to coding standards and a meticulous attention to efficiency, particularly in the context of data processing and storage.


Advantageous Skills:

  • Experience working in a BigData production environment, including familiarity with data warehousing, ETL/ELT processes, and related technologies.
  • Proven experience with containerization technologies such as Docker and container orchestration platforms like Kubernetes (K8S), especially in deploying and managing data services.
  • Experience with Test-Driven Development (TDD) practices to ensure the reliability and quality of data-centric backend systems.

Dataloop AI