We are seeking a Data Engineer Tech Lead to join our capital markets data engineering teams, focusing on designing, building, and maintaining scalable data infrastructure on the Databricks platform. This role requires deep technical expertise in modern data stack technologies and the ability to work with complex financial data systems.
Key Responsibilities
Technical Development
- Design and implement robust, scalable data pipelines using Databricks, Apache Spark, and Delta Lake as well as , BigQuery
- Use SQL and Python to develop, scale, and optimize advanced data pipelines.
- Build and optimize ETL/ELT processes for capital markets data i
- Develop real-time and batch processing solutions to support trading and risk management operations
- Implement data quality monitoring, validation, and alerting systems
Platform Engineering
- Configure and optimize Databricks workspaces, clusters, and job scheduling
- Work in a Multi-cloud environment including Azure, GCP and AWS
- Implement security best practices including access controls, encryption, and audit logging
- Build integrations with market data vendors, trading systems, and risk management platforms
- Establish monitoring and performance tuning for data pipeline health and efficiency
Collaboration & Mentorship
- Collaborate with various stakeholders across the company and support business insight requests.
- Work closely with quantitative researchers, risk analysts, and product teams to understand data requirements
- Collaborate with other data engineering teams and infrastructure groups
- Provide technical guidance to junior engineers and contribute to code reviews
- Participate in architecture discussions and technology selection decisions
Required Qualifications
- 5+ years of data engineering experience with strong expertise in Apache Spark and distributed computing
- Strong programming skills in Python including data handling libraries (pandas, numpy, etc.), and data modeling.
- Proficient in Databricks platform and Delta Lake for data lake architecture
- Proficient in writing complex SQL queries,
- Experience with cloud-based databases (Google BigQuery \ Snowflake) .
- Advanced SQL skills and experience with both relational and NoSQL databases
- Bachelor's degree in Computer Science, Engineering, Finance, or related field
- Experience with integration of multiple data sources and working with various databases technologies.
Preferred Qualifications
- Experience with Azure cloud platform and associated data services (Data Factory, Event Hubs, Storage)
- Experience with EKS / AKS
- Knowledge of data streaming platforms (Kafka, Azure Event Hubs) for real-time processing
- Experience working in a multi cloud environment