Big Ideas. Real People.
At Orca, in the right environment and with the right team,
talent has no boundaries. This team spirit, together with our drive to always aim high, has quickly earned us unicorn status and turned us into a global cloud security innovation leader. So if you're ready to join an amazing team of people who inspire each other every day, now is the time to find your place in our pod.
We're looking for driven and talented people like you to join our R&D team and our mission to change the future of cloud security.
Ready to dive in and swim with our pod?
Highlights
- High-growth: Over the past six years, we've consistently achieved milestones that take other companies a decade or more. During this time, we've significantly grown our employee base, expanded our customer reach, and rapidly advanced our product capabilities.
- Disruptive innovation: Our founders saw that traditional security didn't work for the cloud—so they set out to carve a new path. We're relentless pioneers who invented agentless technology and continue to be the most comprehensive and innovative cloud security company.
- Well-capitalized: With a valuation of $1.8 billion, Orca is a cybersecurity unicorn dominating the cloud security space. We're backed by an impressive team of investors such as Capital G, ICONIQ, GGV, and SVCI, a syndicate of CISOs who invest their own money after conducting their due diligence.
- Respectful and transparent culture: Our executives pride themselves on being accessible to everyone and believe in sharing knowledge with the employees. Each employee has a place in shaping the future of our industry.
About the role
As a Senior Software Engineer on the Data Platform, you'll be part of one of Orca's most strategic engineering groups — tasked with building the core data ingestion and processing infrastructure that powers our entire platform. The team is responsible for handling billions of cloud signals daily, ensuring scalability, reliability, and efficiency across Orca's architecture.
You'll work on large-scale distributed systems, own critical components of the cloud security data pipeline, and drive architectural decisions that influence how data is ingested, normalized, and made available for product teams across Orca. We're currently in the midst of a major architectural transformation, evolving our ingestion and processing layers to support real-time pipelines, improved observability, and greater horizontal scalability—and we're looking for experienced engineers who are eager to make foundational impact.
Our Stack: Python, Go, Rust, SingleStore, Postgres, ElasticSearch, Redis, Kafka, AWS
On a typical day you'll
- Write clean, concise code that is stable, extensible, and unit-tested appropriately
- Write production-ready code that meets design specifications, anticipates edge cases, and accounts for scalability
- Diagnose complex issues, evaluate, recommend and execute the best solution
- Implement new requirements within our Agile delivery methodology while following our established architectural principles
- Lead initiatives end to end—from design and planning to implementation and deployment—while aligning cross-functional teams and ensuring technical excellence
- Test software to ensure proper and efficient execution and adherence to business and technical requirements
- Provides input into the architecture and design of the product; collaborating with the team in solving problems the right way
- Develop expertise of AWS, Azure, and GCP products and technologies
About You
- Bachelor's degree in Computer Science, Engineering or relevant experience
- 5+ years of professional software development experience
- Proven experience building data-intensive systems at scale
- Experience in working with micro-service architecture & cloud-native services
- Solid understanding of software design principles, concurrency, synchronization, memory management, data structures, algorithms, etc
- Hands-on experience with databases such as SingleStore, Postgres, Elasticsearch, Redis
- Experience with Python / Go (Advantage)
- Experience with distributed data processing tools like Kafka (Advantage)