Principal Engineer / Data Platform Architect
Office Location: North America / US / Remote
Vector Flow (established 2019) is a next-generation, AI-powered, data-driven security optimization platform and applications company revolutionizing physical and logical security operations. The Vector Flow platform and applications can be deployed at a fraction of the time and cost of other traditional security systems and provide AI-driven results out of the box while providing ongoing enhancements to security operations. Vector Flow promises high performance, scalability and configurability flexibility not found in other, legacy solutions in the market today.
Vector Flow’s platform and applications range from Security Operation Center (SOC) alarm reduction and management, health and systems maintenance, KPI metric monitoring, reporting and dashboards, identity and physical access lifecycle management, visitor management, audit, compliance and privacy solutions.
Vector Flow is built on a cutting edge, data-driven architecture and licensed as a subscription ‘pay as you utilize’ solution making it easy, fast and cost-effective to deploy quickly and realize immediate business value.
The primary role, responsibilities and duties are:
- Design and develop Vector Flow platform components, developing data ingestion architecture, runtime automation engines and data visualization
- Design and implement a highly performant, scalable backend systems and algorithms
- Create data models and designs using SQL, NoSQL & in-memory database
- Build production quality solutions that balance complexity and performance
- Work closely and mentor rest of the team members to develop, test, deploy, and operate high quality software
- Ability to effectively work with cross functional teams to understand requirements and engineer high performance deliverables
Qualifications – Required
- Familiarity with databases, distributed systems, microservices, message bus/queues, and front-end UIs
- Learn new company technologies and work effectively in a fast-paced dynamic environment
- Experience with databases: Oracle, MySQL, PostgreSQL, Elasticsearch, Redis, NGINX, and PostgreSQL
- Experience running applications in containers (e.g. Docker)
- Experience with using queuing systems such as Kafka
- Experience in designing and implementing RESTful APIs
- Able to take on complex problems, learn quickly, iterate, and persevere towards bringing features from a design specification to production
Qualifications – Preferred
- Experience building API (development experience using GraphQL is a plus)
- Familiarity with issues of web performance, availability, scalability, reliability, and maintainability
- Knowledge about in-memory datastores such as Redis, Spark or Databricks (protocols, functionality, performance tuning)
- Understanding of parallel processing concepts such as Map/Reduce
- Experience with large data processing tools such as Spark, Redis, InfluxDB, fast in-memory data stores etc. is a plus
- SQL Query optimization understanding/experience is a big plus
- Experience with Docker & Kubernetes ecosystems is a plus
Education and/or Experience
- BS/MS or PhD in Computer Science or related field from a Top Tier University
- 5+ years of experience in the backend web stack (Node.js, Java, Python, Docker, Kubernetes, SQL, no-SQL database experience)
- Highly competitive experience and competency-based salary
- Industry leading benefits package – health, dental, etc.
- English / Primary
- Other languages considered
Something looks off?