Spatial Perception Systems Engineer
San Francisco, CA
AI
Full Time Regular

Blue Signal Search
Spatial Perception Systems Engineer
Location: San Francisco, CA (Hybrid or Onsite Preferred)
Type: Full-Time | Autonomy / AR/VR / Robotics
Our client is on a mission to redefine how machines understand and interact with the physical world. As part of their cutting-edge research and development team, they are seeking a Spatial Perception Systems Engineer to develop computer vision and sensor fusion technologies powering next-gen autonomous systems and immersive environments. This role is ideal for candidates eager to push the envelope in robotics, AR/VR, and spatial computing, working hands-on with advanced hardware and real-time applications.
Key Responsibilities:
• Design and deploy computer vision pipelines including object detection, segmentation, tracking, and simultaneous localization and mapping (SLAM).
• Integrate vision and perception modules into physical systems such as autonomous platforms, AR headsets, and edge devices.
• Optimize performance of algorithms on embedded systems and hardware accelerators (e.g., Jetson, Apple Vision Pro, Qualcomm chipsets).
• Collaborate closely with interdisciplinary teams in robotics, machine learning, and embedded hardware to ensure seamless end-to-end system performance.
• Manage, annotate, and preprocess large-scale image and video datasets for supervised and self-supervised learning applications.
Ideal Background:
• 3+ years of experience in computer vision development for real-time or embedded environments.
• Solid foundation in deep learning (CNNs, transformers), geometric vision, and image processing techniques.
• Experience with tools and libraries such as PyTorch, TensorFlow, OpenCV, ROS, and 3D processing frameworks like Open3D or PCL.
• Familiar with stereo imaging, camera calibration, depth estimation, and environmental variability challenges (e.g., motion blur, occlusion).
• Ability to analyze and address edge-case scenarios using robust perception techniques.
Bonus Qualifications:
• Prior hands-on experience in robotics, drones, self-driving vehicles, or AR/VR platforms.
• Deployment experience using inference optimization libraries such as ONNX Runtime, Core ML, or TensorRT.
• Understanding of SLAM, real-time 3D reconstruction, and multi-view geometry.
• Exposure to synthetic training environments using simulation platforms like Unity, Unreal Engine, or CARLA.
Compensation:
• $160,000 to $230,000 base salary, with equity and full benefits package included.
Why You Should Apply:
• Join a forward-thinking team at the forefront of autonomy and immersive technology.
• Work on hardware-software integration that directly impacts emerging consumer and industrial applications.
• Thrive in a collaborative, problem-solving culture focused on real-world results and scalable innovation.
To apply for this job please visit www.bluesignal.com.