YYantrix
Computer vision & robotics perception

Computer Vision for Robotics Services

Vision systems win or lose on integration. We build perception stacks where the model, the robot, and the product engineering decision all line up.

Vision-guided robotic manipulation with detection overlays

What we do

Practical support for targeted engineering work

Yantrix delivers computer-vision systems for vision-guided pick-and-place, bin picking, defect detection on conveyors, SKU recognition, and robotic manipulation. We design the data pipeline, pick and train the model (YOLO detection, SAM-2 segmentation, custom classifiers), integrate it into ROS 2 nodes or a PLC-facing service, and ship benchmarks against the target FPS and accuracy envelope.

What problems we solve

  • Reduce manual inspection and sorting with on-line visual quality control.
  • Enable robotic cells to handle variable, randomly oriented, or cluttered parts.
  • Replace pose fixtures and custom jigs with camera-driven perception.
  • Catch defects earlier on the line and feed that signal back into process control.

Tools we use

  • Ultralytics YOLOv8 / v11
  • SAM-2 segmentation
  • OpenCV
  • PyTorch
  • TensorRT
  • ROS 2 perception nodes
  • Intel RealSense / ZED / OAK-D cameras
  • Roboflow for dataset ops
  • NVIDIA Jetson Orin

Deliverables

  • Production vision model with documented accuracy/latency profile
  • ROS 2 perception package or gRPC / REST perception service
  • Camera + lighting recommendations for the deployment environment
  • Dataset, labelling guidelines, and retraining instructions
  • Integration with grippers, manipulators, or PLC I/O
Use cases

Industries where this service applies

We adapt the same engineering service to different product contexts depending on the load case, packaging problem, validation target, or deployment environment.

Robotics

Relevant when the project needs focused computer vision for robotics support.

warehouse & fulfilment

Relevant when the project needs focused computer vision for robotics support.

factory QA

Relevant when the project needs focused computer vision for robotics support.

packaging

Relevant when the project needs focused computer vision for robotics support.

electronics manufacturing

Relevant when the project needs focused computer vision for robotics support.

agritech

Relevant when the project needs focused computer vision for robotics support.

Related work

Case studies connected to this service

These links help visitors move from service intent to real examples of engineering work.

Applied AI · Vision-guided robotics

Vision-guided bin picking at 80 ms end-to-end

Yantrix built a production vision stack that lets a 6-DOF arm pick randomly oriented SKUs out of a cluttered bin — running entirely on an edge device.

Robotics design

Robotic arm design for precise pick-and-place motion

A design study covering joint packaging, structural stiffness, and manufacturable geometry for a compact robotic arm concept.

FAQ

Questions teams ask before they engage

Service-specific questions are useful for both users and search visibility around intent-driven queries.

What frame rate and latency can you hit on embedded hardware?

On a Jetson Orin Nano, we typically ship YOLOv11-Seg pipelines between 12-30 FPS end-to-end, with decision latency under 100 ms including camera capture, inference, and robot command. Specific numbers depend on image resolution and the target class count.

Can you retrain on our own parts and SKUs?

That's the default. We usually start with a base detector, then fine-tune on a small labelled dataset of your parts. We'll set up labelling tooling and hand you a retraining pipeline so you can keep extending it.

Do you handle the lighting and camera selection?

Yes. Vision fails most often because of the physical setup, not the model. We scope the camera, lens, lighting, and mount geometry as part of the engagement.

Start your project

Need computer vision for robotics support?

Send the problem, your current design stage, and any existing files. We can scope the work from there.