flystem-usls/examples/blip
Jamjamjon b81b5e3cf5
Add GroundingDINO (#30)
2024-08-09 19:06:30 +08:00
..
README.md Add YOLOv8-OBB and some bug fixes (#9) 2024-04-21 17:06:58 +08:00
main.rs Add GroundingDINO (#30) 2024-08-09 19:06:30 +08:00

README.md

This demo shows how to use BLIP to do conditional or unconditional image captioning.

Quick Start

cargo run -r --example blip

BLIP ONNX Model

Results

[Unconditional image captioning]: a group of people walking around a bus
[Conditional image captioning]: three man walking in front of a bus
Some(["three man walking in front of a bus"])

TODO

  • Multi-batch inference for image caption
  • VQA
  • Retrival
  • TensorRT support for textual model