frigate/docker-compose.yml
Nate Meyer 3f05f74ecb
Nvidia TensorRT detector (#4718)
* Initial WIP dockerfile and scripts to add tensorrt support

* Add tensorRT detector

* WIP attempt to install TensorRT 8.5

* Updates to detector for cuda python library

* TensorRT Cuda library rework WIP

Does not run

* Fixes from rebase to detector factory

* Fix parsing output memory pointer

* Handle TensorRT logs with the python logger

* Use non-async interface and convert input data to float32. Detection runs without error.

* Make TensorRT a separate build from the base Frigate image.

* Add script and documentation for generating TRT Models

* Add support for TensorRT devcontainer

* Add labelmap to trt model script and docs.  Cleanup of old scripts.

* Update detect to normalize input tensor using model input type

* Add config for selecting GPU. Fix Async inference. Update documentation.

* Update some CUDA libraries to clean up version warning

* Add CI stage to build TensorRT tag

* Add note in docs for image tag and model support
2022-12-30 10:53:17 -06:00

40 lines
1.2 KiB
YAML

version: "3"
services:
devcontainer:
container_name: frigate-devcontainer
# add groups from host for render, plugdev, video
group_add:
- "109" # render
- "110" # render
- "44" # video
- "46" # plugdev
shm_size: "256mb"
build:
context: .
# Use target devcontainer-trt for TensorRT dev
target: devcontainer
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
devices:
- /dev/bus/usb:/dev/bus/usb
# - /dev/dri:/dev/dri # for intel hwaccel, needs to be updated for your hardware
volumes:
- .:/workspace/frigate:cached
- ./web/dist:/opt/frigate/web:cached
- /etc/localtime:/etc/localtime:ro
- ./config/config.yml:/config/config.yml:ro
- ./debug:/media/frigate
# Create the trt-models folder using the documented method of generating TRT models
# - ./debug/trt-models:/trt-models
- /dev/bus/usb:/dev/bus/usb
mqtt:
container_name: mqtt
image: eclipse-mosquitto:1.6
ports:
- "1883:1883"