I have been pondering on the idea of playing with object detection and AI for a long time. I have even implemented this project which was really fun and I learned a few things along the way. But I did not know what to do with it. Until I came along Frigate.
What is Frigate
Apart from its weird name (who cares about names anyway), this is a pretty cool piece of software. It takes a camera video stream and does object detection on it. It can find a person, cat, dog, car, bicycle, bus, train, airplane and many many more. Once it finds what you are looking for, it sends you a snapshot or a short video with the detected object.
How is this related to smart home ? Well, for one thing it has a direct integration with Home Assistant and for another it supports MQTT. The latter means that you can integrate with any other smarthome application that supports MQTT. Since I am already running OpenHab3 in my home production network I thought I could give it a shot.
Requirements
Hardware
You can use a raspberry pi 3/4 but if you have more than 2 cameras it is not really recommended. It is recommended to use more powerful hardware if you are serious about it that can handle the video stream coming from the cameras. The developer makes some suggestions here.
Camera
Any camera will do but some cameras are better suited than others. The developer recommends Dahua, Hikvision and Amcrest in that specific order. I personally used an old Foscam FI9821P and it worked just fine.
TPU
The developer suggests google Coral and although it is not necessary it is strongly recommended. The reason is that the coral can do the object detection faster than any modern CPU so once Frigate detects motion, it will pass that part of the image to coral for the object detection part. I am lucky that I already had one at home taking dust, since I realize it is almost impossible to purchase one at the time of writing. If you are serious about it, you should get one either in USB or the M2/Mini PCIe version.
Operating System
Any debian-based OS will do. Don’t use virtualization although some people had luck using proxmox and getting the coral to work properly. I tried on ESXi but I gave up since I could not find a way for the Debian VM to recognize the USB Coral properly. I ended up using Lubuntu in a live USB key on a physical server just to test it.
Software
Frigate is running on docker so you should run it as such. It can run as a standalone app or as a Home Assistant add-on. You do need to have an MQTT broker setup. See this article on how to install mosquitto (an MQTT broker) on docker along with other apps.
Installation
Before the installation we need to create the config file that Frigate needs in order to boot up.
mqtt:
host: 192.168.XXX.XXX # <------- Your MQTT Broker IP Address
detectors:
coral:
type: edgetpu
device: usb
record:
enabled: True
retain:
days: 0
mode: all
events:
objects:
- person
retain:
default: 10
mode: all
cameras:
camera_1: # <------ Name the camera
ffmpeg:
inputs:
- path: rtsp://username:password@192.168.XXX.XXXX:XX/videoMain # <----Update for your camera
roles:
- detect
- rtmp
rtmp:
enabled: False # <-- RTMP should be disabled if your stream is not H264
detect:
width: 1280 # <---- update for your camera's resolution
height: 720 # <---- update for your camera's resolution
mqtt: this needs to point to your MQTT broker
detectors: this is where you configure your google coral (if you have one) or you rely on the CPU to do the event detection (not recommended)
recording: this is important as its absence means no recordings.In this example only recordings that overlap with event detection will be stored. The rest of the recordings will be removed from cache.
cameras: this is where you configure your cameras. Typically, you want to use the high-quality stream to store, but the lower resolution stream to do the event detection. If you don’t know the rtsp path for your camera sites like this might help. I keep it simple in this example.
rtmp: if you want to re-stream your video feed as RTMP feed so that other applications like Home Assistant can utilize it then you make sure this is enabled and port 1935 is open. That way you don’t need to open two video streams to your camera (one for frigate and one for another application).
Using docker compose installation is pretty easy. Here is an example docker compose file. Feel free to modify to suit your needs
version: "3.9"
services:
frigate:
container_name: frigate
privileged: true # this may not be necessary for all setups
restart: unless-stopped
image: blakeblackshear/frigate:stable
shm_size: "64mb" # update for your cameras based on calculation above
devices:
- /dev/bus/usb:/dev/bus/usb # passes the USB Coral, needs to be modified for other versions
- /dev/dri/renderD128 # for intel hwaccel, needs to be updated for your hardware
volumes:
- /etc/localtime:/etc/localtime:ro
- /home/lubuntu/frigate/config.yml:/config/config.yml:ro
- /mnt/frigate_media:/media/frigate
- type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
target: /tmp/cache
tmpfs:
size: 1000000000
ports:
- "5000:5000"
- "1935:1935" # RTMP feeds
Note: Unfortunately you do need the privileged mode (although not ideal for security reasons) if you are using google coral.
Note: In the volumes section I added an NFS share that points to my synology server. So every time Frigate creates snapshots or recordings it dumps them in the NFS share (/mnt/frigate_media) and they show up in my synology server for post viewing / storage.
Features
Masks: You can define an area of your camera in which no motion detection will happen and consequently no event detection will be triggered.
Zones: You can configure different zones within the camera image (ie garden, gate, sidewalk, road, terrace) etc and you can get notifications if an object is detected in one or multiple of those zones. For example you may want to be notified when a person is detected in your garden as opposed to the sidewalk. You can apply different detection filters in different zones.
Birdseye: You get a view of all your cameras especially those where there is motion or object detection happened. That way you don’t miss the action.
Conclusion
This is the first such application I tested and I feel very excited. I am pretty sure there are similar applications out there (opensource?) but I don’t know how they compare to Frigate. Maybe something to check in the future. I believe those application will become better and better as the detection algorithm evolves.
It would be good to let me know in the comments your thoughts about Frigate or similar applications!