diff --git a/docs/docs/configuration/live.md b/docs/docs/configuration/live.md
index 22789181a..bc19d3caa 100644
--- a/docs/docs/configuration/live.md
+++ b/docs/docs/configuration/live.md
@@ -3,9 +3,9 @@ id: live
title: Live View
---
-Frigate intelligently displays your camera streams on the Live view dashboard. Your camera images update once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any motion is detected, cameras seamlessly switch to a live stream.
+Frigate intelligently displays your camera streams on the Live view dashboard. By default, Frigate employs "smart streaming" where camera images update once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any motion or active objects are detected, cameras seamlessly switch to a live stream.
-## Live View technologies
+### Live View technologies
Frigate intelligently uses three different streaming technologies to display your camera streams on the dashboard and the single camera view, switching between available modes based on network bandwidth, player errors, or required features like two-way talk. The highest quality and fluency of the Live view requires the bundled `go2rtc` to be configured as shown in the [step by step guide](/guides/configuring_go2rtc).
@@ -51,19 +51,32 @@ go2rtc:
- ffmpeg:rtsp://192.168.1.5:554/live0#video=copy
```
-### Setting Stream For Live UI
+### Setting Streams For Live UI
-There may be some cameras that you would prefer to use the sub stream for live view, but the main stream for recording. This can be done via `live -> stream_name`.
+You can configure Frigate to allow manual selection of the stream you want to view in the Live UI. For example, you may want to view your camera's substream on mobile devices, but the full resolution stream on desktop devices. Setting the `live -> streams` list will populate a dropdown in the UI's Live view that allows you to choose between the streams. This stream setting is _per device_ and is saved in your browser's local storage.
+
+Additionally, when creating and editing camera groups in the UI, you can choose the stream you want to use for your camera group's Live dashboard.
+
+::: note
+
+Frigate's default dashboard ("All Cameras") will always use the first entry you've defined in `streams:` when playing live streams from your cameras.
+
+:::
+
+Configure the `streams` option with a "friendly name" for your stream followed by the go2rtc stream name.
+
+Using Frigate's internal version of go2rtc is required to use this feature. You cannot specify paths in the `streams` configuration, only go2rtc stream names.
```yaml
go2rtc:
streams:
test_cam:
- - rtsp://192.168.1.5:554/live0 # <- stream which supports video & aac audio.
+ - rtsp://192.168.1.5:554/live_main # <- stream which supports video & aac audio.
- "ffmpeg:test_cam#audio=opus" # <- copy of the stream which transcodes audio to opus for webrtc
test_cam_sub:
- - rtsp://192.168.1.5:554/substream # <- stream which supports video & aac audio.
- - "ffmpeg:test_cam_sub#audio=opus" # <- copy of the stream which transcodes audio to opus for webrtc
+ - rtsp://192.168.1.5:554/live_sub # <- stream which supports video & aac audio.
+ test_cam_another_sub:
+ - rtsp://192.168.1.5:554/live_alt # <- stream which supports video & aac audio.
cameras:
test_cam:
@@ -80,7 +93,10 @@ cameras:
roles:
- detect
live:
- stream_name: test_cam_sub
+ streams: # <--- Multiple streams for Frigate 0.16 and later
+ Main Stream: test_cam # <--- Specify a "friendly name" followed by the go2rtc stream name
+ Sub Stream: test_cam_sub
+ Special Stream: test_cam_another_sub
```
### WebRTC extra configuration:
@@ -101,6 +117,7 @@ WebRTC works by creating a TCP or UDP connection on port `8555`. However, it req
```
- For access through Tailscale, the Frigate system's Tailscale IP must be added as a WebRTC candidate. Tailscale IPs all start with `100.`, and are reserved within the `100.64.0.0/10` CIDR block.
+- Note that WebRTC does not support H.265.
:::tip
@@ -148,3 +165,50 @@ For devices that support two way talk, Frigate can be configured to use the feat
- For the Home Assistant Frigate card, [follow the docs](https://github.com/dermotduffy/frigate-hass-card?tab=readme-ov-file#using-2-way-audio) for the correct source.
To use the Reolink Doorbell with two way talk, you should use the [recommended Reolink configuration](/configuration/camera_specific#reolink-doorbell)
+
+### Streaming options on camera group dashboards
+
+Frigate provides a dialog in the Camera Group Edit pane with several options for streaming on a camera group's dashboard. These settings are _per device_ and are saved in your device's local storage.
+
+- Stream selection using the `live -> streams` configuration option (see _Setting Streams For Live UI_ above)
+- Streaming type:
+ - _No streaming_: Camera images will only update once per minute and no live streaming will occur.
+ - _Smart Streaming_ (default, recommended setting): Smart streaming will update your camera image once per minute when no detectable activity is occurring to conserve bandwidth and resources, since a static picture is the same as a streaming image with no motion or objects. When motion or objects are detected, the image seamlessly switches to a live stream.
+ - _Continuous Streaming_: Camera image will always be a live stream when visible on the dashboard, even if no activity is being detected. Continuous streaming may cause high bandwidth usage and performance issues. **Use with caution.**
+- _Compatibility mode_: Enable this option only if your camera's live stream is displaying color artifacts and has a diagonal line on the right side of the image. Before enabling this, try setting your camera's `detect` width and height to a standard aspect ratio (for example: 640x352 becomes 640x360, and 800x443 becomes 800x450, 2688x1520 becomes 2688x1512, etc). Depending on your browser and device, more than a few cameras in compatibility mode may not be supported, so only use this option if changing your config fails to resolve the color artifacts and diagonal line.
+
+:::note
+
+The default dashboard ("All Cameras") will always use Smart Streaming and the first entry set in your `streams` configuration, if defined. Use a camera group if you want to change any of these settings from the defaults.
+
+:::
+
+## Live view FAQ
+
+1. Why don't I have audio in my Live view?
+ You must use go2rtc to hear audio in your live streams. If you have go2rtc already configured, you need to ensure your camera is sending PCMA/PCMU or AAC audio. If you can't change your camera's audio codec, you need to [transcode the audio](https://github.com/AlexxIT/go2rtc?tab=readme-ov-file#source-ffmpeg) using go2rtc.
+
+ Note that the low bandwidth mode player is a video-only stream. You should not expect to hear audio when in low bandwidth mode, even if you've set up go2rtc.
+
+2. Frigate shows that my live stream is in "low bandwidth mode". What does this mean?
+ Frigate intelligently selects the live streaming technology based on a number of factors (user-selected modes like two-way talk, camera settings, browser capabilities, available bandwidth) and prioritizes showing an actual up-to-date live view of your camera's stream as quickly as possible.
+
+ When you have go2rtc configured, Live view initially attempts to load and play back your stream with a clearer, fluent stream technology (MSE). An initial timeout, a low bandwidth condition that would cause buffering of the stream, or decoding errors in the stream will cause Frigate to switch to the stream defined by the `detect` role, using the jsmpeg format. This is what the UI labels as "low bandwidth mode". On Live dashboards, the mode will automatically reset when smart streaming is configured and activity stops. You can also try using the _Reset_ button to force a reload of your stream.
+
+ If you are still experiencing Frigate falling back to low bandwidth mode, you may need to adjust your camera's settings per the recommendations above or ensure you have enough bandwidth available.
+
+3. It doesn't seem like my cameras are streaming on the Live dashboard. Why?
+ On the default Live dashboard ("All Cameras"), your camera images will update once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any activity is detected, cameras seamlessly switch to a full-resolution live stream. If you want to customize this behavior, use a camera group.
+
+4. I see a strange diagonal line on my live view, but my recordings look fine. How can I fix it?
+ This is caused by incorrect dimensions set in your detect width or height (or incorrectly auto-detected), causing the jsmpeg player's rendering engine to display a slightly distorted image. You should enlarge the width and height of your `detect` resolution up to a standard aspect ratio (example: 640x352 becomes 640x360, and 800x443 becomes 800x450, 2688x1520 becomes 2688x1512, etc). If changing the resolution to match a standard (4:3, 16:9, or 32:9, etc) aspect ratio does not solve the issue, you can enable "compatibility mode" in your camera group dashboard's stream settings. Depending on your browser and device, more than a few cameras in compatibility mode may not be supported, so only use this option if changing your `detect` width and height fails to resolve the color artifacts and diagonal line.
+
+5. How does "smart streaming" work?
+ Because a static image of a scene looks exactly the same as a live stream with no motion or activity, smart streaming updates your camera images once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any activity (motion or object/audio detection) occurs, cameras seamlessly switch to a live stream.
+
+ This static image is pulled from the stream defined in your config with the `detect` role. When activity is detected, images from the `detect` stream immediately begin updating at ~5 frames per second so you can see the activity until the live player is loaded and begins playing. This usually only takes a second or two. If the live player times out, buffers, or has streaming errors, the jsmpeg player is loaded and plays a video-only stream from the `detect` role. When activity ends, the players are destroyed and a static image is displayed until activity is detected again, and the process repeats.
+
+ This is Frigate's default and recommended setting because it results in a significant bandwidth savings, especially for high resolution cameras.
+
+6. I have unmuted some cameras on my dashboard, but I do not hear sound. Why?
+ If your camera is streaming (as indicated by a red dot in the upper right, or if it has been set to continuous streaming mode), your browser may be blocking audio until you interact with the page. This is an intentional browser limitation. See [this article](https://developer.mozilla.org/en-US/docs/Web/Media/Autoplay_guide#autoplay_availability). Many browsers have a whitelist feature to change this behavior.
diff --git a/docs/docs/configuration/reference.md b/docs/docs/configuration/reference.md
index 3c055fadf..30b14f687 100644
--- a/docs/docs/configuration/reference.md
+++ b/docs/docs/configuration/reference.md
@@ -572,10 +572,12 @@ go2rtc:
# Optional: Live stream configuration for WebUI.
# NOTE: Can be overridden at the camera level
live:
- # Optional: Set the name of the stream configured in go2rtc
+ # Optional: Set the streams configured in go2rtc
# that should be used for live view in frigate WebUI. (default: name of camera)
# NOTE: In most cases this should be set at the camera level only.
- stream_name: camera_name
+ streams:
+ main_stream: main_stream_name
+ sub_stream: sub_stream_name
# Optional: Set the height of the jsmpeg stream. (default: 720)
# This must be less than or equal to the height of the detect stream. Lower resolutions
# reduce bandwidth required for viewing the jsmpeg stream. Width is computed to match known aspect ratio.
diff --git a/frigate/config/camera/live.py b/frigate/config/camera/live.py
index 9f15f2645..13ae2d04f 100644
--- a/frigate/config/camera/live.py
+++ b/frigate/config/camera/live.py
@@ -1,3 +1,5 @@
+from typing import Dict
+
from pydantic import Field
from ..base import FrigateBaseModel
@@ -6,6 +8,9 @@ __all__ = ["CameraLiveConfig"]
class CameraLiveConfig(FrigateBaseModel):
- stream_name: str = Field(default="", title="Name of restream to use as live view.")
+ streams: Dict[str, str] = Field(
+ default_factory=list,
+ title="Friendly names and restream names to use for live view.",
+ )
height: int = Field(default=720, title="Live camera view height")
quality: int = Field(default=8, ge=1, le=31, title="Live camera view quality")
diff --git a/frigate/config/config.py b/frigate/config/config.py
index c4c502d26..694a3389f 100644
--- a/frigate/config/config.py
+++ b/frigate/config/config.py
@@ -199,17 +199,18 @@ def verify_config_roles(camera_config: CameraConfig) -> None:
)
-def verify_valid_live_stream_name(
+def verify_valid_live_stream_names(
frigate_config: FrigateConfig, camera_config: CameraConfig
) -> ValueError | None:
"""Verify that a restream exists to use for live view."""
- if (
- camera_config.live.stream_name
- not in frigate_config.go2rtc.model_dump().get("streams", {}).keys()
- ):
- return ValueError(
- f"No restream with name {camera_config.live.stream_name} exists for camera {camera_config.name}."
- )
+ for _, stream_name in camera_config.live.streams.items():
+ if (
+ stream_name
+ not in frigate_config.go2rtc.model_dump().get("streams", {}).keys()
+ ):
+ return ValueError(
+ f"No restream with name {stream_name} exists for camera {camera_config.name}."
+ )
def verify_recording_retention(camera_config: CameraConfig) -> None:
@@ -586,15 +587,15 @@ class FrigateConfig(FrigateBaseModel):
zone.generate_contour(camera_config.frame_shape)
# Set live view stream if none is set
- if not camera_config.live.stream_name:
- camera_config.live.stream_name = name
+ if not camera_config.live.streams:
+ camera_config.live.streams = {name: name}
# generate the ffmpeg commands
camera_config.create_ffmpeg_cmds()
self.cameras[name] = camera_config
verify_config_roles(camera_config)
- verify_valid_live_stream_name(self, camera_config)
+ verify_valid_live_stream_names(self, camera_config)
verify_recording_retention(camera_config)
verify_recording_segments_setup_with_reasonable_time(camera_config)
verify_zone_objects_are_tracked(camera_config)
diff --git a/frigate/util/config.py b/frigate/util/config.py
index a8664ea4e..5b40fe37b 100644
--- a/frigate/util/config.py
+++ b/frigate/util/config.py
@@ -13,7 +13,7 @@ from frigate.util.services import get_video_properties
logger = logging.getLogger(__name__)
-CURRENT_CONFIG_VERSION = "0.15-1"
+CURRENT_CONFIG_VERSION = "0.16-0"
DEFAULT_CONFIG_FILE = "/config/config.yml"
@@ -84,6 +84,13 @@ def migrate_frigate_config(config_file: str):
yaml.dump(new_config, f)
previous_version = "0.15-1"
+ if previous_version < "0.16-0":
+ logger.info(f"Migrating frigate config from {previous_version} to 0.16-0...")
+ new_config = migrate_016_0(config)
+ with open(config_file, "w") as f:
+ yaml.dump(new_config, f)
+ previous_version = "0.16-0"
+
logger.info("Finished frigate config migration...")
@@ -289,6 +296,29 @@ def migrate_015_1(config: dict[str, dict[str, any]]) -> dict[str, dict[str, any]
return new_config
+def migrate_016_0(config: dict[str, dict[str, any]]) -> dict[str, dict[str, any]]:
+ """Handle migrating frigate config to 0.16-0"""
+ new_config = config.copy()
+
+ for name, camera in config.get("cameras", {}).items():
+ camera_config: dict[str, dict[str, any]] = camera.copy()
+
+ live_config = camera_config.get("live", {})
+ if "stream_name" in live_config:
+ # Migrate from live -> stream_name to live -> streams -> dict
+ stream_name = live_config["stream_name"]
+ live_config["streams"] = {stream_name: stream_name}
+
+ del live_config["stream_name"]
+
+ camera_config["live"] = live_config
+
+ new_config["cameras"][name] = camera_config
+
+ new_config["version"] = "0.16-0"
+ return new_config
+
+
def get_relative_coordinates(
mask: Optional[Union[str, list]], frame_shape: tuple[int, int]
) -> Union[str, list]:
diff --git a/web/package-lock.json b/web/package-lock.json
index 3ced33ffe..119fc79ea 100644
--- a/web/package-lock.json
+++ b/web/package-lock.json
@@ -54,7 +54,7 @@
"react-day-picker": "^8.10.1",
"react-device-detect": "^2.2.3",
"react-dom": "^18.3.1",
- "react-grid-layout": "^1.4.4",
+ "react-grid-layout": "^1.5.0",
"react-hook-form": "^7.52.1",
"react-icons": "^5.2.1",
"react-konva": "^18.2.10",
@@ -5120,7 +5120,8 @@
"node_modules/fast-equals": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/fast-equals/-/fast-equals-4.0.3.tgz",
- "integrity": "sha512-G3BSX9cfKttjr+2o1O22tYMLq0DPluZnYtq1rXumE1SpL/F/SLIfHx08WYQoWSIpeMYf8sRbJ8++71+v6Pnxfg=="
+ "integrity": "sha512-G3BSX9cfKttjr+2o1O22tYMLq0DPluZnYtq1rXumE1SpL/F/SLIfHx08WYQoWSIpeMYf8sRbJ8++71+v6Pnxfg==",
+ "license": "MIT"
},
"node_modules/fast-glob": {
"version": "3.3.2",
@@ -7275,9 +7276,10 @@
}
},
"node_modules/react-grid-layout": {
- "version": "1.4.4",
- "resolved": "https://registry.npmjs.org/react-grid-layout/-/react-grid-layout-1.4.4.tgz",
- "integrity": "sha512-7+Lg8E8O8HfOH5FrY80GCIR1SHTn2QnAYKh27/5spoz+OHhMmEhU/14gIkRzJOtympDPaXcVRX/nT1FjmeOUmQ==",
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/react-grid-layout/-/react-grid-layout-1.5.0.tgz",
+ "integrity": "sha512-WBKX7w/LsTfI99WskSu6nX2nbJAUD7GD6nIXcwYLyPpnslojtmql2oD3I2g5C3AK8hrxIarYT8awhuDIp7iQ5w==",
+ "license": "MIT",
"dependencies": {
"clsx": "^2.0.0",
"fast-equals": "^4.0.3",
@@ -7624,7 +7626,8 @@
"node_modules/resize-observer-polyfill": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/resize-observer-polyfill/-/resize-observer-polyfill-1.5.1.tgz",
- "integrity": "sha512-LwZrotdHOo12nQuZlHEmtuXdqGoOD0OhaxopaNFxWzInpEgaLWoVuAMbTzixuosCx2nEG58ngzW3vxdWoxIgdg=="
+ "integrity": "sha512-LwZrotdHOo12nQuZlHEmtuXdqGoOD0OhaxopaNFxWzInpEgaLWoVuAMbTzixuosCx2nEG58ngzW3vxdWoxIgdg==",
+ "license": "MIT"
},
"node_modules/resolve": {
"version": "1.22.8",
diff --git a/web/package.json b/web/package.json
index 6b3ec1e44..d0bdd01d4 100644
--- a/web/package.json
+++ b/web/package.json
@@ -60,7 +60,7 @@
"react-day-picker": "^8.10.1",
"react-device-detect": "^2.2.3",
"react-dom": "^18.3.1",
- "react-grid-layout": "^1.4.4",
+ "react-grid-layout": "^1.5.0",
"react-hook-form": "^7.52.1",
"react-icons": "^5.2.1",
"react-konva": "^18.2.10",
diff --git a/web/src/components/dynamic/CameraFeatureToggle.tsx b/web/src/components/dynamic/CameraFeatureToggle.tsx
index 4b9dabe95..8a284b82f 100644
--- a/web/src/components/dynamic/CameraFeatureToggle.tsx
+++ b/web/src/components/dynamic/CameraFeatureToggle.tsx
@@ -40,9 +40,9 @@ export default function CameraFeatureToggle({
("config");
+ const { allGroupsStreamingSettings, setAllGroupsStreamingSettings } =
+ useStreamingSettings();
+
+ const [groupStreamingSettings, setGroupStreamingSettings] =
+ useState(
+ allGroupsStreamingSettings[editingGroup?.[0] ?? ""],
+ );
+
+ const [openCamera, setOpenCamera] = useState();
+
const birdseyeConfig = useMemo(() => config?.birdseye, [config]);
const formSchema = z.object({
@@ -656,6 +675,16 @@ export function CameraGroupEdit({
setIsLoading(true);
+ // update streaming settings
+ const updatedSettings: AllGroupsStreamingSettings = {
+ ...Object.fromEntries(
+ Object.entries(allGroupsStreamingSettings || {}).filter(
+ ([key]) => key !== editingGroup?.[0],
+ ),
+ ),
+ [values.name]: groupStreamingSettings,
+ };
+
let renamingQuery = "";
if (editingGroup && editingGroup[0] !== values.name) {
renamingQuery = `camera_groups.${editingGroup[0]}&`;
@@ -679,7 +708,7 @@ export function CameraGroupEdit({
requires_restart: 0,
},
)
- .then((res) => {
+ .then(async (res) => {
if (res.status === 200) {
toast.success(`Camera group (${values.name}) has been saved.`, {
position: "top-center",
@@ -688,6 +717,7 @@ export function CameraGroupEdit({
if (onSave) {
onSave();
}
+ setAllGroupsStreamingSettings(updatedSettings);
} else {
toast.error(`Failed to save config changes: ${res.statusText}`, {
position: "top-center",
@@ -704,7 +734,16 @@ export function CameraGroupEdit({
setIsLoading(false);
});
},
- [currentGroups, setIsLoading, onSave, updateConfig, editingGroup],
+ [
+ currentGroups,
+ setIsLoading,
+ onSave,
+ updateConfig,
+ editingGroup,
+ groupStreamingSettings,
+ allGroupsStreamingSettings,
+ setAllGroupsStreamingSettings,
+ ],
);
const form = useForm>({
@@ -762,16 +801,66 @@ export function CameraGroupEdit({
),
].map((camera) => (
- {
- const updatedCameras = checked
- ? [...(field.value || []), camera]
- : (field.value || []).filter((c) => c !== camera);
- form.setValue("cameras", updatedCameras);
- }}
- />
+
+
+ {camera.replaceAll("_", " ")}
+
+
+
+ {camera !== "birdseye" && (
+
+ setOpenCamera(isOpen ? camera : null)
+ }
+ >
+
+
+
+
+
+
+ setOpenCamera(isOpen ? camera : null)
+ }
+ />
+
+ )}
+ {
+ const updatedCameras = checked
+ ? [...(field.value || []), camera]
+ : (field.value || []).filter((c) => c !== camera);
+ form.setValue("cameras", updatedCameras);
+ }}
+ />
+
+
))}
diff --git a/web/src/components/menu/LiveContextMenu.tsx b/web/src/components/menu/LiveContextMenu.tsx
new file mode 100644
index 000000000..f5222592d
--- /dev/null
+++ b/web/src/components/menu/LiveContextMenu.tsx
@@ -0,0 +1,302 @@
+import {
+ ReactNode,
+ useCallback,
+ useEffect,
+ useMemo,
+ useRef,
+ useState,
+} from "react";
+import {
+ ContextMenu,
+ ContextMenuContent,
+ ContextMenuItem,
+ ContextMenuSeparator,
+ ContextMenuTrigger,
+} from "@/components/ui/context-menu";
+import {
+ MdVolumeDown,
+ MdVolumeMute,
+ MdVolumeOff,
+ MdVolumeUp,
+} from "react-icons/md";
+import { Dialog } from "@/components/ui/dialog";
+import { VolumeSlider } from "@/components/ui/slider";
+import { CameraStreamingDialog } from "../settings/CameraStreamingDialog";
+import {
+ AllGroupsStreamingSettings,
+ GroupStreamingSettings,
+} from "@/types/frigateConfig";
+import { useStreamingSettings } from "@/context/streaming-settings-provider";
+import { IoIosWarning } from "react-icons/io";
+import { cn } from "@/lib/utils";
+import { useNavigate } from "react-router-dom";
+
+type LiveContextMenuProps = {
+ className?: string;
+ camera: string;
+ streamName: string;
+ cameraGroup?: string;
+ preferredLiveMode: string;
+ isRestreamed: boolean;
+ supportsAudio: boolean;
+ audioState: boolean;
+ toggleAudio: () => void;
+ volumeState?: number;
+ setVolumeState: (volumeState: number) => void;
+ muteAll: () => void;
+ unmuteAll: () => void;
+ statsState: boolean;
+ toggleStats: () => void;
+ resetPreferredLiveMode: () => void;
+ children?: ReactNode;
+};
+export default function LiveContextMenu({
+ className,
+ camera,
+ streamName,
+ cameraGroup,
+ preferredLiveMode,
+ isRestreamed,
+ supportsAudio,
+ audioState,
+ toggleAudio,
+ volumeState,
+ setVolumeState,
+ muteAll,
+ unmuteAll,
+ statsState,
+ toggleStats,
+ resetPreferredLiveMode,
+ children,
+}: LiveContextMenuProps) {
+ const [showSettings, setShowSettings] = useState(false);
+
+ // streaming settings
+
+ const { allGroupsStreamingSettings, setAllGroupsStreamingSettings } =
+ useStreamingSettings();
+
+ const [groupStreamingSettings, setGroupStreamingSettings] =
+ useState(
+ allGroupsStreamingSettings[cameraGroup ?? ""],
+ );
+
+ useEffect(() => {
+ if (cameraGroup) {
+ setGroupStreamingSettings(allGroupsStreamingSettings[cameraGroup]);
+ }
+ // set individual group when all groups changes
+ // eslint-disable-next-line react-hooks/exhaustive-deps
+ }, [allGroupsStreamingSettings]);
+
+ const onSave = useCallback(
+ (settings: GroupStreamingSettings) => {
+ if (!cameraGroup || !allGroupsStreamingSettings) {
+ return;
+ }
+
+ const updatedSettings: AllGroupsStreamingSettings = {
+ ...Object.fromEntries(
+ Object.entries(allGroupsStreamingSettings || {}).filter(
+ ([key]) => key !== cameraGroup,
+ ),
+ ),
+ [cameraGroup]: {
+ ...Object.fromEntries(
+ Object.entries(settings).map(([cameraName, cameraSettings]) => [
+ cameraName,
+ cameraName === camera
+ ? {
+ ...cameraSettings,
+ playAudio: audioState ?? cameraSettings.playAudio ?? false,
+ volume: volumeState ?? cameraSettings.volume ?? 1,
+ }
+ : cameraSettings,
+ ]),
+ ),
+ // Add the current camera if it doesn't exist
+ ...(!settings[camera]
+ ? {
+ [camera]: {
+ streamName: streamName,
+ streamType: "smart",
+ compatibilityMode: false,
+ playAudio: audioState,
+ volume: volumeState ?? 1,
+ },
+ }
+ : {}),
+ },
+ };
+
+ setAllGroupsStreamingSettings?.(updatedSettings);
+ },
+ [
+ camera,
+ streamName,
+ cameraGroup,
+ allGroupsStreamingSettings,
+ setAllGroupsStreamingSettings,
+ audioState,
+ volumeState,
+ ],
+ );
+
+ // ui
+
+ const audioControlsUsed = useRef(false);
+
+ const VolumeIcon = useMemo(() => {
+ if (!volumeState || volumeState == 0.0 || !audioState) {
+ return MdVolumeOff;
+ } else if (volumeState <= 0.33) {
+ return MdVolumeMute;
+ } else if (volumeState <= 0.67) {
+ return MdVolumeDown;
+ } else {
+ return MdVolumeUp;
+ }
+ // only update when specific fields change
+ // eslint-disable-next-line react-hooks/exhaustive-deps
+ }, [volumeState, audioState]);
+
+ const handleVolumeIconClick = (e: React.MouseEvent) => {
+ e.stopPropagation();
+ audioControlsUsed.current = true;
+ toggleAudio();
+ };
+
+ const handleVolumeChange = (value: number[]) => {
+ audioControlsUsed.current = true;
+ setVolumeState(value[0]);
+ };
+
+ const handleOpenChange = (open: boolean) => {
+ if (!open && audioControlsUsed.current) {
+ onSave(groupStreamingSettings);
+ audioControlsUsed.current = false;
+ }
+ };
+
+ // navigate for debug view
+
+ const navigate = useNavigate();
+
+ return (
+
+
+ {children}
+
+
+
+ {camera.replaceAll("_", " ")}
+
+ {preferredLiveMode == "jsmpeg" && isRestreamed && (
+
+ )}
+
+ {preferredLiveMode != "jsmpeg" && isRestreamed && supportsAudio && (
+ <>
+
+
+ >
+ )}
+
+
+
+
+
+
+
+
+
+
+
+ {statsState ? "Hide" : "Show"} Stream Stats
+
+
+
+
+ navigate(`/settings?page=debug&camera=${camera}`)}
+ >
+
Debug View
+
+
+ {cameraGroup && cameraGroup !== "default" && (
+ <>
+
+
+ setShowSettings(true)}
+ >
+
Streaming Settings
+
+
+ >
+ )}
+ {preferredLiveMode == "jsmpeg" && isRestreamed && (
+ <>
+
+
+
+
+ >
+ )}
+
+
+
+
+
+
+
+ );
+}
diff --git a/web/src/components/overlay/detail/SearchDetailDialog.tsx b/web/src/components/overlay/detail/SearchDetailDialog.tsx
index 45aabb07c..f15627b71 100644
--- a/web/src/components/overlay/detail/SearchDetailDialog.tsx
+++ b/web/src/components/overlay/detail/SearchDetailDialog.tsx
@@ -673,7 +673,8 @@ export function ObjectSnapshotTab({
{search.data.type == "object" &&
search.plus_id !== "not_enabled" &&
- search.end_time && (
+ search.end_time &&
+ search.label != "on_demand" && (
diff --git a/web/src/components/player/BirdseyeLivePlayer.tsx b/web/src/components/player/BirdseyeLivePlayer.tsx
index 2666ac9f7..286f19216 100644
--- a/web/src/components/player/BirdseyeLivePlayer.tsx
+++ b/web/src/components/player/BirdseyeLivePlayer.tsx
@@ -58,6 +58,7 @@ export default function BirdseyeLivePlayer({
height={birdseyeConfig.height}
containerRef={containerRef}
playbackEnabled={true}
+ useWebGL={true}
/>
);
} else {
diff --git a/web/src/components/player/JSMpegPlayer.tsx b/web/src/components/player/JSMpegPlayer.tsx
index 401e85869..3753a9e46 100644
--- a/web/src/components/player/JSMpegPlayer.tsx
+++ b/web/src/components/player/JSMpegPlayer.tsx
@@ -1,6 +1,7 @@
import { baseUrl } from "@/api/baseUrl";
import { useResizeObserver } from "@/hooks/resize-observer";
import { cn } from "@/lib/utils";
+import { PlayerStatsType } from "@/types/live";
// @ts-expect-error we know this doesn't have types
import JSMpeg from "@cycjimmy/jsmpeg-player";
import React, { useEffect, useMemo, useRef, useState } from "react";
@@ -12,6 +13,8 @@ type JSMpegPlayerProps = {
height: number;
containerRef: React.MutableRefObject
;
playbackEnabled: boolean;
+ useWebGL: boolean;
+ setStats?: (stats: PlayerStatsType) => void;
onPlaying?: () => void;
};
@@ -22,6 +25,8 @@ export default function JSMpegPlayer({
className,
containerRef,
playbackEnabled,
+ useWebGL = false,
+ setStats,
onPlaying,
}: JSMpegPlayerProps) {
const url = `${baseUrl.replace(/^http/, "ws")}live/jsmpeg/${camera}`;
@@ -33,6 +38,9 @@ export default function JSMpegPlayer({
const [hasData, setHasData] = useState(false);
const hasDataRef = useRef(hasData);
const [dimensionsReady, setDimensionsReady] = useState(false);
+ const bytesReceivedRef = useRef(0);
+ const lastTimestampRef = useRef(Date.now());
+ const statsIntervalRef = useRef(null);
const selectedContainerRef = useMemo(
() => (containerRef.current ? containerRef : internalContainerRef),
@@ -111,6 +119,8 @@ export default function JSMpegPlayer({
const canvas = canvasRef.current;
let videoElement: JSMpeg.VideoElement | null = null;
+ let frameCount = 0;
+
setHasData(false);
if (videoWrapper && playbackEnabled) {
@@ -123,21 +133,68 @@ export default function JSMpegPlayer({
{
protocols: [],
audio: false,
- disableGl: camera != "birdseye",
- disableWebAssembly: camera != "birdseye",
+ disableGl: !useWebGL,
+ disableWebAssembly: !useWebGL,
videoBufferSize: 1024 * 1024 * 4,
onVideoDecode: () => {
if (!hasDataRef.current) {
setHasData(true);
onPlayingRef.current?.();
}
+ frameCount++;
},
},
);
+
+ // Set up WebSocket message handler
+ if (
+ videoElement.player &&
+ videoElement.player.source &&
+ videoElement.player.source.socket
+ ) {
+ const socket = videoElement.player.source.socket;
+ socket.addEventListener("message", (event: MessageEvent) => {
+ if (event.data instanceof ArrayBuffer) {
+ bytesReceivedRef.current += event.data.byteLength;
+ }
+ });
+ }
+
+ // Update stats every second
+ statsIntervalRef.current = setInterval(() => {
+ const currentTimestamp = Date.now();
+ const timeDiff = (currentTimestamp - lastTimestampRef.current) / 1000; // in seconds
+ const bitrate = (bytesReceivedRef.current * 8) / timeDiff / 1000; // in kbps
+
+ setStats?.({
+ streamType: "jsmpeg",
+ bandwidth: Math.round(bitrate),
+ totalFrames: frameCount,
+ latency: undefined,
+ droppedFrames: undefined,
+ decodedFrames: undefined,
+ droppedFrameRate: undefined,
+ });
+
+ bytesReceivedRef.current = 0;
+ lastTimestampRef.current = currentTimestamp;
+ }, 1000);
+
+ return () => {
+ if (statsIntervalRef.current) {
+ clearInterval(statsIntervalRef.current);
+ frameCount = 0;
+ statsIntervalRef.current = null;
+ }
+ };
}, 0);
return () => {
clearTimeout(initPlayer);
+ if (statsIntervalRef.current) {
+ clearInterval(statsIntervalRef.current);
+ statsIntervalRef.current = null;
+ }
if (videoElement) {
try {
// this causes issues in react strict mode
diff --git a/web/src/components/player/LivePlayer.tsx b/web/src/components/player/LivePlayer.tsx
index abf908baa..4bd751469 100644
--- a/web/src/components/player/LivePlayer.tsx
+++ b/web/src/components/player/LivePlayer.tsx
@@ -11,6 +11,7 @@ import { useCameraActivity } from "@/hooks/use-camera-activity";
import {
LivePlayerError,
LivePlayerMode,
+ PlayerStatsType,
VideoResolutionType,
} from "@/types/live";
import { getIconForLabel } from "@/utils/iconUtil";
@@ -20,20 +21,26 @@ import { cn } from "@/lib/utils";
import { TbExclamationCircle } from "react-icons/tb";
import { TooltipPortal } from "@radix-ui/react-tooltip";
import { baseUrl } from "@/api/baseUrl";
+import { PlayerStats } from "./PlayerStats";
type LivePlayerProps = {
cameraRef?: (ref: HTMLDivElement | null) => void;
containerRef?: React.MutableRefObject;
className?: string;
cameraConfig: CameraConfig;
+ streamName: string;
preferredLiveMode: LivePlayerMode;
showStillWithoutActivity?: boolean;
+ useWebGL: boolean;
windowVisible?: boolean;
playAudio?: boolean;
+ volume?: number;
+ playInBackground: boolean;
micEnabled?: boolean; // only webrtc supports mic
iOSCompatFullScreen?: boolean;
pip?: boolean;
autoLive?: boolean;
+ showStats?: boolean;
onClick?: () => void;
setFullResolution?: React.Dispatch>;
onError?: (error: LivePlayerError) => void;
@@ -45,14 +52,19 @@ export default function LivePlayer({
containerRef,
className,
cameraConfig,
+ streamName,
preferredLiveMode,
showStillWithoutActivity = true,
+ useWebGL = false,
windowVisible = true,
playAudio = false,
+ volume,
+ playInBackground = false,
micEnabled = false,
iOSCompatFullScreen = false,
pip,
autoLive = true,
+ showStats = false,
onClick,
setFullResolution,
onError,
@@ -60,6 +72,18 @@ export default function LivePlayer({
}: LivePlayerProps) {
const internalContainerRef = useRef(null);
+ // stats
+
+ const [stats, setStats] = useState({
+ streamType: "-",
+ bandwidth: 0, // in kbps
+ latency: undefined, // in seconds
+ totalFrames: 0,
+ droppedFrames: undefined,
+ decodedFrames: 0,
+ droppedFrameRate: 0, // percentage
+ });
+
// camera activity
const { activeMotion, activeTracking, objects, offline } =
@@ -144,6 +168,25 @@ export default function LivePlayer({
setLiveReady(false);
}, [preferredLiveMode]);
+ const [key, setKey] = useState(0);
+
+ const resetPlayer = () => {
+ setLiveReady(false);
+ setKey((prevKey) => prevKey + 1);
+ };
+
+ useEffect(() => {
+ if (streamName) {
+ resetPlayer();
+ }
+ }, [streamName]);
+
+ useEffect(() => {
+ if (showStillWithoutActivity && !autoLive) {
+ setLiveReady(false);
+ }
+ }, [showStillWithoutActivity, autoLive]);
+
const playerIsPlaying = useCallback(() => {
setLiveReady(true);
}, []);
@@ -153,15 +196,19 @@ export default function LivePlayer({
}
let player;
- if (!autoLive) {
+ if (!autoLive || !streamName) {
player = null;
} else if (preferredLiveMode == "webrtc") {
player = (
@@ -293,7 +348,7 @@ export default function LivePlayer({
)}
>
)}
+ {showStats && (
+
+ )}
);
}
diff --git a/web/src/components/player/MsePlayer.tsx b/web/src/components/player/MsePlayer.tsx
index 52cf8f99c..554eb5af1 100644
--- a/web/src/components/player/MsePlayer.tsx
+++ b/web/src/components/player/MsePlayer.tsx
@@ -1,5 +1,9 @@
import { baseUrl } from "@/api/baseUrl";
-import { LivePlayerError, VideoResolutionType } from "@/types/live";
+import {
+ LivePlayerError,
+ PlayerStatsType,
+ VideoResolutionType,
+} from "@/types/live";
import {
SetStateAction,
useCallback,
@@ -15,7 +19,11 @@ type MSEPlayerProps = {
className?: string;
playbackEnabled?: boolean;
audioEnabled?: boolean;
+ volume?: number;
+ playInBackground?: boolean;
pip?: boolean;
+ getStats?: boolean;
+ setStats?: (stats: PlayerStatsType) => void;
onPlaying?: () => void;
setFullResolution?: React.Dispatch>;
onError?: (error: LivePlayerError) => void;
@@ -26,7 +34,11 @@ function MSEPlayer({
className,
playbackEnabled = true,
audioEnabled = false,
+ volume,
+ playInBackground = false,
pip = false,
+ getStats = false,
+ setStats,
onPlaying,
setFullResolution,
onError,
@@ -57,6 +69,7 @@ function MSEPlayer({
const [connectTS, setConnectTS] = useState(0);
const [bufferTimeout, setBufferTimeout] = useState();
const [errorCount, setErrorCount] = useState(0);
+ const totalBytesLoaded = useRef(0);
const videoRef = useRef(null);
const wsRef = useRef(null);
@@ -316,6 +329,8 @@ function MSEPlayer({
let bufLen = 0;
ondataRef.current = (data) => {
+ totalBytesLoaded.current += data.byteLength;
+
if (sb?.updating || bufLen > 0) {
const b = new Uint8Array(data);
buf.set(b, bufLen);
@@ -508,12 +523,22 @@ function MSEPlayer({
}
};
- document.addEventListener("visibilitychange", listener);
+ if (!playInBackground) {
+ document.addEventListener("visibilitychange", listener);
+ }
return () => {
- document.removeEventListener("visibilitychange", listener);
+ if (!playInBackground) {
+ document.removeEventListener("visibilitychange", listener);
+ }
};
- }, [playbackEnabled, visibilityCheck, onConnect, onDisconnect]);
+ }, [
+ playbackEnabled,
+ visibilityCheck,
+ playInBackground,
+ onConnect,
+ onDisconnect,
+ ]);
// control pip
@@ -525,6 +550,16 @@ function MSEPlayer({
videoRef.current.requestPictureInPicture();
}, [pip, videoRef]);
+ // control volume
+
+ useEffect(() => {
+ if (!videoRef.current || volume == undefined) {
+ return;
+ }
+
+ videoRef.current.volume = volume;
+ }, [volume, videoRef]);
+
// ensure we disconnect for slower connections
useEffect(() => {
@@ -542,6 +577,68 @@ function MSEPlayer({
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [playbackEnabled]);
+ // stats
+
+ useEffect(() => {
+ const video = videoRef.current;
+ let lastLoadedBytes = totalBytesLoaded.current;
+ let lastTimestamp = Date.now();
+
+ if (!getStats) return;
+
+ const updateStats = () => {
+ if (video) {
+ const now = Date.now();
+ const bytesLoaded = totalBytesLoaded.current;
+ const timeElapsed = (now - lastTimestamp) / 1000; // seconds
+ const bandwidth = (bytesLoaded - lastLoadedBytes) / timeElapsed / 1024; // kbps
+
+ lastLoadedBytes = bytesLoaded;
+ lastTimestamp = now;
+
+ const latency =
+ video.seekable.length > 0
+ ? Math.max(
+ 0,
+ video.seekable.end(video.seekable.length - 1) -
+ video.currentTime,
+ )
+ : 0;
+
+ const videoQuality = video.getVideoPlaybackQuality();
+ const { totalVideoFrames, droppedVideoFrames } = videoQuality;
+ const droppedFrameRate = totalVideoFrames
+ ? (droppedVideoFrames / totalVideoFrames) * 100
+ : 0;
+
+ setStats?.({
+ streamType: "MSE",
+ bandwidth,
+ latency,
+ totalFrames: totalVideoFrames,
+ droppedFrames: droppedVideoFrames || undefined,
+ decodedFrames: totalVideoFrames - droppedVideoFrames,
+ droppedFrameRate,
+ });
+ }
+ };
+
+ const interval = setInterval(updateStats, 1000); // Update every second
+
+ return () => {
+ clearInterval(interval);
+ setStats?.({
+ streamType: "-",
+ bandwidth: 0,
+ latency: undefined,
+ totalFrames: 0,
+ droppedFrames: undefined,
+ decodedFrames: 0,
+ droppedFrameRate: 0,
+ });
+ };
+ }, [setStats, getStats]);
+
return (
+
+ Stream Type: {" "}
+ {stats.streamType}
+
+
+ Bandwidth: {" "}
+ {stats.bandwidth.toFixed(2)} kbps
+
+ {stats.latency != undefined && (
+
+ Latency: {" "}
+ 2 ? "text-danger" : ""}`}
+ >
+ {stats.latency.toFixed(2)} seconds
+
+
+ )}
+
+ Total Frames: {" "}
+ {stats.totalFrames}
+
+ {stats.droppedFrames != undefined && (
+
+ Dropped Frames: {" "}
+ {stats.droppedFrames}
+
+ )}
+ {stats.decodedFrames != undefined && (
+
+ Decoded Frames: {" "}
+ {stats.decodedFrames}
+
+ )}
+ {stats.droppedFrameRate != undefined && (
+
+ Dropped Frame Rate: {" "}
+
+ {stats.droppedFrameRate.toFixed(2)}%
+
+
+ )}
+ >
+ );
+
+ const minimalStatsContent = (
+
+
+ Type
+ {stats.streamType}
+
+
+ Bandwidth {" "}
+ {stats.bandwidth.toFixed(2)} kbps
+
+ {stats.latency != undefined && (
+
+ Latency
+ = 2 ? "text-danger" : ""}`}
+ >
+ {stats.latency.toFixed(2)} sec
+
+
+ )}
+ {stats.droppedFrames != undefined && (
+
+ Dropped
+ {stats.droppedFrames} frames
+
+ )}
+
+ );
+
+ return (
+ <>
+
+ {minimal ? minimalStatsContent : fullStatsContent}
+
+ >
+ );
+}
diff --git a/web/src/components/player/WebRTCPlayer.tsx b/web/src/components/player/WebRTCPlayer.tsx
index 3498c31ff..b4c9ea6b2 100644
--- a/web/src/components/player/WebRTCPlayer.tsx
+++ b/web/src/components/player/WebRTCPlayer.tsx
@@ -1,5 +1,5 @@
import { baseUrl } from "@/api/baseUrl";
-import { LivePlayerError } from "@/types/live";
+import { LivePlayerError, PlayerStatsType } from "@/types/live";
import { useCallback, useEffect, useMemo, useRef, useState } from "react";
type WebRtcPlayerProps = {
@@ -7,9 +7,12 @@ type WebRtcPlayerProps = {
camera: string;
playbackEnabled?: boolean;
audioEnabled?: boolean;
+ volume?: number;
microphoneEnabled?: boolean;
iOSCompatFullScreen?: boolean; // ios doesn't support fullscreen divs so we must support the video element
pip?: boolean;
+ getStats?: boolean;
+ setStats?: (stats: PlayerStatsType) => void;
onPlaying?: () => void;
onError?: (error: LivePlayerError) => void;
};
@@ -19,9 +22,12 @@ export default function WebRtcPlayer({
camera,
playbackEnabled = true,
audioEnabled = false,
+ volume,
microphoneEnabled = false,
iOSCompatFullScreen = false,
pip = false,
+ getStats = false,
+ setStats,
onPlaying,
onError,
}: WebRtcPlayerProps) {
@@ -194,6 +200,16 @@ export default function WebRtcPlayer({
videoRef.current.requestPictureInPicture();
}, [pip, videoRef]);
+ // control volume
+
+ useEffect(() => {
+ if (!videoRef.current || volume == undefined) {
+ return;
+ }
+
+ videoRef.current.volume = volume;
+ }, [volume, videoRef]);
+
useEffect(() => {
videoLoadTimeoutRef.current = setTimeout(() => {
onError?.("stalled");
@@ -215,6 +231,75 @@ export default function WebRtcPlayer({
onPlaying?.();
};
+ // stats
+
+ useEffect(() => {
+ if (!pcRef.current || !getStats) return;
+
+ let lastBytesReceived = 0;
+ let lastTimestamp = 0;
+
+ const interval = setInterval(async () => {
+ if (pcRef.current && videoRef.current && !videoRef.current.paused) {
+ const report = await pcRef.current.getStats();
+ let bytesReceived = 0;
+ let timestamp = 0;
+ let roundTripTime = 0;
+ let framesReceived = 0;
+ let framesDropped = 0;
+ let framesDecoded = 0;
+
+ report.forEach((stat) => {
+ if (stat.type === "inbound-rtp" && stat.kind === "video") {
+ bytesReceived = stat.bytesReceived;
+ timestamp = stat.timestamp;
+ framesReceived = stat.framesReceived;
+ framesDropped = stat.framesDropped;
+ framesDecoded = stat.framesDecoded;
+ }
+ if (stat.type === "candidate-pair" && stat.state === "succeeded") {
+ roundTripTime = stat.currentRoundTripTime;
+ }
+ });
+
+ const timeDiff = (timestamp - lastTimestamp) / 1000; // in seconds
+ const bitrate =
+ timeDiff > 0
+ ? (bytesReceived - lastBytesReceived) / timeDiff / 1000
+ : 0; // in kbps
+
+ setStats?.({
+ streamType: "WebRTC",
+ bandwidth: Math.round(bitrate),
+ latency: roundTripTime,
+ totalFrames: framesReceived,
+ droppedFrames: framesDropped,
+ decodedFrames: framesDecoded,
+ droppedFrameRate:
+ framesReceived > 0 ? (framesDropped / framesReceived) * 100 : 0,
+ });
+
+ lastBytesReceived = bytesReceived;
+ lastTimestamp = timestamp;
+ }
+ }, 1000);
+
+ return () => {
+ clearInterval(interval);
+ setStats?.({
+ streamType: "-",
+ bandwidth: 0,
+ latency: undefined,
+ totalFrames: 0,
+ droppedFrames: undefined,
+ decodedFrames: 0,
+ droppedFrameRate: 0,
+ });
+ };
+ // we need to listen on the value of the ref
+ // eslint-disable-next-line react-hooks/exhaustive-deps
+ }, [pcRef, pcRef.current, getStats]);
+
return (
+ >;
+ setIsDialogOpen: React.Dispatch>;
+ onSave?: (settings: GroupStreamingSettings) => void;
+};
+
+export function CameraStreamingDialog({
+ camera,
+ groupStreamingSettings,
+ setGroupStreamingSettings,
+ setIsDialogOpen,
+ onSave,
+}: CameraStreamingDialogProps) {
+ const { data: config } = useSWR("config");
+
+ const [isLoading, setIsLoading] = useState(false);
+
+ const [streamName, setStreamName] = useState(
+ Object.entries(config?.cameras[camera]?.live?.streams || {})[0]?.[1] || "",
+ );
+ const [streamType, setStreamType] = useState("smart");
+ const [compatibilityMode, setCompatibilityMode] = useState(false);
+
+ // metadata
+
+ const isRestreamed = useMemo(
+ () =>
+ config &&
+ Object.keys(config.go2rtc.streams || {}).includes(streamName ?? ""),
+ [config, streamName],
+ );
+
+ const { data: cameraMetadata } = useSWR(
+ isRestreamed ? `go2rtc/streams/${streamName}` : null,
+ {
+ revalidateOnFocus: false,
+ },
+ );
+
+ const supportsAudioOutput = useMemo(() => {
+ if (!cameraMetadata) {
+ return false;
+ }
+
+ return (
+ cameraMetadata.producers.find(
+ (prod) =>
+ prod.medias &&
+ prod.medias.find((media) => media.includes("audio, recvonly")) !=
+ undefined,
+ ) != undefined
+ );
+ }, [cameraMetadata]);
+
+ // handlers
+
+ useEffect(() => {
+ if (!config) {
+ return;
+ }
+ if (groupStreamingSettings && groupStreamingSettings[camera]) {
+ const cameraSettings = groupStreamingSettings[camera];
+ setStreamName(cameraSettings.streamName || "");
+ setStreamType(cameraSettings.streamType || "smart");
+ setCompatibilityMode(cameraSettings.compatibilityMode || false);
+ } else {
+ setStreamName(
+ Object.entries(config?.cameras[camera]?.live?.streams || {})[0]?.[1] ||
+ "",
+ );
+ setStreamType("smart");
+ setCompatibilityMode(false);
+ }
+ }, [groupStreamingSettings, camera, config]);
+
+ const handleSave = useCallback(() => {
+ setIsLoading(true);
+ const updatedSettings = {
+ ...groupStreamingSettings,
+ [camera]: {
+ streamName,
+ streamType,
+ compatibilityMode,
+ playAudio: groupStreamingSettings?.[camera]?.playAudio ?? false,
+ volume: groupStreamingSettings?.[camera]?.volume ?? 1,
+ },
+ };
+
+ setGroupStreamingSettings(updatedSettings);
+ setIsDialogOpen(false);
+ setIsLoading(false);
+ onSave?.(updatedSettings);
+ }, [
+ groupStreamingSettings,
+ setGroupStreamingSettings,
+ camera,
+ streamName,
+ streamType,
+ compatibilityMode,
+ setIsDialogOpen,
+ onSave,
+ ]);
+
+ const handleCancel = useCallback(() => {
+ if (!config) {
+ return;
+ }
+ if (groupStreamingSettings && groupStreamingSettings[camera]) {
+ const cameraSettings = groupStreamingSettings[camera];
+ setStreamName(cameraSettings.streamName || "");
+ setStreamType(cameraSettings.streamType || "smart");
+ setCompatibilityMode(cameraSettings.compatibilityMode || false);
+ } else {
+ setStreamName(
+ Object.entries(config?.cameras[camera]?.live?.streams || {})[0]?.[1] ||
+ "",
+ );
+ setStreamType("smart");
+ setCompatibilityMode(false);
+ }
+ setIsDialogOpen(false);
+ }, [groupStreamingSettings, camera, config, setIsDialogOpen]);
+
+ if (!config) {
+ return null;
+ }
+
+ return (
+
+
+
+ {camera.replaceAll("_", " ")} Streaming Settings
+
+
+ Change the live streaming options for this camera group's dashboard.{" "}
+ These settings are device/browser-specific.
+
+
+
+ {!isRestreamed && (
+
+
Stream
+
+
+
Restreaming is not enabled for this camera.
+
+
+
+
+ Info
+
+
+
+ Set up go2rtc for additional live view options and audio for
+ this camera.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+
+
+ )}
+ {isRestreamed &&
+ Object.entries(config?.cameras[camera].live.streams).length > 0 && (
+
+
+ Stream
+
+
+
+
+
+
+ {camera !== "birdseye" &&
+ Object.entries(config?.cameras[camera].live.streams).map(
+ ([name, stream]) => (
+
+ {name}
+
+ ),
+ )}
+
+
+ {supportsAudioOutput ? (
+ <>
+
+
Audio is available for this stream
+ >
+ ) : (
+ <>
+
+
Audio is unavailable for this stream
+
+
+
+
+ Info
+
+
+
+ Audio must be output from your camera and configured
+ in go2rtc for this stream.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+ >
+ )}
+
+
+
+ )}
+
+
+ Streaming Method
+
+
setStreamType(value as StreamType)}
+ >
+
+
+
+
+ No Streaming
+
+ Smart Streaming (recommended)
+
+ Continuous Streaming
+
+
+ {streamType === "no-streaming" && (
+
+ Camera images will only update once per minute and no live
+ streaming will occur.
+
+ )}
+ {streamType === "smart" && (
+
+ Smart streaming will update your camera image once per minute when
+ no detectable activity is occurring to conserve bandwidth and
+ resources. When activity is detected, the image seamlessly
+ switches to a live stream.
+
+ )}
+ {streamType === "continuous" && (
+ <>
+
+ Camera image will always be a live stream when visible on the
+ dashboard, even if no activity is being detected.
+
+
+
+
+ Continuous streaming may cause high bandwidth usage and
+ performance issues. Use with caution.
+
+
+ >
+ )}
+
+
+
+ setCompatibilityMode(!compatibilityMode)}
+ />
+
+ Compatibility mode
+
+
+
+
+ Enable this option only if your camera's live stream is displaying
+ color artifacts and has a diagonal line on the right side of the
+ image.
+
+
+
+
+
+
+
+ Cancel
+
+
+ {isLoading ? (
+
+ ) : (
+ "Save"
+ )}
+
+
+
+
+ );
+}
diff --git a/web/src/components/ui/slider.tsx b/web/src/components/ui/slider.tsx
index 1dde1df67..3cb4165e9 100644
--- a/web/src/components/ui/slider.tsx
+++ b/web/src/components/ui/slider.tsx
@@ -18,7 +18,7 @@ const Slider = React.forwardRef<
-
+
));
Slider.displayName = SliderPrimitive.Root.displayName;
@@ -36,9 +36,9 @@ const VolumeSlider = React.forwardRef<
{...props}
>
-
+
-
+
));
VolumeSlider.displayName = SliderPrimitive.Root.displayName;
@@ -58,7 +58,7 @@ const NoThumbSlider = React.forwardRef<
-
+
));
NoThumbSlider.displayName = SliderPrimitive.Root.displayName;
@@ -78,8 +78,8 @@ const DualThumbSlider = React.forwardRef<
-
-
+
+
));
DualThumbSlider.displayName = SliderPrimitive.Root.displayName;
diff --git a/web/src/context/providers.tsx b/web/src/context/providers.tsx
index fe5e931e7..61b4a6426 100644
--- a/web/src/context/providers.tsx
+++ b/web/src/context/providers.tsx
@@ -5,6 +5,7 @@ import { ApiProvider } from "@/api";
import { IconContext } from "react-icons";
import { TooltipProvider } from "@/components/ui/tooltip";
import { StatusBarMessagesProvider } from "@/context/statusbar-provider";
+import { StreamingSettingsProvider } from "./streaming-settings-provider";
type TProvidersProps = {
children: ReactNode;
@@ -17,7 +18,11 @@ function providers({ children }: TProvidersProps) {
- {children}
+
+
+ {children}
+
+
diff --git a/web/src/context/streaming-settings-provider.tsx b/web/src/context/streaming-settings-provider.tsx
new file mode 100644
index 000000000..82558722e
--- /dev/null
+++ b/web/src/context/streaming-settings-provider.tsx
@@ -0,0 +1,68 @@
+import {
+ createContext,
+ useState,
+ useEffect,
+ ReactNode,
+ useContext,
+} from "react";
+import { AllGroupsStreamingSettings } from "@/types/frigateConfig";
+import { usePersistence } from "@/hooks/use-persistence";
+
+type StreamingSettingsContextType = {
+ allGroupsStreamingSettings: AllGroupsStreamingSettings;
+ setAllGroupsStreamingSettings: (settings: AllGroupsStreamingSettings) => void;
+ isPersistedStreamingSettingsLoaded: boolean;
+};
+
+const StreamingSettingsContext =
+ createContext(null);
+
+export function StreamingSettingsProvider({
+ children,
+}: {
+ children: ReactNode;
+}) {
+ const [allGroupsStreamingSettings, setAllGroupsStreamingSettings] =
+ useState({});
+
+ const [
+ persistedGroupStreamingSettings,
+ setPersistedGroupStreamingSettings,
+ isPersistedStreamingSettingsLoaded,
+ ] = usePersistence("streaming-settings");
+
+ useEffect(() => {
+ if (isPersistedStreamingSettingsLoaded) {
+ setAllGroupsStreamingSettings(persistedGroupStreamingSettings ?? {});
+ }
+ }, [isPersistedStreamingSettingsLoaded, persistedGroupStreamingSettings]);
+
+ useEffect(() => {
+ if (Object.keys(allGroupsStreamingSettings).length) {
+ setPersistedGroupStreamingSettings(allGroupsStreamingSettings);
+ }
+ }, [allGroupsStreamingSettings, setPersistedGroupStreamingSettings]);
+
+ return (
+
+ {children}
+
+ );
+}
+
+// eslint-disable-next-line react-refresh/only-export-components
+export function useStreamingSettings() {
+ const context = useContext(StreamingSettingsContext);
+ if (!context) {
+ throw new Error(
+ "useStreamingSettings must be used within a StreamingSettingsProvider",
+ );
+ }
+ return context;
+}
diff --git a/web/src/hooks/resize-observer.ts b/web/src/hooks/resize-observer.ts
index 57f55817a..1e174af7e 100644
--- a/web/src/hooks/resize-observer.ts
+++ b/web/src/hooks/resize-observer.ts
@@ -17,7 +17,15 @@ export function useResizeObserver(...refs: RefType[]) {
() =>
new ResizeObserver((entries) => {
window.requestAnimationFrame(() => {
- setDimensions(entries.map((entry) => entry.contentRect));
+ setDimensions((prevDimensions) => {
+ const newDimensions = entries.map((entry) => entry.contentRect);
+ if (
+ JSON.stringify(prevDimensions) !== JSON.stringify(newDimensions)
+ ) {
+ return newDimensions;
+ }
+ return prevDimensions;
+ });
});
}),
[],
diff --git a/web/src/hooks/use-camera-live-mode.ts b/web/src/hooks/use-camera-live-mode.ts
index edf165951..238ac70cc 100644
--- a/web/src/hooks/use-camera-live-mode.ts
+++ b/web/src/hooks/use-camera-live-mode.ts
@@ -1,16 +1,29 @@
import { CameraConfig, FrigateConfig } from "@/types/frigateConfig";
import { useCallback, useEffect, useState } from "react";
import useSWR from "swr";
-import { LivePlayerMode } from "@/types/live";
+import { LivePlayerMode, LiveStreamMetadata } from "@/types/live";
export default function useCameraLiveMode(
cameras: CameraConfig[],
windowVisible: boolean,
) {
const { data: config } = useSWR("config");
+ const { data: allStreamMetadata } = useSWR<{
+ [key: string]: LiveStreamMetadata;
+ }>(config ? "go2rtc/streams" : null, { revalidateOnFocus: false });
+
const [preferredLiveModes, setPreferredLiveModes] = useState<{
[key: string]: LivePlayerMode;
}>({});
+ const [isRestreamedStates, setIsRestreamedStates] = useState<{
+ [key: string]: boolean;
+ }>({});
+ const [supportsAudioOutputStates, setSupportsAudioOutputStates] = useState<{
+ [key: string]: {
+ supportsAudio: boolean;
+ cameraName: string;
+ };
+ }>({});
useEffect(() => {
if (!cameras) return;
@@ -18,26 +31,56 @@ export default function useCameraLiveMode(
const mseSupported =
"MediaSource" in window || "ManagedMediaSource" in window;
- const newPreferredLiveModes = cameras.reduce(
- (acc, camera) => {
- const isRestreamed =
- config &&
- Object.keys(config.go2rtc.streams || {}).includes(
- camera.live.stream_name,
- );
+ const newPreferredLiveModes: { [key: string]: LivePlayerMode } = {};
+ const newIsRestreamedStates: { [key: string]: boolean } = {};
+ const newSupportsAudioOutputStates: {
+ [key: string]: { supportsAudio: boolean; cameraName: string };
+ } = {};
- if (!mseSupported) {
- acc[camera.name] = isRestreamed ? "webrtc" : "jsmpeg";
- } else {
- acc[camera.name] = isRestreamed ? "mse" : "jsmpeg";
- }
- return acc;
- },
- {} as { [key: string]: LivePlayerMode },
- );
+ cameras.forEach((camera) => {
+ const isRestreamed =
+ config &&
+ Object.keys(config.go2rtc.streams || {}).includes(
+ Object.values(camera.live.streams)[0],
+ );
+
+ newIsRestreamedStates[camera.name] = isRestreamed ?? false;
+
+ if (!mseSupported) {
+ newPreferredLiveModes[camera.name] = isRestreamed ? "webrtc" : "jsmpeg";
+ } else {
+ newPreferredLiveModes[camera.name] = isRestreamed ? "mse" : "jsmpeg";
+ }
+
+ // check each stream for audio support
+ if (isRestreamed) {
+ Object.values(camera.live.streams).forEach((streamName) => {
+ const metadata = allStreamMetadata?.[streamName];
+ newSupportsAudioOutputStates[streamName] = {
+ supportsAudio: metadata
+ ? metadata.producers.find(
+ (prod) =>
+ prod.medias &&
+ prod.medias.find((media) =>
+ media.includes("audio, recvonly"),
+ ) !== undefined,
+ ) !== undefined
+ : false,
+ cameraName: camera.name,
+ };
+ });
+ } else {
+ newSupportsAudioOutputStates[camera.name] = {
+ supportsAudio: false,
+ cameraName: camera.name,
+ };
+ }
+ });
setPreferredLiveModes(newPreferredLiveModes);
- }, [cameras, config, windowVisible]);
+ setIsRestreamedStates(newIsRestreamedStates);
+ setSupportsAudioOutputStates(newSupportsAudioOutputStates);
+ }, [cameras, config, windowVisible, allStreamMetadata]);
const resetPreferredLiveMode = useCallback(
(cameraName: string) => {
@@ -61,5 +104,11 @@ export default function useCameraLiveMode(
[config],
);
- return { preferredLiveModes, setPreferredLiveModes, resetPreferredLiveMode };
+ return {
+ preferredLiveModes,
+ setPreferredLiveModes,
+ resetPreferredLiveMode,
+ isRestreamedStates,
+ supportsAudioOutputStates,
+ };
}
diff --git a/web/src/index.css b/web/src/index.css
index c657f22eb..9a294ff7c 100644
--- a/web/src/index.css
+++ b/web/src/index.css
@@ -180,6 +180,11 @@ html {
opacity: 0.5 !important;
}
+.react-grid-layout,
+.react-grid-layout .react-grid-item {
+ transition: none !important;
+}
+
.react-lazylog,
.react-lazylog-searchbar {
background-color: transparent !important;
diff --git a/web/src/pages/Settings.tsx b/web/src/pages/Settings.tsx
index 92d992f8a..e64620baa 100644
--- a/web/src/pages/Settings.tsx
+++ b/web/src/pages/Settings.tsx
@@ -37,6 +37,7 @@ import AuthenticationView from "@/views/settings/AuthenticationView";
import NotificationView from "@/views/settings/NotificationsSettingsView";
import SearchSettingsView from "@/views/settings/SearchSettingsView";
import UiSettingsView from "@/views/settings/UiSettingsView";
+import { useSearchEffect } from "@/hooks/use-overlay-state";
const allSettingsViews = [
"UI settings",
@@ -119,6 +120,21 @@ export default function Settings() {
}
}, [tabsRef, pageToggle]);
+ useSearchEffect("page", (page: string) => {
+ if (allSettingsViews.includes(page as SettingsType)) {
+ setPage(page as SettingsType);
+ }
+ return true;
+ });
+
+ useSearchEffect("camera", (camera: string) => {
+ const cameraNames = cameras.map((c) => c.name);
+ if (cameraNames.includes(camera)) {
+ setSelectedCamera(camera);
+ }
+ return true;
+ });
+
useEffect(() => {
document.title = "Settings - Frigate";
}, []);
diff --git a/web/src/types/frigateConfig.ts b/web/src/types/frigateConfig.ts
index 4b293de29..8acdc5396 100644
--- a/web/src/types/frigateConfig.ts
+++ b/web/src/types/frigateConfig.ts
@@ -87,7 +87,7 @@ export interface CameraConfig {
live: {
height: number;
quality: number;
- stream_name: string;
+ streams: { [key: string]: string };
};
motion: {
contour_area: number;
@@ -175,10 +175,18 @@ export interface CameraConfig {
alerts: {
required_zones: string[];
labels: string[];
+ retain: {
+ days: number;
+ mode: string;
+ };
};
detections: {
required_zones: string[];
labels: string[];
+ retain: {
+ days: number;
+ mode: string;
+ };
};
};
rtmp: {
@@ -230,6 +238,24 @@ export type CameraGroupConfig = {
order: number;
};
+export type StreamType = "no-streaming" | "smart" | "continuous";
+
+export type CameraStreamingSettings = {
+ streamName: string;
+ streamType: StreamType;
+ compatibilityMode: boolean;
+ playAudio: boolean;
+ volume: number;
+};
+
+export type GroupStreamingSettings = {
+ [cameraName: string]: CameraStreamingSettings;
+};
+
+export type AllGroupsStreamingSettings = {
+ [groupName: string]: GroupStreamingSettings;
+};
+
export interface FrigateConfig {
audio: {
enabled: boolean;
@@ -326,12 +352,6 @@ export interface FrigateConfig {
camera_groups: { [groupName: string]: CameraGroupConfig };
- live: {
- height: number;
- quality: number;
- stream_name: string;
- };
-
logger: {
default: string;
logs: Record;
diff --git a/web/src/types/live.ts b/web/src/types/live.ts
index f6c8e463a..ccc6b5e74 100644
--- a/web/src/types/live.ts
+++ b/web/src/types/live.ts
@@ -32,3 +32,17 @@ export type LiveStreamMetadata = {
};
export type LivePlayerError = "stalled" | "startup" | "mse-decode";
+
+export type AudioState = Record;
+export type StatsState = Record;
+export type VolumeState = Record;
+
+export type PlayerStatsType = {
+ streamType: string;
+ bandwidth: number;
+ latency: number | undefined;
+ totalFrames: number;
+ droppedFrames: number | undefined;
+ decodedFrames: number | undefined;
+ droppedFrameRate: number | undefined;
+};
diff --git a/web/src/views/live/DraggableGridLayout.tsx b/web/src/views/live/DraggableGridLayout.tsx
index fc2d9bb52..131f44770 100644
--- a/web/src/views/live/DraggableGridLayout.tsx
+++ b/web/src/views/live/DraggableGridLayout.tsx
@@ -1,5 +1,6 @@
import { usePersistence } from "@/hooks/use-persistence";
import {
+ AllGroupsStreamingSettings,
BirdseyeConfig,
CameraConfig,
FrigateConfig,
@@ -20,7 +21,12 @@ import {
} from "react-grid-layout";
import "react-grid-layout/css/styles.css";
import "react-resizable/css/styles.css";
-import { LivePlayerError, LivePlayerMode } from "@/types/live";
+import {
+ AudioState,
+ LivePlayerMode,
+ StatsState,
+ VolumeState,
+} from "@/types/live";
import { ASPECT_VERTICAL_LAYOUT, ASPECT_WIDE_LAYOUT } from "@/types/record";
import { Skeleton } from "@/components/ui/skeleton";
import { useResizeObserver } from "@/hooks/resize-observer";
@@ -42,6 +48,8 @@ import {
} from "@/components/ui/tooltip";
import { Toaster } from "@/components/ui/sonner";
import useCameraLiveMode from "@/hooks/use-camera-live-mode";
+import LiveContextMenu from "@/components/menu/LiveContextMenu";
+import { useStreamingSettings } from "@/context/streaming-settings-provider";
type DraggableGridLayoutProps = {
cameras: CameraConfig[];
@@ -76,8 +84,26 @@ export default function DraggableGridLayout({
// preferred live modes per camera
- const { preferredLiveModes, setPreferredLiveModes, resetPreferredLiveMode } =
- useCameraLiveMode(cameras, windowVisible);
+ const {
+ preferredLiveModes,
+ setPreferredLiveModes,
+ resetPreferredLiveMode,
+ isRestreamedStates,
+ supportsAudioOutputStates,
+ } = useCameraLiveMode(cameras, windowVisible);
+
+ const [globalAutoLive] = usePersistence("autoLiveView", true);
+
+ const { allGroupsStreamingSettings, setAllGroupsStreamingSettings } =
+ useStreamingSettings();
+
+ const currentGroupStreamingSettings = useMemo(() => {
+ if (cameraGroup && cameraGroup != "default" && allGroupsStreamingSettings) {
+ return allGroupsStreamingSettings[cameraGroup];
+ }
+ }, [allGroupsStreamingSettings, cameraGroup]);
+
+ // grid layout
const ResponsiveGridLayout = useMemo(() => WidthProvider(Responsive), []);
@@ -342,6 +368,105 @@ export default function DraggableGridLayout({
placeholder.h = layoutItem.h;
};
+ // audio and stats states
+
+ const [audioStates, setAudioStates] = useState({});
+ const [volumeStates, setVolumeStates] = useState({});
+ const [statsStates, setStatsStates] = useState(() => {
+ const initialStates: StatsState = {};
+ cameras.forEach((camera) => {
+ initialStates[camera.name] = false;
+ });
+ return initialStates;
+ });
+
+ const toggleStats = (cameraName: string): void => {
+ setStatsStates((prev) => ({
+ ...prev,
+ [cameraName]: !prev[cameraName],
+ }));
+ };
+
+ useEffect(() => {
+ if (!allGroupsStreamingSettings) {
+ return;
+ }
+
+ const initialAudioStates: AudioState = {};
+ const initialVolumeStates: VolumeState = {};
+
+ Object.entries(allGroupsStreamingSettings).forEach(([_, groupSettings]) => {
+ Object.entries(groupSettings).forEach(([camera, cameraSettings]) => {
+ initialAudioStates[camera] = cameraSettings.playAudio ?? false;
+ initialVolumeStates[camera] = cameraSettings.volume ?? 1;
+ });
+ });
+
+ setAudioStates(initialAudioStates);
+ setVolumeStates(initialVolumeStates);
+ }, [allGroupsStreamingSettings]);
+
+ const toggleAudio = (cameraName: string) => {
+ setAudioStates((prev) => ({
+ ...prev,
+ [cameraName]: !prev[cameraName],
+ }));
+ };
+
+ const onSaveMuting = useCallback(
+ (playAudio: boolean) => {
+ if (!cameraGroup || !allGroupsStreamingSettings) {
+ return;
+ }
+
+ const existingGroupSettings =
+ allGroupsStreamingSettings[cameraGroup] || {};
+
+ const updatedSettings: AllGroupsStreamingSettings = {
+ ...Object.fromEntries(
+ Object.entries(allGroupsStreamingSettings || {}).filter(
+ ([key]) => key !== cameraGroup,
+ ),
+ ),
+ [cameraGroup]: {
+ ...existingGroupSettings,
+ ...Object.fromEntries(
+ Object.entries(existingGroupSettings).map(
+ ([cameraName, settings]) => [
+ cameraName,
+ {
+ ...settings,
+ playAudio: playAudio,
+ },
+ ],
+ ),
+ ),
+ },
+ };
+
+ setAllGroupsStreamingSettings?.(updatedSettings);
+ },
+ [cameraGroup, allGroupsStreamingSettings, setAllGroupsStreamingSettings],
+ );
+
+ const muteAll = () => {
+ const updatedStates: AudioState = {};
+ cameras.forEach((camera) => {
+ updatedStates[camera.name] = false;
+ });
+ setAudioStates(updatedStates);
+ onSaveMuting(false);
+ };
+
+ const unmuteAll = () => {
+ const updatedStates: AudioState = {};
+ cameras.forEach((camera) => {
+ updatedStates[camera.name] = true;
+ });
+ setAudioStates(updatedStates);
+ onSaveMuting(true);
+ };
+
return (
<>
@@ -364,7 +489,7 @@ export default function DraggableGridLayout({
) : (
{
- !isEditMode && onSelectCamera(camera.name);
- }}
- onError={(e) => {
- setPreferredLiveModes((prevModes) => {
- const newModes = { ...prevModes };
- if (e === "mse-decode") {
- newModes[camera.name] = "webrtc";
- } else {
- newModes[camera.name] = "jsmpeg";
- }
- return newModes;
- });
- }}
- onResetLiveMode={() => resetPreferredLiveMode(camera.name)}
+ isRestreamed={isRestreamedStates[camera.name]}
+ supportsAudio={
+ supportsAudioOutputStates[streamName].supportsAudio
+ }
+ audioState={audioStates[camera.name]}
+ toggleAudio={() => toggleAudio(camera.name)}
+ statsState={statsStates[camera.name]}
+ toggleStats={() => toggleStats(camera.name)}
+ volumeState={volumeStates[camera.name]}
+ setVolumeState={(value) =>
+ setVolumeStates({
+ [camera.name]: value,
+ })
+ }
+ muteAll={muteAll}
+ unmuteAll={unmuteAll}
+ resetPreferredLiveMode={() =>
+ resetPreferredLiveMode(camera.name)
+ }
>
+ {
+ !isEditMode && onSelectCamera(camera.name);
+ }}
+ onError={(e) => {
+ setPreferredLiveModes((prevModes) => {
+ const newModes = { ...prevModes };
+ if (e === "mse-decode") {
+ newModes[camera.name] = "webrtc";
+ } else {
+ newModes[camera.name] = "jsmpeg";
+ }
+ return newModes;
+ });
+ }}
+ onResetLiveMode={() => resetPreferredLiveMode(camera.name)}
+ playAudio={audioStates[camera.name]}
+ volume={volumeStates[camera.name]}
+ />
{isEditMode && showCircles && }
-
+
);
})}
@@ -596,41 +768,57 @@ const BirdseyeLivePlayerGridItem = React.forwardRef<
},
);
-type LivePlayerGridItemProps = {
+type GridLiveContextMenuProps = {
+ className?: string;
style?: React.CSSProperties;
- className: string;
onMouseDown?: React.MouseEventHandler;
onMouseUp?: React.MouseEventHandler;
onTouchEnd?: React.TouchEventHandler;
children?: React.ReactNode;
- cameraRef: (node: HTMLElement | null) => void;
- windowVisible: boolean;
- cameraConfig: CameraConfig;
- preferredLiveMode: LivePlayerMode;
- onClick: () => void;
- onError: (e: LivePlayerError) => void;
- onResetLiveMode: () => void;
+ camera: string;
+ streamName: string;
+ cameraGroup: string;
+ preferredLiveMode: string;
+ isRestreamed: boolean;
+ supportsAudio: boolean;
+ audioState: boolean;
+ toggleAudio: () => void;
+ statsState: boolean;
+ toggleStats: () => void;
+ volumeState?: number;
+ setVolumeState: (volumeState: number) => void;
+ muteAll: () => void;
+ unmuteAll: () => void;
+ resetPreferredLiveMode: () => void;
};
-const LivePlayerGridItem = React.forwardRef<
+const GridLiveContextMenu = React.forwardRef<
HTMLDivElement,
- LivePlayerGridItemProps
+ GridLiveContextMenuProps
>(
(
{
- style,
className,
+ style,
onMouseDown,
onMouseUp,
onTouchEnd,
children,
- cameraRef,
- windowVisible,
- cameraConfig,
+ camera,
+ streamName,
+ cameraGroup,
preferredLiveMode,
- onClick,
- onError,
- onResetLiveMode,
+ isRestreamed,
+ supportsAudio,
+ audioState,
+ toggleAudio,
+ statsState,
+ toggleStats,
+ volumeState,
+ setVolumeState,
+ muteAll,
+ unmuteAll,
+ resetPreferredLiveMode,
...props
},
ref,
@@ -644,18 +832,26 @@ const LivePlayerGridItem = React.forwardRef<
onTouchEnd={onTouchEnd}
{...props}
>
- }
- />
- {children}
+ isRestreamed={isRestreamed}
+ supportsAudio={supportsAudio}
+ audioState={audioState}
+ toggleAudio={toggleAudio}
+ statsState={statsState}
+ toggleStats={toggleStats}
+ volumeState={volumeState}
+ setVolumeState={setVolumeState}
+ muteAll={muteAll}
+ unmuteAll={unmuteAll}
+ resetPreferredLiveMode={resetPreferredLiveMode}
+ >
+ {children}
+
);
},
diff --git a/web/src/views/live/LiveCameraView.tsx b/web/src/views/live/LiveCameraView.tsx
index af3ed0cee..ccf06de7b 100644
--- a/web/src/views/live/LiveCameraView.tsx
+++ b/web/src/views/live/LiveCameraView.tsx
@@ -17,6 +17,11 @@ import {
DropdownMenuItem,
DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu";
+import {
+ Popover,
+ PopoverContent,
+ PopoverTrigger,
+} from "@/components/ui/popover";
import {
Tooltip,
TooltipContent,
@@ -62,29 +67,52 @@ import {
FaMicrophoneSlash,
} from "react-icons/fa";
import { GiSpeaker, GiSpeakerOff } from "react-icons/gi";
-import { TbViewfinder, TbViewfinderOff } from "react-icons/tb";
-import { IoMdArrowRoundBack } from "react-icons/io";
import {
+ TbRecordMail,
+ TbRecordMailOff,
+ TbViewfinder,
+ TbViewfinderOff,
+} from "react-icons/tb";
+import { IoIosWarning, IoMdArrowRoundBack } from "react-icons/io";
+import {
+ LuCheck,
LuEar,
LuEarOff,
+ LuExternalLink,
LuHistory,
+ LuInfo,
LuPictureInPicture,
LuVideo,
LuVideoOff,
+ LuX,
} from "react-icons/lu";
import {
MdNoPhotography,
+ MdOutlineRestartAlt,
MdPersonOff,
MdPersonSearch,
MdPhotoCamera,
MdZoomIn,
MdZoomOut,
} from "react-icons/md";
-import { useNavigate } from "react-router-dom";
+import { Link, useNavigate } from "react-router-dom";
import { TransformWrapper, TransformComponent } from "react-zoom-pan-pinch";
import useSWR from "swr";
import { cn } from "@/lib/utils";
import { useSessionPersistence } from "@/hooks/use-session-persistence";
+import {
+ Select,
+ SelectContent,
+ SelectGroup,
+ SelectItem,
+ SelectTrigger,
+} from "@/components/ui/select";
+import { usePersistence } from "@/hooks/use-persistence";
+import { Label } from "@/components/ui/label";
+import { Switch } from "@/components/ui/switch";
+import axios from "axios";
+import { toast } from "sonner";
+import { Toaster } from "@/components/ui/sonner";
type LiveCameraViewProps = {
config?: FrigateConfig;
@@ -109,17 +137,20 @@ export default function LiveCameraView({
// supported features
+ const [streamName, setStreamName] = usePersistence(
+ `${camera.name}-stream`,
+ Object.values(camera.live.streams)[0],
+ );
+
const isRestreamed = useMemo(
() =>
config &&
- Object.keys(config.go2rtc.streams || {}).includes(
- camera.live.stream_name,
- ),
- [camera, config],
+ Object.keys(config.go2rtc.streams || {}).includes(streamName ?? ""),
+ [config, streamName],
);
const { data: cameraMetadata } = useSWR(
- isRestreamed ? `go2rtc/streams/${camera.live.stream_name}` : null,
+ isRestreamed ? `go2rtc/streams/${streamName}` : null,
{
revalidateOnFocus: false,
},
@@ -209,6 +240,13 @@ export default function LiveCameraView({
const [pip, setPip] = useState(false);
const [lowBandwidth, setLowBandwidth] = useState(false);
+ const [playInBackground, setPlayInBackground] = usePersistence(
+ `${camera.name}-background-play`,
+ false,
+ );
+
+ const [showStats, setShowStats] = useState(false);
+
const [fullResolution, setFullResolution] = useState({
width: 0,
height: 0,
@@ -337,6 +375,7 @@ export default function LiveCameraView({
return (
+
)}
@@ -499,9 +549,13 @@ export default function LiveCameraView({
showStillWithoutActivity={false}
cameraConfig={camera}
playAudio={audio}
+ playInBackground={playInBackground ?? false}
+ showStats={showStats}
micEnabled={mic}
iOSCompatFullScreen={isIOS}
preferredLiveMode={preferredLiveMode}
+ useWebGL={true}
+ streamName={streamName ?? ""}
pip={pip}
containerRef={containerRef}
setFullResolution={setFullResolution}
@@ -816,12 +870,49 @@ function PtzControlPanel({
);
}
+function OnDemandRetentionMessage({ camera }: { camera: CameraConfig }) {
+ const rankMap = { all: 0, motion: 1, active_objects: 2 };
+ const getValidMode = (retain?: { mode?: string }): keyof typeof rankMap => {
+ const mode = retain?.mode;
+ return mode && mode in rankMap ? (mode as keyof typeof rankMap) : "all";
+ };
+
+ const recordRetainMode = getValidMode(camera.record.retain);
+ const alertsRetainMode = getValidMode(camera.review.alerts.retain);
+
+ const effectiveRetainMode =
+ rankMap[alertsRetainMode] < rankMap[recordRetainMode]
+ ? recordRetainMode
+ : alertsRetainMode;
+
+ const source = effectiveRetainMode === recordRetainMode ? "camera" : "alerts";
+
+ return effectiveRetainMode !== "all" ? (
+
+ Your {source} recording retention configuration is set to{" "}
+ mode: {effectiveRetainMode}, so this on-demand recording will
+ only keep segments with {effectiveRetainMode.replaceAll("_", " ")}.
+
+ ) : null;
+}
+
type FrigateCameraFeaturesProps = {
- camera: string;
+ camera: CameraConfig;
recordingEnabled: boolean;
audioDetectEnabled: boolean;
autotrackingEnabled: boolean;
fullscreen: boolean;
+ streamName: string;
+ setStreamName?: (value: string | undefined) => void;
+ preferredLiveMode: string;
+ playInBackground: boolean;
+ setPlayInBackground: (value: boolean | undefined) => void;
+ showStats: boolean;
+ setShowStats: (value: boolean) => void;
+ isRestreamed: boolean;
+ setLowBandwidth: React.Dispatch>;
+ supportsAudioOutput: boolean;
+ supports2WayTalk: boolean;
};
function FrigateCameraFeatures({
camera,
@@ -829,14 +920,124 @@ function FrigateCameraFeatures({
audioDetectEnabled,
autotrackingEnabled,
fullscreen,
+ streamName,
+ setStreamName,
+ preferredLiveMode,
+ playInBackground,
+ setPlayInBackground,
+ showStats,
+ setShowStats,
+ isRestreamed,
+ setLowBandwidth,
+ supportsAudioOutput,
+ supports2WayTalk,
}: FrigateCameraFeaturesProps) {
- const { payload: detectState, send: sendDetect } = useDetectState(camera);
- const { payload: recordState, send: sendRecord } = useRecordingsState(camera);
- const { payload: snapshotState, send: sendSnapshot } =
- useSnapshotsState(camera);
- const { payload: audioState, send: sendAudio } = useAudioState(camera);
+ const { payload: detectState, send: sendDetect } = useDetectState(
+ camera.name,
+ );
+ const { payload: recordState, send: sendRecord } = useRecordingsState(
+ camera.name,
+ );
+ const { payload: snapshotState, send: sendSnapshot } = useSnapshotsState(
+ camera.name,
+ );
+ const { payload: audioState, send: sendAudio } = useAudioState(camera.name);
const { payload: autotrackingState, send: sendAutotracking } =
- useAutotrackingState(camera);
+ useAutotrackingState(camera.name);
+
+ // manual event
+
+ const recordingEventIdRef = useRef(null);
+ const [isRecording, setIsRecording] = useState(false);
+ const [activeToastId, setActiveToastId] = useState(
+ null,
+ );
+
+ const createEvent = useCallback(async () => {
+ try {
+ const response = await axios.post(
+ `events/${camera.name}/on_demand/create`,
+ {
+ include_recording: true,
+ duration: null,
+ },
+ );
+
+ if (response.data.success) {
+ recordingEventIdRef.current = response.data.event_id;
+ setIsRecording(true);
+ const toastId = toast.success(
+
+
+ Started manual on-demand recording.
+
+ {!camera.record.enabled || camera.record.retain.days == 0 ? (
+
+ Since recording is disabled or restricted in the config for this
+ camera, only a snapshot will be saved.
+
+ ) : (
+
+ )}
+
,
+ {
+ position: "top-center",
+ duration: 10000,
+ },
+ );
+ setActiveToastId(toastId);
+ }
+ } catch (error) {
+ toast.error("Failed to start manual on-demand recording.", {
+ position: "top-center",
+ });
+ }
+ }, [camera]);
+
+ const endEvent = useCallback(() => {
+ if (activeToastId) {
+ toast.dismiss(activeToastId);
+ }
+ try {
+ if (recordingEventIdRef.current) {
+ axios.put(`events/${recordingEventIdRef.current}/end`, {
+ end_time: Math.ceil(Date.now() / 1000),
+ });
+ recordingEventIdRef.current = null;
+ setIsRecording(false);
+ toast.success("Ended manual on-demand recording.", {
+ position: "top-center",
+ });
+ }
+ } catch (error) {
+ toast.error("Failed to end manual on-demand recording.", {
+ position: "top-center",
+ });
+ }
+ }, [activeToastId]);
+
+ const handleEventButtonClick = useCallback(() => {
+ if (isRecording) {
+ endEvent();
+ } else {
+ createEvent();
+ }
+ }, [createEvent, endEvent, isRecording]);
+
+ useEffect(() => {
+ // ensure manual event is stopped when component unmounts
+ return () => {
+ if (recordingEventIdRef.current) {
+ endEvent();
+ }
+ };
+ // mount/unmount only
+ // eslint-disable-next-line react-hooks/exhaustive-deps
+ }, []);
+
+ // navigate for debug view
+
+ const navigate = useNavigate();
// desktop shows icons part of row
if (isDesktop || isTablet) {
@@ -888,6 +1089,264 @@ function FrigateCameraFeatures({
}
/>
)}
+
+
+
+
+
+
+
+
+
+
+ {!isRestreamed && (
+
+
Stream
+
+
+
Restreaming is not enabled for this camera.
+
+
+
+
+ Info
+
+
+
+ Set up go2rtc for additional live view options and audio
+ for this camera.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+
+
+ )}
+ {isRestreamed &&
+ Object.values(camera.live.streams).length > 0 && (
+
+
Stream
+
{
+ setStreamName?.(value);
+ }}
+ >
+
+ {Object.keys(camera.live.streams).find(
+ (key) => camera.live.streams[key] === streamName,
+ )}
+
+
+
+
+ {Object.entries(camera.live.streams).map(
+ ([stream, name]) => (
+
+ {stream}
+
+ ),
+ )}
+
+
+
+
+ {preferredLiveMode != "jsmpeg" && isRestreamed && (
+
+ {supportsAudioOutput ? (
+ <>
+
+
Audio is available for this stream
+ >
+ ) : (
+ <>
+
+
Audio is unavailable for this stream
+
+
+
+
+ Info
+
+
+
+ Audio must be output from your camera and
+ configured in go2rtc for this stream.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+ >
+ )}
+
+ )}
+ {preferredLiveMode != "jsmpeg" &&
+ isRestreamed &&
+ supportsAudioOutput && (
+
+ {supports2WayTalk ? (
+ <>
+
+
+ Two-way talk is available for this stream
+
+ >
+ ) : (
+ <>
+
+
+ Two-way talk is unavailable for this stream
+
+
+
+
+
+ Info
+
+
+
+ Your device must suppport the feature and
+ WebRTC must be configured for two-way talk.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+ >
+ )}
+
+ )}
+
+ {preferredLiveMode == "jsmpeg" && isRestreamed && (
+
+
+
+
+
+ Live view is in low-bandwidth mode due to buffering
+ or stream errors.
+
+
+
setLowBandwidth(false)}
+ >
+
+
+ Reset stream
+
+
+
+ )}
+
+ )}
+ {isRestreamed && (
+
+
+
+ Play in background
+
+
+ setPlayInBackground(checked)
+ }
+ />
+
+
+ Enable this option to continue streaming when the player is
+ hidden.
+
+
+ )}
+
+
+
+ Show stream stats
+
+ setShowStats(checked)}
+ />
+
+
+ Enable this option to show stream statistics as an overlay on
+ the camera feed.
+
+
+
+
+ Debug View
+
+ navigate(`/settings?page=debug&camera=${camera.name}`)
+ }
+ className="ml-2 inline-flex size-5 cursor-pointer"
+ />
+
+
+
+
+
>
);
}
@@ -908,44 +1367,276 @@ function FrigateCameraFeatures({
title={`${camera} Settings`}
/>
-
- sendDetect(detectState == "ON" ? "OFF" : "ON")}
- />
- {recordingEnabled && (
+
+
- sendRecord(recordState == "ON" ? "OFF" : "ON")
+ sendDetect(detectState == "ON" ? "OFF" : "ON")
}
/>
- )}
-
- sendSnapshot(snapshotState == "ON" ? "OFF" : "ON")
- }
- />
- {audioDetectEnabled && (
+ {recordingEnabled && (
+
+ sendRecord(recordState == "ON" ? "OFF" : "ON")
+ }
+ />
+ )}
sendAudio(audioState == "ON" ? "OFF" : "ON")}
- />
- )}
- {autotrackingEnabled && (
-
- sendAutotracking(autotrackingState == "ON" ? "OFF" : "ON")
+ sendSnapshot(snapshotState == "ON" ? "OFF" : "ON")
}
/>
- )}
+ {audioDetectEnabled && (
+
+ sendAudio(audioState == "ON" ? "OFF" : "ON")
+ }
+ />
+ )}
+ {autotrackingEnabled && (
+
+ sendAutotracking(autotrackingState == "ON" ? "OFF" : "ON")
+ }
+ />
+ )}
+
+
+ {!isRestreamed && (
+
+
Stream
+
+
+
Restreaming is not enabled for this camera.
+
+
+
+
+ Info
+
+
+
+ Set up go2rtc for additional live view options and audio for
+ this camera.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+
+
+ )}
+ {isRestreamed && Object.values(camera.live.streams).length > 0 && (
+
+
Stream
+
{
+ setStreamName?.(value);
+ }}
+ >
+
+ {Object.keys(camera.live.streams).find(
+ (key) => camera.live.streams[key] === streamName,
+ )}
+
+
+
+
+ {Object.entries(camera.live.streams).map(
+ ([stream, name]) => (
+
+ {stream}
+
+ ),
+ )}
+
+
+
+ {preferredLiveMode != "jsmpeg" && isRestreamed && (
+
+ {supportsAudioOutput ? (
+ <>
+
+
Audio is available for this stream
+ >
+ ) : (
+ <>
+
+
Audio is unavailable for this stream
+
+
+
+
+ Info
+
+
+
+ Audio must be output from your camera and configured
+ in go2rtc for this stream.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+ >
+ )}
+
+ )}
+ {preferredLiveMode != "jsmpeg" &&
+ isRestreamed &&
+ supportsAudioOutput && (
+
+ {supports2WayTalk ? (
+ <>
+
+
Two-way talk is available for this stream
+ >
+ ) : (
+ <>
+
+
Two-way talk is unavailable for this stream
+
+
+
+
+ Info
+
+
+
+ Your device must suppport the feature and WebRTC
+ must be configured for two-way talk.
+
+
+ Read the documentation{" "}
+
+
+
+
+
+ >
+ )}
+
+ )}
+ {preferredLiveMode == "jsmpeg" && isRestreamed && (
+
+
+
+
+
+ Live view is in low-bandwidth mode due to buffering or
+ stream errors.
+
+
+
setLowBandwidth(false)}
+ >
+
+ Reset stream
+
+
+ )}
+
+ )}
+
+
+ On-Demand Recording
+
+
+ {isRecording ? "End" : "Start"} on-demand recording
+
+
+ Start a manual event based on this camera's recording retention
+ settings.
+
+
+ {isRestreamed && (
+ <>
+
+
{
+ setPlayInBackground(checked);
+ }}
+ />
+
+ Enable this option to continue streaming when the player is
+ hidden.
+
+
+
+
{
+ setShowStats(checked);
+ }}
+ />
+
+ Enable this option to show stream statistics as an overlay on
+ the camera feed.
+
+
+ >
+ )}
+
+
+ Debug View
+
+ navigate(`/settings?page=debug&camera=${camera.name}`)
+ }
+ className="ml-2 inline-flex size-5 cursor-pointer"
+ />
+
+
+
);
diff --git a/web/src/views/live/LiveDashboardView.tsx b/web/src/views/live/LiveDashboardView.tsx
index 7642d5a0d..363405023 100644
--- a/web/src/views/live/LiveDashboardView.tsx
+++ b/web/src/views/live/LiveDashboardView.tsx
@@ -28,10 +28,16 @@ import DraggableGridLayout from "./DraggableGridLayout";
import { IoClose } from "react-icons/io5";
import { LuLayoutDashboard } from "react-icons/lu";
import { cn } from "@/lib/utils";
-import { LivePlayerError } from "@/types/live";
+import {
+ AudioState,
+ LivePlayerError,
+ StatsState,
+ VolumeState,
+} from "@/types/live";
import { FaCompress, FaExpand } from "react-icons/fa";
import useCameraLiveMode from "@/hooks/use-camera-live-mode";
import { useResizeObserver } from "@/hooks/resize-observer";
+import LiveContextMenu from "@/components/menu/LiveContextMenu";
type LiveDashboardViewProps = {
cameras: CameraConfig[];
@@ -184,8 +190,13 @@ export default function LiveDashboardView({
};
}, []);
- const { preferredLiveModes, setPreferredLiveModes, resetPreferredLiveMode } =
- useCameraLiveMode(cameras, windowVisible);
+ const {
+ preferredLiveModes,
+ setPreferredLiveModes,
+ resetPreferredLiveMode,
+ isRestreamedStates,
+ supportsAudioOutputStates,
+ } = useCameraLiveMode(cameras, windowVisible);
const cameraRef = useCallback(
(node: HTMLElement | null) => {
@@ -221,9 +232,45 @@ export default function LiveDashboardView({
[setPreferredLiveModes],
);
+ // audio states
+
+ const [audioStates, setAudioStates] = useState({});
+ const [volumeStates, setVolumeStates] = useState({});
+ const [statsStates, setStatsStates] = useState({});
+
+ const toggleStats = (cameraName: string): void => {
+ setStatsStates((prev) => ({
+ ...prev,
+ [cameraName]: !prev[cameraName],
+ }));
+ };
+
+ const toggleAudio = (cameraName: string): void => {
+ setAudioStates((prev) => ({
+ ...prev,
+ [cameraName]: !prev[cameraName],
+ }));
+ };
+
+ const muteAll = (): void => {
+ const updatedStates: Record = {};
+ visibleCameras.forEach((cameraName) => {
+ updatedStates[cameraName] = false;
+ });
+ setAudioStates(updatedStates);
+ };
+
+ const unmuteAll = (): void => {
+ const updatedStates: Record = {};
+ visibleCameras.forEach((cameraName) => {
+ updatedStates[cameraName] = true;
+ });
+ setAudioStates(updatedStates);
+ };
+
return (
{isMobile && (
@@ -346,20 +393,56 @@ export default function LiveDashboardView({
grow = "aspect-video";
}
return (
- onSelectCamera(camera.name)}
- onError={(e) => handleError(camera.name, e)}
- onResetLiveMode={() => resetPreferredLiveMode(camera.name)}
- />
+ isRestreamed={isRestreamedStates[camera.name]}
+ supportsAudio={
+ supportsAudioOutputStates[
+ Object.values(camera.live.streams)?.[0]
+ ]?.supportsAudio ?? false
+ }
+ audioState={audioStates[camera.name]}
+ toggleAudio={() => toggleAudio(camera.name)}
+ statsState={statsStates[camera.name]}
+ toggleStats={() => toggleStats(camera.name)}
+ volumeState={volumeStates[camera.name] ?? 1}
+ setVolumeState={(value) =>
+ setVolumeStates({
+ [camera.name]: value,
+ })
+ }
+ muteAll={muteAll}
+ unmuteAll={unmuteAll}
+ resetPreferredLiveMode={() =>
+ resetPreferredLiveMode(camera.name)
+ }
+ >
+ onSelectCamera(camera.name)}
+ onError={(e) => handleError(camera.name, e)}
+ onResetLiveMode={() => resetPreferredLiveMode(camera.name)}
+ playAudio={audioStates[camera.name] ?? false}
+ volume={volumeStates[camera.name]}
+ />
+
);
})}
diff --git a/web/src/views/settings/UiSettingsView.tsx b/web/src/views/settings/UiSettingsView.tsx
index 6386a3cad..e3b5c8c7a 100644
--- a/web/src/views/settings/UiSettingsView.tsx
+++ b/web/src/views/settings/UiSettingsView.tsx
@@ -46,6 +46,25 @@ export default function UiSettingsView() {
});
}, [config]);
+ const clearStreamingSettings = useCallback(async () => {
+ if (!config) {
+ return [];
+ }
+
+ await delData(`streaming-settings`)
+ .then(() => {
+ toast.success(`Cleared streaming settings for all camera groups.`, {
+ position: "top-center",
+ });
+ })
+ .catch((error) => {
+ toast.error(
+ `Failed to clear camera groups streaming settings: ${error.response.data.message}`,
+ { position: "top-center" },
+ );
+ });
+ }, [config]);
+
useEffect(() => {
document.title = "General Settings - Frigate";
}, []);
@@ -84,11 +103,15 @@ export default function UiSettingsView() {
Automatic Live View
-
+
Automatically switch to a camera's live view when activity is
detected. Disabling this option causes static camera images on
- the Live dashboard to only update once per minute.
+ the your dashboards to only update once per minute.{" "}
+
+ This is a global setting but can be overridden on each
+ camera in camera groups only .
+
@@ -103,7 +126,7 @@ export default function UiSettingsView() {
Play Alert Videos
-
+
By default, recent alerts on the Live dashboard play as small
looping videos. Disable this option to only show a static
@@ -114,10 +137,10 @@ export default function UiSettingsView() {
-
+
Stored Layouts
-
+
The layout of cameras in a camera group can be
dragged/resized. The positions are stored in your browser's
@@ -133,6 +156,24 @@ export default function UiSettingsView() {
+
+
+
Camera Group Streaming Settings
+
+
+ Streaming settings for each camera group are stored in your
+ browser's local storage.
+
+
+
+
+ Clear All Streaming Settings
+
+
+