{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": "# User Manual\n\n`ethograph` visualizes behavioural recordings across modalities:\n\n| Data | Shown as | File formats / loaders |\n|------|----------|------------------------|\n| **Video** | Napari viewer | `.mp4` (recommended), `.avi`, `.mov` (via [pyav](https://pyav.org/docs/stable/)) |\n| **Pose/ BoundingBox** | Overlay on video | `.h5`, `.csv` (DLC, SLEAP, LightningPose, ... via [movement](https://movement.neuroinformatics.dev)) |\n| **Audio** | Waveform + spectrogram | `.wav`, `.mp3` (via [audioio](https://github.com/bendalab/audioio)) |\n| **Electrophysiology** | Multi-channel trace | `.dat`, `.bin`, `.raw` via [phylib](https://github.com/cortex-lab/phylib) and [supported extensions](https://akseli-ilmanen.github.io/ethograph/getting_started/loading_ephys.html#supported-formats) via [Neo](https://neo.readthedocs.io) |\n| **Spike-sorted units** | Raster / PSTH | [Kilosort](https://github.com/MouseLand/Kilosort) folder, `.nwb` file, {class}`pynapple.TsGroup`|\n| **Features** | Lineplot / heatmap | Kinematics, firing rates, latent variables, model outputs, ... |\n\n```{note}\n`.avi` and `.mov` files have inaccurate frame seeking (off by 1–2 frames). For best results, transcode to `.mp4` with H.264. See {doc}`troubleshooting`.\n```\n\n---\n\n\nFeature data can be loaded via three backends:\n\n| Backend | Object | File format |\n|---------|--------|-------------|\n| **xarray** | {class}`xarray.Dataset` / {class}`~ethograph.io.trialtree.TrialTree` | `.nc` |\n| **Pynapple** | {class}`~pynapple.Tsd` / {class}`~pynapple.TsdFrame` | `.npz` or folder |\n| **NWB** | `.nwb` (loaded via {mod}`pynapple`) | `.nwb` |\n\nUnlike a plain numpy array, all three formats carry explicit timestamps with each value.\nThis allows `ethograph` to automatically align data of different sampling rates and\nmodalities (video, audio & electrophysiology).\n" }, { "cell_type": "markdown", "source": "## Labelling\n\nEthograph is a labelling GUI for marking the onset and offset of behavioural\nevents. Press a number key to select a label class, click on the timeseries plot to\nmark the onset, click again to mark the offset, then press **V** to play back\nthe segment you just labelled.\n\n```{image} /_static/videos/label_basic.gif\n:alt: Labelling workflow — select class, click onset, click offset, press V to review\n:width: 100%\n```", "metadata": {} }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Feature dropdowns\n", "\n", "To showcase some `ethograph` functionalities, we will use pose data from a carrion crow performing a tool-use task (Moll et al., 2025¹). The {class}`xarray.Dataset` is from one behavioural trial, and shows position, velocity, speed, and acceleration for 3 keypoints tracked in 3D.\n", "\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "execution": { "iopub.execute_input": "2026-04-13T22:36:26.885749Z", "iopub.status.busy": "2026-04-13T22:36:26.885749Z", "iopub.status.idle": "2026-04-13T22:36:29.012346Z", "shell.execute_reply": "2026-04-13T22:36:29.012346Z" } }, "outputs": [ { "data": { "text/html": [ "
<xarray.Dataset> Size: 206kB\n",
"Dimensions: (time: 1169, space: 3, keypoints: 3, individuals: 1)\n",
"Coordinates:\n",
" * time (time) float64 9kB 0.0 0.005 0.01 0.015 ... 5.83 5.835 5.84\n",
" * space (space) <U1 12B 'x' 'y' 'z'\n",
" * keypoints (keypoints) <U8 96B 'beakTip' 'stickTip' 'pellet'\n",
" * individuals (individuals) <U5 20B 'Crow1'\n",
"Data variables:\n",
" position (time, space, keypoints, individuals) float64 84kB ...\n",
" velocity (time, space, keypoints, individuals) float64 84kB ...\n",
" speed (time, keypoints, individuals) float64 28kB ...\n",
"Attributes:\n",
" source_software: DeepLabCut\n",
" ds_type: poses\n",
" fps: 200.0\n",
" time_unit: seconds\n",
" source_file: C:/Users/aksel/Documents/Code/EthoGraph/data/Moll2025/2...\n",
" trial: 115\n",
" bird: Crow1\n",
" session_date: 2024-12-17\n",
" pellet_position: left\n",
" human_verified: 1