File | Date | Author | Commit |
---|---|---|---|
data | 2019-09-18 |
![]() |
[c4c061] remove this column from the spreadsheet as it i... |
matlab | 2019-11-21 |
![]() |
[91f7a2] not sure if this is supposed to be here but add... |
python | 2019-09-18 |
![]() |
[7d184f] update videos to have single-frame titles and c... |
.gitignore | 2019-08-29 |
![]() |
[f71298] make the main script work from the data in the ... |
README.md | 2019-11-21 |
![]() |
[d50c26] Update README.md for publication |
This repository includes routines for training DeepLabCut to track mouse digits and analyse the resulting trajectories as well as manual ethograms. Python code is in the python folder, Matlab code in the Matlab folder
The master branch may or may not contain new, experimental features that have been added since publication. Branch plosone-revision1 contains the code as of the first revised submission to PLoS ONE (which is also the most recent Biorxiv version). Branches for other versions (including those submitted to other versions) may be made available in future if anyone is interested.
First, install DeepLabCut following the instructions at https://github.com/AlexEMG/DeepLabCut/blob/master/docs/installation.md. We recommend using the appropriate supplied Anaconda environment for your system.
Next, install Matlab and the required Matlab dependencies.
Finally, download the contents of this repository and add the Matlab folder to your Matlab path. In other words, if you cloned the repository into $repository_dir, add $repository_dir/matlab to your Matlab path.
If you wish to retrain DeepLabCut for whatever reason, open the Jupyter notebook python/dlc_seed_handling.ipynb (ensuring you have the DeepLabCut environment activated if you're using Anaconda) and follow the instructions therein.
To check the DeepLabCut generated trajectories and exclude stretches of poor tracking, run matlab/checkDLCTrajectories.m. This step is unnecessary if using the supplied data, as it has already been done, but is an option if you disagree with our excluded segments or you have generated your own trajectories. Running the script will spawn a GUI that will display, for each file in turn, the L and D traces (see the paper for definitions) for front view videos, or just the L trace for ventral and side view videos. Click and drag to draw a line through any segments you want to exclude (you don't need to actually cross the trace, only the X-coordinate of the line is considered). Lines are allowed to overlap. There is no undo: if you make a mistake, hit Ctrl+C in the Matlab command window, close the GUI, edit the for statement in the script to start from the video you were on, and run it again. Once you're done with a video, hit ESC to move onto the next one. Note that front view videos are shown twice, so the GUI will appear not to change. The first time you're excluding segments of poor D, the second segments of poor L (in addition to those already excluded for poor D).
Once you're happy with the trajectories, run matlab/seedHandlingPaper.m to perform all the analysis and generate the figures and tables for the paper (note that the figures don't match exactly as some editing of appearance and layout was done in Illustrator post-hoc). Grab a coffee, the clustering takes a while.
Finally, open python/seed_handling_example_videos.ipynb to generate the example videos shown in the paper. That notebook isn't great about closing its handles, so you might need to rerun some cells after restarting the kernel and commenting out already generated videos.