Additional information that may be collected to support modelling density#
Hint
This page has not yet been populated with information. This information will be available soon! In the meantime, check out the information in other info topics (Concept library TOC).
Assumptions, Pros, Cons
Demographic closure (i.e., no births or deaths) (Wearn & Glover-Kapfer, 2017)
Detection probability of different individuals is equal (Wearn & Glover-Kapfer, 2017)
or, for SECR, individuals have equal Detection probability at a given distance from the centre of their home range (Wearn & Glover-Kapfer, 2017)
Detections of different individuals are independent (Wearn & Glover-Kapfer, 2017)
Behaviour is unaffected by cameras and marking (Wearn & Glover-Kapfer, 2017)
Individuals do not lose marks (Wearn & Glover-Kapfer, 2017)
Individuals are not misidentified (Wearn & Glover-Kapfer, 2017)
Surveys are independent (Wearn & Glover-Kapfer, 2017)
For conventional models, geographic closure (i.e., no immigration or emigration) (Wearn & Glover-Kapfer, 2017)
Spatially explicit models have further assumptions about animal movement (Wearn & Glover-Kapfer, 2017; Rowcliffe et al., 2008; Royle et al., 2009; O’Brien et al., 2011); these include:
Home ranges are stable (Wearn & Glover-Kapfer, 2017)
Movement is unaffected by the cameras (Wearn & Glover-Kapfer, 2017)
Camera locations are randomly placed with respect to the distribution and orientation of home ranges (Wearn & Glover-Kapfer, 2017)
Distribution of home range centres follows a defined distribution (Poisson, or other, e.g., negative binomial) (Wearn & Glover-Kapfer, 2017)
Produces direct estimates of density or population size for explicit spatial regions (Chandler & Royle, 2013)
Allows researchers to mark a subset of the population / to take advantage of natural markings (Wearn & Glover-Kapfer, 2017)
Estimates are fully comparable across space, time, species and studies (Wearn & Glover-Kapfer, 2017)
density estimates obtained in a single model, fully incorporate spatial information of locations and individuals (Wearn & Glover-Kapfer, 2017)
Both likelihood-based and Bayesian versions of the model have been implemented in relatively easy-to-use software DENSITY and SPACECAP, respectively, as well as associated R packages) (Wearn & Glover-Kapfer, 2017)
Flexibility in study design (e.g., ‘holes’ in the trapping grid) (Wearn & Glover-Kapfer, 2017)
Open SECR (Efford, 2004; Borchers & Efford, 2008; Royle & Young, 2008; Royle et al., 2009) models exist that allow for estimation of recruitment and survival rates (Wearn & Glover-Kapfer, 2017)
Avoid ad-hoc definitions of study area and edge effects (Doran-Myers, 2018)
SECR (Efford, 2004; Borchers & Efford, 2008; Royle & Young, 2008; Royle et al., 2009) accounts for variation in individual Detection probability; can produce spatial variation in density; SECR (Efford, 2004; Borchers & Efford, 2008; Royle & Young, 2008; Royle et al., 2009) more sensitive to detect moderate-to-major populations changes (+/-20-80%) (Royle & Young, 2008; Royle et al., 2009)
Requires that individuals are identifiable (Wearn & Glover-Kapfer, 2017)
Requires that a minimum number of individuals are trapped (each recaptured multiple times ideally) (Wearn & Glover-Kapfer, 2017)
Requires that each individual is captured at a number of camera locations (Wearn & Glover-Kapfer, 2017)
Multiple cameras per station may be required to identify individuals; difficult to implement at large spatial scales as it requires a high density of cameras (Morin et al., 2022)
May not be precise enough for long-term monitoring (Green et al., 2020)
Cameras must be close enough that animals are detected at multiple camera locations (Wearn & Glover-Kapfer, 2017) (may be challenging to implement at large scales as many cameras are needed)’ (Chandler & Royle, 2013)
½ MMDM (Mean Maximum Distance Moved) will usually lead to an underestimation of home range size and thus overestimation of density (Parmenter et al., 2003; Noss et al., 2012; Wearn & Glover-Kapfer, 2017)
figure1_caption
figure2_caption
figure3_caption
figure4_caption
figure5_caption
figure6_caption
vid1_caption
vid2_caption
vid3_caption
vid4_caption
vid5_caption
vid6_caption
Type |
Name |
Note |
URL |
Reference |
---|---|---|---|---|
resource1_type |
resource1_name |
resource1_note |
resource1_url |
rbib_resource1_ref_id |
resource2_type |
resource2_name |
resource2_note |
resource2_url |
rbib_resource2_ref_id |
resource3_type |
resource3_name |
resource3_note |
resource3_url |
rbib_resource3_ref_id |
resource4_type |
resource4_name |
resource4_note |
resource4_url |
rbib_resource4_ref_id |
resource5_type |
resource5_name |
resource5_note |
resource5_url |
rbib_resource5_ref_id |
resource6_type |
resource6_name |
resource6_note |
resource6_url |
rbib_resource6_ref_id |
resource7_type |
resource7_name |
resource7_note |
resource7_url |
rbib_resource7_ref_id |
resource8_type |
resource8_name |
resource8_note |
resource8_url |
rbib_resource8_ref_id |
resource9_type |
resource9_name |
resource9_note |
resource9_url |
rbib_resource9_ref_id |
resource10_type |
resource10_name |
resource10_note |
resource10_url |
rbib_resource10_ref_id |
resource11_type |
resource11_name |
resource11_note |
resource11_url |
rbib_resource11_ref_id |
resource12_type |
resource12_name |
resource12_note |
resource12_url |
rbib_resource12_ref_id |
resource13_type |
resource13_name |
resource13_note |
resource13_url |
rbib_resource13_ref_id |
resource14_type |
resource14_name |
resource14_note |
resource14_url |
rbib_resource14_ref_id |
resource15_type |
resource15_name |
resource15_note |
resource15_url |
rbib_resource15_ref_id |
resource16_type |
resource16_name |
resource16_note |
resource16_url |
rbib_resource16_ref_id |
resource17_type |
resource17_name |
resource17_note |
resource17_url |
rbib_resource17_ref_id |
resource18_type |
resource18_name |
resource18_note |
resource18_url |
rbib_resource18_ref_id |
resource19_type |
resource19_name |
resource19_note |
resource19_url |
rbib_resource19_ref_id |
resource20_type |
resource20_name |
resource20_note |
resource20_url |
rbib_resource20_ref_id |