There are 24 new articles in PLoS ONE today. As always, you should rate the articles, post notes and comments and send trackbacks when you blog about the papers. You can now also easily place articles on various social services (CiteULike, Connotea, Stumbleupon, Facebook and Digg) with just one click. Here are my own picks for the week – you go and look for your own favourites:
An ethogram is a catalogue of discrete behaviors typically employed by a species. Traditionally animal behavior has been recorded by observing study individuals directly. However, this approach is difficult, often impossible, in the case of behaviors which occur in remote areas and/or at great depth or altitude. The recent development of increasingly sophisticated, animal-borne data loggers, has started to overcome this problem. Accelerometers are particularly useful in this respect because they can record the dynamic motion of a body in e.g. flight, walking, or swimming. However, classifying behavior using body acceleration characteristics typically requires prior knowledge of the behavior of free-ranging animals. Here, we demonstrate an automated procedure to categorize behavior from body acceleration, together with the release of a user-friendly computer application, “Ethographer”. We evaluated its performance using longitudinal acceleration data collected from a foot-propelled diving seabird, the European shag, Phalacrocorax aristotelis. The time series data were converted into a spectrum by continuous wavelet transformation. Then, each second of the spectrum was categorized into one of 20 behavior groups by unsupervised cluster analysis, using k-means methods. The typical behaviors extracted were characterized by the periodicities of body acceleration. Each categorized behavior was assumed to correspond to when the bird was on land, in flight, on the sea surface, diving and so on. The behaviors classified by the procedures accorded well with those independently defined from depth profiles. Because our approach is performed by unsupervised computation of the data, it has the potential to detect previously unknown types of behavior and unknown sequences of some behaviors.
Coat-color proportions and patterns in mice are used as assays for many processes such as transgene expression, chimerism, and epigenetics. In many studies, coat-color readouts are estimated from subjective scoring of individual mice. Here we show a method by which mouse coat color is quantified as the proportion of coat shown in one or more digital images. We use the yellow-agouti mouse model of epigenetic variegation to demonstrate this method. We apply this method to live mice using a conventional digital camera for data collection. We use a raster graphics editing program to convert agouti regions of the coat to a standard, uniform, brown color and the yellow regions of the coat to a standard, uniform, yellow color. We use a second program to quantify the proportions of these standard colors. This method provides quantification that relates directly to the visual appearance of the live animal. It also provides an objective analysis with a traceable record, and it should allow for precise comparisons of mouse coats and mouse cohorts within and between studies.