Multi-scale Three Dimensional Behavioral Quantification in Individuals and Social Groups<\/em><\/strong><\/p>\nCurrent methods of measuring movement of freely behaving animals have limitations: Highly detailed observations of small movements of an animal (a single digit, for example) require restricted ranges of motion. Studying freely moving behavior in 3D space often means limiting resolution, perhaps only tracking overall position, or relying on an observer\u2019s description. Automatic video tracking in animals typically requires an unnatural, simple environment, and body parts not visible to cameras aren\u2019t tracked accurately. High-resolution Artificial Intelligence (AI) predictions over large three-dimensional spaces using volumetric spatial representation, a technique recently developed to overcome these issues, require massive computing power. Adding multiple animals for social observations introduces additional issues.<\/p>\n
As a result, there is poor availability of the most desired data: High-resolution, automatic tracking of animals in 3D space performing natural behaviors, alone or in groups, and quantification of that motion in a standardized format. Dr. Dunn is working on a new approach that aims to bring that ideal closer. Building on learnings from a 3D geometric machine-learning algorithm his team used to greatly improve the accuracy of predictions, Dr. Dunn and his team are now working on adaptive recurrent image sampling (ARIS) that combines images from multiple cameras to build a model that can measure and predict body position on many scales, even when a part (such as an arm or foot) isn\u2019t directly visible.<\/p>\n
ARIS selectively improves the resolution of fine-scale body features, and uses predictive modelling based on what it knows of its subject (arrangement and length of limbs, how they are connect, how they move, etc.) \u2013 learned first by parsing enormous amounts of training data from freely-behaving rats and then finetuned using training data in other species \u2013 to focus on the portion of space where the body part is likely to be. This uses far less computational power than previous 3D volumetric tools. In his research, Dr. Dunn will implement ARIS and record data at multiple scales, from overall position and posture down to the movement of fine features of the hands, feet, and face. Further research will explore its effectiveness with multiple animals interacting. This ability to measure behavior in a new, more precise way has broad implications for the study of neurological disorders that affect movement, linking brain activity to behavior, and studying social interactions.<\/p>\n
\nJeffrey Kieft, Ph.D., <\/strong>Professor, Department of Biochemistry and Molecular Genetics, University of Colorado School of Medicine<\/strong><\/p>\nA New Technology to Control the Transcriptome<\/em><\/strong><\/p>\nMessenger RNA, or mRNA, is recognized as a vital player in the life and health of cells. These RNA molecules are the templates to make protein, and are created within cells to carry instructions to the protein-making machinery, then are destroyed by enzymes. The totality of mRNA an organism expresses is called its \u201ctranscriptome.\u201d<\/p>\n
Deficiencies in mRNA and non-coding RNA (ncRNA) are linked to certain neurodegenerative and neurodevelopmental disorders. If there is too little of a specific mRNA or ncRNA in the transcriptome, certain cellular functions may be degraded or disabled. Dr. Kieft is exploring a novel way to manage the transcriptome by slowing the decay of mRNA and ncRNA. Knowing that some enzymes that destroy the RNAs essentially \u201cchew\u201d it from one end to the other, Dr. Kieft used his understanding of how RNA molecules are structured and fold on themselves to create an engineered piece of exoribonuclease-resistant RNA (xrRNA) that, when introduced to compatible mRNA or ncRNA, combines and folds to form a \u201cblocking\u201d structure, literally changing the shape of the RNA by inserting a protrusion that stops the enzymes in their tracks.<\/p>\n
By slowing the decay of the target mRNA and ncRNA, Dr. Kieft sees the opportunity to manage their abundance within the transcriptome. Engineered xrRNAs could recognize just specific targets, link up with them, and create the protection, so researchers can increase the proportion of the target without changing how much is created. The approach has the advantage of being less disruptive to the host cell than unnaturally boosting mRNA, and the precision with which xrRNA can be engineered offers the potential to target multiple RNAs at once, and possibly even allow fine-tuning by precisely managing the rate of decay. Dr. Kieft sees this application, born of basic science studying RNA, as a potentially powerful research tool for neuroscientists, and perhaps even the foundation for therapies in the more distant future.<\/p>\n
\nSuhasa Kodandaramaiah, Ph.D., Benjamin Mayhugh Assistant Professor, Department of Mechanical Engineering, University of Minnesota Twin Cities <\/strong><\/p>\nRobot Assisted Brain-Wide Recordings in Freely Behaving Mice <\/em><\/strong><\/p>\nNeuroscientists studying brain activity during behaviors usually have to make a trade-off: They use miniaturized head-mounted neural sensors that are light enough to allow a subject animal to behave freely, but are lower resolution or can\u2019t monitor the whole brain. Or they use more powerful tools, which are far too heavy for subject animals and require other solutions, like immobilization while letting animals move on a treadmill, or even using virtual reality experiences that nonetheless limit the behavior of a subject.<\/p>\n
Dr. Kodandaramaiah is tackling the challenge with a robotic cranial exoskeleton that carries the weight of neural recording and monitoring hardware while still allowing the subject (in this case a mouse) to rotate its head in all three degrees: a full 360 degree turn in the yaw (horizontal rotation) axis, and about 50 degrees of motion in the pitch and roll axes, while moving around in an arena. The robot has three jointed arms arranged in a triangular configuration, suspended over the subject and meeting at the point of mounting on the head. Sensors in the mount will detect what motion the mouse is making and direct the robot to enable the motion with as little resistive force as possible, allowing the mouse to turn and move within an arena typically used for neuroscience experiments with all the necessary sensory equipment and wires from the implants supported by the robot.<\/p>\n
Taking out the need for miniaturization allows researchers to use whatever state-of-the art hardware is available, meaning a robot can theoretically be upgraded to use the latest technology soon after its introduction. To get to that point, Dr. Kodandaramaiah\u2019s team will go through several steps \u2013 engineering the exoskeleton; engineering the head-stage with its needed sensors plus high-density electrodes and cameras for external observation of eyes, whiskers and more; performing benchtop testing; tuning the robot to the inputs a mouse can deliver; determining how to introduce probes; and finally making live recording. With this mechanical underpinning, Dr. Kodandaramaiah hopes to help researchers get closer to the state where they can make detailed brain-wide neural recordings of freely behaving subjects over long timescales.<\/p>\n\n\t\t<\/div>\n\t<\/div>\n<\/div><\/div><\/div><\/div>\n<\/div>","protected":false},"excerpt":{"rendered":"July 30, 2021 The McKnight Endowment Fund for Neuroscience (MEFN) announced the three recipients of $600,000 in grant funding through the 2021 McKnight Technological Innovations in Neuroscience Awards, recognizing these projects for their ability to fundamentally change the way neuroscience research is conducted. Each of the projects will receive a total of $200,000 over the …","protected":false},"author":17,"featured_media":42676,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[2],"tags":[],"class_list":{"0":"post-47180","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-announcement","8":"post_topic-technology-awards","9":"post_topic-the-mcknight-endowment-fund-for-neuroscience"},"acf":[],"yoast_head":"\n
2021 McKnight Technology Awards - McKnight Foundation<\/title>\n\n\n\n\n\n\n\n\n\n\t\n\t\n\t\n\n\n\n\n\n\t\n\t\n\t\n