ወደ ይዘት ዝለል
8 ደቂቃ ተነቧል

2021 McKnight Technology Awards

July 30, 2021

The McKnight Endowment Fund for Neuroscience (MEFN) announced the three recipients of $600,000 in grant funding through the 2021 McKnight Technological Innovations in Neuroscience Awards, recognizing these projects for their ability to fundamentally change the way neuroscience research is conducted. Each of the projects will receive a total of $200,000 over the next two years, advancing the development of these groundbreaking technologies used to map, monitor, and model brain function. The 2021 awardees are:

  • Timothy Dunn, Ph.D., of Duke University, who is working on a system to capture the body motion of subjects using 3D measurements (rather than 2D pixel measurements) by combining multiple video cameras and a new machine-learning algorithm. The method allows high resolution tracking of small, discrete body parts in freely behaving animals, allows study in naturalistic space, and can even track multiple animals interacting socially, a combination of features not available in current systems.
  • Jeffrey Kieft, Ph.D., of the University of Colorado School of Medicine, who is developing a way to engineer RNA to create a form of protection that can reduce the rate at which mRNA is destroyed by enzymes. In this way, researchers would be able to manage the abundance of specific mRNA proteins in cells, useful for studying and possibly even treating certain disorders.
  • Suhasa Kodandaramaiah, Ph.D., of the University of Minnesota Twin Cities, who is using robotic systems to enable more robust tracking of brain activity in freely moving animals. Using robotics to move the hardware along multiple axes in sync with the animal, this approach allows the use of larger, more powerful, higher resolution monitoring systems than the miniaturized versions often used for experiments in freely moving animals.

Learn more about each of these research projects below.

ስለ ኒውሮሳይንስ ሽልማቶች ስለ ቴክኖሎጅካዊ ፈጠራዎች

Since the McKnight Technological Innovations in Neuroscience Award was established in 1999, the MEFN has contributed more than $15 million to innovative technologies for neuroscience through this award mechanism. The MEFN is especially interested in work that takes new and novel approaches to advancing the ability to manipulate and analyze brain function. Technologies developed with McKnight support must ultimately be made available to other scientists.

“Again, it has been a thrill to see the ingenuity that our applicants are bringing to new neurotechnologies,” said Markus Meister, Ph.D., chair of the awards committee and the Anne P. and Benjamin F. Biaggini professor of biological sciences at Caltech. “This year, we faced a tough choice among many exciting developments, and our awards span a broad range, from a robotic exoskeleton to support neural recording in rodents, to molecular control of accurate gene expression, to algorithms for 3D tracking of animal behavior.”

This year’s selection committee also included Adrienne Fairhall, Timothy Holy, Loren Looger, Mala Murthy, Alice Ting, and Hongkui Zeng, who chose this year’s Technological Innovations in Neuroscience Awards from a highly competitive pool of 73 applicants.

Letters of intent for the 2022 Technological Innovations in Neuroscience Awards are due Monday, December 6, 2021. An announcement about the 2022 process will go out in August. Click for more information about the awards.

2021 McKnight Technological Innovations in Neuroscience Awards

Timothy Dunn, Ph.D., Assistant Professor, Department of Biomedical Engineering, Duke University

Multi-scale Three Dimensional Behavioral Quantification in Individuals and Social Groups

Current methods of measuring movement of freely behaving animals have limitations: Highly detailed observations of small movements of an animal (a single digit, for example) require restricted ranges of motion. Studying freely moving behavior in 3D space often means limiting resolution, perhaps only tracking overall position, or relying on an observer’s description. Automatic video tracking in animals typically requires an unnatural, simple environment, and body parts not visible to cameras aren’t tracked accurately. High-resolution Artificial Intelligence (AI) predictions over large three-dimensional spaces using volumetric spatial representation, a technique recently developed to overcome these issues, require massive computing power. Adding multiple animals for social observations introduces additional issues.

As a result, there is poor availability of the most desired data: High-resolution, automatic tracking of animals in 3D space performing natural behaviors, alone or in groups, and quantification of that motion in a standardized format. Dr. Dunn is working on a new approach that aims to bring that ideal closer. Building on learnings from a 3D geometric machine-learning algorithm his team used to greatly improve the accuracy of predictions, Dr. Dunn and his team are now working on adaptive recurrent image sampling (ARIS) that combines images from multiple cameras to build a model that can measure and predict body position on many scales, even when a part (such as an arm or foot) isn’t directly visible.

ARIS selectively improves the resolution of fine-scale body features, and uses predictive modelling based on what it knows of its subject (arrangement and length of limbs, how they are connect, how they move, etc.) – learned first by parsing enormous amounts of training data from freely-behaving rats and then finetuned using training data in other species – to focus on the portion of space where the body part is likely to be. This uses far less computational power than previous 3D volumetric tools. In his research, Dr. Dunn will implement ARIS and record data at multiple scales, from overall position and posture down to the movement of fine features of the hands, feet, and face. Further research will explore its effectiveness with multiple animals interacting. This ability to measure behavior in a new, more precise way has broad implications for the study of neurological disorders that affect movement, linking brain activity to behavior, and studying social interactions.


Jeffrey Kieft, Ph.D., Professor, Department of Biochemistry and Molecular Genetics, University of Colorado School of Medicine

A New Technology to Control the Transcriptome

Messenger RNA, or mRNA, is recognized as a vital player in the life and health of cells. These RNA molecules are the templates to make protein, and are created within cells to carry instructions to the protein-making machinery, then are destroyed by enzymes. The totality of mRNA an organism expresses is called its “transcriptome.”

Deficiencies in mRNA and non-coding RNA (ncRNA) are linked to certain neurodegenerative and neurodevelopmental disorders. If there is too little of a specific mRNA or ncRNA in the transcriptome, certain cellular functions may be degraded or disabled. Dr. Kieft is exploring a novel way to manage the transcriptome by slowing the decay of mRNA and ncRNA. Knowing that some enzymes that destroy the RNAs essentially “chew” it from one end to the other, Dr. Kieft used his understanding of how RNA molecules are structured and fold on themselves to create an engineered piece of exoribonuclease-resistant RNA (xrRNA) that, when introduced to compatible mRNA or ncRNA, combines and folds to form a “blocking” structure, literally changing the shape of the RNA by inserting a protrusion that stops the enzymes in their tracks.

By slowing the decay of the target mRNA and ncRNA, Dr. Kieft sees the opportunity to manage their abundance within the transcriptome. Engineered xrRNAs could recognize just specific targets, link up with them, and create the protection, so researchers can increase the proportion of the target without changing how much is created. The approach has the advantage of being less disruptive to the host cell than unnaturally boosting mRNA, and the precision with which xrRNA can be engineered offers the potential to target multiple RNAs at once, and possibly even allow fine-tuning by precisely managing the rate of decay. Dr. Kieft sees this application, born of basic science studying RNA, as a potentially powerful research tool for neuroscientists, and perhaps even the foundation for therapies in the more distant future.


Suhasa Kodandaramaiah, Ph.D., Benjamin Mayhugh Assistant Professor, Department of Mechanical Engineering, University of Minnesota Twin Cities

Robot Assisted Brain-Wide Recordings in Freely Behaving Mice

Neuroscientists studying brain activity during behaviors usually have to make a trade-off: They use miniaturized head-mounted neural sensors that are light enough to allow a subject animal to behave freely, but are lower resolution or can’t monitor the whole brain. Or they use more powerful tools, which are far too heavy for subject animals and require other solutions, like immobilization while letting animals move on a treadmill, or even using virtual reality experiences that nonetheless limit the behavior of a subject.

Dr. Kodandaramaiah is tackling the challenge with a robotic cranial exoskeleton that carries the weight of neural recording and monitoring hardware while still allowing the subject (in this case a mouse) to rotate its head in all three degrees: a full 360 degree turn in the yaw (horizontal rotation) axis, and about 50 degrees of motion in the pitch and roll axes, while moving around in an arena. The robot has three jointed arms arranged in a triangular configuration, suspended over the subject and meeting at the point of mounting on the head. Sensors in the mount will detect what motion the mouse is making and direct the robot to enable the motion with as little resistive force as possible, allowing the mouse to turn and move within an arena typically used for neuroscience experiments with all the necessary sensory equipment and wires from the implants supported by the robot.

Taking out the need for miniaturization allows researchers to use whatever state-of-the art hardware is available, meaning a robot can theoretically be upgraded to use the latest technology soon after its introduction. To get to that point, Dr. Kodandaramaiah’s team will go through several steps – engineering the exoskeleton; engineering the head-stage with its needed sensors plus high-density electrodes and cameras for external observation of eyes, whiskers and more; performing benchtop testing; tuning the robot to the inputs a mouse can deliver; determining how to introduce probes; and finally making live recording. With this mechanical underpinning, Dr. Kodandaramaiah hopes to help researchers get closer to the state where they can make detailed brain-wide neural recordings of freely behaving subjects over long timescales.

ርዕስ የ McKnight Endowment Fund ለሬዮነቲስ, የቴክኖሎጂ ሽልማቶች

July 2021

አማርኛ