This week, we talk about dinosaur population density, the applications of augmented reality, and more!
Estimating the size of extinct populations is difficult for scientists because fossil evidence is scarce. But recently, paleontologists from the University of California, Berkeley decided to try it. The team set out to estimate the number of Tyrannosaurus rexes that roamed North America during the Cretaceous period.
The researchers used Monte Carlo computer simulation and data that relates body mass to population density to arrive at their estimate. In addition, they treated T. Rex as a predator and assumed its energy requirements were halfway between those of a lion and a Komodo dragon. They also ignored juvenile T. rexes because the young reptiles may have lived apart from adults and pursued different prey.
After factoring for the uncertainty of ecological variability, the team determined that the standard population size was about 20,000. Given 127,000 generations, there should have been around 2.5 billion T. rexes overall.
The framework for this study is now available for use by other researchers. It lays the foundation for estimating how many species paleontologists might have missed when excavating for fossils.
Augmented reality (AR) is a technology that superimposes virtual information onto physical objects and spaces. AR is used in a variety of fields such as education, tourism, and entertainment. It hit the mainstream in 2016 with the release of the interactive smartphone game Pokemon Go.
Recently, researchers at the City University of Hong Kong and Singapore Management University studied the effect of AR usage in retail. They looked at some of the ways retailers are using the technology to enhance customer experience.
One example of AR use in retail is Lego’s companion app for users of specially designed brick sets. The app features animated characters who interact with the physical Lego sets. Walmart is another retailer examined in the study. The big-box store collaborated with DC Comics and Marvel to create superhero-themed AR displays.
Researchers say that the impact of AR is stronger for premium products and products that are less popular. The findings suggest that AR is most effective in reducing customer uncertainty.
Mice thinking abstractly
Our brains use categorization to simplify life’s complexity. Grouping information into categories helps us make sense of the world and react quickly to new experiences. Scientists at the Max Planck Institute of Neurobiology have discovered that mouse brains do the same thing.
The researchers studied mice to see if they categorize abstract information in a similar way to humans. In the study, Tobias Bonhoeffer and his team showed mice pictures of striped patterns and gave them a sorting rule. One group had to sort the images into categories based on stripe thickness. Mice in the other group sorted by orientation. Both groups learned the rule and completed the task successfully. They were even able to categorize patterns they had not seen before. When the scientists changed the sorting rules, the mice ditched the old rules and used the new ones.
The study revealed that certain neurons in the prefrontal cortex became active while the mice did their sorting tasks. Also, different groups of neurons reacted selectively to individual categories. Bonhoeffer says that the results show that neurons develop selectivity gradually as the learning process progresses.
The study adds important information about the neuronal basis of abstract learning and reminds us that humans are not the only animals capable of complex thought.
The inner robot
If you’re worried that robots may take over the world, this research may or may not allay your fears.
Italian researchers Antonio Chella and Arianna Pipitone have designed a robot that uses inner speech. Inner speech is something that humans do. It helps us evaluate situations and, thus, make better decisions. Chella explains that we may find robots more trustworthy if we can hear their thought process.
For the study, the researchers built a cognitive model of inner speech that makes robots able to speak to themselves. They deployed the model in a commercially available Pepper robot.
In one experiment, Pepper was asked to place a napkin in the wrong spot. The robot asked itself if the user was confused about where to put a napkin. It stated its aversion to breaking the rules but also said it did not want to upset the user by not complying with the request.
Pipitone and Chella found that the robot had a higher task-completion rate when it used inner speech. In the study, Pepper outperformed the international standard functional and moral requirements for collaborative robots — guidelines that machines follow. Chella and Pipitone say that machines like navigation apps, phone cameras, and medical robots can benefit from the chatty feature.
Picking up the pace
The inability to walk quickly can be a frustrating problem. That’s why engineers at Stanford University developed a prototype exoskeleton system designed to increase self-selected walking speed. The device attaches around the shin and into a running shoe. It’s powered by motors and controlled by an algorithm.
In the study, ten participants were tested. They walked in normal shoes without the exoskeleton. And then with the exoskeleton attached but not turned on. After that, the subjects used the device in three modes, optimized for speed, optimized for energy, and placebo mode. Use of the speed-optimized mode resulted in a 42 percent increase in walking pace. The same mode also reduced energy use by around 2 percent.
Associate Professor of Mechanical Engineering Steve Collins wonders if future designs could improve balance or reduce pain caused by weight on joints. The team plans to conduct similar walking tests with older adults and hope those provide encouraging results as well.