ֱ̽ of Cambridge - robot /taxonomy/subjects/robot en ‘Palaeo-robots’ to help scientists understand how fish started to walk on land /research/news/palaeo-robots-to-help-scientists-understand-how-fish-started-to-walk-on-land <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/paleo-robots-883x432.jpg?itok=rSGMB0cY" alt="Illustration of palaeo-robots." title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://doi.org/10.1126/scirobotics.adn1125">Writing</a> in the journal <em>Science Robotics</em>, the research team, led by the ֱ̽ of Cambridge, outline how ‘palaeo-inspired robotics’ could provide a valuable experimental approach to studying how the pectoral and pelvic fins of ancient fish evolved to support weight on land.</p> <p>“Since fossil evidence is limited, we have an incomplete picture of how ancient life made the transition to land,” said lead author <a href="https://www.michaelishida.com/">Dr Michael Ishida</a> from Cambridge’s Department of Engineering. “Palaeontologists examine ancient fossils for clues about the structure of hip and pelvic joints, but there are limits to what we can learn from fossils alone. That’s where robots can come in, helping us fill gaps in the research, particularly when studying major shifts in how vertebrates moved.”</p> <p>Ishida is a member of Cambridge’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, led by Professor Fumiya Iida. ֱ̽team is developing energy-efficient robots for a variety of applications, which take their inspiration from the efficient ways that animals and humans move.</p> <p>With funding from the Human Frontier Science Program, the team is developing palaeo-inspired robots, in part by taking their inspiration from modern-day ‘walking fish’ such as mudskippers, and from fossils of extinct fish. “In the lab, we can’t make a living fish walk differently, and we certainly can’t get a fossil to move, so we’re using robots to simulate their anatomy and behaviour,” said Ishida.</p> <p> ֱ̽team is creating robotic analogues of ancient fish skeletons, complete with mechanical joints that mimic muscles and ligaments. Once complete, the team will perform experiments on these robots to determine how these ancient creatures might have moved.</p> <p>“We want to know things like how much energy different walking patterns would have required, or which movements were most efficient,” said Ishida. “This data can help confirm or challenge existing theories about how these early animals evolved.”</p> <p>One of the biggest challenges in this field is the lack of comprehensive fossil records. Many of the ancient species from this period in Earth’s history are known only from partial skeletons, making it difficult to reconstruct their full range of movement.</p> <p>“In some cases, we’re just guessing how certain bones connected or functioned,” said Ishida. “That’s why robots are so useful—they help us confirm these guesses and provide new evidence to support or rebut them.”</p> <p>While robots are commonly used to study movement in living animals, very few research groups are using them to study extinct species. “There are only a few groups doing this kind of work,” said Ishida. “But we think it’s a natural fit – robots can provide insights into ancient animals that we simply can’t get from fossils or modern species alone.”</p> <p> ֱ̽team hopes that their work will encourage other researchers to explore the potential of robotics to study the biomechanics of long-extinct animals. “We’re trying to close the loop between fossil evidence and real-world mechanics,” said Ishida. “Computer models are obviously incredibly important in this area of research, but since robots are interacting with the real world, they can help us test theories about how these creatures moved, and maybe even why they moved the way they did.”</p> <p> ֱ̽team is currently in the early stages of building their palaeo-robots, but they hope to have some results within the next year. ֱ̽researchers say they hope their robot models will not only deepen understanding of evolutionary biology, but could also open up new avenues of collaboration between engineers and researchers in other fields.</p> <p> ֱ̽research was supported by the Human Frontier Science Program. Fumiya Iida is a Fellow of Corpus Christi College, Cambridge. Michael Ishida a Postdoctoral Research Associate at Gonville and Caius College, Cambridge.</p> <p><em><strong>Reference:</strong><br /> Michael Ishida et al. ‘<a href="https://doi.org/10.1126/scirobotics.adn1125">Paleo-inspired robotics as an experimental approach to the history of life</a>.’ Science Robotics (2024). DOI: 10.1126/scirobotics.adn1125</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽transition from water to land is one of the most significant events in the history of life on Earth. Now, a team of roboticists, palaeontologists and biologists is using robots to study how the ancestors of modern land animals transitioned from swimming to walking, about 390 million years ago.</p> </p></div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 23 Oct 2024 18:00:00 +0000 sc604 248514 at Robot trained to read braille at twice the speed of humans /research/news/robot-trained-to-read-braille-at-twice-the-speed-of-humans <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/img-4841-dp.jpg?itok=RoYah_Zz" alt="Robot braille reader" title="Robot braille reader, Credit: Parth Potdar" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽research team, from the ֱ̽ of Cambridge, used machine learning algorithms to teach a robotic sensor to quickly slide over lines of braille text. ֱ̽robot was able to read the braille at 315 words per minute at close to 90% accuracy.</p> <p>Although the robot braille reader was not developed as an assistive technology, the researchers say the high sensitivity required to read braille makes it an ideal test in the development of robot hands or prosthetics with comparable sensitivity to human fingertips. ֱ̽<a href="https://ieeexplore.ieee.org/document/10410896">results</a> are reported in the journal <em>IEEE Robotics and Automation Letters</em>.</p> <p>Human fingertips are remarkably sensitive and help us gather information about the world around us. Our fingertips can detect tiny changes in the texture of a material or help us know how much force to use when grasping an object: for example, picking up an egg without breaking it or a bowling ball without dropping it.</p> <p>Reproducing that level of sensitivity in a robotic hand, in an energy-efficient way, is a big engineering challenge. In <a href="https://birlab.org/">Professor Fumiya Iida’s lab</a> in Cambridge’s Department of Engineering, researchers are developing solutions to this and other skills that humans find easy, but robots find difficult.</p> <p>“ ֱ̽softness of human fingertips is one of the reasons we’re able to grip things with the right amount of pressure,” said Parth Potdar from Cambridge’s Department of Engineering and an undergraduate at Pembroke College, the paper’s first author. “For robotics, softness is a useful characteristic, but you also need lots of sensor information, and it’s tricky to have both at once, especially when dealing with flexible or deformable surfaces.”</p> <p>Braille is an ideal test for a robot ‘fingertip’ as reading it requires high sensitivity, since the dots in each representative letter pattern are so close together. ֱ̽researchers used an off-the-shelf sensor to develop a robotic braille reader that more accurately replicates human reading behaviour.</p> <p>“There are existing robotic braille readers, but they only read one letter at a time, which is not how humans read,” said co-author David Hardman, also from the Department of Engineering. “Existing robotic braille readers work in a static way: they touch one letter pattern, read it, pull up from the surface, move over, lower onto the next letter pattern, and so on. We want something that’s more realistic and far more efficient.”</p> <p> ֱ̽robotic sensor the researchers used has a camera in its ‘fingertip’, and reads by using a combination of the information from the camera and the sensors. “This is a hard problem for roboticists as there’s a lot of image processing that needs to be done to remove motion blur, which is time and energy-consuming,” said Potdar.</p> <p> ֱ̽team developed machine learning algorithms so the robotic reader would be able to ‘deblur’ the images before the sensor attempted to recognise the letters. They trained the algorithm on a set of sharp images of braille with fake blur applied. After the algorithm had learned to deblur the letters, they used a computer vision model to detect and classify each character.</p> <p>Once the algorithms were incorporated, the researchers tested their reader by sliding it quickly along rows of braille characters. ֱ̽robotic braille reader could read at 315 words per minute at 87% accuracy, which is twice as fast and about as accurate as a human Braille reader.</p> <p>“Considering that we used fake blur the train the algorithm, it was surprising how accurate it was at reading braille,” said Hardman. “We found a nice trade-off between speed and accuracy, which is also the case with human readers.”</p> <p>“Braille reading speed is a great way to measure the dynamic performance of tactile sensing systems, so our findings could be applicable beyond braille, for applications like detecting surface textures or slippage in robotic manipulation,” said Potdar.</p> <p>In future, the researchers are hoping to scale the technology to the size of a humanoid hand or skin. ֱ̽research was supported in part by the Samsung Global Research Outreach Program.</p> <p> </p> <p><em><strong>Reference:</strong><br /> Parth Potdar et al. ‘<a href="https://ieeexplore.ieee.org/document/10410896">High-Speed Tactile Braille Reading via Biomimetic Sliding Interactions</a>.’ IEEE Robotics and Automation Letters (2024). DOI: 10.1109/LRA.2024.3356978</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.</p> </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-217601" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/can-robots-read-braille">Can robots read braille?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/xqtA2Z668Ic?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Parth Potdar</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robot braille reader</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 29 Jan 2024 06:04:52 +0000 sc604 244161 at Robots cause company profits to fall – at least at first /research/news/robots-cause-company-profits-to-fall-at-least-at-first <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/gettyimages-1408271637-dp.jpg?itok=uZqWd7Is" alt="Robots on a manufacturing line" title="Robots on a manufacturing line, Credit: kynny via Getty Images" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, studied industry data from the UK and 24 other European countries between 1995 and 2017, and found that at low levels of adoption, robots have a negative effect on profit margins. But at higher levels of adoption, robots can help increase profits.</p>&#13; &#13; <p>According to the researchers, this U-shaped phenomenon is due to the relationship between reducing costs, developing new processes and innovating new products. While many companies first adopt robotic technologies to decrease costs, this ‘process innovation’ can be easily copied by competitors, so at low levels of robot adoption, companies are focused on their competitors rather than on developing new products. However, as levels of adoption increase and robots are fully integrated into a company’s processes, the technologies can be used to increase revenue by innovating new products.</p>&#13; &#13; <p>In other words, firms using robots are likely to focus initially on streamlining their processes before shifting their emphasis to product innovation, which gives them greater market power via the ability to differentiate from their competitors. ֱ̽<a href="https://ieeexplore.ieee.org/document/10202238">results</a> are reported in the journal <em>IEEE Transactions on Engineering Management</em>.</p>&#13; &#13; <p>Robots have been widely used in industry since the 1980s, especially in sectors where they can carry out physically demanding, repetitive tasks, such as automotive assembly. In the decades since, the rate of robot adoption has increased dramatically and consistently worldwide, and the development of precise, electrically controlled robots makes them particularly useful for high-value manufacturing applications requiring greater precision, such as electronics.</p>&#13; &#13; <p>While robots have been shown to reliably raise labour productivity at an industry or country level, what has been less studied is how robots affect profit margins at a similar macro scale.</p>&#13; &#13; <p>“If you look at how the introduction of computers affected productivity, you actually see a slowdown in productivity growth in the 1970s and early 1980s, before productivity starts to rise again, which it did until the financial crisis of 2008,” said co-author Professor Chander Velu from Cambridge’s Institute for Manufacturing. “It’s interesting that a tool meant to increase productivity had the opposite effect, at least at first. We wanted to know whether there is a similar pattern with robotics.”</p>&#13; &#13; <p>“We wanted to know whether companies were using robots to improve processes within the firm, rather than improve the whole business model,” said co-author Dr Philip Chen. “Profit margin can be a useful way to analyse this.”</p>&#13; &#13; <p> ֱ̽researchers examined industry-level data for 25 EU countries (including the UK, which was a member at the time) between 1995 and 2017. While the data did not drill down to the level of individual companies, the researchers were able to look at whole sectors, primarily in manufacturing where robots are commonly used.</p>&#13; &#13; <p> ֱ̽researchers then obtained robotics data from the International Federation of Robotics (IFR) database. By comparing the two sets of data, they were able to analyse the effect of robotics on profit margins at a country level.</p>&#13; &#13; <p>“Intuitively, we thought that more robotic technologies would lead to higher profit margins, but the fact that we see this U-shaped curve instead was surprising,” said Chen.</p>&#13; &#13; <p>“Initially, firms are adopting robots to create a competitive advantage by lowering costs,” said Velu. “But process innovation is cheap to copy, and competitors will also adopt robots if it helps them make their products more cheaply. This then starts to squeeze margins and reduce profit margin.”</p>&#13; &#13; <p> ֱ̽researchers then carried out a series of interviews with an American medical equipment manufacturer to study their experiences with robot adoption.</p>&#13; &#13; <p>“We found that it’s not easy to adopt robotics into a business – it costs a lot of money to streamline and automate processes,” said Chen.</p>&#13; &#13; <p>“When you start bringing more and more robots into your process, eventually you reach a point where your whole process needs to be redesigned from the bottom up,” said Velu. “It’s important that companies develop new processes at the same time as they’re incorporating robots, otherwise they will reach this same pinch point.”</p>&#13; &#13; <p> ֱ̽researchers say that if companies want to reach the profitable side of the U-shaped curve more quickly, it’s important that the business model is adapted concurrently with robot adoption. Only after robots are fully integrated into the business model can companies fully use the power of robotics to develop new products, driving profits.</p>&#13; &#13; <p>A related piece of work being led by the Institute for Manufacturing is a community programme to help small- and medium-sized enterprises (SMEEs) to adopt digital technologies including robotics in a low-cost, low-risk way. “Incremental and step changes in this area enable SMEs to get the benefits of cost reduction as well as margin improvements from new products,” said co-author Professor Duncan McFarlane.</p>&#13; &#13; <p> ֱ̽research was supported by the Engineering and Physical Sciences Research Council (EPSRC) and the Economic and Social Research Council (ESRC), which are both part of UK Research and Innovation (UKRI). Chander Velu is a Fellow of Selwyn College, Cambridge. Duncan McFarlane is a Fellow of St John's College, Cambridge. </p>&#13; &#13; <p> </p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Yifeng P Chen, Chander Velu, Duncan McFarlane. ‘<a href="https://ieeexplore.ieee.org/document/10202238"> ֱ̽Effect of Robot Adoption on Profit Margins</a>.’ IEEE Transactions on Engineering Management (2023). DOI: 10.1109/TEM.2023.3260734</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have found that robots can have a ‘U-shaped’ effect on profits: causing profit margins to fall at first, before eventually rising again.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It’s important that companies develop new processes at the same time as they’re incorporating robots, otherwise they will reach this same pinch point</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Chander Velu</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.gettyimages.co.uk/detail/photo/smart-robot-in-manufacturing-industry-for-industry-royalty-free-image/1408271637?phrase=robot manufacturing&amp;amp;adppopup=true" target="_blank">kynny via Getty Images</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robots on a manufacturing line</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/social-media/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 03 Aug 2023 10:05:12 +0000 sc604 241131 at ֱ̽life robotic: Meet the Cambridge ֱ̽ researchers fostering human wellbeing using robots /stories/Cambridge-roboticists-wellbeing-support-robot-coaches <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽team is exploring the capacity robots have to inspire self-reflection, and support the work of psychologists and clinicians. </p> </p></div></div></div> Wed, 19 Jul 2023 10:10:48 +0000 sb726 240791 at Robot ‘chef’ learns to recreate recipes from watching food videos /research/news/robot-chef-learns-to-recreate-recipes-from-watching-food-videos <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/untitled-3_1.jpg?itok=RV53FI1P" alt="Robot arm reaching for a piece of broccoli" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, programmed their robotic chef with a ‘cookbook’ of eight simple salad recipes. After watching a video of a human demonstrating one of the recipes, the robot was able to identify which recipe was being prepared and make it.</p>&#13; &#13; <p>In addition, the videos helped the robot incrementally add to its cookbook. At the end of the experiment, the robot came up with a ninth recipe on its own. Their <a href="https://ieeexplore.ieee.org/document/10124218">results</a>, reported in the journal <em>IEEE Access</em>, demonstrate how video content can be a valuable and rich source of data for automated food production, and could enable easier and cheaper deployment of robot chefs.</p>&#13; &#13; <p>Robotic chefs have been featured in science fiction for decades, but in reality, cooking is a challenging problem for a robot. Several commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.</p>&#13; &#13; <p>Human cooks can learn new recipes through observation, whether that’s watching another person cook or watching a video on YouTube, but programming a robot to make a range of dishes is costly and time-consuming.</p>&#13; &#13; <p>“We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can – by identifying the ingredients and how they go together in the dish,” said Grzegorz Sochacki from Cambridge’s Department of Engineering, the paper’s first author.</p>&#13; &#13; <p>Sochacki, a PhD candidate in Professor Fumiya Iida’s <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, and his colleagues devised eight simple salad recipes and filmed themselves making them. They then used a publicly available neural network to train their robot chef. ֱ̽neural network had already been programmed to identify a range of different objects, including the fruits and vegetables used in the eight salad recipes (broccoli, carrot, apple, banana and orange).</p>&#13; &#13; <p>Using computer vision techniques, the robot analysed each frame of video and was able to identify the different objects and features, such as a knife and the ingredients, as well as the human demonstrator’s arms, hands and face. Both the recipes and the videos were converted to vectors and the robot performed mathematical operations on the vectors to determine the similarity between a demonstration and a vector.</p>&#13; &#13; <p>By correctly identifying the ingredients and the actions of the human chef, the robot could determine which of the recipes was being prepared. ֱ̽robot could infer that if the human demonstrator was holding a knife in one hand and a carrot in the other, the carrot would then get chopped up.</p>&#13; &#13; <p>Of the 16 videos it watched, the robot recognised the correct recipe 93% of the time, even though it only detected 83% of the human chef’s actions. ֱ̽robot was also able to detect that slight variations in a recipe, such as making a double portion or normal human error, were variations and not a new recipe. ֱ̽robot also correctly recognised the demonstration of a new, ninth salad, added it to its cookbook and made it.</p>&#13; &#13; <p>“It’s amazing how much nuance the robot was able to detect,” said Sochacki. “These recipes aren’t complex – they’re essentially chopped fruits and vegetables, but it was really effective at recognising, for example, that two chopped apples and two chopped carrots is the same recipe as three chopped apples and three chopped carrots.”  </p>&#13; &#13; <p> ֱ̽videos used to train the robot chef are not like the food videos made by some social media influencers, which are full of fast cuts and visual effects, and quickly move back and forth between the person preparing the food and the dish they’re preparing. For example, the robot would struggle to identify a carrot if the human demonstrator had their hand wrapped around it – for the robot to identify the carrot, the human demonstrator had to hold up the carrot so that the robot could see the whole vegetable.</p>&#13; &#13; <p>“Our robot isn’t interested in the sorts of food videos that go viral on social media – they’re simply too hard to follow,” said Sochacki. “But as these robot chefs get better and faster at identifying ingredients in food videos, they might be able to use sites like YouTube to learn a whole range of recipes.”</p>&#13; &#13; <p> ֱ̽research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).</p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Grzegorz Sochacki et al. ‘<a href="https://ieeexplore.ieee.org/document/10124218">Recognition of Human Chef’s Intentions for Incremental Learning of Cookbook by Robotic Salad Chef</a>.’ IEEE Access (2023). DOI: 10.1109/ACCESS.2023.3276234</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have trained a robotic ‘chef’ to watch and learn from cooking videos, and recreate the dish itself.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can – by identifying the ingredients and how they go together in the dish</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Greg Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-208991" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/robot-chef-learns-to-recreate-recipes-from-watching-food-videos">Robot ‘chef’ learns to recreate recipes from watching food videos</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nx3k4XA3x4Q?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/social-media/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 05 Jun 2023 01:00:00 +0000 sc604 239811 at It’s all in the wrist: energy-efficient robot hand learns how not to drop the ball /stories/robotic-hand <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have designed a low-cost, energy-efficient robotic hand that can grasp a range of objects – and not drop them – using just the movement of its wrist and the feeling in its ‘skin’.  </p> </p></div></div></div> Wed, 12 Apr 2023 03:23:34 +0000 sc604 238441 at Robots can help improve mental wellbeing at work – as long as they look right /research/news/robots-can-help-improve-mental-wellbeing-at-work-as-long-as-they-look-right <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/two-robots.jpg?itok=mQbW6APr" alt="Humanoid QT robot and toy-like Misty robot" title="QT robot (left) and Misty robot (right), Credit: Hatice Gunes" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Researchers from the ֱ̽ of Cambridge conducted a study in a tech consultancy firm using two robot wellbeing coaches, where 26 employees participated in weekly robot-led wellbeing sessions for four weeks. Although the robots had identical voices, facial expressions, and scripts for the sessions, the physical appearance of the robot affected how participants interacted with it.</p>&#13; &#13; <p>Participants who did their wellbeing exercises with a toy-like robot said that they felt more of a connection with their ‘coach’ than participants who worked with a humanoid-like robot. ֱ̽researchers say that perception of robots is affected by popular culture, where the only limit on what robots can do is the imagination. When faced with a robot in the real world however, it often does not live up to expectations.</p>&#13; &#13; <p>Since the toy-like robot looks simpler, participants may have had lower expectations and ended up finding the robot easier to talk and connect with. Participants who worked with the humanoid robot found that their expectations didn’t match reality, since the robot was not capable of having interactive conversations.</p>&#13; &#13; <p>Despite the differences between expectations and reality, the researchers say that their study shows that robots can be a useful tool to promote mental wellbeing in the workplace. ֱ̽<a href="https://www.repository.cam.ac.uk/handle/1810/345159">results</a> will be reported today (15 March) at the <a href="https://humanrobotinteraction.org/2023/"><em>ACM/IEEE International Conference on Human-Robot Interaction</em></a> in Stockholm.</p>&#13; &#13; <p> ֱ̽World Health Organization recommends that employers take action to promote and protect mental wellbeing at work, but the implementation of wellbeing practices is often limited by a lack of resources and personnel. Robots have shown some early promise for helping address this gap, but most studies on robots and wellbeing have been conducted in a laboratory setting.</p>&#13; &#13; <p>“We wanted to take the robots out of the lab and study how they might be useful in the real world,” said first author Dr Micol Spitale, from Cambridge’s Department of Computer Science and Technology.</p>&#13; &#13; <p> ֱ̽researchers collaborated with local technology company Cambridge Consultants to design and implement a workplace wellbeing programme using robots. Over the course of four weeks, employees were guided through four different wellbeing exercises by one of two robots: either the <a href="https://luxai.com/humanoid-social-robot-for-research-and-teaching/">QTRobot</a> (QT) or the <a href="https://www.mistyrobotics.com/">Misty II robot </a>(Misty).</p>&#13; &#13; <p> ֱ̽QT is a childlike humanoid robot and roughly 90cm tall, while Misty is a 36cm tall toy-like robot. Both robots have screen faces that can be programmed with different facial expressions.</p>&#13; &#13; <p>“We interviewed different wellbeing coaches and then we programmed our robots to have a coach-like personality, with high openness and conscientiousness,” said co-author Minja Axelsson. “ ֱ̽robots were programmed to have the same personality, the same facial expressions and the same voice, so the only difference between them was the physical robot form.”</p>&#13; &#13; <p>Participants in the experiment were guided through different positive psychology exercises by a robot in an office meeting room. Each session started with the robot asking participants to recall a positive experience or describe something in their lives they were grateful for, and the robot would ask follow-up questions. After the sessions, participants were asked to assess the robot with a questionnaire and an interview. Participants did one session per week for four weeks, and worked with the same robot for each session.</p>&#13; &#13; <p>Participants who worked with the toy-like Misty robot reported that they had a better working connection with the robot than participants who worked with the child-like QT robot. Participants also had a more positive perception of Misty overall.</p>&#13; &#13; <p>“It could be that since the Misty robot is more toy-like, it matched their expectations,” said Spitale. “But since QT is more humanoid, they expected it to behave like a human, which may be why participants who worked with QT were slightly underwhelmed.”</p>&#13; &#13; <p>“ ֱ̽most common response we had from participants was that their expectations of the robot didn’t match with reality,” said Professor Hatice Gunes, who led the research. “We programmed the robots with a script, but participants were hoping there would be more interactivity. It’s incredibly difficult to create a robot that’s capable of natural conversation. New developments in large language models could really be beneficial in this respect.”</p>&#13; &#13; <p>“Our perceptions of how robots should look or behave might be holding back the uptake of robotics in areas where they can be useful,” said Axelsson.</p>&#13; &#13; <p>Although the robots used in the experiment are not as advanced as C-3PO or other fictional robots, participants still said they found the wellbeing exercises helpful, and that they were open to the idea of talking to a robot in future.</p>&#13; &#13; <p>“ ֱ̽robot can serve as a physical reminder to commit to the practice of wellbeing exercises,” said Gunes. “And just saying things out loud, even to a robot, can be helpful when you’re trying to improve mental wellbeing.”</p>&#13; &#13; <p> ֱ̽team is now working to enhance the robot coaches’ responsiveness during coaching practices and interactions.</p>&#13; &#13; <p> ֱ̽research was supported by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Hatice Gunes is a Staff Fellow of Trinity Hall, Cambridge.</p>&#13; &#13; <p><br />&#13; <em><strong>Reference:</strong><br />&#13; Micol Spitale, Minja Axelsson, and Hatice Gunes. ‘Robotic Mental Well-being Coaches for the Workplace: An In-the-Wild Study on Form.’ Paper presented to the ACM/IEEE International Conference on Human-Robot Interaction, Stockholm, Sweden, 13-16 March 2023.</em></p>&#13; &#13; <p><em>Try a positive <a href="https://www.festival.cam.ac.uk/events/try-positive-psychology-session-robot">psychology session </a>with the robots used in this research as part of the Cambridge Festival on Saturday, 18 March. </em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Robots can be useful as mental wellbeing coaches in the workplace – but perception of their effectiveness depends in large part on what the robot looks like.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Our perceptions of how robots should look or behave might be holding back the uptake of robotics in areas where they can be useful</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Minja Axelsson</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Hatice Gunes</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">QT robot (left) and Misty robot (right)</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 15 Mar 2023 00:50:57 +0000 sc604 237651 at Robots can be used to assess children’s mental wellbeing, study suggests /research/news/robots-can-be-used-to-assess-childrens-mental-wellbeing-study-suggests <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/shaking-hands_0.jpg?itok=6DtkvW3H" alt="Robot shaking hands with Dr Micol Spitale" title="Nao robot shaking hands with study co-author Dr Micol Spitale, Credit: Rachel Gardner" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>A team of roboticists, computer scientists and psychiatrists from the ֱ̽ of Cambridge carried out a study with 28 children between the ages of eight and 13, and had a child-sized humanoid robot administer a series of standard psychological questionnaires to assess the mental wellbeing of each participant.</p>&#13; &#13; <p> ֱ̽children were willing to confide in the robot, in some cases sharing information with the robot that they had not yet shared via the standard assessment method of online or in-person questionnaires. This is the first time that robots have been used to assess mental wellbeing in children.</p>&#13; &#13; <p> ֱ̽researchers say that robots could be a useful addition to traditional methods of mental health assessment, although they are not intended to be a substitute for professional mental health support. ֱ̽<a href="https://www.repository.cam.ac.uk/handle/1810/338405">results</a> will be presented today at the <em>31st IEEE International Conference on Robot &amp; Human Interactive Communication (RO-MAN)</em> in Naples, Italy.</p>&#13; &#13; <p>During the COVID-19 pandemic, home schooling, financial pressures, and isolation from peers and friends impacted the mental health of many children. Even before the pandemic however, anxiety and depression among children in the UK has been on the rise, but the resources and support to address mental wellbeing are severely limited.</p>&#13; &#13; <p>Professor Hatice Gunes, who leads the <a href="https://cambridge-afar.github.io/">Affective Intelligence and Robotics Laboratory</a> in Cambridge’s <a href="https://www.cst.cam.ac.uk/">Department of Computer Science and Technology</a>, has been studying how socially-assistive robots (SARs) can be used as mental wellbeing ‘coaches’ for adults, but in recent years has also been studying how they may be beneficial to children.</p>&#13; &#13; <p>“After I became a mother, I was much more interested in how children express themselves as they grow, and how that might overlap with my work in robotics,” said Gunes. “Children are quite tactile, and they’re drawn to technology. If they’re using a screen-based tool, they’re withdrawn from the physical world. But robots are perfect because they’re in the physical world – they’re more interactive, so the children are more engaged.”</p>&#13; &#13; <p>With colleagues in Cambridge’s Department of Psychiatry, Gunes and her team designed an experiment to see if robots could be a useful tool to assess mental wellbeing in children.</p>&#13; &#13; <p>“There are times when traditional methods aren’t able to catch mental wellbeing lapses in children, as sometimes the changes are incredibly subtle,” said Nida Itrat Abbasi, the study’s first author. “We wanted to see whether robots might be able to help with this process.”</p>&#13; &#13; <p>For the study, 28 participants between ages eight and 13 each took part in a one-to-one 45-minute session with a Nao robot – a humanoid robot about 60 centimetres tall. A parent or guardian, along with members of the research team, observed from an adjacent room. Prior to each session, children and their parent or guardian completed standard online questionnaire to assess each child’s mental wellbeing.</p>&#13; &#13; <p>During each session, the robot performed four different tasks: 1) asked open-ended questions about happy and sad memories over the last week; 2) administered the Short Mood and Feelings Questionnaire (SMFQ); 3) administered a picture task inspired by the Children’s Apperception Test (CAT), where children are asked to answer questions related to pictures shown; and 4) administered the Revised Children’s Anxiety and Depression Scale (RCADS) for generalised anxiety, panic disorder and low mood.</p>&#13; &#13; <p>Children were divided into three different groups following the SMFQ, according to how likely they were to be struggling with their mental wellbeing. Participants interacted with the robot throughout the session by speaking with it, or by touching sensors on the robot’s hands and feet. Additional sensors tracked participants’ heartbeat, head and eye movements during the session.</p>&#13; &#13; <p>Study participants all said they enjoyed talking with the robot: some shared information with the robot that they hadn’t shared either in person or on the online questionnaire.</p>&#13; &#13; <p> ֱ̽researchers found that children with varying levels of wellbeing concerns interacted differently with the robot. For children that might not be experiencing mental wellbeing-related problems, the researchers found that interacting with the robot led to more positive response ratings to the questionnaires. However, for children that might be experiencing wellbeing related concerns, the robot may have enabled them to divulge their true feelings and experiences, leading to more negative response ratings to the questionnaire.</p>&#13; &#13; <p>“Since the robot we use is child-sized, and completely non-threatening, children might see the robot as a confidante – they feel like they won’t get into trouble if they share secrets with it,” said Abbasi. “Other researchers have found that children are more likely to divulge private information – like that they’re being bullied, for example – to a robot than they would be to an adult.”</p>&#13; &#13; <p> ֱ̽researchers say that while their results show that robots could be a useful tool for psychological assessment of children, they are not a substitute for human interaction.</p>&#13; &#13; <p>“We don’t have any intention of replacing psychologists or other mental health professionals with robots, since their expertise far surpasses anything a robot can do,” said co-author <a href="https://micolspitalecom.wordpress.com/">Dr Micol Spitale</a>. “However, our work suggests that robots could be a useful tool in helping children to open up and share things they might not be comfortable sharing at first.”</p>&#13; &#13; <p> ֱ̽researchers say that they hope to expand their survey in future, by including more participants and following them over time. They are also investigating whether similar results could be achieved if children interact with the robot via video chat.</p>&#13; &#13; <p> ֱ̽research was supported in part by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI), and NIHR Cambridge Biomedical Research Centre. Hatice Gunes is a Fellow of Trinity Hall, Cambridge. </p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Nida Itrat Abbasi et al. ‘<a href="https://ras.papercept.net/conferences/conferences/ROMAN22/program/ROMAN22_ContentListWeb_4.html#th601">Can Robots Help in the Evaluation of Mental Wellbeing in Children? An Empirical Study</a>.’ Paper presented to the 31st IEEE International Conference on Robot &amp; Human Interactive Communication (RO-MAN), Naples, Italy, 29 August – 2 September 2022.</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Robots can be better at detecting mental wellbeing issues in children than parent-reported or self-reported testing, a new study suggests.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">Children might see the robot as a confidante – they feel like they won’t get into trouble if they share secrets with it</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Nida Itrat Abbasi</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Rachel Gardner</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Nao robot shaking hands with study co-author Dr Micol Spitale</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 31 Aug 2022 23:53:05 +0000 sc604 234001 at