ֱ̽ of Cambridge - sensors /taxonomy/subjects/sensors en Scientists develop ‘smart pyjamas’ to monitor sleep disorders /research/news/scientists-develop-smart-pyjamas-to-monitor-sleep-disorders <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/smart-pyjamas.jpg?itok=cvWKsZHo" alt="Illustration and photograph of &#039;smart pyjamas&#039;" title="Illustration and photograph of &amp;#039;smart pyjamas&amp;#039;, Credit: Luigi Occhipinti" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽team, led by the ֱ̽ of Cambridge, developed printed fabric sensors that can monitor breathing by detecting tiny movements in the skin, even when the pyjamas are worn loosely around the neck and chest.</p> <p> ֱ̽sensors embedded in the smart pyjamas were trained using a ‘lightweight’ AI algorithm and can identify six different sleep states with 98.6% accuracy, while ignoring regular sleep movements such as tossing and turning. ֱ̽energy-efficient sensors only require a handful of examples of sleep patterns to successfully identify the difference between regular and disordered sleep.</p> <p> ֱ̽researchers say that their smart pyjamas could be useful for the millions of people in the UK who struggle with disordered sleep to monitor their sleep, and how it might be affected by lifestyle changes. ֱ̽<a href="https://www.pnas.org/doi/10.1073/pnas.2420498122">results</a> are reported in the <em>Proceedings of the National Academy of Sciences (PNAS)</em>.</p> <p>Sleep is vital for human health, yet more than 60% of adults experience poor sleep quality, leading to the loss of between 44 and 54 annual working days, and an estimated one percent reduction in global GDP. Sleep behaviours such as mouth breathing, sleep apnoea and snoring are major contributors to poor sleep quality, and can lead to chronic conditions such as cardiovascular disease, diabetes and depression.</p> <p>“Poor sleep has huge effects on our physical and mental health, which is why proper sleep monitoring is vital,” said Professor Luigi Occhipinti from the Cambridge Graphene Centre, who led the research. “However, the current gold standard for sleep monitoring, polysomnography or PSG, is expensive, complicated and isn’t suitable for long-term use at home.”</p> <p>Home devices that are simpler than PSG, such as home sleep tests, typically focus on a single condition and are bulky or uncomfortable. Wearable devices such as smartwatches, while more comfortable to wear, can only infer sleep quality, and are not effective for accurately monitoring disordered sleep.</p> <p>“We need something that is comfortable and easy to use every night, but is accurate enough to provide meaningful information about sleep quality,” said Occhipinti.</p> <p>To develop the smart pyjamas, Occhipinti and his colleagues built on their earlier work on a <a href="/research/news/smart-choker-uses-ai-to-help-people-with-speech-impairment-to-communicate">smart choker</a> for people with speech impairments. ֱ̽team re-designed the graphene-based sensors for breath analysis during sleep, and made several design improvements to increase sensitivity.</p> <p>“Thanks to the design changes we made, the sensors are able to detect different sleep states, while ignoring regular tossing and turning,” said Occhinpinti. “ ֱ̽improved sensitivity also means that the smart garment does not need to be worn tightly around the neck, which many people would find uncomfortable. As long as the sensors are in contact with the skin, they provide highly accurate readings.”</p> <p> ֱ̽researchers designed a machine learning model, called SleepNet, that uses the signals captured by the sensors to identify sleep states including nasal breathing, mouth breathing, snoring, teeth grinding, central sleep apnoea (CSA), and obstructive sleep apnoea (OSA). SleepNet is a ‘lightweight’ AI network, that reduces computational complexity to the point where it can be run on portable devices, without the need to connect to computers or servers.</p> <p>“We pruned the AI model to the point where we could get the lowest computational cost with the highest degree of accuracy,” said Occhinpinti. “This way we are able to embed the main data processors in the sensors directly.”</p> <p> ֱ̽smart pyjamas were tested on healthy patients and those with sleep apnoea, and were able to detect a range of sleep states with an accuracy of 98.6%. By treating the smart pyjamas with a special starching step, they were able to improve the durability of the sensors so they can be run through a regular washing machine.</p> <p> ֱ̽most recent version of the smart pyjamas are also capable of wireless data transfer, meaning the sleep data can be securely transferred to a smartphone or computer.</p> <p>“Sleep is so important to health, and reliable sleep monitoring can be key in preventative care,” said Occhipinti. “Since this garment can be used at home, rather than in a hospital or clinic, it can alert users to changes in their sleep that they can then discuss with their doctor. Sleep behaviours such as nasal versus mouth breathing are not typically picked up in an NHS sleep analysis, but it can be an indicator of disordered sleep.”</p> <p> ֱ̽researchers are hoping to adapt the sensors for a range of health conditions or home uses, such as baby monitoring, and have been in discussions with different patient groups. They are also working to improve the durability of the sensors for long-term use.</p> <p> ֱ̽research was supported in part by the EU Graphene Flagship, Haleon, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).</p> <p><strong>Reference:</strong><br /> Chenyu Tang, Wentian Yi et al. ‘<a href="https://www.pnas.org/doi/10.1073/pnas.2420498122">A deep learning-enabled smart garment for accurate and versatile monitoring of sleep conditions in daily life</a>.’ PNAS (2025). DOI: 10.1073/pnas.2420498122</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed comfortable, washable ‘smart pyjamas’ that can monitor sleep disorders such as sleep apnoea at home, without the need for sticky patches, cumbersome equipment or a visit to a specialist sleep clinic.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We need something that is comfortable and easy to use every night, but is accurate enough to provide meaningful information about sleep quality</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Luigi Occhipinti</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.occhipintigroup.com/" target="_blank">Luigi Occhipinti</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Illustration and photograph of &#039;smart pyjamas&#039;</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 18 Feb 2025 11:06:44 +0000 sc604 248705 at Imperceptible sensors made from ‘electronic spider silk’ can be printed directly on human skin /research/news/imperceptible-sensors-made-from-electronic-spider-silk-can-be-printed-directly-on-human-skin <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/picture1_4.jpg?itok=wncwlNCX" alt="Sensors printed on human fingers" title="Sensors printed on human fingers, Credit: Huang Lab, Cambridge" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽method, developed by researchers from the ֱ̽ of Cambridge, takes its inspiration from spider silk, which can conform and stick to a range of surfaces. These ‘spider silks’ also incorporate bioelectronics, so that different sensing capabilities can be added to the ‘web’.</p> <p> ֱ̽fibres, at least 50 times smaller than a human hair, are so lightweight that the researchers printed them directly onto the fluffy seedhead of a dandelion without collapsing its structure. When printed on human skin, the fibre sensors conform to the skin and expose the sweat pores, so the wearer doesn’t detect their presence. Tests of the fibres printed onto a human finger suggest they could be used as continuous health monitors.</p> <p>This low-waste and low-emission method for augmenting living structures could be used in a range of fields, from healthcare and virtual reality, to electronic textiles and environmental monitoring. ֱ̽<a href="https://www.nature.com/articles/s41928-024-01174-4">results</a> are reported in the journal <em>Nature Electronics</em>.</p> <p>Although human skin is remarkably sensitive, augmenting it with electronic sensors could fundamentally change how we interact with the world around us. For example, sensors printed directly onto the skin could be used for continuous health monitoring, for understanding skin sensations, or could improve the sensation of ‘reality’ in gaming or virtual reality application.</p> <p>While wearable technologies with embedded sensors, such as smartwatches, are widely available, these devices can be uncomfortable, obtrusive and can inhibit the skin’s intrinsic sensations.</p> <p>“If you want to accurately sense anything on a biological surface like skin or a leaf, the interface between the device and the surface is vital,” said Professor Yan Yan Shery Huang from Cambridge’s Department of Engineering, who led the research. “We also want bioelectronics that are completely imperceptible to the user, so they don’t in any way interfere with how the user interacts with the world, and we want them to be sustainable and low waste.”</p> <p>There are multiple methods for making wearable sensors, but these all have drawbacks. Flexible electronics, for example, are normally printed on plastic films that don’t allow gas or moisture to pass through, so it would be like wrapping your skin in cling film. Other researchers have recently developed flexible electronics that are gas-permeable, like artificial skins, but these still interfere with normal sensation, and rely on energy- and waste-intensive manufacturing techniques.</p> <p>3D printing is another potential route for bioelectronics since it is less wasteful than other production methods, but leads to thicker devices that can interfere with normal behaviour. Spinning electronic fibres results in devices that are imperceptible to the user, but don't have a high degree of sensitivity or sophistication, and they’re difficult to transfer onto the object in question.</p> <p>Now, the Cambridge-led team has developed a new way of making high-performance bioelectronics that can be customised to a wide range of biological surfaces, from a fingertip to the fluffy seedhead of a dandelion, by printing them directly onto that surface. Their technique takes its inspiration in part from spiders, who create sophisticated and strong web structures adapted to their environment, using minimal material.</p> <p> ֱ̽researchers spun their bioelectronic ‘spider silk’ from PEDOT:PSS (a biocompatible conducting polymer), hyaluronic acid and polyethylene oxide. ֱ̽high-performance fibres were produced from water-based solution at room temperature, which enabled the researchers to control the ‘spinnability’ of the fibres. ֱ̽researchers then designed an orbital spinning approach to allow the fibres to morph to living surfaces, even down to microstructures such as fingerprints.</p> <p>Tests of the bioelectronic fibres, on surfaces including human fingers and dandelion seedheads, showed that they provided high-quality sensor performance while being imperceptible to the host.</p> <p>“Our spinning approach allows the bioelectronic fibres to follow the anatomy of different shapes, at both the micro and macro scale, without the need for any image recognition,” said Andy Wang, the first author of the paper. “It opens up a whole different angle in terms of how sustainable electronics and sensors can be made. It’s a much easier way to produce large area sensors.”</p> <p>Most high-resolution sensors are made in an industrial cleanroom and require the use of toxic chemicals in a multi-step and energy-intensive fabrication process. ֱ̽Cambridge-developed sensors can be made anywhere and use a tiny fraction of the energy that regular sensors require.</p> <p> ֱ̽bioelectronic fibres, which are repairable, can be simply washed away when they have reached the end of their useful lifetime, and generate less than a single milligram of waste: by comparison, a typical single load of laundry produces between 600 and 1500 milligrams of fibre waste.</p> <p>“Using our simple fabrication technique, we can put sensors almost anywhere and repair them where and when they need it, without needing a big printing machine or a centralised manufacturing facility,” said Huang. “These sensors can be made on-demand, right where they’re needed, and produce minimal waste and emissions.”</p> <p> ֱ̽researchers say their devices could be used in applications from health monitoring and virtual reality, to precision agriculture and environmental monitoring. In future, other functional materials could be incorporated into this fibre printing method, to build integrated fibre sensors for augmenting the living systems with display, computation, and energy conversion functions. ֱ̽research is being commercialised with the support of Cambridge Enterprise, the ֱ̽’s commercialisation arm.</p> <p> ֱ̽research was supported in part by the European Research Council, Wellcome, the Royal Society, and the Biotechnology and Biological Sciences Research Council (BBSRC), part of UK Research and Innovation (UKRI).</p> <p><em><strong>Reference:</strong><br /> Wenyu Wang et al. ‘<a href="https://www.nature.com/articles/s41928-024-01174-4">Sustainable and imperceptible augmentation of living structures with organic bioelectronic fibres</a>.’ Nature Electronics (2024). DOI: 10.1038/s41928-024-01174-4</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a method to make adaptive and eco-friendly sensors that can be directly and imperceptibly printed onto a wide range of biological surfaces, whether that’s a finger or a flower petal.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Huang Lab, Cambridge</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Sensors printed on human fingers</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 24 May 2024 09:23:44 +0000 sc604 246131 at Major investment in doctoral training announced /research/news/major-investment-in-doctoral-training-announced <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/gettyimages-1457151572-dp.jpg?itok=h6mrjT0o" alt="Two people working on circuit boards in an office" title="Two people working on circuit boards, Credit: Phynart Studio via Getty Images" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽65 Engineering and Physical Sciences Research Council (EPSRC) Centres for Doctoral Training (CDTs) will support leading research in areas of national importance, including net zero, AI, defence and security, healthcare and quantum technologies. ֱ̽£1 billion in funding – from government, universities and industry – represents the UK’s biggest-ever investment in engineering and physical sciences doctoral skills.</p> <p> ֱ̽ ֱ̽ of Cambridge will lead two of the CDTs and is a partner in a further five CDTs. ֱ̽funding will support roughly 150 Cambridge PhD students over the next five years.</p> <p> ֱ̽CDT in Future Infrastructure and Built Environment: Unlocking Net Zero (FIBE3 CDT), led by Professor Abir Al-Tabbaa from the Department of Engineering, will focus on meeting the needs of the infrastructure and construction sector in its pursuit of net zero by 2050 and is a collaboration between Cambridge, 30+ industry partners and eight international academic partners.</p> <p>“ ֱ̽infrastructure sector is responsible for significant CO2 emissions, energy use and consumption of natural resources, and it’s key to unlocking net zero,” said Al-Tabbaa. “This CDT will develop the next generation of highly talented doctoral graduates who will be equipped to lead the design and implementation of the net zero infrastructure agenda in the UK.”</p> <p> ֱ̽FIBE3 CDT will provide more than 70 fully funded studentships over the next five years. ֱ̽£8.1M funding from EPSRC is supported by £1.3M funding from the ֱ̽ and over £2.5M from industry as well as over £8.9M of in-kind contributions. Recruitment is underway for the first FIBE3 CDT cohort, to start in October.</p> <p> ֱ̽CDT in Sensor Technologies and Applications in an Uncertain World, led by Professor Clemens Kaminski from the Department of Chemical Engineering and Biotechnology, will cover the entire sensor research chain – from development to end of life – and will emphasise systems thinking, responsible research and innovation, co-creation, and cohort learning.</p> <p>“Our CDT will provide students with comprehensive expertise and skills in sensor technology,” said Kaminski. “This programme will develop experts who are capable of driving impactful sensor solutions for industry and society, and can deal with uncertain data and the consequences of a rapidly changing world.”</p> <p> ֱ̽ ֱ̽ is also a partner in:</p> <ul> <li>EPSRC Centre for Doctoral Training in 2D Materials of Tomorrow (2DMoT), led by: Professor Irina Grigorieva from the ֱ̽ of Manchester</li> <li>EPSRC Centre for Doctoral Training Developing National Capability for Materials 4.0 and Henry Royce Institute, led by Professor William Parnell from the ֱ̽ of Manchester</li> <li>EPSRC Centre for Doctoral Training in Superconductivity: Enabling Transformative Technologies, led by Professor Antony Carrington from the ֱ̽ of Bristol</li> <li>EPSRC Centre for Doctoral Training in Aerosol Science: Harnessing Aerosol Science for Improved Security, Resilience and Global Health, led by Professor Jonathan Reid from the ֱ̽ of Bristol</li> <li>EPSRC Centre for Doctoral Training in Photonic and Electronic Systems, led by Professor Alwyn Seeds from ֱ̽ College London</li> </ul> <p>“As innovators across the world break new ground faster than ever, it is vital that government, business and academia invest in ambitious UK talent, giving them the tools to pioneer new discoveries that benefit all our lives while creating new jobs and growing the economy,” said Science and Technology Secretary, Michelle Donelan. “By targeting critical technologies including artificial intelligence and future telecoms, we are supporting world-class universities across the UK to build the skills base we need to unleash the potential of future tech and maintain our country’s reputation as a hub of cutting-edge research and development.”</p> <p>“ ֱ̽Centres for Doctoral Training will help to prepare the next generation of researchers, specialists and industry experts across a wide range of sectors and industries,” said Professor Charlotte Deane, Executive Chair of the Engineering and Physical Sciences Research Council, part of UK Research and Innovation. “Spanning locations across the UK and a wide range of disciplines, the new centres are a vivid illustration of the UK’s depth of expertise and potential, which will help us to tackle large-scale, complex challenges and benefit society and the economy. ֱ̽high calibre of both the new centres and applicants is a testament to the abundance of research excellence across the UK, and EPSRC’s role as part of UKRI is to invest in this excellence to advance knowledge and deliver a sustainable, resilient and prosperous nation.”</p> <p>More than 4,000 doctoral students will be trained over the next nine years, building on EPSRC’s long-standing record of sustained support for doctoral training.</p> <p>Total investment in the CDTs includes:</p> <ul> <li>£479 million by EPSRC, including £16 million of additional UKRI funding to support CDTs in quantum technologies</li> <li>Over £7 million from Biotechnology and Biological Sciences Research Council, also part of UKRI, to co-fund three CDTs</li> <li>£16 million by the MOD to support two CDTs</li> <li>£169 million by UK universities</li> <li>plus a further £420 million in financial and in-kind support from business partners </li> </ul> <p>This investment includes an additional £135 million for CDTs which will start in 2025. More than 1,400 companies, higher education institutions, charities and civic organisations are taking part in the centres for doctoral training. CDTs have a significant reputation for training future UK academics, industrialists and innovators who have gone on to develop the latest technologies.</p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Sixty-five Centres for Doctoral Training – which will train more than 4000 doctoral students across the UK – have been announced by Science, Innovation and Technology Secretary Michelle Donelan.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Phynart Studio via Getty Images</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Two people working on circuit boards</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 12 Mar 2024 14:23:55 +0000 Anonymous 245071 at Sensors made from ‘frozen smoke’ can detect toxic formaldehyde in homes and offices /research/news/sensors-made-from-frozen-smoke-can-detect-toxic-formaldehyde-in-homes-and-offices <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/pia23343orig-dp.jpg?itok=KtbikhpC" alt="A block of silica aerogel being held in a person&#039;s hand" title="Silica aerogel, Credit: NASA/JPL-Caltech" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽researchers, from the ֱ̽ of Cambridge, developed sensors made from highly porous materials known as aerogels. By precisely engineering the shape of the holes in the aerogels, the sensors were able to detect the fingerprint of formaldehyde, a common indoor air pollutant, at room temperature.</p> <p> ֱ̽proof-of-concept sensors, which require minimal power, could be adapted to detect a wide range of hazardous gases, and could also be miniaturised for wearable and healthcare applications. ֱ̽<a href="https://www.science.org/doi/full/10.1126/sciadv.adk6856">results</a> are reported in the journal <em>Science Advances</em>.</p> <p>Volatile organic compounds (VOCs) are a major source of indoor air pollution, causing watery eyes, burning in the eyes and throat, and difficulty breathing at elevated levels. High concentrations can trigger attacks in people with asthma, and prolonged exposure may cause certain cancers.</p> <p>Formaldehyde is a common VOC and is emitted by household items including pressed wood products (such as MDF), wallpapers and paints, and some synthetic fabrics. For the most part, the levels of formaldehyde emitted by these items are low, but levels can build up over time, especially in garages where paints and other formaldehyde-emitting products are more likely to be stored.</p> <p>According to a 2019 <a href="https://www.globalactionplan.org.uk/news/nearly-half-of-uk-homes-have-high-indoor-air-pollution-new-report">report</a> from the campaign group Clean Air Day, a fifth of households in the UK showed notable concentrations of formaldehyde, with 13% of residences surpassing the recommended limit set by the World Health Organization (WHO).</p> <p>“VOCs such as formaldehyde can lead to serious health problems with prolonged exposure even at low concentrations, but current sensors don’t have the sensitivity or selectivity to distinguish between VOCs that have different impacts on health,” said <a href="https://www.nanoengineering.eng.cam.ac.uk/">Professor Tawfique Hasan</a> from the <a href="https://www.graphene.cam.ac.uk/">Cambridge Graphene Centre</a>, who led the research.</p> <p>“We wanted to develop a sensor that is small and doesn’t use much power, but can selectively detect formaldehyde at low concentrations,” said Zhuo Chen, the paper’s first author.</p> <p> ֱ̽researchers based their sensors on aerogels: ultra-light materials sometimes referred to as ‘liquid smoke’, since they are more than 99% air by volume. ֱ̽open structure of aerogels allows gases to easily move in and out. By precisely engineering the shape, or morphology, of the holes, the aerogels can act as highly effective sensors.</p> <p>Working with colleagues at Warwick ֱ̽, the Cambridge researchers optimised the composition and structure of the aerogels to increase their sensitivity to formaldehyde, making them into filaments about three times the width of a human hair. ֱ̽researchers 3D printed lines of a paste made from graphene, a two-dimensional form of carbon, and then freeze-dried the graphene paste to form the holes in the final aerogel structure. ֱ̽aerogels also incorporate tiny semiconductors known as quantum dots.</p> <p> ֱ̽sensors they developed were able to detect formaldehyde at concentrations as low as eight parts per billion, which is 0.4 percent of the level deemed safe in UK workplaces. ֱ̽sensors also work at room temperature, consuming very low power.</p> <p>“Traditional gas sensors need to be heated up, but because of the way we’ve engineered the materials, our sensors work incredibly well at room temperature, so they use between 10 and 100 times less power than other sensors,” said Chen.</p> <p>To improve selectivity, the researchers then incorporated machine learning algorithms into the sensors. ֱ̽algorithms were trained to detect the ‘fingerprint’ of different gases, so that the sensor was able to distinguish the fingerprint of formaldehyde from other VOCs.</p> <p>“Existing VOC detectors are blunt instruments – you only get one number for the overall concentration in the air,” said Hasan. “By building a sensor that can detect specific VOCs at very low concentrations in real time, it can give home and business owners a more accurate picture of air quality and any potential health risks.”</p> <p> ֱ̽researchers say the same technique could be used to develop sensors to detect other VOCs. In theory, a device the size of a standard household carbon monoxide detector could incorporate multiple different sensors within it, providing real-time information about a range of different hazardous gases.  “At Warwick, we're developing a low-cost multi-sensor platform that will incorporate these new aerogel materials and, coupled with AI algorithms, detect different VOCs,” said co-author Professor Julian Gardner from Warwick ֱ̽. </p> <p>“By using highly porous materials as the sensing element, we’re opening up whole new ways of detecting hazardous materials in our environment,” said Chen.</p> <p> ֱ̽research was supported in part by the Henry Royce Institute, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Tawfique Hasan is a Fellow of Churchill College, Cambridge.</p> <p><em><strong>Reference:</strong><br /> Zhuo Chen et al. ‘<a href="https://www.science.org/doi/full/10.1126/sciadv.adk6856">Real-time, noise and drift resilient formaldehyde sensing at room temperature with aerogel filaments</a>.’ Science Advances (2024). DOI: 10.1126/sciadv.adk6856</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a sensor made from ‘frozen smoke’ that uses artificial intelligence techniques to detect formaldehyde in real time at concentrations as low as eight parts per billion, far beyond the sensitivity of most indoor air quality sensors.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://images.nasa.gov/details/PIA23343" target="_blank">NASA/JPL-Caltech</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Silica aerogel</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/public-domain">Public Domain</a></div></div></div> Fri, 09 Feb 2024 19:00:00 +0000 sc604 244381 at Robot trained to read braille at twice the speed of humans /research/news/robot-trained-to-read-braille-at-twice-the-speed-of-humans <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/img-4841-dp.jpg?itok=RoYah_Zz" alt="Robot braille reader" title="Robot braille reader, Credit: Parth Potdar" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽research team, from the ֱ̽ of Cambridge, used machine learning algorithms to teach a robotic sensor to quickly slide over lines of braille text. ֱ̽robot was able to read the braille at 315 words per minute at close to 90% accuracy.</p> <p>Although the robot braille reader was not developed as an assistive technology, the researchers say the high sensitivity required to read braille makes it an ideal test in the development of robot hands or prosthetics with comparable sensitivity to human fingertips. ֱ̽<a href="https://ieeexplore.ieee.org/document/10410896">results</a> are reported in the journal <em>IEEE Robotics and Automation Letters</em>.</p> <p>Human fingertips are remarkably sensitive and help us gather information about the world around us. Our fingertips can detect tiny changes in the texture of a material or help us know how much force to use when grasping an object: for example, picking up an egg without breaking it or a bowling ball without dropping it.</p> <p>Reproducing that level of sensitivity in a robotic hand, in an energy-efficient way, is a big engineering challenge. In <a href="https://birlab.org/">Professor Fumiya Iida’s lab</a> in Cambridge’s Department of Engineering, researchers are developing solutions to this and other skills that humans find easy, but robots find difficult.</p> <p>“ ֱ̽softness of human fingertips is one of the reasons we’re able to grip things with the right amount of pressure,” said Parth Potdar from Cambridge’s Department of Engineering and an undergraduate at Pembroke College, the paper’s first author. “For robotics, softness is a useful characteristic, but you also need lots of sensor information, and it’s tricky to have both at once, especially when dealing with flexible or deformable surfaces.”</p> <p>Braille is an ideal test for a robot ‘fingertip’ as reading it requires high sensitivity, since the dots in each representative letter pattern are so close together. ֱ̽researchers used an off-the-shelf sensor to develop a robotic braille reader that more accurately replicates human reading behaviour.</p> <p>“There are existing robotic braille readers, but they only read one letter at a time, which is not how humans read,” said co-author David Hardman, also from the Department of Engineering. “Existing robotic braille readers work in a static way: they touch one letter pattern, read it, pull up from the surface, move over, lower onto the next letter pattern, and so on. We want something that’s more realistic and far more efficient.”</p> <p> ֱ̽robotic sensor the researchers used has a camera in its ‘fingertip’, and reads by using a combination of the information from the camera and the sensors. “This is a hard problem for roboticists as there’s a lot of image processing that needs to be done to remove motion blur, which is time and energy-consuming,” said Potdar.</p> <p> ֱ̽team developed machine learning algorithms so the robotic reader would be able to ‘deblur’ the images before the sensor attempted to recognise the letters. They trained the algorithm on a set of sharp images of braille with fake blur applied. After the algorithm had learned to deblur the letters, they used a computer vision model to detect and classify each character.</p> <p>Once the algorithms were incorporated, the researchers tested their reader by sliding it quickly along rows of braille characters. ֱ̽robotic braille reader could read at 315 words per minute at 87% accuracy, which is twice as fast and about as accurate as a human Braille reader.</p> <p>“Considering that we used fake blur the train the algorithm, it was surprising how accurate it was at reading braille,” said Hardman. “We found a nice trade-off between speed and accuracy, which is also the case with human readers.”</p> <p>“Braille reading speed is a great way to measure the dynamic performance of tactile sensing systems, so our findings could be applicable beyond braille, for applications like detecting surface textures or slippage in robotic manipulation,” said Potdar.</p> <p>In future, the researchers are hoping to scale the technology to the size of a humanoid hand or skin. ֱ̽research was supported in part by the Samsung Global Research Outreach Program.</p> <p> </p> <p><em><strong>Reference:</strong><br /> Parth Potdar et al. ‘<a href="https://ieeexplore.ieee.org/document/10410896">High-Speed Tactile Braille Reading via Biomimetic Sliding Interactions</a>.’ IEEE Robotics and Automation Letters (2024). DOI: 10.1109/LRA.2024.3356978</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.</p> </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-217601" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/can-robots-read-braille">Can robots read braille?</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/xqtA2Z668Ic?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Parth Potdar</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robot braille reader</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br /> ֱ̽text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 29 Jan 2024 06:04:52 +0000 sc604 244161 at It’s all in the wrist: energy-efficient robot hand learns how not to drop the ball /stories/robotic-hand <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have designed a low-cost, energy-efficient robotic hand that can grasp a range of objects – and not drop them – using just the movement of its wrist and the feeling in its ‘skin’.  </p> </p></div></div></div> Wed, 12 Apr 2023 03:23:34 +0000 sc604 238441 at Artificial intelligence powers record-breaking all-in-one miniature spectrometers /research/news/artificial-intelligence-powers-record-breaking-all-in-one-miniature-spectrometers <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/chip-on-fingertip.jpg?itok=98mjDymU" alt="On-chip spectrometer on a fingertip" title="On-chip spectrometer on a fingertip, Credit: Suvi-Tuuli Akkanen, Mikko Turunen, Vincent Pelgrin. Aalto ֱ̽." /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>We see light and colours around us every day. However, to analyse the information it carries, we must analyse light using spectrometers, in the lab. These devices detect sparkles and substances that our eyes would otherwise not notice.</p> <p>Now, an international team of researchers, including the ֱ̽ of Cambridge, have designed a miniaturised spectrometer that breaks all current resolution records, and does so in a much smaller package, thanks to computational programmes and artificial intelligence.</p> <p> ֱ̽new miniaturised devices could be used in a broad range of sectors, from checking the quality of food to analysing starlight or detecting faint clues of life in outer space. ֱ̽<a href="https://www.science.org/doi/10.1126/science.add8544">results</a> are reported in the journal <em>Science</em>.</p> <p>Traditionally, spectrometers rely on bulky components to filter and disperse light. Modern approaches simplify these components to shrink footprints, but still suffer from limited resolution and bandwidth. Additionally, traditional spectrometers are heavy and take up extraordinary amounts of space, which limits their applications in portable and mobile devices.</p> <p>To tackle these problems, and shrink the size of the system, researchers have coupled layered materials with artificial intelligence algorithms. ֱ̽result is an all-in-one spectrometer thousands of times smaller than current commercial systems. At the same time, it offers performance comparable to benchtop systems. In other words, these new spectrometers will provide portable alternatives to uncover otherwise invisible information, without even going into the lab.</p> <p>“We eliminate the need for detector arrays, dispersive components, and filters. It’s an all-in-one, miniaturised device that could revolutionise this field,” said Dr Hoon Hahn Yoon, from Aalto ֱ̽ in Finland, first author of the paper. This spectrometer-on-chip technology is expected to offer high performance and new usability across science and industry.</p> <p> ֱ̽detector uses van der Waals heterostructures – a ‘sandwich’ of different ingredients, including graphene, molybdenum disulfide, and tungsten diselenide. Different combinations of material components enable light detection beyond the visible spectrum, as far as the near-infrared region. This means the spectrometer detects more than just colour, enabling applications such as chemical analysis and night vision.</p> <p>“We detect a continuum spectrum of light, opening a world of possibilities in a myriad of markets,” said Yoon. “Exploring other material combinations could uncover further functionalities, including even broader hyperspectral detection and improved resolution.”</p> <p>Artificial intelligence is a key aspect of these devices, commonly called ‘computational’ spectrometers. This technology compensates for the inherent noise increase that inevitably occurs when the optical component is wholly removed.</p> <p>“We were able to use mathematical algorithms to successfully reconstruct the signals and spectra, it’s a profound and transformative technological leap,” said lead author Professor Zhipei Sun, also from Aalto ֱ̽, and a former member of Cambridge’s Department of Engineering. “ ֱ̽current design is just a proof-of-concept. More advanced algorithms, as well as different combinations of materials, could soon provide even better miniaturised spectrometers.”</p> <p>Spectrometers are used for toxin detection in food and cosmetics, cancer imaging, and in spacecraft – including the James Webb Space Telescope. And they will soon become more common thanks to the development and advancement of technologies such as the Internet of Things and Industry 4.0.</p> <p> ֱ̽detection of light – and the full analysis of spectroscopic information – has applications in sensing, surveillance, smart agriculture, and more. Among the most promising applications for miniaturised spectrometers are chemical and biochemical analysis, thanks to the capabilities of the devices to detect light in the infrared wavelength range.</p> <p> ֱ̽new devices could be incorporated into instruments like drones, mobile phones, and lab-on-a-chip platforms, which can carry out several experiments in a single integrated circuit. ֱ̽latter also opens up opportunities in healthcare. In this field, spectrometers and light-detectors are already key components of imaging and diagnostic systems – the new miniaturised devices could enable the simultaneous visualisation and detection of ‘chemical fingerprints’, leading to possibilities in the biomedical area.</p> <p>“Our miniaturised spectrometers offer high spatial and spectral resolution at the micrometre and nanometre scales, which is particularly exciting for responsive bio-implants and innovative imaging techniques,” said co-author Professor Tawfique Hasan, from the Cambridge Graphene Centre.</p> <p>This technology has huge potential for scalability and integration, thanks to its compatibility with well-established industrial processes. It could open up the future for the next generation of smartphone cameras that evolve into hyperspectral cameras that conventional colour cameras cannot do. Researchers hope their contribution is a stepping stone towards the development of more advanced computational spectrometers, with record-breaking accuracy and resolution. This example, they say, is just the first of many.</p> <p><em><strong>Reference:</strong><br /> Hoon Hahn Yoon et al. ‘<a href="https://www.science.org/doi/10.1126/science.add8544">Miniaturized Spectrometers with a Tunable van der Waals Junction</a>.’ Science (2022). DOI: 10.1126/science.add8544.</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Using Artificial Intelligence (AI) to replace optical and mechanical components, researchers have designed a tiny spectrometer that breaks all current resolution records.</p> </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Suvi-Tuuli Akkanen, Mikko Turunen, Vincent Pelgrin. Aalto ֱ̽.</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">On-chip spectrometer on a fingertip</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 20 Oct 2022 18:00:00 +0000 sc604 234761 at Self-healing materials for robotics made from ‘jelly’ and salt /research/news/self-healing-materials-for-robotics-made-from-jelly-and-salt <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/selfhealingrobotics.jpg?itok=IX6Jk8iI" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> ֱ̽low-cost jelly-like materials, developed by researchers at the ֱ̽ of Cambridge, can sense strain, temperature and humidity. And unlike earlier self-healing robots, they can also partially repair themselves at room temperature.</p>&#13; &#13; <p> ֱ̽<a href="https://www.nature.com/articles/s41427-022-00357-9">results</a> are reported in the journal <em>NPG Asia Materials</em>.</p>&#13; &#13; <p>Soft sensing technologies could transform robotics, tactile interfaces and wearable devices, among other applications. However, most soft sensing technologies aren’t durable and consume high amounts of energy.</p>&#13; &#13; <p>“Incorporating soft sensors into robotics allows us to get a lot more information from them, like how strain on our muscles allows our brains to get information about the state of our bodies,” said David Hardman from Cambridge’s Department of Engineering, the paper’s first author.</p>&#13; &#13; <p>As part of the EU-funded SHERO project, Hardman and his colleagues have been working to develop soft sensing, self-healing materials for robotic hands and arms. These materials can detect when they are damaged, take the necessary steps to temporarily heal themselves and then resume work – all without the need for human interaction.</p>&#13; &#13; <p>“We’ve been working with self-healing materials for several years, but now we’re looking into faster and cheaper ways to make self-healing robots,” said co-author Dr Thomas George-Thuruthel, also from the Department of Engineering.</p>&#13; &#13; <p>Earlier versions of the self-healing robots needed to be heated in order to heal, but the Cambridge researchers are now developing materials that can heal at room temperature, which would make them more useful for real-world applications.</p>&#13; &#13; <p>“We started with a stretchy, gelatine-based material which is cheap, biodegradable and biocompatible and carried out different tests on how to incorporate sensors into the material by adding in lots of conductive components,” said Hardman.</p>&#13; &#13; <p> ֱ̽researchers found that printing sensors containing sodium chloride – salt – instead of carbon ink resulted in a material with the properties they were looking for. Since salt is soluble in the water-filled hydrogel, it provides a uniform channel for ionic conduction – the movement of ions.</p>&#13; &#13; <p>When measuring the electrical resistance of the printed materials, the researchers found that changes in strain resulted in a highly linear response, which they could use to calculate the deformations of the material. Adding salt also enabled sensing of stretches of more than three times the sensor’s original length, so that the material can be incorporated into flexible and stretchable robotic devices.</p>&#13; &#13; <p> ֱ̽self-healing materials are cheap and easy to make, either by 3D printing or casting. They are preferable to many existing alternatives since they show long-term strength and stability without drying out, and they are made entirely from widely available, food-safe, materials.</p>&#13; &#13; <p>“It’s a really good sensor considering how cheap and easy it is to make,” said George-Thuruthel. “We could make a whole robot out of gelatine and print the sensors wherever we need them.”</p>&#13; &#13; <p> ֱ̽self-healing hydrogels bond well with a range of different materials, meaning they can easily be incorporated with other types of robotics. For example, much of the research in the <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, where the researchers are based, is focused on the development of artificial hands. Although this material is a proof-of-concept, if developed further, it could be incorporated into artificial skins and custom-made wearable and biodegradable sensors.</p>&#13; &#13; <p>This work was supported by the <a href="https://katamaluku.id/">Self-HEaling soft RObotics (SHERO)</a> project, funded under the Future and Emerging Technologies (FET) programme of the European Commission.</p>&#13; &#13; <p> </p>&#13; &#13; <p><em>R<strong>eference:</strong><br />&#13; David Hardman, Thomas George-Thuruthel, and Fumiya Iida. ‘<a href="https://www.nature.com/articles/s41427-022-00357-9">Self-Healing Ionic Gelatin/Glycerol Hydrogels for Strain Sensing Applications</a>.’ NPG Asia Materials (2022). DOI: 10.1038/s41427-022-00357-9</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have developed self-healing, biodegradable, 3D-printed materials that could be used in the development of realistic artificial hands and other soft robotics applications.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It’s a really good sensor considering how cheap and easy it is to make</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Thomas George-Thuruthel</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-192031" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/self-healing-robot-developed-by-cambridge-uni-engineers">Self healing robot developed by Cambridge Uni engineers</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/eVH0YCeI464?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Fri, 18 Feb 2022 16:54:17 +0000 sc604 229951 at