Figure 1. Overview of the gesture-recognition wristband (GRW): a) Configuration of the proposed GRW. b) Schematic illustration of a single nanofiber-based pressure-sensor unit. c) Cross section of the wrist with GRW. d) Photograph of the GRW.
One key to the success of the GRW in this work is its highly sensitive NFPSU, as shown in Figure 1b. It consists of a filmy optical-nanofiber sensor, a soft liquid sac, and a rigid three-dimensional (3D)-printed resin shell. The optical nanofiber, serving as a light waveguide to detect skin deformation with high sensitivity, is U-shaped for increased compactness and is encapsulated by polydimethylsiloxane (PDMS) films (length: 21 mm; width: 14 mm; thickness: 200 μm) to form a filmy optical nanofiber sensor. A soft liquid sac (length: 27 mm; width: 23 mm; thickness: 2.5 mm) filled with glycerol is sealed by the filmy optical nanofiber sensor; it is used to transfer the applied pressure due to skin deformation to the optical nanofiber. The rigid 3D-printed resin shell is designed to isolate the sensor from mechanical stimuli other than those generated by finger movements. The sac base (length: 6 mm; width: 6 mm; thickness: 200 μm) penetrates the hole in the center of the resin shell to contact the skin surface and collect pressure signals. In this case, a liquid sac can cover 2-3 adjacent tendons, thus, the cross-talk of acquired signals and the position sensitive response are effectively mitigated. Benefitting from the flexible sac base (200 μm in thickness), the liquid sac can transduce the movements of wrist tendons with high-fidelity and excellent reversibility.
The locations of the three sensors were arranged by considering the distribution of wrist tendons, subcutaneous fat, and the blood vessels. In order to avoid the impact of wrist pules on the hand gesture recognition, two NFPSUs on the dorsal wrist (back side) were used to detect the movements of thumb and middle fingers. One NFPSU on the volar wrist (palm side) was used to detect flexion of index and ring finger. As shown in Figure 1c, the three NFPSUs can recognize 12 hand gestures and the interference of wrist pulse can be neglected.
 To make the GRW suitable for testers with different physiques and avoid random sliding between skins and NFPSUs, we used elastic straps to connect the three NFPSUs as shown in Figure 1d. Note that we don’t recommend wear the wrist band too much tightly. Each NFPSU has dimensions of 3×2.5×0.3 cm3 (comparable to a coin) and can be comfortably worn on the wrist.
Figure 2. Operation and characterization of the nanofiber-based pressure-sensor unit (NFPSU): a) Working principle of the NFPSU. b) Simulated optical-transmission distribution of 630-nm-wavelength light guided along an 800-nm-diameter optical nanofiber with bending radius 20 μm. c) SEM image of optical nanofiber. d) Micrograph of a U-shaped optical nanofiber guiding 633-nm light. e) NFPSU’s wavelength-dependent transmittance response to pressure in the range 0-31.8 kPa. Inset: Transmittance response at 630 nm wavelength. f) 6000 repeated cycles of stability test. g) Response to a pressure signal with a frequency of 800 Hz.
Figure 2a illustrates the working principle of the proposed NFPSU. When skin deformations relating to finger movements occur, the pressure applied to the bottom of the attached soft liquid sac changes. According to Pascal’s principle, the pressure applied to any point within an incompressible liquid can be transmitted to every point of the liquid in real time.[35] Therefore, the pressure applied to the bottom of the soft-liquid-sac base is transmitted to the top surface, causing the contact filmy optical-nanofiber sensor to deform. In this way, the external mechanical signals related to finger movements captured at any point of the soft liquid sac base are transmitted to the optical-nanofiber sensor with high fidelity, eliminating the impact of position drift on the sensing signals. As shown in Figure 2b, when the optical nanofiber is slightly bent under pressure, the well-confined symmetric mode of an 800-nm-diameter nanofiber at the input port evolves into an asymmetric profile with clear optical leakage, making it highly sensitive to mechanical stimuli. The model between the nanofiber deformation and optical intensity is provided in Supporting Information. As shown in Figure S1, the bending radius decreases with the increase of the applied normal force. On the other hand, when an 800-nm-diameter nanofiber is embedded in PDMS, it will be not sensitive to the change of temperature based on our previous studies.[36,37]
In this study, the optical nanofiber was fabricated by heating and stretching a standard silica single-mode fiber (SMF).[38] The as-fabricated optical nanofiber showed excellent flexibility, significantly exceeding the performance of standard silica or polymer optical fibers. For example, the bending radius could be made less than 10 µm, as shown in Figure 2c. Owing to its smooth surface and geometric uniformity (Figure 2c), the as-fabricated fiber offered a transmission greater than 99%[38] and a tensile strength higher than that of spider silk.[39]
Figure 2d shows a U-shaped nanofiber guiding 633-nm-wavelength laser light. The bright red light along the fiber indicates the presence of an evanescent field outside the optical nanofiber. Generally, with the decrease of the fiber diameter, the fractional power of the light outside the optical fiber increases exponentially and a stronger evanescent field results in a higher sensitivity. However, a thinner optical nanofiber loses mechanical stability when it is manipulated to form a U shape structure. In this work, an 800-nm-diameter optical nanofiber was chosen for the trade-off between high sensitivity and mechanical stability. For high compactness, the curved end of the U-shaped optical-nanofiber sensor was intentionally positioned slightly over the edge of the liquid sac to ensure that deformation would occur in the sensitive area of the nanofiber.
To investigate the sensor’s pressure response, we used a mechanical testing system (Figure S2). Figure 2e shows the wavelength-dependent transmittance response to applied static pressure in the range 0-31.8 kPa. With increasing wavelength, the transmittance of the nanofiber decreases and the sensitivity increases as a result of the increasing fractional power of the evanescent field.[19] By defining the sensitivity as S=ΔT/ΔP, where ΔT is the change of transmittance and ΔP is the change of pressure, the sensor shows a lower sensitivity of -0.01 kPa-1 when the applied pressure is less than 6.4 kPa. In the high-pressure range (6.4-31.8 kPa), the sensor achieves a higher sensitivity of -0.03 kPa-1 (inset of Fig. 2e).
 The NFPSU demonstrated high durability under a pressure of 31.8 kPa at a frequency of 0.55 Hz (Figure 2f): after 6000 cycles, the sensor performance showed little change. In addition, a sinusoidal mechanical signal with a high frequency of 800 Hz was applied to evaluate the temporal frequency response of the NFPSU. The stable performance shown in Figure 2g demonstrates the sensor’s ability to obtain mechanical signals related to finger movements in real time.
In addition, Figure S3 gives the temperature response of a NFPSU in the range of 30-60 ºC. It can be clearly seen that the spectrum shows a temperature independent change. Moreover, the skin temperature is relatively stable, which reduces the impact of temperature on our sensor. 
 
Figure 3. Arterial-pulse measurement with nanofiber-based pressure-sensor unit (NFPSU): a) Photograph of NFPSU on tester’s wrist. b, c) Measured arterial-pulse waveforms of a male and a female tester before and after exercise. d) Schematic of nine-point grid with an area of 1×1 cm2 on a tester’s wrist. e) Measured arterial-pulse waveform at different locations (red dots) in the grid. f) Typical pulse waveform showing P, T, and D waves.
To characterize its response to tiny mechanical stimuli, the NFPSU was used to monitor wrist pulses in real time (Figure 3a). Owing to its high sensitivity, it could readily obtain wrist-pulse waveforms with high resolution; the two distinguishable peaks and late systolic augmentation shoulder agree very well with the expected shape of a noninvasive radial-artery pressure wave.[40] The trace shows clearly that the pulse frequency before the exercise was ~ 70 beats/min and the pulse shape was regular and repeatable. After exercise, the pulse frequency increased to ~ 90 beats/min, and the shape and intensity were irregular (Figure 3, b and c).
 To prove that the NFPSU response was position independent, we drew a nine-point grid with an area of 1×1 cm2 on the wrist skin, centered on the wrist artery (Figure 3d). The NFPSU was positioned point-by-point on the grid to obtain pulse signals. Figure 3e presents the measured arterial-pulse signals recorded by the NFPSU at various positions in the grid (indicated by red dots in the figure). Because the whole experiment lasted more than one hour, the heart beat cycles show a slight fluctuation in the range of 60-70/min. Each of the nine pulse waveforms in Figure 3e resembles a typical pulse waveform (Figure 3f) consisting of a percussion wave (P-wave), a tidal wave (T-wave), and a diastolic wave (D-wave). Thus, we believe that high-fidelity pulse signals can be successfully obtained anywhere within a maximum distance of √(52+52)≈7.07 mm from the arterial pulse. Such position-independent sensing ability of the NFPSU is crucial when the GRW is worn for a long time.
Figure 4. Processing procedure for data collected in gesture-recognition experiment. (ML: machine learning; SVM: support-vector machine)
For gesture recognition, the proposed GRW with three NFPSUs was placed on the tester’s wrist to obtain mechanical signals related to hand gestures. Data-processing methods were employed to calculate the characteristics of each gesture signal, as shown in Figure 4. Light was launched into the three sensors and collected using a CMOS camera. The time-varying output data—in the form of a CMOS image (1280×720 pixels)—were then transferred to and processed by a computer. By extracting the change in the gray level of the CMOS images over time, we obtained the time-domain output light intensities of all three sensor channels as they varied with different gestures. (The time-domain signal was automatically collected using the change-point-finder algorithm or the threshold setting method.) To avoid the influence of ambient light, the CMOS was well packaged and shaded. In addition, an algorithm was used to remove the background noise. Subsequently, data consolidation was achieved by an end-to-end merge of the time-domain data obtained from the three sensors. The consolidated data were then appended to a gesture label and collected in a dataset of integrated time-domain signals containing all gestures and their corresponding gesture labels. Because some degree of random motion is inevitable for a GRW when a person is wearing it, a machine learning algorithm for support-vector classification (SVC) was introduced to relearn the tester’s gestures every time the wearing condition changed. In this study, an SVM classification model was trained using the consolidated database and was subsequently used as a classifier to detect gestures. Once the real-time gesture-related data were collected, the trained support-vector classifier (SVC) was used to predict the real-time data and return the predicted gesture.
Figure 5a shows the mechanical signals obtained by our GRW for twelve fundamental gestures. The effects of different gestures on the output intensity are readily visible. The cross section of the wrist was altered by the gesture-related movement of even a single tendon, and the wearing conditions of the GRW changed accordingly. Even for similar gestures (e.g., Gesture 1 and 2), notable differences were observed in the corresponding time-varying output of NFPSU 2; this can be attributed to the high sensitivity of the NFPSUs. Though the introduction of machine learning algorithm can effectively solve the inevitable problem of random wristband motion, disturbance in the output of GRW sensors caused by sliding between the sensors and the skin surface may occur during long-term wear, reducing the recognition accuracy. However, the position-independent response of the NFPSU means that the effects of sliding on the results are insignificant. Consequently, a stable output of the NFPSUs during long-term wear is achieved even with very few sensors.
Figure 5. Results of the gesture-recognition experiment: a) Gestures with corresponding time-domain signals measured by the three nanofiber-based pressure-sensor units (NFPSUs). b) Hands of four testers with different physiques. c-f) Classifiers obtained from different testers.
To characterize the adaptability and gesture-recognition accuracy of the GRW, four testers with different physiques were employed (Figure 5b). Each tester performed each gesture ten times to update the corresponding databases. The corresponding classifiers were obtained by training new SVM classification models with the updated databases and corresponding gesture labels. Figure 5c shows the classifier obtained from Tester 1. Twelve gestures were successfully recognized with an accuracy of 93.2%, which was comparable to or slightly higher than that reported for GRWs with more than five electrical sensors.[16-18] To prove the adaptivity of the wristband and save the testers’ time, only five representative gestures were included for the other testers. The classifiers obtained from the other testers are shown in Figure 5d-f. The slight fluctuations in the recognition accuracy may be attributed to different physiques. Specifically, the subcutaneous fat of the chubby tester (Tester 2) reduced the degree of finger movement-related deformation, which slightly decreased the recognition accuracy. Nevertheless, the excellent adaptability of the proposed GRW can be seen in the high recognition accuracy (92%-94%) for all the testers, regardless of physique. Besides, calibration is required before each wear and can be performed by repeating each gesture 10 times, which takes about 10 minutes.
Figure 6. Remote control of robotic hand via proposed gesture-recognition wristband (GRW): a) Images of a robotic hand performing gestures based on the results obtained from a tester wearing the GRW. b) Images of a tester playing rock-paper-scissors with a robotic hand.
Robotic hands are widely used in modern industry, serving as an efficient method for improving productivity and working conditions. Humanoid robotic hands, which can perform more complicated tasks involving various gestures, have wide application prospects, e.g., remote surgical operation, sign-language translation, and virtual/augmented-reality interactions. To demonstrate further the use of the proposed GRW as an HMI device, a robotic hand was used to perform specific movements based on the gesture-recognition results, as shown in Figure 6a. A schematic of the entire experiment is shown in Figure S4 in the Supporting Information. All four testers, wearing GRWs, controlled the robotic hand through gestures almost in real time (Videos S1-S4). Specifically, there is a time delay of ~1 s between the tester’s action and the response of the robotic hand in the videos. We attributed the time delay mainly to the robotic hand, which was driven by steering engines, because the response time of the NFPSU and the CMOS-based signal collection system is 12 ms and 150 ms, respectively. Additionally, each tester played rock-paper-scissors with the robotic hand (Video S5). Since the robotic hand recognized the tester’s gesture before executing its own move, it always won, as shown in Figure 6b.
3. Conclusion
In this study, we proposed and demonstrated an optical-nanofiber-enabled GRW for HMI with the assistance of machine learning. In order to overcome the position-dependent response of a filmy optical nanofiber sensor, we used a soft liquid sac to transfer pressure stimuli to the optical-nanofiber sensor. The pressure sensor exhibited position independence within a distance of 7.07 mm, together with a good linear pressure response in the range 6.4-31.8 kPa. With the assistance of the SVM machine-learning algorithm, the three-sensor GRW could successfully recognize twelve hand gestures with a maximum accuracy of 94%. Moreover, the GRW was able to control and play games with a robotic hand, demonstrating its significant potential for use as an immersive HMI terminal. To recognize more gestures with much higher accuracy, one can include more NFPSUs in a GRW or optimize the SVM model. With the assistance of micro/nanofiber angle sensors[30-31], it is possible for the GRW to recognize wrist-related gestures. The proposed GRW seems to offer a promising solution for application scenarios in virtual/augmented reality and the metaverse requiring gesture recognition, e.g., remote control of machines or translation of sign language.
4. Experimental Methods
Materials: PDMS (Sylgard 184 silicone elastomer) was purchased from Dow Corning. The optical nanofiber was fabricated from a standard silica SMF (G.652, cladding diameter: 125 μm, core diameter: 9 μm; Corning Inc.).
Fabrication: First, the nanofiber was fabricated by heating and stretching the SMF. The uniform diameter and length of the waist area were 800 nm and approximately 1.5 cm, respectively. By accurately measuring the time interval between transmission laser intensity drops during fiber pulling process, we could precisely determine the time to stop the heating and pulling based on a target diameter. Both the accuracy and standard deviation of diameters can be less than 5 nm for expected micro/nanofiber diameters ranging from 800 nm to 1300 nm.[41] The stretched nanofiber was connected to an unstretched SMF at both ends by a conical tapered transition region. Subsequently, the PDMS monomer and curing agent were mixed in a ratio of 10:1. After degassing for 30 minutes, the uncured PDMS was cast on a glass slide, followed by spin-coating (500 rpm, 5 minutes). The glass slide coated with PDMS was heated at 80 °C for 30 minutes to form a PDMS membrane with a thickness of 200 μm. The thickness can be readily adjusted by controlling the spin-coating speed.
The optical nanofiber was then formed into a U-shape and embedded between the PDMS membranes. Therefore, a membrane-nanofiber-membrane sandwiched optical-nanofiber sensor was realized. To form the soft liquid sac, we cast the degassed PDMS precursor into a custom-made mold. The region of the sac contacting the skin surface, referred to as the sac base, was designed to have a thickness of ~200 μm. After solidification, the sac was demolded and filled with glycerol solution to ensure its non-toxicity and chemical stability. Subsequently, the liquid-filled sac was sealed using a filmy optical-nanofiber sensor. Finally, the sealed optical-nanofiber sensor with the soft liquid sac was embedded and fixed in a rigid 3D-printed resin shell using ultraviolet solidification glue to complete the assembly of the NFPSU. The detailed process of NFPSU fabrication is shown in Video S6.
Characterization and measurement: A motorized force tester (ESM 303, Mark-10 Inc.) was used to control the applied pressure and maintain the operating cycle. A tungsten light source (SLS201L/M, Thorlabs Inc.) and a spectrometer (USB2000+, Ocean Optics) were used to inject and collect the light signals for the characterization of NFPSU. In the gesture-recognition experiments, an LED (GCI-0604, Daheng Optics) was used as the light source, and a CMOS camera (ov5640, OmniVision Technologies) was used for collecting the output signals. A robotic hand (uHand 2.0, Hiwonder Inc.) controlled by STM 32 was used to perform the gestures recognized.
Experiments involving human subjects were performed with the full, informed consent of the volunteers, who are also the authors of the manuscript.
Supporting Information
Supporting information is available from the Wiley Online Library or from the corresponding author.
Acknowledgements
This study was supported by the National Natural Science Foundation of China (61975173, 62105299, 92148205), the Major Scientific Research Project of Zhejiang Lab (No. 2019MC0AD01), and the Key Research and Development Project of Zhejiang Province (No. 2021C05003, 2022C03103).
References
[1]    S. H. Ko, J. Rogers, Functional Materials and Devices for XR (VR/AR/MR) Applications, Adv. Funct. Mater. 2021, 31, 2106546.
[2]    M. Wang, T. Wang, Y. Luo, K. He, L. Pan, Z. Li, Z. Cui, Z. Liu, J. Tu, X. Chen, Fusing Stretchable Sensing Technology with Machine Learning for Human-Machine Interfaces, Adv. Funct. Mater. 2021, 31, 2008807.
[3]    G. Gao, F. Yang, F. Zhou, J. He, W. Lu, P. Xiao, H. Yan, C. Pan, T. Chen, Z. L. Wang, Bioinspired Self-Healing Human-Machine Interactive Touch Pad with Pressure-Sensitive Adhesiveness on Targeted Substrates, Adv. Mater. 2020, 32, 2004290.
[4]    Z. Sun, M. Zhu, X. Shan, C. Lee, Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions, Nat. Commun. 2022, 13, 5224.
[5]    J. Yang, S. Liu, Y. Meng, W. Xu, S. Liu, L. Jia, G. Chen, Y. Qin, M. Han, X. Li, Self-Powered Tactile Sensor for Gesture Recognition Using Deep Learning Algorithms, ACS Appl. Mater. & Interfaces 2022, 14, 25629.
[6]    S. S. Rautaray, A. Agrawal, Vision based hand gesture recognition for human computer interaction: a survey, Artif. Intell. Rev. 2015, 43, 1.
[7]    P. K. Pisharady, P. Vadakkepat, A. P. Loh, Attention Based Detection and Recognition of Hand Postures Against Complex Backgrounds, Int. J. Comput. Vis. 2012, 101, 403.
[8]    A. Moin, A. Zhou, A. Rahimi, A. Menon, S. Benatti, G. Alexandrov, S. Tamakloe, J. Ting, N. Yamamoto, Y. Khan, F. Burghardt, L. Benini, A. C. Arias, J. M. Rabaey, A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition, Nat. Electron. 2020, 4, 54.
[9]    W. Geng, Y. Du, W. Jin, W. Wei, Y. Hu, J. Li, Gesture recognition by instantaneous surface EMG images, Sci. Rep. 2016, 6, 36571.
[10] Z. Zhou, K. Chen, X. Li, S. Zhang, Y. Wu, Y. Zhou, K. Meng, C. Sun, Q. He, W. Fan, E. Fan, Z. Lin, X. Tan, W. Deng, J. Yang, J. Chen, Sign-to-speech translation using machine-learning-assisted stretchable sensor arrays, Nat. Electron. 2020, 3, 571.
[11] F. Wen, Z. Zhang, T. He, C. Lee, AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove, Nat. Commun. 2021, 12, 5378.
[12] M. Zhu, Z. Sun, Z. Zhang, Q. Shi, T. He, H. Liu, T. Chen, C. Lee, Haptic-feedback smart glove as a creative human-machine interface (HMI) for virtual/augmented reality applications, Sci. Adv. 2020, 6, eaaz8693.
[13] S. Shin, H. Yoon, B. Yoo. Hand Gesture Recognition Using EGaIn-Silicone Soft Sensors, Sensors, 2021, 21, 3204.
[14] J. Nassour, H. Amirabadi, S. Weheabby, A. Ali, H. Lang, F. Hamker, A robust data-driven soft sensory glove for human hand motions identification and replication. IEEE Sens. J., 2020, 20, 12972.
[15] M. Wang, Z. Yan, T. Wang, P. Cai, S. Gao, Y. Zeng, C. Wan, H. Wang, L. Pan, J. Yu, S. Pan, K. He, J. Lu, X. Chen, Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors, Nat. Electron. 2020, 3, 563.
[16] P. Tan, X. Han, Y. Zou, X. Qu, J. Xue, T. Li, Y. Wang, R. Luo, X. Cui, Y. Xi, L. Wu, B. Xue, D. Luo, Y. Fan, X. Chen, Z. Li, Z. L. Wang, Self-powered gesture recognition wristband enabled by machine learning for full keyboard and multi-command input, Adv. Mater. 2022, e2200793.
[17] P. B. Shull, S. Jiang, Y. Zhu, X. Zhu, Hand Gesture Recognition and Finger Angle Estimation via Wrist-Worn Modified Barometric Pressure Sensing, IEEE Trans. Neural Syst. Rehabilitation Eng. 2019, 27, 724.
[18] X. Liang, R. Ghannam, H. Heidari, Wrist-Worn Gesture Sensing With Wearable Intelligence, IEEE Sens. J. 2019, 19, 1082.
[19] L. Zhang, J. Pan, Z. Zhang, H. Wu, N. Yao, D. Cai, Y. Xu, J. Zhang, G. Sun, L. Wang, W. Geng, W. Jin, W. Fang, D. Di, L. Tong, Ultrasensitive skin-like wearable optical sensors based on glass micro/nanofibers, Opto-Electron. Adv 2020, 3, 19002201.
[20] S. Wang, X. Ni, L. Li, J. Wang, Q. Liu, Z. Yan, L. Zhang, Q. Sun, Noninvasive Monitoring of Vital Signs Based on Highly Sensitive Fiber Optic Mattress, IEEE Sens. J. 2020, 20, 6182.
[21] A. Leber, B. Cholst, J. Sandt, N. Vogel, M. Kolle, Stretchable Thermoplastic Elastomer Optical Fibers for Sensing of Extreme Deformations, Adv. Funct. Mater. 2018, 29, 1802629.
[22] S. Pant, S. Umesh, S. Asokan, A Novel Approach to Acquire the Arterial Pulse by Finger Plethysmography Using Fiber Bragg Grating Sensor, IEEE Sens. J. 2020, 20, 5921.
[23] J. Guo, M. Niu, C. Yang, Highly flexible and stretchable optical strain sensing for human motion detection, Optica 2017, 4, 1285.
[24] H. Bai, S. Li, J. Barreiros, Y. Tu, C. R. Pollock, R. F. Shepherd, Stretchable distributed fiber-optic sensors, Science 2020, 370, 848.
[25] S. Ma, X. Wang, P. Li, N. Yao, J. Xiao, H. Liu, Z. Zhang, L. Yu, G. Tao, X. Li, L. Tong, L. Zhang, Optical Micro/Nano Fibers Enabled Smart Textiles for Human-Machine Interface, Adv. Fiber Mater. 2022, 4, 1108.
[26] L. Zhang, Y. Tang, L. Tong, Micro-/Nanofiber Optics: Merging Photonics and Material Science on Nanoscale for Advanced Sensing Technology, iScience 2020, 23, 100810.    
[27] J. h. Li, J. h. Chen, F. Xu, Sensitive and Wearable Optical Microfiber Sensor for Human Health Monitoring, Adv. Mater. Technol. 2018, 3, 1800296.
[28] L. Y. Li, Y. F. Liu, C. Y. Song, S. F. Sheng, L. Y. Yang, Z. J. Yan, D. J. J. Hu, Q. Z. Sun, Wearable Alignment-Free Microfiber-Based Sensor Chip for Precise Vital Signs Monitoring and Cardiovascular Assessment, Adv. Fiber Mater. 2022, 4, 475.
[29] W. Yu, N. Yao, J. Pan, W. Fang, X. Li, L. Tong, L. Zhang, Highly sensitive and fast response strain sensor based on evanescently coupled micro/nanofibers, Opto-Electron. Adv 2022, 5, 210101.
[30] J. Pan, Z. Zhang, C. Jiang, L. Zhang, L. Tong, A multifunctional skin-like wearable optical sensor based on an optical micro-/nanofibre, Nanoscale 2020, 12, 17538.
[31] Z. Zhang, Y. Kang, N. Yao, J. Pan, W. Yu, Y. Tang, Y. Xu, L. Wang, L. Zhang, L. Tong, A Multifunctional Airflow Sensor Enabled by Optical Micro/nanofiber, Adv. Fiber Mater. 2021, 3, 359.
[32] Y. Li, S. Tan, L. Yang, L. Li, F. Fang, Q. Sun, Optical Microfiber Neuron for Finger Motion Perception, Adv. Fiber Mater. 2022, 4, 226.
[33] L. Zhao, B. Wu, Y. Niu, S. Zhu, Y. Chen, H. Chen, J. h. Chen, Soft Optoelectronic Sensors with Deep Learning for Gesture Recognition, Adv. Mater. Technol. 2022, 7, 2101698.
[34] M. H. Syu, Y. J. Guan, W. C. Lo, Y. K. Fuh, Biomimetic and porous nanofiber-based hybrid sensor for multifunctional pressure sensing and human gesture identification via deep learning method, Nano Energy 2020, 76, 105029.
[35] X. Fan, Y. Huang, X. Ding, N. Luo, C. Li, N. Zhao, S.-C. Chen, Alignment-Free Liquid-Capsule Pressure Sensor for Cardiovascular Monitoring, Adv. Funct. Mater. 2018, 28, 1805045.
[36] N. Yao, X. Wang, S. Ma, X. Song, S. Wang, Z Shi, J. Pan, S. Wang, J. Xiao, H. Liu, L. Yu, Y. Tang, Z. Zhang, X. Li, W. Fang, L. Zhang and L. Tong, Single optical microfiber enabled tactile sensor for simultaneous temperature and pressure measurement, Photonics Res. 2022, 10, 2040.
[37] Y. Tang, L. Yu, J. Pan, N. Yao, W. Geng, X. Li, L. Tong, L. Zhang, Z. Zhang, A. Song, Optical nanofiber skins for multifunctional humanoid tactility, Adv. Intell. Syst. 2023, 2200203.
[38] N. Yao, S. Linghu, Y. Xu, R. Zhu, N. Zhou, F. Gu, L. Zhang, W. Fang, W. Ding, L. Tong, Ultra-Long Subwavelength Micro/Nanofibers With Low Loss, IPTL 2020, 32, 1069.
[39] G. Brambilla, D. N. Payne, The ultimate strength of glass silica nanowires, Nano Lett. 2009, 9, 831.
[40] W. Nichols, Clinical measurement of arterial stiffness obtained from noninvasive pressure waveforms, Am J Hypertens 2005, 18, 3S.
[41] Y. Xu, W. Fang, L. Tong, Real-time control of micro/nanofiber waist diameter with ultrahigh accuracy and precision, Opt. Express 2017, 25, 10434.