Tactile Optimization and User-Centric Haptics Laboratory (TOUCH Lab)
Research Field
Dr. Aaron Raymond See is a multidisciplinary researcher specializing in haptics, assistive device development, and visuotactile sensing technologies. He was born in Manila, Philippines, and received his B.S. degree in Electronics and Communications Engineering from De La Salle University, Manila in 2006. He earned his M.S. and Ph.D. degrees in Electrical Engineering, with a focus on Biomedical Engineering, from Southern Taiwan University of Science and Technology in 2010 and 2014, respectively.
Following his doctoral studies, Dr. See pursued postdoctoral research in neuroscience at the Brain Research Center, National Tsing Hua University, where he focused on sensorimotor integration and neural feedback systems. He currently serves as an Associate Professor at National Chin-Yi University of Technology (NCUT), where he leads the TOUCH Lab—a research group advancing human-machine interaction through haptic technologies and sensory augmentation.
His research interests include:
Development of wearable haptic devices and visuotactile feedback systems
Integration of biomedical sensors for assistive and rehabilitative applications
AI-driven solutions for medical training, vocational learning, and industrial automation
NCUT Tactile Optimization and User-Centric Haptics (TOUCH) Lab
The TOUCH Lab at National Chin-Yi University of Technology is dedicated to advancing research and innovation in haptic technologies, visuotactile sensing, and human-machine interaction. Our interdisciplinary team develops impactful solutions for manufacturing automation, industrial and medical training, and assistive technologies, with a strong focus on improving quality of life in underserved communities.
What We Do
We specialize in:
Haptics and Visuotactile Sensors: Creating integrated systems that enhance spatial perception and interaction in visually impaired users, as well as in medical and industrial training environments.
Automation and Intelligent Systems: Designing AI-powered control systems and optimizing automation processes for smart industry and smart healthcare applications.
Assistive Device Development: Engineering low-cost, high-impact assistive tools tailored for use in developing countries, with the aim of supporting education, rehabilitation, and independent living.
Current Projects
Our ongoing research projects involve:
Development of wearable visuotactile sensors for real-time feedback in rehabilitation.
Creation of haptic-enhanced interfaces for immersive medical training simulations and precision tasks in manufacturing.
Advanced human-machine interfaces that integrate machine learning, biomedical signal processing, and sensor fusion.
Exploration of biomedical image analysis, electronic circuit design, materials engineering, and soft robotics for novel applications in rehabilitation and smart prosthetics.
Join Us
We are actively seeking motivated students who are passionate about applying engineering for social good. Whether you're interested in sensor development, machine learning, or hands-on prototyping, the TOUCH Lab provides a collaborative environment where engineering innovation meets meaningful societal impact.
My research lies at the forefront of haptic technology and visuotactile sensory systems, with a strong emphasis on developing intelligent assistive and rehabilitative devices. I explore how tactile and visual information can be integrated to improve human-machine interaction, enhance sensory perception, and support neuroplasticity in individuals with somatosensory deficits or visual impairments.
Key Research Areas:
Wearable Haptics & Tactile Interfaces: Development of wearable devices that simulate or enhance tactile feedback for rehabilitation, navigation, and learning applications—particularly for individuals with visual or motor disabilities.
Visuotactile Sensor Fusion: Integration of visual sensing (e.g., 3D cameras, depth sensors) with tactile sensing arrays to create systems that support spatial awareness, object recognition, and environmental mapping through multimodal feedback.
Neuroadaptive Feedback Systems: Use of real-time signal processing and machine learning to deliver personalized, adaptive tactile stimuli for rehabilitation and cognitive enhancement.
Embedded Biomedical Sensing Systems: Design of compact and low-power biosensing modules capable of measuring physiological parameters (e.g., EEG, EMG, GSR) to inform feedback systems or monitor patient health.
Sleep Science & Somatosensory Modulation: Investigating the role of vibrotactile stimulation in improving sleep quality, circadian regulation, and stress resilience, using non-invasive intervention platforms.
My research team adopts a multidisciplinary and application-driven approach, combining elements from biomedical engineering, human-computer interaction, sensor design, and AI-based signal analysis. Our goal is to develop impactful technologies that bridge engineering innovation and clinical relevance, while cultivating a hands-on, exploratory learning culture for students. We welcome students passionate about tactile computing, assistive technologies, and sensor integration, and seek collaborators across neuroscience, rehabilitation, and robotics.
Dr. See's team boasts an impressive array of accolades, having secured numerous engineering and startup awards such as:
- 2025 Best AI Awards AI Application International Group– NT$ 1,000,000
- 2025 iENA Nuremberg, Germany – Silver Medal
- Taiwan Innotech Expo (2021 & 2025) - Taipei, Taiwan - Silver Medal
- 2025 Global Student Innovation Challenge - Rehabilitation Engineering and Assistive Technology (gSIC-REAT) Bangkok, Thailand – Merit Award
- 2025 & 2024 Global Student Innovation Challenge – Rehabilitation Engineering and Assistive Technology, gSIC i-CREATe Taiwan Challenge – First Place (NT$ 20,000)
- 2022 U-Start Innovation and Entrepreneurship Outstanding Startup (NT$ 800,000)
- 2022 TIC 100 Social Innovation Competition – Social Impact Award (NT$ 100,000)
- 2022 FutureTech Award
- 2021 Macronix Golden Silicon Awards - Silver Award, Outstanding Creativity Award (NT$230,000)
- 2021 Macronix Golden Silicon Awards – Outstanding Advisor
- 2021 FITI Outstanding Entrepreneurship Award AI Prosthetic Arm (NT$ 2,000,000)
- 2020 Chinese Vocational and Industrial Education Association Golden Excellence Administration Award
- 2024 & 2019 Corning Future Innovator “Teacher Contribution Award”
- 2019 Malaysia Novel Research and Innovation Competition – Social Impact Award (RM 2,000)
- 2018-2022 Excellent Teacher Award at the Southern Taiwan Taiwan University of Technology
Dr. Aaron Raymond See, originally from Manila, Philippines, earned his B.S. in Electronics and Communications Engineering from De La Salle University, Manila, in 2006. He pursued further studies at Southern Taiwan University of Science and Technology, obtaining his Master's and Ph.D. in Electrical Engineering, specializing in Biomedical Engineering, in 2010 and 2014, respectively. He then engaged in postdoctoral research in Neuroscience at the Brain Research Center, National Tsing Hua University, in Hsinchu, Taiwan. Dr. See also enhanced his educational expertise with specialized training in Designing Student-Centered Learning at the Olin College of Engineering in the USA. His international experience was further enriched through roles as a visiting scholar at the Warsaw School of Economics in Poland in July 2019 and at the HTWG Konstanz University of Applied Sciences in Germany from June to July 2023.
Job Description
Mechanoluminescent Sensor Development
Assist in the fabrication and characterization of mechanoluminescent tactile sensors
Prepare ML materials and multilayer sensor structures (e.g., elastomers, optical layers)
Study the relationship between mechanical stimulation and optical emission patterns
Optical Measurement & Data Collection
Set up camera-based optical systems to capture ML responses under force, pressure, or deformation
Conduct calibration, repeatability, and sensitivity testing
Support experiments for wearable or rehabilitation-oriented sensor configurations
Data Analysis & Signal Processing
Perform basic image processing and signal analysis on ML-generated visual data
Assist with multimodal data handling related to visuotactile sensor fusion
Contribute to exploratory analysis for machine learning–ready datasets (as appropriate)
Research Support & Documentation
Maintain detailed lab notebooks and experimental records
Assist in preparing figures, technical summaries, or draft sections for academic publications
Participate in lab meetings and interdisciplinary research discussions
Preferred Intern Educational Level
Undergraduate or graduate student in Biomedical Engineering, Electrical Engineering, Mechanical Engineering, Materials Science, Robotics, Physics, or related fields
Skill sets or Qualities
- Basic understanding of mechanics, optics, or materials science
- Familiarity with laboratory experimentation and prototyping
- Introductory programming skills in Python or MATLAB
- Willingness to learn image-based sensing and experimental data analysis
- Strong curiosity and intrinsic motivation for research
- Carefulness and patience in experimental work
- Ability to follow protocols while thinking critically about results
- Willingness to iterate, troubleshoot, and learn from failure
- Interest in applying engineering skills to real-world societal challenges
Job Description
Key Responsibilities
Image Processing & Data Preparation
- Process and analyze image sequences generated by visuotactile and mechanoluminescent sensors
- Implement image preprocessing, segmentation, and feature extraction pipelines
- Perform sensor calibration, normalization, and noise reduction
AI & Machine Learning
- Develop and evaluate machine learning or deep learning models for visuotactile data interpretation
- Explore methods for force estimation, contact localization, pattern recognition, or state classification
- Assist in developing multimodal learning pipelines combining visual, tactile, and biomedical signals
Signal Processing & System Integration
- Apply time-series and spatial signal processing techniques to sensor data
- Support integration of AI models with embedded or real-time systems (as appropriate)
- Collaborate with haptics and sensor teams to ensure algorithm–hardware compatibility
Research Support & Documentation
- Maintain reproducible code and annotated datasets
- Assist in preparing figures, experimental results, and drafts for academic publications
- Participate in lab meetings, paper discussions, and technical reviews
Preferred Intern Educational Level
Undergraduate or graduate student in Computer Science, Electrical Engineering, Biomedical Engineering, Robotics, AI, or related fields
Skill sets or Qualities
Technical Skills
- Proficiency in Python
- Familiarity with image processing and computer vision concepts
- Basic understanding of machine learning (e.g., regression, classification, CNNs)
- Experience with at least one of the following: OpenCV, NumPy, PyTorch, TensorFlow, or MATLAB
Preferred / Bonus Skills
- Experience with deep learning for vision or multimodal data
- Familiarity with ROS, real-time systems, or embedded AI
- Knowledge of signal processing, sensor fusion, or time-series analysis
- Interest in assistive technologies, rehabilitation engineering, or haptics