Virtual Glove Puppets
（1）Glove puppetry is an important traditional art in Taiwan.Recently, because of the expensive cost of this traditional performance (such as professional stage, high cost puppet, and long preparing time for performance), it becomes uncompetitive.
（2）We provide a method which can let users control the virtual puppets through the computer. User can choose different puppets, scene and music according to your own preferences though the system that we construct.
（3）Virtual glove puppetry system is aimed to make more users to enjoy the pleasure of this traditional art in a lower cost.
（1）Leap motion supports input using palm and finger movements, but does not require hand contact or light touch. It uses advanced motion-sensing patented technology for human-computer interaction.
（2）The characters that acted in glove puppetry in Taiwan are related to the religion, the world, analyze the principle of the motion of model, combine input controlling signal motion, shake the arms, and change the motion of the shadow puppets. Shadow image and then present to users through OpenGL.
（3）Unity can be used to develop stand-alone games for Windows, MacOS, and Linux platforms or for mobile devices such as iOS and Android. And we use Unity to present performances and stage construction.
Blind Auxiliary System
（1）In order to realize the application of guided robots in Taiwanese streets, in robot vision, we use a deep learning method to train a model that can identify 14 common signs in Taiwan, which can help blind people to instantly identify which stores are nearby.
（2）The 14 categories of signboards are
（1）Our method is mainly a hierarchical structure, with the famous object detection network as the first segment output, and then training 14 types of signboard binary classifiers for the second confirmation.
（2）The object detection network is based on the convolutional neural network (CNN). The features are extracted from the previous layers of convolutional layers, and several convolutional layers are added to predict the displacement of each candidate frame and the scores of each category.
（3）Since the object detection network may misjudge objects of similar characteristics, such as post offices and shelters, or other green signs of similar post offices, a binary classifier is trained for each category, and whether or not the resolution is unique.
（4）Through the classifiers of each category, some misjudged images can be filtered out to improve the overall accuracy.
（1）The Information and Communication Technology (ICT) industry refers to the main purpose of its products (including goods and services) and must be processed and disseminated through electronic tools (including transmission and display).
（2）Knowledge skills in the ICT-industry always evolve. With the vast variety of jobs available, it is unlikely to educate students with skills to fit every job-requirement. This issue inspired us to develop the Learning Content Recommender (LCRec) for students to find appropriate learning contents based on required job-skills.
（1）We carried out experiments among professionals, academics, and students to test the usefulness of LCRec, and evaluated the feedbacks.
（2）LCRec successfully used Knowledge Units from CS2013, Wikipedia, and essential skills from job hunting websites, to benefit entry-level job seekers for finding necessary learning contents to study.
（3）It is also convenient for academics to look at the skills needed in industries, and to consider enhancing the curriculum with new skills.
Virtual Instrument Research
（1）Human Computer Interaction is becoming a major component in computer science related fields allowing humans to communicate with machines in very simpler ways exploring new dimensions of research.
（2）Kinect, the 3D sensing device introduced by Microsoft mainly aiming computer games domain now is used in different scopes.
（3）We use kinect for controlling sound signals producing aesthetic music.
（1）Kinect is a sensor which is capable of capturing depth and color information of the user in front of it using an array of RGB and infrared cameras. Further it is capable of capturing the sound input though an array of microphones.
（2）Musical Instrument Digital Interface (MIDI)
（3）Technical standard that describes a protocol, digital interface and connectors and allows a wide variety of electronic musical instruments, computers and other related devices to connect and communicate with one another.
（4）MIDI controller is hardware or software which generates and transmits MIDI data to MIDI-enabled devices.
（5）Our Virtual Instrument is a novel MIDI controller design based on Kinect!
（1）Visual C++: Programming Language
（2）OpenNI: Kinect Driver and API functions
（3）OpenCV: Visual Interaction display
（4）MIDI: Music signal to control MIDI instrument
（5）Cubase and VST instrument: MIDI instrument (Audio Library)
（6）Once the depth information is captured using Kinect, user skeleton can be obtained with the available functions of OpenNI with 24 joints.
Midi Programming and Audio
（1）For the MIDI signal, we use the RtMidi API from Gary P. Scavone from McGill .Sending MIDI signals to audio library for controlling the note key, duration and velocity.
（2）VST instrument from Steinberg Company is the audio library which is capable of simulating the sound of real instruments.
（3）Using MIDI as the input mechanism, VST instruments output the sounds vividly as instructed.
（1）Three areas in front of the user is identified as the Kick, Snare, Hi-Hat and Cymbal.
（2）Left Hand ,right hand and right knee are used as the triggers against the above specified regions.
（3）When the coordinate of the triggering point is larger than a specified threshold with respect to the defined regions of the virtual drum sets, program triggers MIDI signals, and then MIDI signals trigger the sound in audio library.
（1）Same like in Drum, using Kinect and OpenNI, user skeleton is captured first.
（2）Then according to the user skeleton coordinates, we set a Chord selection position in front of the user’s left hand. In the Chord selection position, we have defined six different areas, each representing a chord.
（3）To play the virtual guitar, we also set a virtual guitar string in front of the user’s right hand. When the right hand coordinate is in a special value interval, the program sends MIDI signal to the audio library and then triggers the relevant sound.
（1）The program virtually draws a circle around the user. We divide the circumference into several intervals. Then users’ hands are used to control the key and volume.
（2）When the hand is closer to the circumference, the volume becomes louder. The sound is also from MIDI signal as with drum and guitar, and we connect MIDI signals to different audio libraries.
Advanced Spider king
The Virtual Cello
- Use two sensors to play
- Totally with 17 scales for the instrument
- Can use another MIDI signal such as violin or others
- Get the X and Y axis gyroscope values from the sensor
- When a gyroscope value is over the threshold, send a message to the cello program
The Virtual Piano (Leap motion)
- Use OpenGL to design the piano
- Separate keys into the upper and lower arrays
- Pre-selection of key using colors to match keys and fingers
- Use a relative average threshold to trigger sound
- Use the left most two keys at the same time to trigger chords
- Use hand gesture (move left or right) to shift to higher or lower keys
Keynote Speech and Plenary Talk in International Conferences
- Keynote Speech, “The MINE Virtual Band – Music Performance via Sensors,” The third International Conference on Visual Informatics 2013 (IVIC’13), Malaysia, November 2013
All the virtual instruments presented here, Guitar, Drum and Spider King and several other virtual instruments are presented in a live performance concert in NCU. –May,9th,2013
—104 CSIE Concert – MINE Studio Virtual Instrument Performance (1/2)
—104 CSIE Concert – MINE Studio Virtual Instrument Performance (2/2)
—103 CSIE Concert – MINE Studio Virtual Instrument Performance (1/2)
—103 CSIE Concert – MINE Studio Virtual Instrument Performance (2/2)
—102 CSIE Parent-teacher meeting
—102 CSIE Concert – MINE Studio Virtual Instrument Performance
—102 CSIE MINE Lab – MINE Studio Virtual Instrument Demonstration
—101 CSIE MINE Lab – MINE Studio Virtual Instrument Demonstration