====== VIVE Facial Tracker ====== \\ You can use [[https://www.vive.com/us/accessory/facial-tracker/|VIVE Facial Tracker]] to track the movement of the mouth.\\ \\ * To apply the mouth movement to any VRM format models, VirtualCast combines five BlendShapes (A, I, U, E and O) from the preset BlendShapes of the VRM format to create corresponding mouth shapes.\\ \\ {{:virtualcast:controller:118274476_971981893266896_7251530162495185971_n.jpg?200|}} {{:virtualcast:controller:118402321_631977677704079_941384337031321510_n.jpg?200|}} As a prerequisite, you need to have [[virtualcast:controller:viveproeye#sr_runtime%E3%81%AE%E5%AE%9F%E8%A1%8C|SR_Runtime]] installed and the lip module must be recognized properly.\\ {{:virtualcast:controller:lipok2.png|}} \\ * When the lip module is recognized.\\ ---- ===== Settings in Virtual Cast ===== You need to change a setting from the [[en/virtualcast/setting |settings]] in the title screen.\\ Turn on the check box: [Device] > [VIVE] > [VIVE Facial Tracker].\\ {{:en:virtualcast:controller:facial_en.png?600|}} \\ When everything is set properly, the lip module will be activated during the calibration and the tracking of the mouth will start. \\ {{:virtualcast:controller:lipok.png}} * When the lip module is active\\