VBridger is a face tracking plugin designed for Vtube Studio and Live2D, which allows the user to make better use of IphoneX ARKit tracking on their live2D model.
所有評論:
大多好評 (75) - 75 篇使用者評論中有 76% 給予此軟體好評。
發行日期:
2022 年 4 月 10 日
開發人員:
發行商:
標籤

登入以將此項目新增至您的願望清單、關注它,或標記為已忽略

不支援繁體中文

本產品尚不支援您的目前所在地的語言。購買前請先確認語言支援清單。

購買 VBridger

購買 VBridger + Editor 組合包 (?)

包含 2 個項目: VBridger, Vbridger - Developer Mode

-15%
HK$ 119.85
 

關於此軟體

VBridger allows VTubers to use face tracking to it's fullest potential

VBridger allows for the augmentation of tracking data, allowing users to combine, mix, and modify live data for use with VTuber Avatars. It comes with several base settings and samples for different types of models. If you have a standard live2D model, you can use our VTS Compatible settings to enhance the tracking quality. If you have a model rigged for ARKit using our parameter guide, you will be able to use our ARKit settings for advanced expression control. If you have an ARKit compatible VRM model, you can use VMC to send tracking data, allowing you to use input curves to tune and calibrate the tracking to better fit your face.

With the VBridger - Editor DLC, riggers can unlock the full potential of VBridger and their rigs by gaining the ability to create new outputs and custom controls for their models. Use your face to toggle VRM exressions via VMC, or create logic based expressions to add flutter to your live2D wings- the possibilities are endless.

Current available input sources:
•Tracking data from the IPhone ARKit FACS based face tracking system from the following apps
•ifacialmocap (iPhone)
•FaceMotion3D (iPhone)
•MeowFace (Android)
•VTubeStudio (iPhone)
•MediaPipe (Webcam)
•NVIDIA (Webcam) *Requires the NVIDIAa webcam DLC from VTube Studio, it's free!
•Additionally, use your microphone to generate audio inputs.


Current available output software:
•Vtube Studio via the Vtube Studio API allowing for the control of live2D models.
•Virtual Motion Capture (VMC) Protocol allowing for any VMC compatible app to recieve face data from VBridger. As long as an output shares the name of a blendshapeclip on your VRM, you can control it with VBridger.
VMC will only work on the facial tracking of the model, it cannot send head rotation, eye rotation, or control the body at the moment.

More on the way!

系統需求

    最低配備:
    • 需要 64 位元的處理器及作業系統
    • 作業系統 *: Windows 7+
    • 處理器: AMD / Intel CPU running at 2.5 GHz or higher
    • 記憶體: 1 GB 記憶體
    • 顯示卡: AMD/NVIDIA graphics card with at least 2GB of dedicated VRAM and DirectX 11+
    • DirectX: 版本:11
    • 網路: 寬頻網際網路連線
    • 儲存空間: 500 MB 可用空間
    建議配備:
    • 需要 64 位元的處理器及作業系統
    • 作業系統: Windows 10
    • 處理器: AMD / Intel CPU running at 3.0 GHz or higher
    • 記憶體: 4 GB 記憶體
    • 顯示卡: AMD/NVIDIA graphics card with at least 4GB of dedicated VRAM and DirectX 11+
    • DirectX: 版本:11
    • 儲存空間: 1 GB 可用空間
* 自 2024 年 1 月 1 日(PT)起,Steam 用戶端僅支援 Windows 10 及更新版本。

鑑賞家怎麼說

2 位鑑賞家已評論這項產品。點擊這裡查看評論。

顧客評論

評論類型


購得方式


語言


日期範圍
如欲檢視特定日期範圍內的評論,請在上方圖表框選一段日期範圍,或點擊直條。

顯示圖表



遊戲時數
依照使用者撰寫評論當時的遊戲時數篩選:




不限最低時數不限最高時數
顯示:
顯示圖表
 
隱藏圖表
 
篩選條件
排除離題評論活動
遊戲時數:
無其它評論符合上述篩選條件
調整上方的篩選條件以檢視其它評論
評論讀取中…