Projects
Audio-Driven Animation
A project that uses audio input and Python scripting to dynamically control Live2D models with custom lip sync solution. Here's a demo using it in an ai audio generated from my online content:
VTS Fullbody Tracking
A plugin that enables full-body tracking for Live2D models using camera-based motion detection.
vts-fullbody-tracking.gitlab.ioPyNizima
A Python library that interacts with the NizimaLIVE API via WebSocket, allowing developers to create plugins within Nizima LIVE.
Github - PyNizimaLive2D Rigging
Discover More of My Work
Follow me on social media for free resources, tutorials, behind-the-scenes content, and exclusive updates on my latest experiments.
About Me
I'm a Tech who love testing features, learning new tools and experiments with process. I explore limits and possibilities to gain deep understanding of how things work, analysing the pros and cons, and the most and least efficient ways for different situations.