Become a VTuber Using Nizima LIVE Official Live2D App
This time, the tracking software from the company of the essential software "Live2D" for making Vtuber.
Introducing "nizima Live"!
This media also explains how to use software such as Facerig, Animaze, prprlive, and VtubeStudio, so I will also summarize them.
What are the features of nizimaLive?
As nizimaLive is official, it has many excellent features.
In particular, I think the following two functions are revolutionary.
- Too many parameters! Too many! But easy to understand!
- Easy to do collaboration delivery! No more complicated settings!
"Vtuber tool that can be used by beginners to professionals for the best Live2D expression"As it is labeled as
It has intuitive clarity and depth of setting.
There is a big difference between the conventional tracking software that asks the creator to set up the avatar and the culture that "the user just touches" to
"the user can also adjust".
Facerig → VtubeStudio → nizima Live
And may he hold the next supremacy like the Sengoku warlords...?
How is this...!
What is the tracking accuracy?
It feels quite accurate in my impression.
Although the number of tracking points is less than that of animaze, there is a feeling that the intended movement is captured only by the web camera.
A point that can be a disadvantage of Facerig and animaze is that they are vulnerable to expressions of half-hearted blinks and tilts of the face because they directly output the captured facial expressions as they are.
In nizimaLive, I felt that the tracking results were similar to VTubeStudio, making it look cute and cool, or complementing it so that it doesn't look collapsed.
Also, I wondered if that was the reason why it felt so accurate.
Free trial available
You can continue to use nizimaLive for free for a certain period of time after launch.
After a certain period of time, a popup will appear and it will stop working, but you can use it again without any problems by restarting.
You can only use it for a fairly short time (40 minutes), so if you want to use it for streaming or videos, sign up for a paid plan!
It is cheap because it can be used for 550 yen per month! Isn't Starbucks more expensive?
nizima LIVE for indie general users/small businesses | nizima LIVE for businessMedium -sized or larger businesses | |
Target audience | Those whose most recent annual sales are less than 10 million yen | Those with the most recent annual sales of 10 million yen or more |
Monthly plan (tax included) | ¥550/month | ¥3,300/month |
Annual plan (tax included) | ¥5,280/year (¥440 per month) | ¥31,680/year (¥2,640 per month) |
Payment Method | Credit card payment (VISA/Master, JCB) | Credit card payment (VISA/Master, JCB) |
payment method | lump sum prepayment | lump sum prepayment |
Works with iPhone
Even if you don't have a webcam, if you have a new model after iPhoneX, it will work!
If anything, it will be a more accurate tracking result than a web camera.
Since the 3D effect is detected by "TrueDepth" for face recognition, it works by detecting the correct 3D image rather than a web camera, which tends to be planar data.
3D model cannot be used
Note that, unlike animate, it is not for both Live2D and 3D.
How to use nizimaLive
To be honest, I don't think there are many problems with nizima Live.
You won't be in trouble like Facerig or Animaze!
However, there are too many elements that can be changed, so I will introduce them!
basic operation
- Scaling the model...mouse wheel
- Adjusting the position of the model・・・middle click (wheel)
Since it is really easy-to-understand software, the camera settings and background transparency can be understood at a glance from the UI, so we will extract the parts that are difficult to understand.
How to set facial expressions
It is necessary to ask the avatar creator to create a facial expression difference, but you can assign facial expressions to
your favorite shortcut keys.
There is a device called " Elgato STREAM DECK " as a device for Vtubers.
It is an excellent thing that you can place your favorite shortcut key and image on the button and switch facial expressions during distribution.
How to connect with OBS
Facerig has the ability to pretend to be a webcam, but nizimaLive does not.
Instead, there is a function called background transparency, so the method of capturing nizima Live will become common!
You can connect with the same operation as the 3tene capture method on this page.
How to set up with iPhone
If you have iPhone X or later, you can easily set it up just by connecting the PC and iPhone to the same Internet.
Conversely, it cannot be used if the computer is wired and the iPhone is using a different line such as a 4G line.
Just set the IP address of your iPhone's PC connected to the same number as nizimaLive and you can work with your iPhone!
How to import Live2D
Importing a model in nizimaLive is very easy.
Facerig and VTubeStudio require work such as "put it in this folder", but nizimaLive is based on importing files.
Click here to open Windows Explorer in a new window
Let's load the corresponding file.
You can easily import by opening the file "model3.json".
The basic work is the same as Facerig, so please refer to this for the output method.
Set parameters
If you are not dissatisfied with the movement of the model you are using, you do not need to set it.
- Blinking is so sensitive that the model blinks even though she doesn't close her eyes
- Difficulty opening and closing the mouth
- Even if you twist your body, it won't follow
Let's change the parameters when there is a problem you want to solve!
- Motion Magnification・・・You can change the reading result of the facial expression.
If you find yourself with your eyes wide open, but the model is half-open, try increasing the number! - Smoothing: You can set the animation switching speed.
If you react too sensitively, such as blinking, and it becomes like crackling, try increasing the value. - Waveform editing: You can change the degree to which the reading result is reflected in the model.
The motion magnification and behavior are similar, but it's easy to see what happens when you change the expression.
It cannot be said that it is good to change the sensitivity of movement and physical calculation unconditionally.
Depending on the creator, there may be cases where the movement will be as intended, so please do not change it, so let's set it carefully!
If you compare it with VTubeStudio, it would be wind swaying.
Your body may move unnaturally, so be careful not to set it up inappropriately even in nizimaLive.
Set up to your liking!
Many of the recently released tracking software is very easy to use.
In particular, VTubeStudio has been the most used in the last few months, and the reason is that "the UI is easy to understand and the tracking results are not unnatural."
However, the disadvantage is that some parameters are in English, and some of them are difficult to understand.
As nizimaLive is Live2D official software, it is easy for beginners and simple, and even
if you want to make complicated settings, the settings are mainly in Japanese, so it is not that difficult.
I think it will become the next-generation standard software that can solve the disadvantages and problems of the past.