Niji and Hololive are both incredibly hindered by their subpar model tracking apps that they force their talents to use. They're old, outdated, and archaic compared to vtubestudio that most indies use.
From personal experience in working with live2d models and close to riggers, seeing the difference between the showcases that riggers post for these corpos vs the actual tracking shows a lot of incompetency with the software.
I'd be surprised if the girls can actually control their in / out values for the sensitivity of their tracking parameters. (Something that you can adjust in vtubestudio easily from person to person, so it doesn't need to be insanely difficult to smile or frown, or be angry).
Hololive's software doesn't even output at thee correct fps for a lot of live2d rigger's physics. Not to mention, for some reason there is no arm parameter in their software, so the arms remain stiff when they have physics in showcases. (Most of the models are rigged so that the arms and body move subtly with the head tracking). Many of the live models lack the ability to be used to their fully rigged capabilities.
What's worse is that Hololive gives their talents iphone tracking, but they seem to completely refuse to utilize the best parts of using the FaceID that comes with those iphones for their software. This leads to not having good eye and mouth tracking, not having eyebrow tracking, and none of the models being equipped with cheekpuff / mouth x / tongue out parameters. Nowadays, most models are made with full mouth forms too, but Hololive often gets models with a restricted amount of mouth forms.
Seriously, in Hololive's case they shouldn't have to toggle their emotions like they do just to show some subtle facial expression.
It's worth it to get at least a shallow understanding of how Live2d art, rigging, and tracking works to fully understand how much their softwares are a complete debuff for their girls when they could be so much more expressive.