>>2884561Tried it. Was a bit confused for a while, but figured out what was happening.
From the top:
1) Base setup, 1280x720
2) 1280x540 to show how a different screen aspect ratio leads to squished output on a screen if not adjusted. Center screen is 8:5 screen ratio, while screen on the left is 16:9 screen ratio. (For anyone using this effect, you can have multiple screens with different sizes or ratios which use texture coordinate bounds other than 0 to 1, in order to sample part of the screen without distortion.)
3) First test, removing three minus signs on line 102. Only skydome was visible, so I increased viewing angle to get something to show up; oddly, this had no effect on skydome, which might be some weird interaction due to the differing math of pmx objects (world matrix is always identity) and accessories.
4) Result of changing SubCameraMatrix to use <+カメラtest>.
5) Resulting of using <+カメラtest> and also removing all three uses of inverseCtrl.
6) Demonstration of the relative positions of bones, with +カメラtest having the 'opposite' rotation of カメラ, and having the same base position, but not being moved when カメラ moves.
It turns out CameraPosition does not affect the vertex shader: it's used for calculating edge width, ShadowMapDistance, and specular color. Vertex shader just uses the position from the last row of the matrix.
I didn't try combining position from original カメラ (from its matrix) and rotation from +カメラtest, because it's clear that the only reason the outputs are different is because カメラ bone has non-zero translation, which moves it away from +カメラtest. If the movement comes only from parents, like rotation of Viewpoint bone, they have the same matrices.
(Note that in the extreme case, one can make a matrix's values output somewhere as screen colors to figure out what's going on, but I didn't have to do that here.)
Long comment