>>4052439>>4051286both of you are wrong. From what I remember from the original poster of the bottom graphics, he used average live viewers, not max.
Formula is watched time = duration * (average viewers / views end of stream). And then he inferred that max live v. got filtered by the same amount as average live v - which seems to make sense.
On the X*Y = Z example, you should probably consider it as more of a X*A=Z example. A is constant, so if Z has gone down, is because X has gone down. From before or after that day in December, the duration or views end of stream can fluctuate, but on average you could say they are constant; just as much as the relation of average viewers / views end of stream should fluctuate but remain constant, on average, between a week to the next, which would then result in the whole average watched hours remain constant (infer this by logic, there`s no reason why the average watched time should fall for every single vtuber on the same day). You can see at sites like vnuma that durations remained nearly the same, as did live views end of stream, but the actual live numbers (average or max) fell down.
And someone said that rising numbers of viewers would pull the average watched time up - no it would not. More viewers would pull both live views and views end of stream by the same multiplier, but the relation of live/view-end should remain the same, not having the average watched hours change. But if you said that more viewers would pull the total watched time up, then yes it is right. Ive written too much and cant bother to go find that comment now.