I think the question is what you mean by overall activation differences between two groups. We wrote a little piece on this (quite dense, but maybe useful), if you cannot access it behind the paywall, there is also a preprint. Check out chapter 5.2 for that specific question.
In short, removing the mean across voxels only removes “overall activation differences” if the mean effect is the same in all voxels (which almost never is the case). If the effect varies between voxels, then you will just spread the mean signal across voxels. You can take the sensitivity of each voxel into account by either calculating a PCA on your data and taking the first component (involved, not mentioned in our paper, may be affected by other factors), or by calculating a mean pattern across both groups and regressing that effect across groups, and working on the residuals. This is equivalent to identifying an angle in multivariate space that is occupied by both groups, i.e. the overall pattern is the same but of a different amplitude. I think that is what most people have in mind when thinking about “overall response differences”. Also, we mention in our paper when this approach no longer works.
Another possible approach would be to use the Haufe method on your SVM weights to reconstruct the discriminative pattern (which gives you a contribution of each voxel that is unaffected by the noise covariance) and then visually inspect the weights (reviewers used to ask this a lot in the early days to see if it’s only a “simple” blob-like response). If most weights are positive or negative, that indicates that the overall activation seems to be higher in one condition than another. But this is not a statistical test of this effect.
Finally, you may ask yourself why it would matter if the overall activation is different. If you can show something similar with more classical analyses: great! If you want to make claims about “fine-scale patterns”, that’s difficult anyway. So, in most cases I don’t see a reason to control for overall activation differences anyway.