Explanations are convenient for large firms in algorithms, AI, ML
They don't give users control
They don't shake-up power relations
They don't shine light on systems as a whole
It's irresponsible of researchers to jump on the explanation bandwagon without being critical of them