Oh I absolutely do, and IMO this scientific irresponsibility is
far more dangerous within fields that actually call themselves "sciences" -- at least these "indexes" don't claim to be scientific (even if some people erroneously assume otherwise, such as my former self). There are certain fields with the word "science" in the title that are
notoriously awful in this respect. (I won't name names, just for the sake of simplifying the discussion, but it's not hard to find out.)
Even the notion "hard science" vs "soft science" is a very slippery slope, IMO. We should not seek to speak of "hard" vs "soft" science; instead, we should focus on distinguishing good from bad science. Even science performed on incredibly difficult and complex topics (with immensely numerous confounding variables, near impossibility of controlled trials/experiments, etc.) can still be done correctly! One simply needs to adhere to statistical rigor and qualify conclusions from data with appropriate levels of uncertainty, withhold conclusions with no predictive power or statistical significance, and publish meaningful negative results just as frequently as positive results.
This movement of lax scientific rigor within fields that call themselves "sciences" is incredibly dangerous and threatens to erode the credibility of all results/fields that call themselves "scientific" in the eyes of the general public -- the vast majority of whom do not have the time, energy, or ability to review each field and/or publication to understand how rigorous and honest it actually is.
When you read about problems of anti-intellectualism and public distrust in science, the first thing we should do is look to the "sciences" (and bad journalism) that justify this distrust.