Do you have any examples of Dems telling people things werent bad? The closest things I can think of is dems saying we know things are bad but we are working on them and they are getting better. It feels like a republican talking point that dems think things are good.
Bidenomics is a right wing attack phrase and I’ve never heard Biden say the economy is doing great. So not sure what your point here was.