The NFL has become to political and most people just want to watch and talk Football.
They have turned off many of their viewer with the Anthem controversy which is not smart.
NFL needs to cherish all their fans and market to them effectively
If the NFL wants to make social statements is should be a statement that all fans will appreciate like curing breast cancer.
I really believe the NFL has made terrible mistakes by alienating 1/2 of America and should focus on marketing to all of America
I would feel the same way if the NFL came about against causes on the left.