Here's what we're talking about on Leid Stories today.
Is the United States currently in deep crisis or not?
Why the hesitancy by the Biden administration to officially declare a national crisis and be clear about what must be done to avoid further downward spirals?
Why such a strong effort to "calm" the people by making them think that things are under control?
Why the unwillingness to face the facts?
Why has America's all-time pandemic -- racism and race oppression -- never been a top priority to tackle and solve? Is it accepted that it is znd will continue to be a permanent reality of American life that will never be eradicated?