It’s extremely easy to see that far too many of our public schools have failed the youth and young adults of our Nation (since the 1960’s). The increasingly authoritarian mindset of the Democrats and their belief that they know better for your kids than you do is horrific as well. We need to eliminate the Federal Department of Education at least, and ultimately privatize the entire system. Having citizens coming out of high school and college thinking anything about Socialism/Communism is good, not understanding taxation or basic economics, spewing divisive social justice talking points, believing they’re owed something, and thinking government is charity and more efficient than the free market is unacceptable and destructive to our society and their own lives. Especially with colleges and their worthless classes and degrees like gender studies, they’re no longer benefiting society but rather creating customers and guaranteeing more future tax revenue and authority for themselves.
Mind you, I have been a public school teacher for the majority of my adult life, so this isn’t from some “conspiracy theorist” or whatever rubbish demonizing talking points that also are far too often found in our schools, media, and politics. As I teach my students, you should always question everyone in the media, government, and education. Don’t believe a word I or they say, but rather look into all this yourselves. If I’m wrong, factually tell me how instead of just emotionally attacking me. Thank you