July 28th, 2015 // 8:12 am @ Oliver DeMille
by Oliver DeMille
I don’t mean it. I’m going to propose it, but I don’t really want it. Or think it’s a good idea. This proposal is meant to be ironic. But it still needs to be said, because there is far too much truth to it.
Supreme Parliament of the United States
In the wake of recent Supreme Court decisions, it’s clear that the Court doesn’t just try cases. It now writes law. It isn’t only a Supreme Court, it’s the de facto Supreme Parliament of the United States as well.
The Court uses some decisions to simply rewrite the laws of the nation, including the laws of the states. It’s been doing that for some time,[i] of course, but now it’s taking this approach to a whole new level. It has decided that the 9th and 10th Amendments are outdated, and it just ignores them.
For example, the Court labels Obamacare a “tax”, even though the Congress and President who proposed and passed it never called it that, and even though it skirts many state laws. The Court just makes up its own way.
Forget the actual case at hand; the Court is convinced that it has the power to create whatever it chooses out of thin air. Whatever the Court says, goes. Call it a “tax”. And call marriage a Constitutional right, even though the word “marriage” and the concept of marriage are never even mentioned in the Constitution or any of the Founder’s commentaries on the federal Constitution.
The Framers specifically left any and all decisions about marriage to the states. The Court has amended the Constitution without even using an official amendment.[ii] Many times. Just because it wants to.[iii] I’m not saying the Court got any of these recent decisions wrong, or right. That’s not my point. In fact, my point is much more important than any of these cases. I’m saying the Court has no authority in the Constitution to make many of its decisions.[iv]
It gave itself the power to do these things.[v] It just took the power. Such power didn’t come from the people or the Constitution.[vi] Such power isn’t legitimate authority. It is, to use the precise, technical word that the Founding generation used for this exact behavior: “tyranny”.
Whether you love the current Court’s decisions, hate them, or fall somewhere in the middle, the bigger picture is beyond the cases. The Court is now boldly and fully engaged in Judicial Tyranny.[vii]
The new rules of the Court: Just do whatever you want. You’re the Court, after all. Oh, and that pesky reality that the Constitution doesn’t give the Court the authority do more than half of what it now does? No problem. Since you’re the Court, just announce that the Constitution does, in fact, give you such authority. In fact, decree that the Court has the Constitutional authority to do whatever you decide to do.
Jefferson warned that this very thing was the biggest danger to the Constitution and to American freedoms. And his prediction has come true. The Legislative Court has become one of the greatest dangers to our freedoms. Five lawyers literally have the power to do whatever they want.
The Proposed Change!
So here’s the proposal. I heard it on a radio show, and it made me laugh. Then it made think. Then it made me mad. Check this out:
Since the Supreme Court now makes up any law it wants just by writing it up in a majority opinion, without bothering about what the House or Senate does, let’s balance the budget by just disbanding Congress. Why pay Representatives and Senators and their staff when the Court is just going to write up laws on its own anyway?
That’s the proposal. Let’s just get rid of Congress and let the Court keep doing its thing.
Again, I don’t really mean it. But at this rate, the Court is on pace to do this anyway. And in the meantime, it’s already behaving as the Supreme Court and the Supreme Parliament all in one.
One More Thing
By the way, the real solution is for Congress to pass legislation ending the use of precedent in the courts and limiting every Supreme Court decision to the scope of that one case. This will send many in the current generation of lawyers into a tizzy, but it’s the right thing to do. Assuming that we want to remain free. Such a change will immediately return the Court to its Article III powers.
Or, barring this solution, if we’re going to keep with the bad tradition of common law precedent,[viii] amend the Constitution so that 2/3 of the state Supreme Courts can overturn any decision of the Supreme Court. (More on Common Law in footnote “viii”.)
If we don’t do one of these, we literally might as well adopt the proposal above—because the Court is now operating as both the Judicial Branch and a Higher Legislative Branch.
[i] See, for example: Martin v. Hunter Lessee; Cohen v. Commonwealth of Virginia; McCullough v. Maryland; Gibbons v. Ogden; Missouri v. Holland; New York ex rel. Cohn v. Graves; U.S. v. Butler; U.S. v. Curtiss-Wright Export Corp.; Wood v. Cloyd W. Miller; among others. See also: Bruno Leoni, Freedom and the Law, 3-25, 133-171.
[ii] Compare Article VI of the United States Constitution to Article III.
[iii] Some scholars and jurists will balk at this, arguing instead that the court “finds” or “discovers” the Constitutional meaning in the law. But while the Court may employ technical and/or logical language to support its decisions, it still utilizes its will. It may claim that its decisions are “findings,” and at times they are, but they are still always decisions. (If they were truly “findings,” matters of law without personal choice, all cases would be decided by 9-0 votes. Will is part of each decision.) Moreover, despite what is taught in some law school courses, the Framers clearly understood votes of the Justices to be acts of will, not mechanized requirements demanded by the laws.
[iv] Read Article III word for word. No such powers are granted.
[v] Review the cases listed in footnote “i” above. See also: John E. Nowak, Ronald D. Rotunda (Thompson-West), Constitutional Law, Seventh Edition, pp. 1-16, 138-156, 397-398.
[vi] Article III.
[vii] See how Raoul Berger warned of this a generation ago: Raoul Berger, Government by Judiciary.
[viii] Many in the legal profession argue that the Framers preferred Common Law to the other alternatives. Certainly there are a number of quotes from prominent founding leaders that on face value seem to support this view. In reality, most of the Framers preferred Common Law to Romano-Germanic Codifications. This was the major legal debate of the era, in Europe at least. Thus the Justinian model was soon to be followed by the Napoleonic Code. So when the Framers sided with Common Law over the Romano-Germanic model it was taken as a blanket endorsement of the Common Law. However, some of the top Founding Fathers, including both Jefferson and Madison, preferred a third model, the Anglo-Saxon code and system, over Common Law. For excellent background on these competing systems, see: Rene David and John E. C. Brierley, Major Legal Systems in the World Today; John William Burgess, The Reconciliation of Government with Liberty; Theodore F.T. Plucknett, A Concise History of the Common Law. In short, common law builds on precedent; the Constitution the Framers wrote didn’t require the use of precedent by the Judiciary. In the Framer’s model, the Court was “supreme” in deciding any one case. Period. This keeps the Court separated in the judicial realm. It is an independent judiciary, unlike in Britain, because it has sole authority to provide the final determination in any one case. But separation of powers gives it no authority to use dicta or precedent to influence later cases. Any allowance of precedent creates the need to explain a decision, and moves into the realm of legislation. Common Law was not the intent of the Framers. Once the Constitution was ratified, however, the attorneys of the era, trained in the Common Law, simply kept practicing their system without change. The Anglo Saxon code and model was quickly lost, to the detriment of American freedom. Most attorneys are unaware of this. Even a lower percentage of non-attorney citizens understand this. We lose our freedoms in many cases simply because we don’t know better.
June 18th, 2015 // 11:48 am @ Oliver DeMille
News after News
If Elizabeth Warren runs for president in 2016, it will change the whole dynamic of the election. So far she has said she’s not running, but we’ve heard that before from politicians who later changed their minds.
Since she first told everyone she isn’t going to run, a lot has happened. For example, the Right has visited an onslaught of negative press on Hillary Clinton. It’s been one criticism after another. Whether the stories are justified or not, this approach by the Right is news. As soon as one story dies down in the press, the Right pushes another one.
It’s a kind of shock and awe approach to negative politics. Break one story, let it run its course, fuel it as much as possible, and be ready to put out the next story when the current one starts to lose steam. If it’s not Benghazi, it’s Server-Gate. Then foreign donations to the Clinton Foundation. Followed by ducking reporters. Or FIFA.
The list is long and growing. No doubt Mrs. Clinton’s critics have an even longer list of additional news stories planned and ready to come out (especially now that her old emails will be released a few at a time for many months ahead).
As this continues, Clinton’s support may eventually take a more significant hit than it has so far. And she’s already down a bit in the polls.
But Democrats aren’t very worried. Mrs. Clinton is still very popular, and if something big comes up that derails her candidacy, there’s always Elizabeth Warren.
In many ways Senator Warren is a stronger candidate than former-First Lady/Senator/Secretary Clinton. She has fewer enemies, and her name doesn’t open the bank accounts of Right-Wing donors like Hillary’s does. Warren isn’t weighed down by a huge, national negative legacy from the past like Hillary—no Bill, no Benghazi, no Clinton Foundation, no sense of “the Clintons have their own set of rules”.
And Warren isn’t closely associated with President Obama like Hillary. In fact, Warren is known by many Americans mainly for standing against Obama and getting him to argue with her publicly. This makes many independents sit up and take notice. In fact, in recent polling over 60% of independents say they don’t trust Mrs. Clinton.
Beyond these things, Warren commands nearly all the positives for a candidate that Hillary enjoys: strong support among women, minorities, youth, and the Democratic base. She is well liked in swing states, she is known as more politically liberal than Hillary, and she is portrayed in the media as tough, independent, and dedicated.
Warren is an excellent public speaker, a mix of fiery and thoughtful, and, if anything, she’s more believable than Clinton.
In a sense, Warren is the anti-Obama of the Democratic Party. She could easily become the anti-Hillary in the presidential primaries, or the post-Hillary rescuer of the Party if Hillary hits a major scandal or roadblock.
In short, Warren is the real deal. She’s “legit,” as today’s youth like to say. I personally disagree with her on many political issues, but I’m impressed with how she’s running her campaign for president in 2016.
Playing ‘What If’
True: officially there is no such campaign. But if there were, she’s doing exactly the savvy thing right now:
- keep getting President Obama to mention you and sometimes argue with you
- say you’re not going to run and let the Right spend its money, time and energy tearing down Hillary
- keep using your position in the Senate to weigh in on important American issues and look increasingly like a leader
- stay out of the political fray and stick to governing for as long as possible—letting the host of Democratic and Republican candidates wear each other down
If she runs, she’ll almost certainly cause the Republican candidates more problems in the 2016 election than Hillary Clinton would.
With all that said, John Kasich may be the Elizabeth Warren of the Republican field. His role as Governor of Ohio gives him support in a vital swing state, and his rough-and-tumble record and disarming approach could be popular among many other swing voters as well.
Both Warren and Kasich have the “Credible” factor and the “Cool” thing going for them—they don’t come across so much like politicians as the other current presidential candidates. Many voters’ experience with Warren and Kasich is that they’re the “grown-ups” in current politics.
Whether you agree or disagree, and however you feel about their politics, Elizabeth Warren and John Kasich are worth watching in the next twelve months. Either one, or both, could be a surprise major player in 2016.
*Image Credit, Tim Pierce. Original>>
June 1st, 2015 // 6:15 pm @ Oliver DeMille
(Should Presidents be Treated Like Royalty?)
News and Fans
A strange thing happened recently. President Obama visited my state, just a quick flight and a few meetings. But to watch the state news reports, you would think the most important celebrity in all of history was visiting. The media simply fawned over the president. The news channels broke into all the nightly television programs and gave special reports. Not on some major announcement, but on…wait for it…the President arriving at his hotel.
Reporters were on the scene. Anchors cut back and forth between the reporters and expert guests opining on how long the President’s day flight been and how much longer before he would arrive at the hotel. This went on and on. No news. No policies. No proposals. No events. No actual happenings.
Just a president and his entourage arriving in a plane, being driven to the hotel, and staying overnight. The reporters and other news professionals—on multiple channels—were positively giddy.
I shook my head in amazement. The American founders would have been…well, upset. At least Jefferson would. Adams? That’s a different story altogether. In fact, it got me thinking.
John Adams, the second president of the United States, made a serious mistake during his term in office. He had served for a long time as ambassador to the English crown, and he picked up some bad habits in London. He witnessed the way members of Parliament treated the top ministers in the British Cabinet, and how the ministers themselves fawned over the King.
He had, likewise, watched how the regular people in England bowed and scraped to the aristocracy. Almost everyone in Britain saw every other person as either a superior, an inferior, or, occasionally, an equal. This class system dominated the culture.
And, as an ambassador of the American colonies, Adams had been pretty low on the pecking order in London. He had seen how this aristocratic structure gave increased clout to the King, to Cabinet members, and to the aristocrats. So when he became president, he wanted the U.S. Presidency to have this same sense of authority, power, and awe.
In fairness, reading between the lines, I think president Adams’ heart was in the right place: He thought this would help the new nation gain credibility in the sight of other nations. He wanted the president to be treated like royalty.
But Adams forgot something very important.
Aristos and Servants
Americans weren’t raised in such a class system. They found such bowing-and-scraping to the aristos both offensive and deeply disturbing. For them, this was one of the medieval vestiges of the old world that America (and Americans) had gratefully left behind. The principle behind this was simple: if we treat leaders like aristos and kings, they’ll start acting like aristos and kings, and our freedoms will be in grave danger.
Moreover, Americans felt that such “prostration before their betters” was especially inappropriate where government officials were concerned. “Government leaders work for us,” was the typical American view. And if the citizens start acting like their leaders are aristocrats and royalty, they’ll stop thinking and acting like fiercely independent free citizens.
But still Adams wanted the people to treat the president like royalty. Jefferson was appalled at Adams’ words and behavior in this regard, and his criticisms of President Adams’ on this matter fueled a rift between these two men that lasted almost two decades. At times, the biting rivalry and harsh words bordered on hatred—if not of each other, certainly of the other’s behavior.
Jefferson believed in a democratic culture, where each person would be judged by his choices and character—not by class background. Even more importantly, he believed that, in fact, government officials do work for the people—and the higher the office, the more they are servants of the people.
As president himself, Jefferson went out of his way to avoid and reject any “airs” or special perks of office. He saw himself as simply a man, one of the citizens, who had the responsibility to serve all the other citizens. He wasn’t superior to them—he was the least of them, their servant, their humble servant.
After 8 years as president, Jefferson had established this as the way each president should be viewed: a regular citizen, serving as president while elected, but not treated as inherently royal, superior, or in any way better than anyone else. No perks, no special treatment. Yes, security for the Commander in Chief obviously needed to be different than for the average person. But other than that, no aristocratic fawning was appropriate.
Madison followed the same model, and after him Monroe. So did John Quincy Adams and other presidents through the eras of Andrew Jackson, Abraham Lincoln, Teddy Roosevelt, and FDR. Truman angrily rejected any attempts to put him on an aristocratic pedestal.
Eisenhower rebuffed aristocratic perks both as a general and later as president, and Kennedy did the same in many ways—leaving his expensive upper-class clothes at home and dressing more like the people he worked with each day while in Washington. Reagan openly pushed against any aristocratic airs; as a movie star he was accustomed to celebrity, but clearly avoided it as president. He wanted to be a man of the people, a servant.
And even while Adams tried to promote a higher level of fawning over the president, at least the media of his era had the good sense to criticize him for it. To take him to task. To note that such is not the American way.
But something changed, somewhere along the way. Was it because Kennedy was considered American “royalty” even before the presidency, or because Reagan was already a Hollywood celebrity before he occupied the White House? Or did the change come during the Clinton era, or the Bush years? (Ironic that such fawning of the Chief Executive came after Nixon.)
Or was it the way Hollywood portrayed presidents—with actors like Michael Douglas or Martin Sheen glibly spouting a sense of arrogant power and aristocratic entitlement? Did entertainment capture our national imagination?
When did we start seeing the world as classes of royals, aristos, and “regular” people? When did “celebrity” become an accepted reason for special treatment in America?
Should we respect the offices held by our political leaders? Absolutely. Should we honor those who serve? Absolutely. Too often the level of civic discourse today is coarse and angry. More respect is needed. And, just as importantly, less aristocratic fawning is needed.
I once wrote about experiencing this trend firsthand while hosting a visiting dignitary many years ago. After an evening event, we went to a restaurant. The dignitary and I naturally walked to the back of the line, but one of the other hosts hurried to the front of the line and arranged for a “VIP table.” As we walked past the rest of the line, people vocally grumbled. We heard words like: “Hey, don’t cut in line,” and “You can wait your turn, just like the rest of us.”
When the other host responded by telling the people in line the dignitary’s title and name, the frustration grew louder. “We’re Americans too,” one man said. “We don’t worship titles in this country. Wait your turn.” A number of similar words were exchanged. The dignitary demanded that we go back to the end of the line just like everyone else.
Fast forward over two decades later. A similar scene. I walked with a dignitary to the back of a large restaurant line, and someone in our group arranged for us to go straight to our table. As we passed the line, people asked, “Who’s cutting in line?”
But these voices weren’t angry. They were just curious. “It’s a celebrity,” someone said. “Oh, okay,” was the general consensus. “Of course. Let him ahead, then.” Apart from trying to see if they recognized the person, everyone in the line seemed totally content to let a celebrity pass.
The earlier line had been made up mostly of people from the Greatest Generation and the Boomers, while the second line consisted mainly of Gen X and Millennials. The U.S. clearly changed during the two decades between these experiences. But is the change a good one?
It’s an important question: Can we maintain a democratically free society when the broad culture sees people as “superiors,” and “inferiors”? When we as Americans have come to see celebrities as naturally deserving special perks, and government officials as people to be submitted to? Won’t that change how we vote?
Indeed, it already has.
Focus and Fawning
Would John Adams like the way Americans now treat their presidents and other celebrities? What would the British aristos of the American Founding era think of Americans today? Are we just as “domesticated” as the London populace of their time? We don’t exactly “bow” yet, but have we become a nation that “bows and scrapes” before people we see as “our betters?”
And if this applies to “power celebrities,” meaning government leaders, what does that say about the American people as wise stewards and overseers of freedom? Politeness is good, no doubt. But which is classier:
- A people who move aside for aristos, craning their necks to catch a glimpse of celebrity and snap a photo on their phones, or,
- A people who think everyone is an important leader because we are all citizens of a great, free nation?
Whatever Adams would think of the way many Americans now turn giddy in the presence of political leaders, Jefferson, Washington, Franklin and Madison would not like it. Jefferson would certainly call foul. This is not the way to maintain a true democratic republic.
Treat everyone like aristos, or no one. Period. That’s the only way to be equal and fair.
Don’t treat some like aristos and others like commoners. Because if we do, freedom will quickly decline. (It will decline more for the commoners than the aristos, by the way.) Even Adams hated such obsequiousness when he witnessed it in Europe.
Clearly, as mentioned, the security of top leaders is very important. So is the security of everyone else, but it is true that top officials and celebrities often face more frequent threats. Also, more respect in our civil discourse is needed. But the servile toadying to political celebrity by many in our current populace and even media is a disturbing trend.
Again, we should certainly be respectful of important office. But sycophancy isn’t the American way—as Adams learned when the “commoners” refused to give him a second term. In fact, they instead entrusted the presidency of their nation to the leading voice of democracy and equal treatment for all citizens: Thomas Jefferson.
We should be so wise. Give us a president who stands what matters—boldly, openly, and effectively—while simultaneously rejecting the perks and “airs” of aristocracy, or worse, an Americanized royalty.
Give us a president who vocally reminds us that he or she is just one of the citizens. And who openly remembers and operates on the principle that we, the people, are the real leaders of the nation. It’s time for the media servility and simultaneously smug superiority of the White House to be replaced with what our nation actually needs—quality, grown up, leadership by someone who truly understands what makes freedom tick. The people.
As for the media, focus more on policy and principles, less on ratings and the “royals who lead us.” And none at all on what hotel somebody stays in. There are people with real needs in this country, and current policies are hurting millions of American citizens. Let’s fix them. Starting with spending our media airtime on what really matters.
April 28th, 2015 // 4:05 pm @ Oliver DeMille
Will College Get in the Way of Your Kids’ Education?
In fact, it’s downright politically incorrect. For many in the current generation of parents, this is akin to cultural heresy. But let’s think about it. The top colleges depend on the SAT, or in a few cases, the ACT, and later, once students are accepted and enrolled, these schools frequently employ multiple-choice exams of the same ilk.
If such testing is our national educational scoreboard—and it is—we have a problem. As you know if you’ve read my articles on “Homeschooling and Testing: Parts I and Part II” I’m not a real fan of the multiple-choice test as our national standard.
Actually, I readily embrace multiple-choice exams as one part of a student’s learning experience, along with essay exams, papers and projects as exams, oral exams, stand-and-show exams, and other types of testing that allow each student to truly demonstrate what he or she has learned, understands, and can do with the knowledge. But I’m against a one-size-fits all approach to testing (or to education itself, for that matter).
Like Rachel Lynde teaches in Anne of Green Gables, people who have never been parents usually think that there is one good way to parent—but people who have had children know that there is truly a different right way to parent for each child. People truly are different. This lesson is essential, and our modern testing system forgets about it.
That’s bad for our nation and our future.
So, yes, I’m opposed to the way many modern classrooms teach to the standardized multiple-choice tests instead of teaching to the student. Each student’s learning should be individualized, personalized, and guided by one or more caring, committed mentors.
Education Job Training
Ideal? Yes! And that’s what our children deserve. Every single one of them. Why would we settle for anything less than the highest ideals in something as important as the education of our children and grandchildren?
Shame on anyone who suggests anything less than the ideal!
When I see a nation that calls itself a meritocracy but whose highest court, presidency, and a disproportional number of top corporate leaders only come from the Ivy League—at least for the past two decades—and, to get into the League requires high scores on standardized multiple-choice tests, I shake my head. It’s a serious problem.
Why? Because such a system is not a meritocracy at all. There many statistics about making more money in life after graduating from a top school, and about making more if you’ve finished college, but there is one that really stands out. Most people don’t know about it, but it’s true. The statistics are clear: on average, the richer the family of the student who takes the SAT, the better the student scores. (Leon Botstein, TIME, March 24, 2015)
So, of course these students make more money in their life. In fact, the students of the wealthy who don’t complete top colleges also do financially better, on average, than the students of the poor or middle class—including many of those who do graduate from college. Call it elitist, aristocratic, or whatever you want, but this system isn’t a genuine meritocracy.
The common response to such a conversation is that universities need such tests to decide which applicants to accept—since college is all about job training nowadays. Well…I’m going to let that pass without argument, even though I very much disagree that universities should be mainly about career training.
But if schools are going to focus on career prep, then multiple-choice tests as the constant in the system are deeply flawed. As Leon Botstein wrote in TIME magazine: “The SAT is part hoax, part fraud.” The magazine also called it “a scam”. (Ibid., cover, 17) Why? Because, in Botstein’s words:
“[T]he test can’t predict a kid’s future…. As every adult recognizes, knowing something, or knowing how to do something, in real life is never defined by being able to choose a ‘right’ answer from a set of possible options (some of them are intentionally misleading) put forward by nameless test designers.
“No scientist, engineer, writer, psychologist, artist or physician pursues his or her vocation by getting right answers from a set of prescribed alternatives that trivialize complexity and ambiguity.” (Ibid., emphasis added.)
Quality tests reflect reality, not an imaginary matrix that allows a few experts to determine what they want our whole nation of children to learn (or not learn), and which type of learner they prefer to favor, whom they want to succeed. The key phrase in the whole subject of standardized, national testing such as the ACT and SAT comes from the same TIME article: “nameless test designers.”
A Skewed Approach
This is incredibly important to the future of our society. If the test writers were widely and openly known, by name, with a bio and a video of each of them talking about their core beliefs, political and cultural views, most important aspirations and goals, and perspectives on various important current issues and societal concerns, such tests would almost immediately end.
Parents and educators would watch such video clips (easily posted online with today’s technology), and the tests would immediately lose their credibility. I don’t say this because I know the test designers. I don’t. But I am convinced, based on reading and analyzing many of these exams, that they have an agenda.
How could they not? They are the Choosers in our “meritocracy.” They are the Givers. Their exams determine who will win and who will lose the contest—to get into the top institutions of higher learning. They are much more powerful than the Electoral College or the Senate Judicial Committee. In fact, to a large extent, they unofficially but actually choose who will or won’t be eligible for selection by these very groups in the future.
If they openly told us who they are, why they write the tests the way they do, and what they stand for and against, we could evaluate whether we want our kids and grandkids to have the kind of education that is specifically taught in most schools (for the very purpose of excelling on exams written by, planned, and designed by these people). With such transparency, even if they maintained the support of a few, they would lose the support of many others.
And that would be good for our nation. Other types of testing would arise, and the market would support other competing agendas—by other test designers with competing views and goals—not just the one proscribed by this generally unknown group of “Givers” who select our future national leaders each time they create, approve, and/or reject a test question.
If we’re going to continue to be nation run by the test designers (who parents and teachers unwittingly hold up as the key to each child’s future), we should interview each of them and know that our kids are in good hands. If they aren’t open to such transparency, then why do we give them so much power over our children? And over our nation’s future?
Building a child’s education around meeting the demands and agendas of this group of unseen people—simply because it is “necessary for college,” at least if we teach to the test (and the more people care about the test, the more likely they are to do this)—is a bit strange. It’s akin to living in the 19th Century and using leeches to bleed a sick person because the “experts” assure us it is an excellent practice.
Really? Is that how we want to approach education?
The Way Things Are
I’m all for great education. For rigor and depth and intense study—especially among teens and college-age learners. I think testing is an important part of quality learning.
But the current system has it all wrong.
If your young person gains an education that is defined and driven by the standardized national tests in math, science, social studies, and language arts, then yes, college has very much gotten in the way of his or her education. It has narrowed it, milked out much of the natural passion for learning, and infused it with large doses of rote.
That’s what teaching to the tests usually requires. It has also bent your child’s education in a direction with a specific social/political agenda that many parents don’t support—but don’t know about.
Of course, it’s important to note that for the most part teachers have no control over this. Many teachers manage to do an excellent job of real teaching even though they have this system to overcome. And, as mentioned, few parents realize what is actually going on. They just want their kids to get into a good college.
In truth, I have no problem with young people taking these standardized exams. To get into college, it’s necessary. So take them. Sure.
But the problem arises, 1) in throwing out some of your child’s great educational opportunities to focus on teaching to the tests, and 2) in seeing their test scores and thinking you’ve won, or lost.
And, on the national level, the even worse problem is 3) that we’re no longer much of a meritocracy, because we’ve turned over the determination of “merit” almost exclusively to a group of behind-the-scenes “Givers”.
Far too many parents, educators, and others just go along with all three of these problems because “that’s the way things are.”
But they shouldn’t be this way.
And that’s my point. That’s what the idea of America was once all about—to make things the way they should be, to improve things that aren’t right, to fix the broken, and to choose true principles over bad customs. To put what’s right ahead of mere profit or promotion.
The Important Questions
These are by far the most important lessons in anyone’s education.
Without these lessons, no education is complete.
We can still teach these things, and parents have a lot of power to do so. But these things aren’t being taught on the SAT or ACT, nor in the teach-to-the-test lessons that precede these exams and dominate many schools. Nor, sadly, are they taught in most of the college classrooms that come after the long-coveted Elite University acceptance letter.
They’re going to have to be taught elsewhere.
Which brings us to this one central, vital, question:
How, where, and from whom are your kids going to learn them?
Not most schools. Not most universities. Not TV or the movies. Not surfing the Internet.
If parents don’t teach them, they probably won’t get taught.
And that’s going to drastically influence the future of America.
Which brings us to the final test, the real exam, the essential question every parent needs to answer:
What are you going to do about this in your home?
Here’s what we’re doing about it:
April 17th, 2015 // 8:33 am @ Oliver DeMille
(How the White House is Touting Misleading Economic “Recovery” Numbers)
Lies and Facts
Mark Twain popularized the idea that there are lies, darn lies, and then statistics. The implication is that statistics are often the worst lies of all, because most people don’t really understand what they mean.
Lenin added that when money is part of the equation, very few people understand the numbers and what is really going on.
To bring this home, the White House keeps assuring us that the Great Recession is over, and that the U.S. economy is now doing much better. In the 2015 State of the Union Address, for example, President Obama tried to put the nation at ease about the economy. He told us that the economy is in recovery, the worst is past, and we can turn our thoughts to other topics.
Since that speech, the White House has repeatedly reassured us that the economy is now doing well.
But the facts simply don’t tell the same story. In fact, they tell a different tale indeed.
Recovery and Disaster
One of the biggest statistical “lies” is that unemployment is now under control. But this a purely statistical fiction, based on the way the government calculates unemployment numbers.
Officially, according to the Bureau of Labor Statistics, the unemployment rate is now down to 5.5. To compare, the average unemployment rate under the Bush Administration was 5.3, and under the Obama Administration it has averaged 8.3. So getting down to 5.5 is good. But it’s also misleading.
The 5.5 rate is this low only because the Labor Participation Rate (the number of people who have a job or are actively trying to get one) is way down. It’s currently only 62.7%, matching the lowest since 1978, which means that a lot of unemployed people have given up trying to get a job.
As columnist George Will put it: “If the work force participation today were as high as it was on the day Barack Obama was inaugurated, the unemployment rate in this country would be 9.7%.” That’s not a recovery, it’s a major economic disaster. And it’s getting worse, because the participation rate is going down.
Numbers and Reality
Also, a large number of those who are now “employed” according to the government statistics are actually working in jobs that pay much less or offer a lot fewer hours than those they had before the downturn. Such people don’t consider themselves out of the Recession—and won’t until they get back to making as much as they did before.
For example, counting a person who lost his or her $32,000 a year job and is now making $11 an hour for a shorter work week as “employed” might make sense on the statistical report, but it’s not good for the worker or his/her family. They’ve taken an almost 50% pay cut, while the cost of living is still going up. And given the new Obamacare regulations, the number of hours isn’t likely to go up for these people.
The current unemployment statistics may look good on television, or play well in White House press briefings, but only because most people don’t know what the statistics actually mean or how they are configured.
A lack of real, widespread education makes a people easy to sway, and easy to control. Too many of today’s citizens are accustomed to simply accepting whatever the experts and officials say without really thinking or even questioning. This reality makes freedom a lot more difficult to maintain.