Sunday, November 18, 2018

The Social Characteristics of Capitalism


In my last post, I mentioned that intellectuals are most likely to hold a Marxist world view and espouse Socialism as the ideal economic system.  Marx lived and wrote at the time when the dominant economic theory of Mercantilism was being replaced by a new theory that came to be called Capitalism.

In the world of Mercantilist economics, the entire wealth of the world was viewed as fixed.  Therefore, if one person or nation became wealthier, the increase had to come from someone else's becoming poorer.  The size of the economic pie was finite.  Economics was a zero sum game.  

In societies based on rank, status or caste, such as Europe through the mid-19th century, India, and some parts of Africa today, a person’s station in life is fixed.  He or she is born into a certain station and his position in society is rigidly determined by the laws and customs which assign each person either privileges and duties or disabilities.  Exceptionally good luck, such as saving the King’s life, may in rare cases raise someone into a higher rank.  Very bad luck, like getting caught stealing from a prominent person, can result in a person’s losing their status and being assigned to an even lower class.  But, as a rule, the conditions of the individual members of a definite rank or class can only improve or decline with a change in the conditions of the whole class.

In such societies, the individual is, primarily, not a citizen of a nation.  He/she is a member of his or her class or estate.  In coming into contact with a countryman belonging to another rank, there is no sense of community.  There is only the gulf that separates one from the other person’s status.

In Europe during the late Middle Ages, the diversity between classes was reflected not only in language, but also in dress.  The aristocrats spoke French.  The lower classes clung to their own native language, broken into local dialects, which the upper classes couldn’t even understand.  The various ranks also dressed differently.  No one could fail to recognize the rank of a stranger.

The main criticism leveled against the 18th century principle of  equality under the law was that it abolished the privileges of rank and dignity.  It has, said the critics, atomized society, dissolving the natural subdivisions into faceless masses.  These masses are now supreme, and their materialism, their desire for creature comforts, has superseded the respectable standards of days gone by.  Now, money is king.  Quite worthless people enjoy riches, while the meritorious and worthy go empty-handed.

This criticism implies that, under the old ways, the aristocrats were distinguished by their superior virtue and that they owed their rank and their revenues to their moral and cultural superiority.  While the progressive foes of Capitalism disagree with regard to to this evaluation of the old standards, they fully agree with condemning the standards of Capitalistic society.  As they see it, those who acquire wealth and prestige are not those who deserve well from their fellow citizens, but frivolous, unworthy people.  

Now, nobody ever contended that under free-market Capitalism, those who do best are those who ought to be preferred.  What the democracy of the market brings about is not rewarding people according to their true merits, their inherent morality or worth.  What makes a person more or less wealthy is not the evaluation of his contribution from any absolute principle of justice or fairness, but an evaluation on the part of his fellow men, who apply the yardstick of their own personal wants and desires.  This is what the democracy of the market means.  The consumer is king.  The consumer wants to be satisfied.

Millions of people like to drink Pepsi.  Millions like detective stories, mystery movies, tabloid newspapers, football, whiskey, cigarettes, chewing gum, etc.  The entrepreneurs who provide these things in the best and cheapest way succeed in getting rich.  What counts in the frame of the market is not academic or moralistic judgments of value, but the valuation actually manifested by people in buying or not buying.

To the grumbler who complains about the unfairness of the market system only one piece of advice can be given.  If you want to acquire wealth, then try to satisfy the public by offering them something that is cheaper or which they like better.  Try to supersede Pepsi by mixing another beverage.  Equality under the law gives you the power to challenge every millionaire.  In a market not sabotaged by government-imposed restrictions - it is exclusively your fault if you do not outstrip the chocolate king, the movie star, the computer software writer, or whoever.

But if, instead of the riches you might acquire by engaging in providing commercial goods or services, you prefer the personal satisfaction you might get from writing poetry or philosophy or music, you are free to do so.  Of course, you won’t make as much money as those who serve the majority of consumers.  Those who satisfy the wants of a smaller number of people collect fewer votes - dollars - than those who satisfy the wants of many.

It’s important to realize that the opportunity to compete for the prizes society has to allocate is a social institution.  It can’t remove or even alleviate the innate handicaps that nature has chosen to discriminate against many people.  It cannot change the fact that many are born sick or become disabled later in life. The biological equipment of people rigidly restricts the fields in which they can serve.  Danny Devito won’t ever be able to compete with Michael Jordan in basketball.

In the same manner, the class of those who have the ability to think for themselves is separated by an unbridgeable gulf from the class of those who can’t.  In a society based on caste, the individual can credit fate to the conditions of life beyond his or her control.  He is a slave because the supernatural powers that determine what people will become have assigned him to his rank.  It’s not his doing or a result of any mistakes he made and, therefore, there is no reason for him to be ashamed of his humble station in life.

His wife can’t find fault.  If she were to complain to him: “Why aren’t you a duke?  If you were a duke, I would be a duchess,” he would simply reply: “If I had been born the son of a duke, I wouldn’t have married you, a slave girl, but I would have married the daughter of a another duke.  Your not being a duchess is your own fault; why weren’t you more clever in choosing your parents?”

It’s another thing entirely under Capitalism.  Here everybody’s station in life depends on their own doing, the choices they make.  Everybody whose ambitions have not been gratified knows very well that they have missed chances, or made mistakes, and that they have tried and been found wanting by their fellowman.  If his wife criticizes him:  “Why do you make only $150 dollars a week?  If you were as smart as our next door neighbor, Joe, you’d be a foreman by now and I would enjoy a better life,” he becomes conscious of his own inferiority and feels humiliated.

The much maligned unfairness of Capitalism consists in the fact that it handles everybody according to their contribution to the well-being of their fellowman, as judged by their fellowman.
The dominance of the principle “to each according to his accomplishment” rather than the Marxist principle “to each according to his need”, doesn’t allow any excuse for personal shortcomings.
Everybody knows very well that there are people like herself who succeeded where she herself failed.
Everybody knows that many of those whom she envies are self-made people who started from the same point that she started from.  Worse than that, she knows that everyone else in her circle of friends knows it too.

What makes many feel unhappy under Capitalism is the fact that the economic system grants to each the opportunity to attain the most desirable positions.  Of course, these can only be attained by a few.
Whatever a man may have gained for himself, it is mostly a fraction of what his ambition has motivated him to win.  Right before his eyes, there are people who have succeeded where he has failed.  There are those who have outstripped him and against whom he nurtures, at least subconsciously, a feeling of resentment.

This is the attitude of the tramp against the person with a regular job, the factory hand against the foreman, the middle-manager against the vice-president, the vice-president against the company’s president, the person who makes $50,000 a year against the millionaire and so on.  Everyone’s sense of self-assurance and self-worth is undermined by the sight of those who have given proof of greater ability.  It’s human nature for everyone to overrate their own worth and what they consider their just rewards.

This suffering from frustrated ambition is peculiar to people who live in a free society.  It’s not caused by the freedom everyone has to compete, but by the fact that, in such a society, the inequality of people with regard to intellectual abilities, will power, motivation, and energy become clearly visible.  The gulf between what a person is and achieves, and what they think of their own abilities and achievements, is starkly revealed.  Day-dreams and demands for a fair world which would treat them according to their real worth are the refuge of all those afflicted by their lack of self-knowledge.

Therefore, it’s no wonder that the very success of economic and political freedom under Capitalism in the United States, reduced its appeal to later thinkers.  The narrowly limited government of the late 19th century possessed little concentrated power that endangered the ordinary person.  The other side of that coin was that it also possessed little power that would enable good people to do good.  And, in an imperfect world there were, and are, still many evils.

In fact, the very progress of society made the evils that were left seem worse.  This was the milieu in which Marx lived and wrote.  It was society on the cusp, transitioning from Mercantilism to Capitalism. Marx saw poverty and naturally concluded that it must be the result of ill gotten gains on the part of the bourgeoisie.  But, Marx was nothing but a clerk who came in in the middle of the movie.  He saw the factories of the Industrial Revolution in the hands of private owners, while those who worked in those factories struggled for their very survival.  Never did he consider the risks involved in building the factories, inventing and building the machines, or any of the other a priori
requirements that made the whole thing run.

Listening to Marx and his labor theory of value, people took the favorable developments for granted.  They forgot the danger to freedom from a strong government.  Instead, they were attracted by the good that a stronger government could accomplish—if only the government was in the right people’s hands.

These ideas began to influence government policy in Great Britain by the beginning of the 20th century.  They gained more and more acceptance among intellectuals in the U.S. during what is called the Progressive Era, but they had little effect on government policy until the Great Depression.
Contrary to popular notions, the depression was produced by a failure of the government in one key area - money - where the government had exercised exclusive authority since the ratification of the Constitution.  However, the government’s responsibility for the depression was not - and is still not -recognized.  Instead, the depression is still widely interpreted as a failure of free market Capitalism.
That myth led the public to join the intellectuals in a complete change of view about the relative responsibilities of individuals and government.

Emphasis on the responsibility of the individual for his own fate was replaced by an emphasis on the individual as simply a cog in the great wheel of life, a pawn being thrashed about by forces beyond his control.  The earlier view that government’s role is to serve as an umpire to prevent individuals from coercing one another was replaced by the view that government’s role is to serve as a parent, charged with the duty of coercing some to give aid to others.  The hatred of Capitalism by intellectuals, and their embrace of Marxism, is directly related to the earlier discussion of how Capitalism, as a system, reveals the failure of people to conduct their pursuits with an eye toward meeting the demands of the consumer.

Intellectuals, such as doctors, lawyers, artists and writers, scientists, professors and teachers, etc., resent Capitalism precisely because it assigns to some a position that they themselves would like to have.  The so-called common man, as a rule, doesn’t have the opportunity to associate with people who have succeeded better than he.  He or she moves in the circle of other common people.
He or she never meets his boss socially.  They never learn from personal experience how different an entrepreneur, or an executive, is with regard to those abilities which are required for successfully serving the consumer.  Therefore, their envy and resentment are not directed against another living person, but against abstractions like management, capital, and Wall Street.  One can’t hate such an abstraction with the same bitterness that one may bear against a fellow human that one associates with daily.

It’s different with those in which the special conditions of their occupation or their family ties bring them into personal contact with the winners of the prizes that they believe should have been given to them.  With them, the feelings of frustrated ambition become especially piercing because they engender hatred of concrete human beings.  This is the case with people who are commonly termed intellectuals.  Let’s take, for instance, doctors.

Their daily routine and experience make every doctor cognizant of the fact that there exists a hierarchy in which all medical men are graded according to their merits and achievements.
Those who are more famous and skilled are those that the regular doctor must follow in terms of their methods and innovations.  He must learn and practice those methods to keep up-to-date and these eminent doctors he must follow were his classmates in medical school, they served with him as interns, and they attend the same medical meetings he does.  Some are his friends and they all address him with the utmost cordiality. 

But they tower above him in the appreciation of the public and also in the amount of income they earn.  When he compares himself to them, he feels humiliated.  But, he must be careful not to let anyone notice his resentment and envy.  So he diverts his anger toward another target.  He blames the system and the evils of Capitalism.  If it weren’t for the unfairness of the system, his abilities and talents would have brought him the riches he deserves.

It’s the same with many lawyers and teachers, artists and actors, writers and journalists, engineers and chemists.  People who are commonly called intellectuals.  They are angry, too, by the rise of their more successful colleagues and their former schoolmates.  The anti-capitalistic bias of the intellectuals is a phenomenon that is not limited to the U.S.  But it is more bitter here than it is in the European countries.  To understand why you must understand the basic difference between Society in Europe and society in America.

In Europe, (capital S) Society includes all those who are prominent in any field.  Statesmen and government leaders, the heads of civil service departments, publishers and editors, prominent writers, scientists, artists, actors, lawyers, and doctors, as well as members of the aristocratic families all make up what is considered the good society.  They come into contact with one another at dinners and teas, charity balls.  They go to the same restaurants, hotels and resorts. Access to European society is open to anybody who has distinguished themselves in any field.  It may be easier for people of noble ancestry and great wealth, but neither riches nor titles can give a member of this set the rank and prestige that comes with personal distinction in their field.

(Little S) society, in this sense, is foreign to Americans.  What is called society in America almost exclusively consists of the richest families.  There is little, if any, social interaction between the successful businessmen and the authors, actors, artists and scientists, no matter how famous the latter may be in their field.  Most of the socialites are not interested in books and ideas.  When they get together, they usually gossip about other people and talk about sports like polo and tennis.  But even those who do like to read consider writers, scientists, and artists as people with whom they do not want to associate.  There is almost an insurmountable gulf which separates society from the intellectuals.  Consequently, American authors, scientists, and professors are prone to consider the wealthy businessman as a barbarian, someone exclusively intent on making money.

The professor despises the alumni who are more interested in the college's football team than in its scholastic achievement.  He is insulted if he learns that the coach gets a higher salary than a professor of philosophy.  Those whose research has given rise to new methods of production hate the businessman who they view as simply interested in the cash value of the research, rather than its intellectual value.  Therefore, it’s significant that a large number of American professors sympathize with socialism.

If a group of people secludes itself from the rest of the nation in the way American socialites do, they naturally become the target of the hostile criticism from those they keep out.  What they fail to see is that their self-chosen segregation isolates them and kindles animosities which make the intellectuals even more inclined to favor anti-Capitalistic policies.

Friday, November 16, 2018

Further Absurdity in Texas Education

Previously, I related how the Texas State Board of Education (SBOE) regards History as a laundry list of facts, dates, people, and events that must be taught.  This list goes by the name Texas Essential Knowledge and Skills (TEKS).  Further advancing the absurdity is the effort ongoing to "streamline" the TEKS.  As this article just published in the Texas Tribune reports, the streamlining was brought about by teachers complaining that there simply isn't enough time in the school year to teach all those facts.

https://www.texastribune.org/2018/11/13/hillary-clinton-helen-keller-state-board-education-texas/?fbclid=IwAR18yj4y3y7uV2qGBjT_lawJMLcP_xGUKez_ysjUlbX4NOe4xjKw7AO1dFc

Take a look at the makeup of the workgroups.  "Work groups made up of teachers, historians and curriculum experts were tasked with cutting repetitive and unnecessary requirements out of the social studies standards."   What's missing are business owners to provide insight into exactly what the marketplace is seeking in terms of "knowledge and skills".  They could tell the Board that knowledge of Hillary Clinton, Helen Keller, and Moses is NOT what they're looking for.  The global marketplace of the 21st century needs people who can THINK, not simply regurgitate facts from the past.

Another absurd, if not downright pernicious, aspect of the entire education system in Texas is the standardized test that students must pass in order to graduate.  This is the State of Texas Assessment of Academic Readiness (STAAR).  The salient question is readiness for WHAT?  Isn't education intended to prepare one for life as a productive member of society?  In researching the STAAR test, I found that "High school students must pass Algebra I, English I, English II, Biology and U.S. History end-of-course exams to graduate."  

If a young person intends pursuing a career in law enforcement, of what practical use is Algebra, or Biology?  And, one can live a happy, productive life, contributing to society without ever thinking twice about William Shakespeare.  The basis of such a test reflects the arrogance of those who still subscribe to the 19th century attitude of what constitutes an "educated" person.  Of course, those who advocate for continuation of such a curriculum must do so in order to protect their jobs.  If Algebra ceased to be a required subject, there wouldn't be a need for so many Math teachers.

"Individual graduation committees must be established for students in 11th or 12th grade who have failed up to two of the EOCs. The committee determines whether a student can graduate despite failing the exams. The committee is composed of the principal/designee, the teacher of each course for which the student failed the EOC, the department chair or lead teacher supervising the course teacher and the student’s parent (or the student if at least age 18)."  So, here you have a young person's entire future in the hands of education bureaucrats.  Although the parent is included, practical experience with public education officials for the last 35 years has demonstrated that the parent is considered the LEAST important person in the entire process, even though it is the parents, as taxpayers, who make the entire system run.

Another trend prevalent in Texas has been the hiring of Curriculum Coordinators or Assistant Superintendents for Curriculum by school districts.  When one looks at the prerequisites in a job listing, you find a requirement for a Masters degree in Education Administration.  No regard is given to the discipline in which someone received their undergraduate degree and taught.  So, you end up with someone who majored in and taught Biology being given full authority to tell a History teacher how to teach the discipline.  It is only in the field of public education that this occurs.  At the college level, a Biology professor wouldn't dream of trying to tell a History professor how to teach his class.  If he did, the History professor would promptly say LEAVE.

As I've related before, the fundamental underpinning for the introduction of compulsory schooling in America in 1852 was considering the child the property of the State.  In 1853, the Boston School Committee stated:


“The parent is not the absolute owner of the child.  
The child is a member of the community, has certain 
rights, and is bound  to perform certain duties, and so 
far as these relate to the public, Government has the 
same right of control over the child that it has over the 
parent…Those children should be brought within the
jurisdiction of the Public Schools, from whom, through 
their vagrant habits, our property is most in danger, 
and who, of all others, most need the protecting power 
of the State.”  

This insidious attitude continues to the present day in the actions of the SBOE and, in fact, by the entire system of public education.  History is supposed to be written in as unbiased a manner as possible.  Although every historian brings his/her own personal views to their interpretation, those who are honest will strive to control for such biases.  As David Hackett Fischer wrote almost 50 years ago, “A historian is essentially trained to be objective in his selection, analysis and interpretation of evidence. Unless he tries as much as possible to be objective, his person and work would hardly be respected.”

History should be taught the same way.  What the SBOE is doing is fighting over "whose" facts should be taught, based on ideology.  None of this prepares young people for real life.

Through the Scholarship of Teaching and Learning in History, we have 100 years of data showing that the "facts first" approach to teaching History doesn't work.  Unless they are professional historians, humans don't walk around with a bunch of facts about the past just waiting for a propitious moment to use them.  But, even historians cannot know everything about everything.  Someone who wrote their PhD dissertation on Medieval Europe will likely not remember many of the "facts" learned in their freshman American History survey course.  

People first think of questions to be answered, such as "Why are things the way they are?  How did they get this way?"  Teaching them that History is just "what happened in the past", without giving them the tools to research the past and find the answers hampers them in meeting the demands of the 21st century.


Sunday, November 11, 2018

When Did Privileges Become "Rights" and "Responsibilities"?

As Dr. Thomas Sowell of the Hoover Institution has stated, "Many things are believed because they are demonstrably true, but many are believed simply because they have been asserted over and over again."  For at least 100 years now, school children have been indoctrinated to believe that the rights granted in the U.S. Constitution are somehow "gifts" of a benevolent government and that enjoying those rights requires action, stated as responsibilities.  A random search using the phrase "rights and responsibilities of citizens in a democracy" yields a multitude of "hits", all saying pretty much the same things. 

The website of the Department of Homeland Security provides a chart:

In reading the left-hand column above, it becomes clear that someone, somehow, at some time, juxtaposed Freedoms with Rights, intermingling the two as if they are the same.  Although it's impossible to pin the blame on any individual, it is pretty clear that this conflation of freedoms, or privileges, with rights and responsibilities began in the Progressive Era.  It was during this period The Americanization Movement determined to "Americanize" the millions of immigrants who came.  The Progressives also viewed government as a force to be used to bring about social change.  This went counter to how Americans had viewed government before.  In order to change attitudes, it became necessary to change expectations.  In order to ensure allegiance to the nation state, people must be convinced that living in a free country isn't really free.  Because that freedom is granted by the government, citizens owe a debt of gratitude, in the form of certain responsibilities, to that government.

The best way to begin the indoctrination and to impart that message was in the schools.  Having begun in Massachusetts in 1852, by the turn of the 20th century, compulsory schooling had spread to almost every state.  Until that time, there was no written "American History" as we think of it now, a discreet "subject" to be learned in school.  A captive audience, children would be taught a particular version of that history, designed to inculcate Patriotism.  This was also a period when there was great debate over the appropriate military policy to provide for the common defense.  Those who opposed a large professional, standing army took the position that America should rely on a "citizen army", ready to defend the country if called upon.  In order to ensure that young men would fight, it was believed they must be inculcated with patriotic fervor. 

However, when carried too far, Patriotism becomes extreme Nationalism, which, in turn, can become Nativism.  In fact, it was Nativism that led to the creation of compulsory schooling in the first place.  This becomes a bludgeon to be used against those who don't agree.

To those of the founding generation, rights derived from natural law.  The "unalienable Rights" Jefferson wrote of in the Declaration of Independence were endowments from the "Creator", however one chose to define such an entity.  And, then, "That to secure these rights, Governments are instituted among Men."  It was clear, at least to such men as Washington, Adams, Hamilton, Franklin, and Madison, that the government was to protect those rights.  It did not bestow them.  While we can't sit down with them and have a conversation to determine exactly what they were thinking at the time, we do have their writings.  And, at least in terms of the Constitution, it's pretty clear that what they said is what they meant.

Perhaps the best source on the Constitution and Bill of Rights is the man who wrote them both, James Madison.  Although the principle advocate for passing the Bill of Rights in the House of Representatives, Madison personally believed it was wholly unnecessary.  A listing of rights could be dangerous, leading to the erroneous conclusion that only those rights specifically listed were actually protected. Furthermore, most states already had bills of rights; a federal list would be redundant. 

That argument was sufficient to ratify the Constitution, but many states ratified the document with the recommendation that a bill of rights be added immediately.  Those who opposed the Constitution, the Anti-Federalists, hoped that they could leverage these recommendations to achieve more than just a bill of rights. Many hoped that they could use this wedge to force a new constitutional convention that would make changes to the new government’s taxation and commerce clauses.  Madison then decided to co-opt the Anti-Federalists' arguments and propose a federal Bill of Rights himself.

But, when reading the first 10 amendments, it is clear that those "rights" are actually "freedoms", either a freedom TO, or freedom FROM. There is no "right to vote in elections of public officials" or "right to run for public office" in the Bill of Rights.  While voting and running for office are essential features of democracy, such ideas, as we think of them today, would have never entered the Founders' minds.  After all, only those who owned property were allowed to vote at the time.  If voting were a right, every citizen would have enjoyed that right from the beginning.

To those who maintain that voting is a right which carries a concomitant responsibility, one must ask why black citizens were denied that right well into the 20th century.  The  15th Amendment was ratified in 1870.  But, as I've written before, the thousands of African American men who fought in the World Wars considered themselves good citizens.  When they came home, they found that wasn't necessarily the case, at least in the states of the former Confederacy.  It took the Voting Rights Act of 1965 to rectify this injustice.  As for the right of running for public office, don't forget the "white primaries" the Democratic Party used in the 20th century to ensure blacks could not run.

As to "responsibilities", the Founders wrote in terms of people's responsibilities toward one another, not to the government.  They repeatedly talked about government being a necessary evil because of human nature.   As Alexander Hamilton explained in The Federalist Papers, No. 15: “Why has government been instituted at all?  Because the passions of men will not conform to the dictates of reason and justice, without constraint.”  In Federalist No. 55, Madison wrote, “As there is a degree of depravity in mankind which requires a certain degree of circumspection and distrust, so also there are other qualities in human nature which justify a certain portion of esteem and confidence.”  In order to control those passions, they repeatedly spoke of the need for Civic Virtue, which they viewed as necessary for the Constitution to operate successfully and endure.  Without virtue "nothing less than the chains of despotism can restrain them from destroying and devouring one another."

In a recent article in EdWeek (Vol. 38, Issue 10, Pages 1, 12-15), entitled "How History Class Divides Us", Stephen Sawchuk puts forth the argument that the extreme polarization we're experiencing today is due to History and Civics being removed from the curriculum as required subjects in many states.  I submit that the polarization is because of how History and Civics were taught throughout the 20th century.  Those who were left out of the original, biased version (women, blacks, Hispanics, etc.) began to write alternative versions.  While it should have come as no surprise that they would do so, conservative zealots who want to turn the clock back to the 1950s, characterize this as "revisionist" history, put forth by "pointy-headed Liberal professors".

It certainly WAS revisionist, in the sense of telling the story from another perspective.  What the conservatives fail to understand is that continuing to demand a Back to the Future approach to American History, they play right into the hands of their "enemies" on the Left.  We see this being played out today on college campuses, through the suppression of free speech and the destruction of property, remnants of the 1960s.  

Instead of drilling students with an endless litany of facts in History and Civics, continuing to stress the need to "know" all that stuff in order to fulfill one's "responsibilities" to the State, students should be taught the Civic Virtues the Founders had in mind as necessary to live peacefully together in society.  Beyond paying one's taxes, that make the entire engine go, one need not perform any other certain "actions" to be a good citizen.  One can live a happy, productive life (as they define it) by simply treating their fellow citizens civilly.







Friday, October 19, 2018

Democratic Socialism – Putting Lipstick on a Pig


Since the Progressive Era in America, the Left has struggled mightily to label itself in a way that its world view can be made palatable to the American people.  Realizing early on that Karl Marx’s theory of “historical materialism” was simply not accepted by the majority of Americans, the Left has always sought to cloak their Marxist vision of economics so as to fool as many as they can.  First, they called themselves Progressives, then Social Democrats, then Liberals.  Now, they have become Progressives once more, with many of them proclaiming Democratic Socialism to describe the same fundamental world view.

According to the Democratic Socialists of America, “Democratic socialists believe that both the economy and society should be run democratically—to meet public needs, not to make profits for a few.”   This is a purposeful juxtaposition of a type of government with an economic system, designed to confuse the millions who either never took Civics in high school, or have forgotten what they learned.  Apparently, it is believed that so long as people get to vote, everything is fine.  Unfortunately, there are millions of suckers out there who believe it, too.

The trouble is, using that criteria as a measure of legitimacy, the old Soviet Union was “democratic”.  There were elections in the Soviet Union.  But, there was only one political Party for which the people could vote – the Communist Party.




 It has been recognized for at least 3,000 years that there are just a few types of government, Monarchy, Aristocracy, Democracy, Republican, and Communist.

Plato
(429-347 B.C.)

The best form of government in Plato’s “Ideal State” was Monarchy, a “Philosopher King”, assisted by an Aristocracy, the “Guardians”.  In this Ideal State there will be three classes of citizens - the producing class, the warrior class, and the ruling class, each performing its proper function.  He claimed that each man, by the nature of his talents, belonged to one or the other of these classes and that there should be no overlapping, no moving back and forth from one class to another, or belonging to more than one class.  If such things happened, Plato declared, it would be "the ruin of the State."  Plato's Philosopher King was a man whose wisdom and understanding was refined beyond that of other men.   In other words, while everyone is born with the ability to reason, in the end only a few go beyond the understanding of others.  This gives them the right and authority to rule. 

In his later years, Plato turned from considerations of the ideal state to more practical political matters.  While he turned away from the imaginary, he still utilized rational analysis in his later writings.  In The Statesman, he set forth what has become the classical understanding of the possible forms of government.  They are rule by one, rule by a few, and rule by the many.

He further divided these into what he termed the legitimate and per­verted models of these forms of government.  He considered monarchy as the legitimate form of rule by one.  Its perversion would be rule by a tyrant or tyranny.  An aristocracy would be rule by the few and oligarchy would be its perversion.  Interestingly enough, he termed rule by the many as democracy, but gave no distinctive names to either its legitimate or perverted forms.

Aristotle
(384-322 B.C.)

Aristotle elaborated on Plato's thinking regarding the forms of government and made distinctions between acceptable and unaccept­able practices.  In theory, or ideally, Aristotle said that monarchy - rule by one - might be the best form of government.  Thus, he said, "If there be some one person, or more than one whose virtue is so pre-eminent that the virtues or the political capacity of all the rest admit of no comparison with his or theirs, he or they can be no longer regarded as part of the state.  Such a one may be truly deemed a God among men."

He doubted, however, that such a situation would occur except in very rare cases, or that if it did, it would be quite as good as might be imagined.  In reality, the wisest and best intentioned would need the counsel of other men as well as their assistance in ruling, and law would be preferable to personal rule.  Besides, the most likely result of rule by one would be tyranny, which is the perversion of rule by one.  It occurs when a man rules in his own interest and for his own purposes, rather than in the interest of those whom he governs.

Aristotle followed Plato in stating what he viewed as the legitimate and illegitimate forms.  The rule of the few, if it is a good government, Aristotle called an aristocracy.  It would be the rule of the best qualified men in the country who would be expected to rule in the best interest of all the people.  The perversion of aristocracy is oligarchy, which is the rule by the few in their own interest.  The main point he made was that oligarchies tend to keep power perpetually in the hands of a few who use the government as if it were their personal possession.  Over time they become tyrannical.

Rule by the many has the potential for being the best form of government, according to Aristotle.  More precisely, he believed that the best government would be one which included both the few - men of wealth and high intellect - as well as the many - including those from both the lower and middle ranks.
The middle class provided the best hope for good government. He wrote that "It is plain, then, that the most perfect political community must be amongst those who are in the middle rank, and those states are best instituted wherein these are a larger and more respectable part; or, if that cannot be, at least greater than either of the other classes; so that being thrown into the balance it may prevent either scale from preponderating."

Since the majority of the populace would have some part in govern­ing, the laws would be more readily obeyed.  Such a government would be termed a "polity" or constitutional government.  Interestingly, Democracy was the term Aristotle used to describe the perverted form of rule by the many.  He did not object to rule by the many so long as it was rule by law and moderated by thought­ful and experienced men.  However, the perversion occurs when "the multitude have the supreme power and supersede the law by the decrees.  This is a state of affairs brought about by demagogues."   In other words, rule by emotion.  Such a government acts not in what is good for the people of the country but what appeals to the worst inclinations of the people collectively.  Its tenden­cy is toward mob rule.

Madison, Jefferson, Washington, Adams, et al

The impact of Greek thought and practice on the founding of the United States was more indirect than direct.  The U.S. was never composed of city-states.  The American idea of the rule of law was taken more from the Roman and British example than that of Athens.  The Roman influence on the political institutions and practices of the United States was very great.  Rome had a constitution going back to the Twelve Tablets in 450 B.C. and forward through many changes in governmental arrangements until the very end of the Roman Republic.

The United States was styled a republic on the model of Rome and our Constitution provides that the states are guaranteed republican governments as well.   But, the Founders were also well aware that the Roman Republic disintegrated and Roman government reverted to the rule of one when Caesar proclaimed himself Emperor in 45. B.C.

They also tended to view Democracy, at least what is called “direct” democracy in much the same way Aristotle had.  If all the people had an equal voice, without any method of controlling their passions, the majority could always vote away the rights of the minority.  This is why we were given a Representative Republic, with separation of powers and checks and balances, rather than a Democracy.  To call America a “democracy” is simply a glib and easy reference to the fact that people get to vote.  But, as stated previously, so could the people of the Soviet Union.

The last type of government, Communism, is not a government at all.  It describes a Utopian State where the nature of man has been “perfected” so that no government is needed.  Despite calling the governments of the Soviet Union and China Communist, they are no such thing because a government did, and does, exist. 

Capitalism and Socialism

Neither capitalism nor socialism are types of government.  They are economic systems.  The purpose stated by the Democratic Socialists, to meet public need, is more a statement of Utilitarianism, the greatest good for the greatest number.  Utilitarianism is a moral theory, not a form of government.  But, as an economic system, socialism can only exist where there is Plato’s perverted rule of the few – Oligarchy, ultimately becoming autocratic and totalitarian.  That's the only way to keep the people in order.  Putting lipstick on this pig, doesn't change its fundamental nature.

What is most ironic is that those who most favor "Democratic Socialism" are intellectuals. What they don't seem to realize is that, in every country that has imposed a socialist economy, the intellectuals are the first ones killed by those in power.  They should be careful what they wish for.


Monday, October 15, 2018

The Carnegie Unit and the Core Curriculum - Both Are Dead, But Refuse to Be Buried

For over 100 years, the system used to judge a student’s achievement, both in high school and in college, has been based on the Carnegie Unit.  In Texas since 1987, there has been a standard core curriculum, requiring every college student to take a minimum number of credit hours in particular courses in order to earn a degree from state funded institutions.  Are either still accurate measures of education in the 21st century?

In 1905, retired steel magnate and philanthropist Andrew Carnegie, then the world’s richest man, wrote a letter to college presidents declaring his intention to establish a pension system for “one of the poorest paid but highest professions in our nation”—college professors.    He created the Carnegie Foundation for the Advancement of Teaching to run the system and sent a ten million dollar check to the Foundation’s trustees, led by Harvard President Charles Eliot, to finance it.  Carnegie said, “I have reached the conclusion that the least rewarded of all the professions is that of the teacher in our higher educational institutions . . . I have, therefore, transferred to you and your successors, as Trustees, $10,000,000 . . . to provide retiring pensions for the teachers of universities, colleges and technical schools.” 

To determine which institutions were eligible to take part in the Carnegie pension system, the Foundation had to define what a college was.  To be ranked as a college, and thus be eligible to participate in the Carnegie pension plan, an institution “must have at least six professors giving their entire time to college and university work, a course of four full years in liberal arts and sciences, and should require for admission, not less than the usual four years of academic or high school preparation, or its equivalent.” 

Before long, the Carnegie Unit became the central organizing feature of the American educational enterprise, a common currency enabling countless academic transactions among students, faculty, and administrators at myriad public, nonprofit, private, and for-profit institutions, as well as between education policy makers at every level of government.   The Carnegie Foundation established the Carnegie Unit over a century ago as a rough gauge of student readiness for college-level academics.  It sought to standardize students’ exposure to subject material by ensuring they received consistent amounts of instructional time.  The problem is that, while the universal and portable hour may make for a more efficient system, the unit also promotes the false perception that time equals learning, in the same way for all students. This was never the intent when the Carnegie Unit was first created. 

But, Ernest L. Boyer, former Carnegie Foundation President and U.S. Commissioner of Education under President Jimmy Carter, didn’t ignore this false perception.  In 1993, Boyer said, “I am convinced the time has come to bury, once and for all, the old Carnegie unit. Further, since the Foundation I now head created this academic measurement a century ago, I feel authorized this morning to officially declare the Carnegie unit obsolete.” 

The fact that educators and legislators today, 25 years later, are still requiring “seat time” to measure student achievement is a testament to not only their lack of creative thought, but also to a certain degree of professional hubris.   The Education establishment has convinced lawmakers that each and every student must complete a certain number of credit hours in particular subjects or the nation will suffer dire consequences.

In 1987, the 70th Texas Legislature passed House Bill (HB) 2183, which established the first core curriculum legislation, with a general intent to ensure quality in higher education.   Senate Bill (SB) 148, passed by the 75th Texas Legislature in January 1997, repealed all earlier legislation and sought to resolve concerns regarding the transfer of lower-division course credit among Texas public colleges and universities, while maintaining the core curriculum as one of the fundamental components of a high-quality, undergraduate educational experience.  More recent sessions of the Texas Legislature have fine-tuned the existing laws regarding core curriculum, but the essentials of the statutes have not changed since 1997.

This presupposes that there is some certain body of knowledge that everyone needs to know and that elected officials are somehow capable of deciding what that knowledge must be.  The oldest example of this is the legislative requirement that, at state funded colleges and universities, all students must take 6  hours of Government and 6 hours of American History in order to receive a degree, regardless of their major.  What was the genesis of this 1955 requirement? 

A look at the legislative history shows that it was a reaction to the “communist scare” of the late 40s/early 50s that resulted in such things as the House Un-American Activities Committee, the Hollywood Blacklist, and the antics of Senator Joseph McCarthy.  Apparently, it was believed that if students received a certain amount of instruction in those subjects, they would somehow be immune from the entreaties of communism.

In 2013, an effort was made to further codify the American History requirement.  Then State Senator Dan Patrick of Houston introduced a bill providing that only broad-based, survey courses could count to meet the 6 hour requirement.  In testimony supporting HB 1938, numerous individuals cited their understanding of the 1955 law’s purpose - that American history be part of a common core for all to enhance students’ civic knowledge and citizenship skills.  Fortunately, that bill did not pass.  But, it is sure to come up again in further sessions, particularly since Patrick is now Lieutenant Governor.

But, does a broad-based survey course in History actually do that?  First of all, the textbooks for such courses are far too expensive and become door stops at the end of the semester, unless they can be sold back to the bookstore, usually for less than 1/3 the original price.  Secondly, survey texts do not reflect current scholarship.  They are not based on primary or secondary sources.  They are based on each other.  Finally, survey courses promote mindless memorization of facts that can be easily tested.  This represents the lowest order of learning and does not represent real knowledge.

Another salient question that must be asked is just what are “citizenship skills”, and precisely who is authorized to judge?  In 1955, those making the judgment were the same legislators who thumbed their noses at the 1954 Brown v. Board of Education decision, requiring school desegregation “with all deliberate speed.”  Texas legislatures through the mid-60s steadfastly refused to comply with the Supreme Court’s ruling in that case and kept Jim Crow laws in effect in Texas.  Is that what is meant by good citizenship?  Is that the example that should be followed today?  Those who cite the 1955 law as somehow sacrosanct should think twice.  It might not be wise to hold up those 1955 legislators as paragons of wisdom and virtue.

Texas is actually one of the few states that require either History or Government, by law.  In other states, these are electives for those not majoring in these disciplines.    If one is concerned about what an individual actually needs to know to be a good citizen, an objective measure is the test given by the U.S. Immigration and Naturalization Service to aliens seeking U.S. citizenship.  Students need not sit through comprehensive survey courses to learn the basics of citizenship.  No employer is going to ask a job candidate to explain what he/she knows about the Haymarket Riot of 1886, or the Progressive Era, or how our system of checks and balances works.  Knowledge of history is important.  But, it’s not that important to everyone, regardless of what History professors may say. 

Most often, this is expressed in course outlines and syllabi with language that paraphrases the famous quote by George Santayana, “Those who cannot remember the past are condemned to repeat it."  But, people live in the present.   Historians do not perform heart transplants, improve highway design, or arrest criminals.  History is important in that it offers a storehouse of information about how people and societies behave.  But, we shouldn’t continue to overemphasize its importance, or decide for others how much importance it should have in their lives.  But, most importantly, we must recognize that “seat time” is not the true measure of how one can acquire the knowledge.

We’re living in the 21st century.  Today, we have communication technologies that were inconceivable in 1955, and when Boyer was writing in 1993. We can deliver information anywhere, anytime, and to nearly anyone. We can personalize instructional materials, teaching tools, and assessments.  Massive Open Online Courses (MOOCS), and other online courses that are slightly less massive and less open, may not be the future of education, but they are surely an example of how education is fast becoming more accessible, portable, and asynchronous.  Measuring student learning by “seat time” in this new educational era is obsolete, just as Boyer stated 25 years ago.



Friday, October 12, 2018

“Advance and Transfer” – Changing the Course of History Pedagogy

“Advance and Transfer” is a nautical term that describes what happens when a ship’s wheel is turned to a new course.  Due to its forward momentum, the boat or ship will continue on the previous course until the rudder ultimately causes a change in direction.  The larger the ship, the longer this transfer takes.  So it is with pedagogy in History.

In his 2006 JAH article “Uncoverage-Toward a Signature Pedagogy for the History Survey”, Dr. Lendol Calder quoted the late Roland Marchand: “Why are historians so incurious about learning?"  In speaking with public school history teachers, as well as professors at some local colleges, I’ve yet to meet anyone who is even aware of the Scholarship of Teaching and Learning (SoTL) in History and the work that’s been done over the past 12 years, researching how humans learn and, consequently, how they should be taught.  Despite some innovative attempts to depart from the “facts first” emphasis on content, such as the Amherst Project in the 1960s, History continues to be taught as it’s been taught for the last 125 years.

For public school teachers, this is somewhat understandable.  As Diane Ravitch pointed out, a large percentage of those teaching History are teaching out-of-field, defined as having neither majored nor minored in History in college.  Often, History teacher is spelled C-O-A-C-H.  “There appears to be the presumption that teaching history requires no special skills beyond the ability to stay a few pages ahead of the students in the textbook.”(1)

Even if they were inclined to keep up with current scholarship, they are often constrained by the directives of Boards of Education as to what must be taught and, in some cases, even how it is to be taught.  This, too, is understandable.  Individuals elected to such Boards were themselves taught that way.  History pedagogy is mired in 19th century assumptions and 20th century methods.  What were some of those 19th century assumptions?

In 1892, Harvard president Charles Williams Eliot headed up the Committee of Ten that standardized the curriculum which, in large measure, is still followed today in the public schools.  Starting with the view that there is a body of knowledge that everyone must know to be considered educated, the Committee’s recommendation was that "...every subject which is taught at all in a secondary school should be taught in the same way and to the same extent to every pupil so long as he pursues it, no matter what the probable destination of the pupil may be, or at what point his education is to cease."(2)

The key phrase there was the stricture that every subject be taught the same way.  Mathematics, the Sciences, and even English, with its grammatical rules, all rely on facts and lend themselves to rote memorization.  Neither the Pythagorean Theorem, nor Newton’s Laws, is based on interpretation.  In order to "do" Math, one must know that 2+2=4.

Another set of assumptions led directly to the imposition of compulsory school laws.  Contrary to popular thinking, compulsory schooling was not intended for education, in the sense of spreading literacy.  In fact, America was perhaps the most literate country in the world before the creation of government schools.  In 1812, Pierre DuPont published Education in the United States, a book in which he expressed his amazement at the high rate of literacy he saw here compared to Europe.  Forty years before the passage of our first compulsory school laws, DuPont found that fewer than 4 of every 1,000 people in the U.S. could not read and do numbers well.

Compulsory schooling was implemented for the purpose of social and behavioral control.  It is not coincidental that the first compulsory school law in Massachusetts came in 1852.  The late 1840s saw a huge influx of immigrants to the United States – the Irish, following the Potato Famine that began in 1845.

The cultural elites, especially in New England, viewed these newcomers with fear and suspicion.  In 1853, the Boston School Committee stated:

“The parent is not the absolute owner of the child.  
The child is a member of the community, has certain 
rights, and is bound  to perform certain duties, and so 
far as these relate to the public, Government has the 
same right of control over the child that it has over the 
parent…Those children should be brought within the
jurisdiction of the Public Schools, from whom, through 
their vagrant habits, our property is most in danger, 
and who, of all others, most need the protecting power 
of the State.”(3)

Horace Mann, then Boston’s Commissioner of Education, put it more succinctly, “With the old not much can be done; but with their children, the great remedy is education.  The rising generation must be taught as our children are taught.  We say must be, because in many cases this can only be accomplished by coercion.  Children must be gathered up and forced into schools and those who resist and impede this plan, whether parents or priests, must be held accountable and punished.”(4)

In essence, children would be considered the property of the State, at least in the eyes of the WASP elite.  An integral part of their being “taught as our children are taught” was the creation of an American Creed.  This led to what Bruce VanSledright described as collective memorialization of an American creation myth, the “freedom quest, nation building narrative” of American exceptionalism.  "In a nation that has built itself off the backs of waves of immigrants, the push to use history education to 'Americanize' the hordes of 'outsiders' lobbies incessantly.  To sow allegiance to the nation state requires constant maintenance."(5)  Then, the immigration laws began to be manipulated to control the flow of those outsiders.  Only those judged capable of being properly Americanized were allowed in.  Such manipulation continues to the present day.

Added to this perceived need to “Americanize” the millions of immigrants who came to the United States in the 19th century was the cataclysm of World War I.  As Sipress and Voelker pointed out, “The Great War helped trigger a wave of anxiety regarding the basic education necessary to sustain democracy in a world that had recently proven so menacing to peace and stability.”(6) This led to what has become something of a shibboleth, that the study of History is “essential for good citizenship in a democracy.”

This revealed wisdom seems to be uniquely American, however.  Both Great Britain and Germany are democracies.  Yet, in a search of the reasons cited for studying history at the University of London and Humboldt University of Berlin, one does not find this citizenship essentiality.  But, ceding for a moment that there is such an essentiality, it begs a number of questions.

How does force feeding students a litany of dates, facts, and names contribute to good citizenship?  What is a good citizen?  Who gets to decide?

 In 1783, immediately following the Revolutionary War, George Washington wrote Sentiments on a Peace Establishment, setting forth his views on what peacetime military force would be needed to provide for the common defense. In this document, he said, “It may be laid down, as a primary position, and the basis of our system, that every Citizen who enjoys the protection of a free Government, owes not only a proportion of his property, but even of his personal services to the defense of it.”

To George Washington, military service was essential for good citizenship. Thomas Jefferson, on the other hand, believed that such would lead to a large standing army.  And, he believed that the “yeoman farmer”, ready to serve when called upon, was more in keeping with a free society.  Here we have diametrically opposing views on what constitutes good citizenship.

When the United States entered World War I, the Army developed classification tests to determine assignments to various military occupations. “Army Alpha” tested those who were fluent in English, while "Army Beta" was given to those who still spoke a foreign language.  It was found that over 50% of American males tested could not read above a 4th grade level. Most lived on farms and had never traveled more than 10 miles from home.

In World War II, military technology had advanced much farther than in the previous war.  Unlike wars in the past, men could not be taken from their villages, given a weapon, and shoved into the ranks to create an effective army.  With the rapid advances in technology, World War II would demand men with complex skills to operate and repair the weapons of war.  The method in which millions of men began their classification by the Army was taking the Army General Classification Test (AGCT).

Despite an intervening 20 years of supposed progress in education, immediately prior to Pearl Harbor over 347,000 men who registered for Selective Service merely made marks on their registration cards due to their inability to sign their names.  The problem of illiteracy was such a big issue that the Army struggled with finding a policy that would allow it to meet its manpower requirements while still maintaining a high level of efficiency needed to fight and win the war.  Despite all the "progressive" education of government schools, literacy in America went DOWN.

In both World Wars, thousands of American men could barely read. But, they served their country in its hour of need.  They certainly didn't need to "study" History to be good citizens.  In large measure, the GI Bill was enacted after World War II because of the low level of education that had existed at the start of the war.  It was realized that the advance of technology, particularly the technology of modern war, would require a higher level in the future.

One argument always put forth as a measure of good citizenship is voting. Voting is a matter of conscience.  But, it must be remembered that thousands of African American men either volunteered or were conscripted to fight in both world wars. They considered themselves good citizens.  Yet, they were denied the right to vote through such devices as the poll tax and literacy tests.  It took the passage of the Voting Rights Act in 1965, 20 years after the war, to correct this injustice.

Today, there are approximately 25,000 non-citizens serving in the American armed forces, with another 8,000 or so entering each year.  The services have a special program to help these men and women become citizens by the end of their first enlistment.  If they do not, they are not allowed to remain.  What do these non-citizens learn?  They learn the information necessary to pass the citizenship test given by the Federal government.  Are they to be considered “bad” citizens because they didn’t take the History survey course that’s part of the general education curriculum?

Most of the resistance to change comes from academia itself, a combination of hubris, laziness, and a desire for job security.  Elected officials, who exercise a great deal of control over education, act on the advice of those considered “experts” in their field, as if a PhD granted 40 years ago confers the wisdom of the ages.  I personally know professors who haven’t changed their lectures in at least 15 years.  Even the proposed adoption of a different History textbook brings opposition.

There is also a very strong ideological component to maintaining the status quo ante.  As James W. Loewen stated, “History can be a weapon.”(7)  By sticking to a “facts first” approach, those so inclined are willing and able to ensure a particular set of “facts”, and the interpretation of those facts, are taught.  In the Afterward of his A People’s History of the United States, the late Howard Zinn said, “What struck me as I began to study history was how nationalist fervor--inculcated from childhood on by pledges of allegiance, national anthems, flags waving and rhetoric blowing--permeated the educational systems of all countries, including our own.”(8)

However, in his zeal to ameliorate this nationalist fervor, Zinn committed what David Hackett Fischer termed the “converse fallacy of difference”(9), rendering a special judgment upon a group for a quality which is not special to it.  Two entire generations of American students have been indoctrinated to view Western Civilization in general, and America in particular, as the focus of evil in the world.  But, at least Zinn was open and above board in stating that he wrote from a particular ideological perspective.  Unfortunately, most textbook writers are not so honest.

The resistance to change stems mainly from a failure to consider the perspective of the students, following the Behaviorist theory of learning that dominated the 20th century.  This cast the teacher or professor as the font of all knowledge, with the mind of the student the empty vessel into which that knowledge would be “poured.”  Although this epistemological view has been replaced by Cognitive and Constructivist theories, the pedagogical course of the “ship” of History continues to Advance, resisting a Transfer to a new heading.

Although probably not consciously intended, the concept of “Backward Design”(10), developed by Grant Wiggins and Jay McTighe, follows the management method developed by W. Edwards Deming.  In essence, Deming advocated working backward from the customer, studying each process within a company, and making whatever changes are necessary to ensure the customer’s expectations are met.  Who are education’s “customers?”

In all cases, the students are the customers.  At the secondary school level, the expectations to be met are those of the parents and other citizens who pay the taxes that fund the public schools.  All too often, however, the attitude of teachers and administrators is not to meet those expectations, but to tell the customers what their expectations should be.  At the college level, it’s clear that the students are the customers.  They are the one’s paying tuition.

However, society at large can also be viewed as a customer.  Businesses must use the “product” being produced by the educational system.  Little thought has been given to what knowledge and skills are needed in the 21st century.  YouSeeU is a global leader in soft skill development for higher education and corporate training.  In 2015, YouSeeU published a white paper entitled Curbing Global Automation: Why Our Future Rests in Soft Skills.  These are the skills that cannot be replicated by machines:

Social Skills – The ability to get along with others
Communication Skills – Oral, written, non-verbal & and listening skills
Higher Order Thinking Skills – Problem solving, critical thinking & decision making
Self-Control – Ability to control impulses & focus attention
Positive Self-Concept – Self-confidence, self-efficacy & self-awareness

The discipline of history particularly lends itself to developing the higher order thinking skills because that’s what historians DO.  However, the most significant change must be in what students think ABOUT.  Expecting 21st century students to “think critically” about the Salem witch trials is putting a square peg in a round hole.  Rather than having non-History majors remember who the Robber Barons were, it would be more appropriate to have them think about why they came to be called Robber Barons in the first place.  Did they possess no redeeming qualities at all?  What contributions to society did muckrakers like Ida Tarbull make, other than fanning the flames of envy?

As Fischer also pointed out 48 years ago, “there are no facts which everyone needs to know – not even facts of the first historiographical magnitude.  What real difference can knowledge of the fact of the fall of Babylon or Byzantium make in the daily life of anyone except a professional historian?”(11)

To keep students engaged, they must see the relevance of the material to their lives and their aspirations for the future.  Professors who continue to decide a priori what they believe students should know, either based on their own particular interests because they believe their CV grants them the right to do so, or simply because “that’s the way we’ve always done it”, are doing a grave disservice to both the students and to the larger society.

Endnotes:


1 Ravtich, Diane, “The Educational Background of History Teachers”, Knowing, Teaching & Learning History: National and International Perspectives, New York University Press (New York, 2000),  p. 144
2 National Education Association of the United States. Committee of Ten on Secondary School Studies. (1894). Report of the Committee of ten on secondary school studies: with the reports of the conferences arranged by the committee. Pub. for the National Education Association by the American Book Co.. p. 17
3 Nasaw, David, Schooled to Order: A Social History of Public Schooling in the United States, Oxford University Press (New York 1979),  p. 77
4  Ibid, p. 78
5  VanSledright, Bruce A., The Challenge of Rethinking History Education: On Practices, Theories, and Policy, Routledge (New York, 2011), pp. 21-22
6 Sipress, Joel and Voelker, David , The End of the History Survey Course: The Rise and Fall of the Coverage Model, OAH Journal of American History, March 2011, p. 1055
7 Loewen, James A., Teaching What Really Happened, Teacher’s College Press (New York, 2010), p. 12
8 Zinn, Howard, A People’s History of the United States, 1492-Present, Harper Collins (New York, 1980), p. 685
9 Fischer, David Hackett, Historians’ Fallacies-Toward a Logic of Historical Thought, Harper Torchbooks (New York, 1970), p. 223
10 Wiggins, Grant and McTighe, Jay, Understanding by Design (Alexandria, 1998), p. 98-114
11 Fischer, op cit, p. 311


Wednesday, October 10, 2018

The Absurdity of Public Education in Texas

In 2010, Texas became the laughingstock of the country as Americans witnessed the battle over rewriting the Texas Essential Knowledge and Skills (TEKS) that provide the blueprint for Texas textbooks—and standardized tests, teacher training standards, indeed, the entire curriculum.  TEKS is a gigantic “laundry list” approach to history, reflecting a methodology that focuses on “content mastery” as a measure of knowledge, as if simply being able to remember who (insert famous historical figure’s name here) was is necessary to go to college, get a job, or simply live a happy, productive life.

As the Dallas Morning News reported in September 2014, the ongoing debate over what are called “social studies skills” reflects a culture war, with conservatives on the State Board of Education maintaining that “social studies education has been too heavily influenced by left-leaning relativistic ‘empowerment’ rhetoric that reduces American history to a roll call of oppressed minorities and a sneering parade of robber barons.”

On the other side of the argument are those who charge that the texts “exaggerate the role of religion in the nation’s founding; deliberately downplay the separation of church and state; and, shamefully, minimize the disparities imposed by segregation.”

Adding to the absurdity is the fact that the problem-solving and decision-making skills outlined in the TEKS as being important for high school seniors are, word-for-word, the exact same standards held out for kindergartners.  As Dr. Keith Erekson put it “I will leave it to the late-night comedians to identify the careers best suited for Texas high school graduates with kindergarten-level problem-solving skills, but I will say that the phrase “All I really need to know I learned in Kindergarten” makes a far better bumper sticker than educational philosophy.”

To illustrate this absurdity, here are a few of the TEKS set down for Kindergarten students.  Keep in mind, we're talking about 5 and 6 year olds, who cannot go to the bathroom in school without an adult.  In the Social Studies curriculum, these children are supposed to learn:

1) Students identify the role of the U.S. free enterprise system within the parameters of this course and understand that this system may also be referenced as capitalism or the free market system.  Children at this age don't even know how to count money.  How are they to understand capitalism or free market economics?

2) Students identify and discuss how the actions of U.S. citizens and the local, state, and federal governments have either met or failed to meet the ideals espoused in the founding documents.  This requires a significant degree of reasoning skill.  Even high school students have difficulty with such reasoning.

3) History. The student understands how patriots, and good citizens helped shape the community, state, and nation.  The student is expected to:

   (A) identify the contributions of historical figures, including Stephen F. Austin, George Washington, Christopher Columbus, and José Antonio Navarro who helped to shape the state and nation; and

   (B) identify contributions of patriots and good citizens who have shaped the community.

This begins what Bruce VanSledright has termed "collective memorialization".  "In a nation that has built itself off the backs of waves of immigrants, the push to use history education to 'Americanize' the hordes of 'outsiders' lobbies incessantly.  To sow allegiance to the nation state requires constant maintenance."  It is indoctrination instead of education.  It introduces the idea that there are people, other than the child's own parents, who have the right and ability to identify "patriots" and what constitutes a "good" citizen.  But, even "bad" citizens have shaped the nation.   

4) History. The student understands the concept of chronology. The student is expected to: 

   (A) place events in chronological order; and

   (B) use vocabulary related to time and chronology, including before, after, next, first, last, yesterday, today, and tomorrow.

While youngsters do need to begin to understand the concept of time, this antiquated method of teaching history leads to the view, voiced by many high school students, that history is "just one darned thing after another."

5) Social studies skills. The student applies critical-thinking skills to organize and use information acquired from a variety of valid sources, including electronic technology.  The student is expected to:

(A) obtain information about a topic using a variety of valid oral sources such as conversations, interviews, and music;

(B) obtain information about a topic using a variety of valid visual sources such as pictures, symbols, electronic media, print material, and artifacts;

(C) sequence and categorize information.

In most cases, the only history class(es) an elementary education major ever took in college was the freshman survey course, utilizing a textbook, which represents only one source.  How, then, is a non-history major to judge the validity of a source?  But, leaving aside that lack of qualification, kindergartners can't even READ yet.  How in the world are they to apply "critical-thinking skills" to ANYTHING?  In most cases, the teacher tells them what to think.

6) Social studies skills. The student communicates in oral and visual forms. The student is expected to:

    (A) express ideas orally based on knowledge and experiences

WHAT knowledge and experience?  They're 5 and 6 YEARS OLD!


7) Social studies skills. The student uses problem-solving and decision-making skills, working independently and with others, in a variety of settings. The student is expected to:

    (A) use a problem-solving process to identify a problem, gather information, list and consider options, consider advantages and disadvantages, choose and implement a solution, and evaluate the effectiveness of the solution; and

    (B) use a decision-making process to identify a situation that requires a decision, gather information, generate options, predict outcomes, take action to implement a decision, and reflect on the effectiveness of that decision.

How does someone who can't even tie his shoelaces do any of that?  

The public education establishment is very much like a Medieval Guild.  It makes up its own criteria for membership and sets the rules for entry.  Only those who undergo the required training and learn the secret handshake are then considered qualified to impart knowledge.  As anyone who has dealt with teachers and administrators knows, it is also highly secretive.  The teacher and the school itself is allowed to determine what might, or might not, be "disruptive" to the learning process.  Parents who would like to sit in and audit what is being taught are viewed with suspicion.  Often such a request will be denied.

Parents or citizens who express concern are met with standard replies that are taught in many graduate Education Administration courses to deflect such concerns and maintain control.  They will be given such dodges as:

1) You’re too inexperienced to understand, or

2) That all experts disagree with your point of view, or

3) That scientific evidence proves you wrong, or

4) That you’re trying to impose your morals or values on others, or

5) That you’re the only person in the whole community who has raised that issue.

Now, all of these assertions may be completely false.  But most parents don’t have the time or resources to prove it.  School officials KNOW that.

For the past 85 years or so, the education establishment has religiously worked to convince the general public that it represents a profession, rather than a trade.  When he retired from the Presidency of Harvard in 1933, Abbott Lawrence Lowell told the Board of Trustees that Harvard's Graduate School of Education was "a kitten that ought to be drowned”.  In 2013, Harvard stopped conferring the Doctor of Education (EdD) degree.  The Board of Trustees finally acted on the recommendations of the rest of the faculty that this degree is not, and never was, the equivalent of the PhD.  Instead, it was created by the education establishment itself for those seeking the same stature as academic scholars.

What the general public doesn't know is that, at colleges and universities with "schools" or "departments" of Education, those departments are considered the trash dumps of the university by the rest of the faculty, when they think of them at all.

Hard data on education student qualifications have consistently shown their mental test scores to be at, or near, the bottom among all categories of students.  In 1952, the U.S. Army had college students tested for draft deferments during the Korean War.  More than half the students in the humanities, the social, biological and physical sciences, and mathematics passed, compared to only 27% of those majoring in Education.

In 1981, students majoring in education scored lower on both verbal and quantitative SATs than students majoring in art, music, theatre, all the sciences, mathematics, business, and health occupations.  In 1994, a review of the SAT scores of Education majors was done as part of a graduate school research project at Tarleton State University, which has an Education department.  A search of the Registrar’s computer records was conducted.  Student names were not part of the search criteria, only SAT scores by major in each department.  The earlier data was confirmed.  Education majors ranked in the lowest group, by mental and verbal ability, of all students in the university.

At the graduate level, it is much the same story, with students in numerous other fields outscoring education students on the Graduate Record Examination – by from 91 points composite to 259 points, depending on the field.  The pool of graduate students in education supplies not only teachers, counselors, and administrators, but also professors of education and those who speak for the Education establishment, advising such people as elected officials. 

Perhaps this is why, at some point in the past, the Texas legislature inserted a provision in the Local Government Code making teacher's college transcripts private.  As public employees, teachers' salaries are a matter of public record.  But, parents have no way of determining what kind of student a teacher was in college.  Did they score so low on the SAT or ACT that they were required to take remedial classes before being allowed to do college work?  At least 1/2 of all students leaving Texas high schools today must do so.  Were they ever put on academic probation.  Did they start off declaring an academic major, only to switch to Education because they found the other discipline too hard? 

Although local Boards of Education are still elected, this is a nostalgic throwback to the early 20th century.  How and what is taught in the public schools has been taken out of the hands of local officials and is vested in educational bureaucrats who have a vested interest in the status quo.