Friday, October 19, 2018

Democratic Socialism – Putting Lipstick on a Pig


Since the Progressive Era in America, the Left has struggled mightily to label itself in a way that its world view can be made palatable to the American people.  Realizing early on that Karl Marx’s theory of “historical materialism” was simply not accepted by the majority of Americans, the Left has always sought to cloak their Marxist vision of economics so as to fool as many as they can.  First, they called themselves Progressives, then Social Democrats, then Liberals.  Now, they have become Progressives once more, with many of them proclaiming Democratic Socialism to describe the same fundamental world view.

According to the Democratic Socialists of America, “Democratic socialists believe that both the economy and society should be run democratically—to meet public needs, not to make profits for a few.”   This is a purposeful juxtaposition of a type of government with an economic system, designed to confuse the millions who either never took Civics in high school, or have forgotten what they learned.  Apparently, it is believed that so long as people get to vote, everything is fine.  Unfortunately, there are millions of suckers out there who believe it, too.

The trouble is, using that criteria as a measure of legitimacy, the old Soviet Union was “democratic”.  There were elections in the Soviet Union.  But, there was only one political Party for which the people could vote – the Communist Party.




 It has been recognized for at least 3,000 years that there are just a few types of government, Monarchy, Aristocracy, Democracy, Republican, and Communist.

Plato
(429-347 B.C.)

The best form of government in Plato’s “Ideal State” was Monarchy, a “Philosopher King”, assisted by an Aristocracy, the “Guardians”.  In this Ideal State there will be three classes of citizens - the producing class, the warrior class, and the ruling class, each performing its proper function.  He claimed that each man, by the nature of his talents, belonged to one or the other of these classes and that there should be no overlapping, no moving back and forth from one class to another, or belonging to more than one class.  If such things happened, Plato declared, it would be "the ruin of the State."  Plato's Philosopher King was a man whose wisdom and understanding was refined beyond that of other men.   In other words, while everyone is born with the ability to reason, in the end only a few go beyond the understanding of others.  This gives them the right and authority to rule. 

In his later years, Plato turned from considerations of the ideal state to more practical political matters.  While he turned away from the imaginary, he still utilized rational analysis in his later writings.  In The Statesman, he set forth what has become the classical understanding of the possible forms of government.  They are rule by one, rule by a few, and rule by the many.

He further divided these into what he termed the legitimate and per­verted models of these forms of government.  He considered monarchy as the legitimate form of rule by one.  Its perversion would be rule by a tyrant or tyranny.  An aristocracy would be rule by the few and oligarchy would be its perversion.  Interestingly enough, he termed rule by the many as democracy, but gave no distinctive names to either its legitimate or perverted forms.

Aristotle
(384-322 B.C.)

Aristotle elaborated on Plato's thinking regarding the forms of government and made distinctions between acceptable and unaccept­able practices.  In theory, or ideally, Aristotle said that monarchy - rule by one - might be the best form of government.  Thus, he said, "If there be some one person, or more than one whose virtue is so pre-eminent that the virtues or the political capacity of all the rest admit of no comparison with his or theirs, he or they can be no longer regarded as part of the state.  Such a one may be truly deemed a God among men."

He doubted, however, that such a situation would occur except in very rare cases, or that if it did, it would be quite as good as might be imagined.  In reality, the wisest and best intentioned would need the counsel of other men as well as their assistance in ruling, and law would be preferable to personal rule.  Besides, the most likely result of rule by one would be tyranny, which is the perversion of rule by one.  It occurs when a man rules in his own interest and for his own purposes, rather than in the interest of those whom he governs.

Aristotle followed Plato in stating what he viewed as the legitimate and illegitimate forms.  The rule of the few, if it is a good government, Aristotle called an aristocracy.  It would be the rule of the best qualified men in the country who would be expected to rule in the best interest of all the people.  The perversion of aristocracy is oligarchy, which is the rule by the few in their own interest.  The main point he made was that oligarchies tend to keep power perpetually in the hands of a few who use the government as if it were their personal possession.  Over time they become tyrannical.

Rule by the many has the potential for being the best form of government, according to Aristotle.  More precisely, he believed that the best government would be one which included both the few - men of wealth and high intellect - as well as the many - including those from both the lower and middle ranks.
The middle class provided the best hope for good government. He wrote that "It is plain, then, that the most perfect political community must be amongst those who are in the middle rank, and those states are best instituted wherein these are a larger and more respectable part; or, if that cannot be, at least greater than either of the other classes; so that being thrown into the balance it may prevent either scale from preponderating."

Since the majority of the populace would have some part in govern­ing, the laws would be more readily obeyed.  Such a government would be termed a "polity" or constitutional government.  Interestingly, Democracy was the term Aristotle used to describe the perverted form of rule by the many.  He did not object to rule by the many so long as it was rule by law and moderated by thought­ful and experienced men.  However, the perversion occurs when "the multitude have the supreme power and supersede the law by the decrees.  This is a state of affairs brought about by demagogues."   In other words, rule by emotion.  Such a government acts not in what is good for the people of the country but what appeals to the worst inclinations of the people collectively.  Its tenden­cy is toward mob rule.

Madison, Jefferson, Washington, Adams, et al

The impact of Greek thought and practice on the founding of the United States was more indirect than direct.  The U.S. was never composed of city-states.  The American idea of the rule of law was taken more from the Roman and British example than that of Athens.  The Roman influence on the political institutions and practices of the United States was very great.  Rome had a constitution going back to the Twelve Tablets in 450 B.C. and forward through many changes in governmental arrangements until the very end of the Roman Republic.

The United States was styled a republic on the model of Rome and our Constitution provides that the states are guaranteed republican governments as well.   But, the Founders were also well aware that the Roman Republic disintegrated and Roman government reverted to the rule of one when Caesar proclaimed himself Emperor in 45. B.C.

They also tended to view Democracy, at least what is called “direct” democracy in much the same way Aristotle had.  If all the people had an equal voice, without any method of controlling their passions, the majority could always vote away the rights of the minority.  This is why we were given a Representative Republic, with separation of powers and checks and balances, rather than a Democracy.  To call America a “democracy” is simply a glib and easy reference to the fact that people get to vote.  But, as stated previously, so could the people of the Soviet Union.

The last type of government, Communism, is not a government at all.  It describes a Utopian State where the nature of man has been “perfected” so that no government is needed.  Despite calling the governments of the Soviet Union and China Communist, they are no such thing because a government did, and does, exist. 

Capitalism and Socialism

Neither capitalism nor socialism are types of government.  They are economic systems.  The purpose stated by the Democratic Socialists, to meet public need, is more a statement of Utilitarianism, the greatest good for the greatest number.  Utilitarianism is a moral theory, not a form of government.  But, as an economic system, socialism can only exist where there is Plato’s perverted rule of the few – Oligarchy, ultimately becoming autocratic and totalitarian.  That's the only way to keep the people in order.  Putting lipstick on this pig, doesn't change its fundamental nature.

What is most ironic is that those who most favor "Democratic Socialism" are intellectuals. What they don't seem to realize is that, in every country that has imposed a socialist economy, the intellectuals are the first ones killed by those in power.  They should be careful what they wish for.


Monday, October 15, 2018

The Carnegie Unit and the Core Curriculum - Both Are Dead, But Refuse to Be Buried

For over 100 years, the system used to judge a student’s achievement, both in high school and in college, has been based on the Carnegie Unit.  In Texas since 1987, there has been a standard core curriculum, requiring every college student to take a minimum number of credit hours in particular courses in order to earn a degree from state funded institutions.  Are either still accurate measures of education in the 21st century?

In 1905, retired steel magnate and philanthropist Andrew Carnegie, then the world’s richest man, wrote a letter to college presidents declaring his intention to establish a pension system for “one of the poorest paid but highest professions in our nation”—college professors.    He created the Carnegie Foundation for the Advancement of Teaching to run the system and sent a ten million dollar check to the Foundation’s trustees, led by Harvard President Charles Eliot, to finance it.  Carnegie said, “I have reached the conclusion that the least rewarded of all the professions is that of the teacher in our higher educational institutions . . . I have, therefore, transferred to you and your successors, as Trustees, $10,000,000 . . . to provide retiring pensions for the teachers of universities, colleges and technical schools.” 

To determine which institutions were eligible to take part in the Carnegie pension system, the Foundation had to define what a college was.  To be ranked as a college, and thus be eligible to participate in the Carnegie pension plan, an institution “must have at least six professors giving their entire time to college and university work, a course of four full years in liberal arts and sciences, and should require for admission, not less than the usual four years of academic or high school preparation, or its equivalent.” 

Before long, the Carnegie Unit became the central organizing feature of the American educational enterprise, a common currency enabling countless academic transactions among students, faculty, and administrators at myriad public, nonprofit, private, and for-profit institutions, as well as between education policy makers at every level of government.   The Carnegie Foundation established the Carnegie Unit over a century ago as a rough gauge of student readiness for college-level academics.  It sought to standardize students’ exposure to subject material by ensuring they received consistent amounts of instructional time.  The problem is that, while the universal and portable hour may make for a more efficient system, the unit also promotes the false perception that time equals learning, in the same way for all students. This was never the intent when the Carnegie Unit was first created. 

But, Ernest L. Boyer, former Carnegie Foundation President and U.S. Commissioner of Education under President Jimmy Carter, didn’t ignore this false perception.  In 1993, Boyer said, “I am convinced the time has come to bury, once and for all, the old Carnegie unit. Further, since the Foundation I now head created this academic measurement a century ago, I feel authorized this morning to officially declare the Carnegie unit obsolete.” 

The fact that educators and legislators today, 25 years later, are still requiring “seat time” to measure student achievement is a testament to not only their lack of creative thought, but also to a certain degree of professional hubris.   The Education establishment has convinced lawmakers that each and every student must complete a certain number of credit hours in particular subjects or the nation will suffer dire consequences.

In 1987, the 70th Texas Legislature passed House Bill (HB) 2183, which established the first core curriculum legislation, with a general intent to ensure quality in higher education.   Senate Bill (SB) 148, passed by the 75th Texas Legislature in January 1997, repealed all earlier legislation and sought to resolve concerns regarding the transfer of lower-division course credit among Texas public colleges and universities, while maintaining the core curriculum as one of the fundamental components of a high-quality, undergraduate educational experience.  More recent sessions of the Texas Legislature have fine-tuned the existing laws regarding core curriculum, but the essentials of the statutes have not changed since 1997.

This presupposes that there is some certain body of knowledge that everyone needs to know and that elected officials are somehow capable of deciding what that knowledge must be.  The oldest example of this is the legislative requirement that, at state funded colleges and universities, all students must take 6  hours of Government and 6 hours of American History in order to receive a degree, regardless of their major.  What was the genesis of this 1955 requirement? 

A look at the legislative history shows that it was a reaction to the “communist scare” of the late 40s/early 50s that resulted in such things as the House Un-American Activities Committee, the Hollywood Blacklist, and the antics of Senator Joseph McCarthy.  Apparently, it was believed that if students received a certain amount of instruction in those subjects, they would somehow be immune from the entreaties of communism.

In 2013, an effort was made to further codify the American History requirement.  Then State Senator Dan Patrick of Houston introduced a bill providing that only broad-based, survey courses could count to meet the 6 hour requirement.  In testimony supporting HB 1938, numerous individuals cited their understanding of the 1955 law’s purpose - that American history be part of a common core for all to enhance students’ civic knowledge and citizenship skills.  Fortunately, that bill did not pass.  But, it is sure to come up again in further sessions, particularly since Patrick is now Lieutenant Governor.

But, does a broad-based survey course in History actually do that?  First of all, the textbooks for such courses are far too expensive and become door stops at the end of the semester, unless they can be sold back to the bookstore, usually for less than 1/3 the original price.  Secondly, survey texts do not reflect current scholarship.  They are not based on primary or secondary sources.  They are based on each other.  Finally, survey courses promote mindless memorization of facts that can be easily tested.  This represents the lowest order of learning and does not represent real knowledge.

Another salient question that must be asked is just what are “citizenship skills”, and precisely who is authorized to judge?  In 1955, those making the judgment were the same legislators who thumbed their noses at the 1954 Brown v. Board of Education decision, requiring school desegregation “with all deliberate speed.”  Texas legislatures through the mid-60s steadfastly refused to comply with the Supreme Court’s ruling in that case and kept Jim Crow laws in effect in Texas.  Is that what is meant by good citizenship?  Is that the example that should be followed today?  Those who cite the 1955 law as somehow sacrosanct should think twice.  It might not be wise to hold up those 1955 legislators as paragons of wisdom and virtue.

Texas is actually one of the few states that require either History or Government, by law.  In other states, these are electives for those not majoring in these disciplines.    If one is concerned about what an individual actually needs to know to be a good citizen, an objective measure is the test given by the U.S. Immigration and Naturalization Service to aliens seeking U.S. citizenship.  Students need not sit through comprehensive survey courses to learn the basics of citizenship.  No employer is going to ask a job candidate to explain what he/she knows about the Haymarket Riot of 1886, or the Progressive Era, or how our system of checks and balances works.  Knowledge of history is important.  But, it’s not that important to everyone, regardless of what History professors may say. 

Most often, this is expressed in course outlines and syllabi with language that paraphrases the famous quote by George Santayana, “Those who cannot remember the past are condemned to repeat it."  But, people live in the present.   Historians do not perform heart transplants, improve highway design, or arrest criminals.  History is important in that it offers a storehouse of information about how people and societies behave.  But, we shouldn’t continue to overemphasize its importance, or decide for others how much importance it should have in their lives.  But, most importantly, we must recognize that “seat time” is not the true measure of how one can acquire the knowledge.

We’re living in the 21st century.  Today, we have communication technologies that were inconceivable in 1955, and when Boyer was writing in 1993. We can deliver information anywhere, anytime, and to nearly anyone. We can personalize instructional materials, teaching tools, and assessments.  Massive Open Online Courses (MOOCS), and other online courses that are slightly less massive and less open, may not be the future of education, but they are surely an example of how education is fast becoming more accessible, portable, and asynchronous.  Measuring student learning by “seat time” in this new educational era is obsolete, just as Boyer stated 25 years ago.



Friday, October 12, 2018

“Advance and Transfer” – Changing the Course of History Pedagogy

“Advance and Transfer” is a nautical term that describes what happens when a ship’s wheel is turned to a new course.  Due to its forward momentum, the boat or ship will continue on the previous course until the rudder ultimately causes a change in direction.  The larger the ship, the longer this transfer takes.  So it is with pedagogy in History.

In his 2006 JAH article “Uncoverage-Toward a Signature Pedagogy for the History Survey”, Dr. Lendol Calder quoted the late Roland Marchand: “Why are historians so incurious about learning?"  In speaking with public school history teachers, as well as professors at some local colleges, I’ve yet to meet anyone who is even aware of the Scholarship of Teaching and Learning (SoTL) in History and the work that’s been done over the past 12 years, researching how humans learn and, consequently, how they should be taught.  Despite some innovative attempts to depart from the “facts first” emphasis on content, such as the Amherst Project in the 1960s, History continues to be taught as it’s been taught for the last 125 years.

For public school teachers, this is somewhat understandable.  As Diane Ravitch pointed out, a large percentage of those teaching History are teaching out-of-field, defined as having neither majored nor minored in History in college.  Often, History teacher is spelled C-O-A-C-H.  “There appears to be the presumption that teaching history requires no special skills beyond the ability to stay a few pages ahead of the students in the textbook.”(1)

Even if they were inclined to keep up with current scholarship, they are often constrained by the directives of Boards of Education as to what must be taught and, in some cases, even how it is to be taught.  This, too, is understandable.  Individuals elected to such Boards were themselves taught that way.  History pedagogy is mired in 19th century assumptions and 20th century methods.  What were some of those 19th century assumptions?

In 1892, Harvard president Charles Williams Eliot headed up the Committee of Ten that standardized the curriculum which, in large measure, is still followed today in the public schools.  Starting with the view that there is a body of knowledge that everyone must know to be considered educated, the Committee’s recommendation was that "...every subject which is taught at all in a secondary school should be taught in the same way and to the same extent to every pupil so long as he pursues it, no matter what the probable destination of the pupil may be, or at what point his education is to cease."(2)

The key phrase there was the stricture that every subject be taught the same way.  Mathematics, the Sciences, and even English, with its grammatical rules, all rely on facts and lend themselves to rote memorization.  Neither the Pythagorean Theorem, nor Newton’s Laws, is based on interpretation.  In order to "do" Math, one must know that 2+2=4.

Another set of assumptions led directly to the imposition of compulsory school laws.  Contrary to popular thinking, compulsory schooling was not intended for education, in the sense of spreading literacy.  In fact, America was perhaps the most literate country in the world before the creation of government schools.  In 1812, Pierre DuPont published Education in the United States, a book in which he expressed his amazement at the high rate of literacy he saw here compared to Europe.  Forty years before the passage of our first compulsory school laws, DuPont found that fewer than 4 of every 1,000 people in the U.S. could not read and do numbers well.

Compulsory schooling was implemented for the purpose of social and behavioral control.  It is not coincidental that the first compulsory school law in Massachusetts came in 1852.  The late 1840s saw a huge influx of immigrants to the United States – the Irish, following the Potato Famine that began in 1845.

The cultural elites, especially in New England, viewed these newcomers with fear and suspicion.  In 1853, the Boston School Committee stated:

“The parent is not the absolute owner of the child.  
The child is a member of the community, has certain 
rights, and is bound  to perform certain duties, and so 
far as these relate to the public, Government has the 
same right of control over the child that it has over the 
parent…Those children should be brought within the
jurisdiction of the Public Schools, from whom, through 
their vagrant habits, our property is most in danger, 
and who, of all others, most need the protecting power 
of the State.”(3)

Horace Mann, then Boston’s Commissioner of Education, put it more succinctly, “With the old not much can be done; but with their children, the great remedy is education.  The rising generation must be taught as our children are taught.  We say must be, because in many cases this can only be accomplished by coercion.  Children must be gathered up and forced into schools and those who resist and impede this plan, whether parents or priests, must be held accountable and punished.”(4)

In essence, children would be considered the property of the State, at least in the eyes of the WASP elite.  An integral part of their being “taught as our children are taught” was the creation of an American Creed.  This led to what Bruce VanSledright described as collective memorialization of an American creation myth, the “freedom quest, nation building narrative” of American exceptionalism.  "In a nation that has built itself off the backs of waves of immigrants, the push to use history education to 'Americanize' the hordes of 'outsiders' lobbies incessantly.  To sow allegiance to the nation state requires constant maintenance."(5)  Then, the immigration laws began to be manipulated to control the flow of those outsiders.  Only those judged capable of being properly Americanized were allowed in.  Such manipulation continues to the present day.

Added to this perceived need to “Americanize” the millions of immigrants who came to the United States in the 19th century was the cataclysm of World War I.  As Sipress and Voelker pointed out, “The Great War helped trigger a wave of anxiety regarding the basic education necessary to sustain democracy in a world that had recently proven so menacing to peace and stability.”(6) This led to what has become something of a shibboleth, that the study of History is “essential for good citizenship in a democracy.”

This revealed wisdom seems to be uniquely American, however.  Both Great Britain and Germany are democracies.  Yet, in a search of the reasons cited for studying history at the University of London and Humboldt University of Berlin, one does not find this citizenship essentiality.  But, ceding for a moment that there is such an essentiality, it begs a number of questions.

How does force feeding students a litany of dates, facts, and names contribute to good citizenship?  What is a good citizen?  Who gets to decide?

 In 1783, immediately following the Revolutionary War, George Washington wrote Sentiments on a Peace Establishment, setting forth his views on what peacetime military force would be needed to provide for the common defense. In this document, he said, “It may be laid down, as a primary position, and the basis of our system, that every Citizen who enjoys the protection of a free Government, owes not only a proportion of his property, but even of his personal services to the defense of it.”

To George Washington, military service was essential for good citizenship. Thomas Jefferson, on the other hand, believed that such would lead to a large standing army.  And, he believed that the “yeoman farmer”, ready to serve when called upon, was more in keeping with a free society.  Here we have diametrically opposing views on what constitutes good citizenship.

When the United States entered World War I, the Army developed classification tests to determine assignments to various military occupations. “Army Alpha” tested those who were fluent in English, while "Army Beta" was given to those who still spoke a foreign language.  It was found that over 50% of American males tested could not read above a 4th grade level. Most lived on farms and had never traveled more than 10 miles from home.

In World War II, military technology had advanced much farther than in the previous war.  Unlike wars in the past, men could not be taken from their villages, given a weapon, and shoved into the ranks to create an effective army.  With the rapid advances in technology, World War II would demand men with complex skills to operate and repair the weapons of war.  The method in which millions of men began their classification by the Army was taking the Army General Classification Test (AGCT).

Despite an intervening 20 years of supposed progress in education, immediately prior to Pearl Harbor over 347,000 men who registered for Selective Service merely made marks on their registration cards due to their inability to sign their names.  The problem of illiteracy was such a big issue that the Army struggled with finding a policy that would allow it to meet its manpower requirements while still maintaining a high level of efficiency needed to fight and win the war.  Despite all the "progressive" education of government schools, literacy in America went DOWN.

In both World Wars, thousands of American men could barely read. But, they served their country in its hour of need.  They certainly didn't need to "study" History to be good citizens.  In large measure, the GI Bill was enacted after World War II because of the low level of education that had existed at the start of the war.  It was realized that the advance of technology, particularly the technology of modern war, would require a higher level in the future.

One argument always put forth as a measure of good citizenship is voting. Voting is a matter of conscience.  But, it must be remembered that thousands of African American men either volunteered or were conscripted to fight in both world wars. They considered themselves good citizens.  Yet, they were denied the right to vote through such devices as the poll tax and literacy tests.  It took the passage of the Voting Rights Act in 1965, 20 years after the war, to correct this injustice.

Today, there are approximately 25,000 non-citizens serving in the American armed forces, with another 8,000 or so entering each year.  The services have a special program to help these men and women become citizens by the end of their first enlistment.  If they do not, they are not allowed to remain.  What do these non-citizens learn?  They learn the information necessary to pass the citizenship test given by the Federal government.  Are they to be considered “bad” citizens because they didn’t take the History survey course that’s part of the general education curriculum?

Most of the resistance to change comes from academia itself, a combination of hubris, laziness, and a desire for job security.  Elected officials, who exercise a great deal of control over education, act on the advice of those considered “experts” in their field, as if a PhD granted 40 years ago confers the wisdom of the ages.  I personally know professors who haven’t changed their lectures in at least 15 years.  Even the proposed adoption of a different History textbook brings opposition.

There is also a very strong ideological component to maintaining the status quo ante.  As James W. Loewen stated, “History can be a weapon.”(7)  By sticking to a “facts first” approach, those so inclined are willing and able to ensure a particular set of “facts”, and the interpretation of those facts, are taught.  In the Afterward of his A People’s History of the United States, the late Howard Zinn said, “What struck me as I began to study history was how nationalist fervor--inculcated from childhood on by pledges of allegiance, national anthems, flags waving and rhetoric blowing--permeated the educational systems of all countries, including our own.”(8)

However, in his zeal to ameliorate this nationalist fervor, Zinn committed what David Hackett Fischer termed the “converse fallacy of difference”(9), rendering a special judgment upon a group for a quality which is not special to it.  Two entire generations of American students have been indoctrinated to view Western Civilization in general, and America in particular, as the focus of evil in the world.  But, at least Zinn was open and above board in stating that he wrote from a particular ideological perspective.  Unfortunately, most textbook writers are not so honest.

The resistance to change stems mainly from a failure to consider the perspective of the students, following the Behaviorist theory of learning that dominated the 20th century.  This cast the teacher or professor as the font of all knowledge, with the mind of the student the empty vessel into which that knowledge would be “poured.”  Although this epistemological view has been replaced by Cognitive and Constructivist theories, the pedagogical course of the “ship” of History continues to Advance, resisting a Transfer to a new heading.

Although probably not consciously intended, the concept of “Backward Design”(10), developed by Grant Wiggins and Jay McTighe, follows the management method developed by W. Edwards Deming.  In essence, Deming advocated working backward from the customer, studying each process within a company, and making whatever changes are necessary to ensure the customer’s expectations are met.  Who are education’s “customers?”

In all cases, the students are the customers.  At the secondary school level, the expectations to be met are those of the parents and other citizens who pay the taxes that fund the public schools.  All too often, however, the attitude of teachers and administrators is not to meet those expectations, but to tell the customers what their expectations should be.  At the college level, it’s clear that the students are the customers.  They are the one’s paying tuition.

However, society at large can also be viewed as a customer.  Businesses must use the “product” being produced by the educational system.  Little thought has been given to what knowledge and skills are needed in the 21st century.  YouSeeU is a global leader in soft skill development for higher education and corporate training.  In 2015, YouSeeU published a white paper entitled Curbing Global Automation: Why Our Future Rests in Soft Skills.  These are the skills that cannot be replicated by machines:

Social Skills – The ability to get along with others
Communication Skills – Oral, written, non-verbal & and listening skills
Higher Order Thinking Skills – Problem solving, critical thinking & decision making
Self-Control – Ability to control impulses & focus attention
Positive Self-Concept – Self-confidence, self-efficacy & self-awareness

The discipline of history particularly lends itself to developing the higher order thinking skills because that’s what historians DO.  However, the most significant change must be in what students think ABOUT.  Expecting 21st century students to “think critically” about the Salem witch trials is putting a square peg in a round hole.  Rather than having non-History majors remember who the Robber Barons were, it would be more appropriate to have them think about why they came to be called Robber Barons in the first place.  Did they possess no redeeming qualities at all?  What contributions to society did muckrakers like Ida Tarbull make, other than fanning the flames of envy?

As Fischer also pointed out 48 years ago, “there are no facts which everyone needs to know – not even facts of the first historiographical magnitude.  What real difference can knowledge of the fact of the fall of Babylon or Byzantium make in the daily life of anyone except a professional historian?”(11)

To keep students engaged, they must see the relevance of the material to their lives and their aspirations for the future.  Professors who continue to decide a priori what they believe students should know, either based on their own particular interests because they believe their CV grants them the right to do so, or simply because “that’s the way we’ve always done it”, are doing a grave disservice to both the students and to the larger society.

Endnotes:


1 Ravtich, Diane, “The Educational Background of History Teachers”, Knowing, Teaching & Learning History: National and International Perspectives, New York University Press (New York, 2000),  p. 144
2 National Education Association of the United States. Committee of Ten on Secondary School Studies. (1894). Report of the Committee of ten on secondary school studies: with the reports of the conferences arranged by the committee. Pub. for the National Education Association by the American Book Co.. p. 17
3 Nasaw, David, Schooled to Order: A Social History of Public Schooling in the United States, Oxford University Press (New York 1979),  p. 77
4  Ibid, p. 78
5  VanSledright, Bruce A., The Challenge of Rethinking History Education: On Practices, Theories, and Policy, Routledge (New York, 2011), pp. 21-22
6 Sipress, Joel and Voelker, David , The End of the History Survey Course: The Rise and Fall of the Coverage Model, OAH Journal of American History, March 2011, p. 1055
7 Loewen, James A., Teaching What Really Happened, Teacher’s College Press (New York, 2010), p. 12
8 Zinn, Howard, A People’s History of the United States, 1492-Present, Harper Collins (New York, 1980), p. 685
9 Fischer, David Hackett, Historians’ Fallacies-Toward a Logic of Historical Thought, Harper Torchbooks (New York, 1970), p. 223
10 Wiggins, Grant and McTighe, Jay, Understanding by Design (Alexandria, 1998), p. 98-114
11 Fischer, op cit, p. 311


Wednesday, October 10, 2018

The Absurdity of Public Education in Texas

In 2010, Texas became the laughingstock of the country as Americans witnessed the battle over rewriting the Texas Essential Knowledge and Skills (TEKS) that provide the blueprint for Texas textbooks—and standardized tests, teacher training standards, indeed, the entire curriculum.  TEKS is a gigantic “laundry list” approach to history, reflecting a methodology that focuses on “content mastery” as a measure of knowledge, as if simply being able to remember who (insert famous historical figure’s name here) was is necessary to go to college, get a job, or simply live a happy, productive life.

As the Dallas Morning News reported in September 2014, the ongoing debate over what are called “social studies skills” reflects a culture war, with conservatives on the State Board of Education maintaining that “social studies education has been too heavily influenced by left-leaning relativistic ‘empowerment’ rhetoric that reduces American history to a roll call of oppressed minorities and a sneering parade of robber barons.”

On the other side of the argument are those who charge that the texts “exaggerate the role of religion in the nation’s founding; deliberately downplay the separation of church and state; and, shamefully, minimize the disparities imposed by segregation.”

Adding to the absurdity is the fact that the problem-solving and decision-making skills outlined in the TEKS as being important for high school seniors are, word-for-word, the exact same standards held out for kindergartners.  As Dr. Keith Erekson put it “I will leave it to the late-night comedians to identify the careers best suited for Texas high school graduates with kindergarten-level problem-solving skills, but I will say that the phrase “All I really need to know I learned in Kindergarten” makes a far better bumper sticker than educational philosophy.”

To illustrate this absurdity, here are a few of the TEKS set down for Kindergarten students.  Keep in mind, we're talking about 5 and 6 year olds, who cannot go to the bathroom in school without an adult.  In the Social Studies curriculum, these children are supposed to learn:

1) Students identify the role of the U.S. free enterprise system within the parameters of this course and understand that this system may also be referenced as capitalism or the free market system.  Children at this age don't even know how to count money.  How are they to understand capitalism or free market economics?

2) Students identify and discuss how the actions of U.S. citizens and the local, state, and federal governments have either met or failed to meet the ideals espoused in the founding documents.  This requires a significant degree of reasoning skill.  Even high school students have difficulty with such reasoning.

3) History. The student understands how patriots, and good citizens helped shape the community, state, and nation.  The student is expected to:

   (A) identify the contributions of historical figures, including Stephen F. Austin, George Washington, Christopher Columbus, and José Antonio Navarro who helped to shape the state and nation; and

   (B) identify contributions of patriots and good citizens who have shaped the community.

This begins what Bruce VanSledright has termed "collective memorialization".  "In a nation that has built itself off the backs of waves of immigrants, the push to use history education to 'Americanize' the hordes of 'outsiders' lobbies incessantly.  To sow allegiance to the nation state requires constant maintenance."  It is indoctrination instead of education.  It introduces the idea that there are people, other than the child's own parents, who have the right and ability to identify "patriots" and what constitutes a "good" citizen.  But, even "bad" citizens have shaped the nation.   

4) History. The student understands the concept of chronology. The student is expected to: 

   (A) place events in chronological order; and

   (B) use vocabulary related to time and chronology, including before, after, next, first, last, yesterday, today, and tomorrow.

While youngsters do need to begin to understand the concept of time, this antiquated method of teaching history leads to the view, voiced by many high school students, that history is "just one darned thing after another."

5) Social studies skills. The student applies critical-thinking skills to organize and use information acquired from a variety of valid sources, including electronic technology.  The student is expected to:

(A) obtain information about a topic using a variety of valid oral sources such as conversations, interviews, and music;

(B) obtain information about a topic using a variety of valid visual sources such as pictures, symbols, electronic media, print material, and artifacts;

(C) sequence and categorize information.

In most cases, the only history class(es) an elementary education major ever took in college was the freshman survey course, utilizing a textbook, which represents only one source.  How, then, is a non-history major to judge the validity of a source?  But, leaving aside that lack of qualification, kindergartners can't even READ yet.  How in the world are they to apply "critical-thinking skills" to ANYTHING?  In most cases, the teacher tells them what to think.

6) Social studies skills. The student communicates in oral and visual forms. The student is expected to:

    (A) express ideas orally based on knowledge and experiences

WHAT knowledge and experience?  They're 5 and 6 YEARS OLD!


7) Social studies skills. The student uses problem-solving and decision-making skills, working independently and with others, in a variety of settings. The student is expected to:

    (A) use a problem-solving process to identify a problem, gather information, list and consider options, consider advantages and disadvantages, choose and implement a solution, and evaluate the effectiveness of the solution; and

    (B) use a decision-making process to identify a situation that requires a decision, gather information, generate options, predict outcomes, take action to implement a decision, and reflect on the effectiveness of that decision.

How does someone who can't even tie his shoelaces do any of that?  

The public education establishment is very much like a Medieval Guild.  It makes up its own criteria for membership and sets the rules for entry.  Only those who undergo the required training and learn the secret handshake are then considered qualified to impart knowledge.  As anyone who has dealt with teachers and administrators knows, it is also highly secretive.  The teacher and the school itself is allowed to determine what might, or might not, be "disruptive" to the learning process.  Parents who would like to sit in and audit what is being taught are viewed with suspicion.  Often such a request will be denied.

Parents or citizens who express concern are met with standard replies that are taught in many graduate Education Administration courses to deflect such concerns and maintain control.  They will be given such dodges as:

1) You’re too inexperienced to understand, or

2) That all experts disagree with your point of view, or

3) That scientific evidence proves you wrong, or

4) That you’re trying to impose your morals or values on others, or

5) That you’re the only person in the whole community who has raised that issue.

Now, all of these assertions may be completely false.  But most parents don’t have the time or resources to prove it.  School officials KNOW that.

For the past 85 years or so, the education establishment has religiously worked to convince the general public that it represents a profession, rather than a trade.  When he retired from the Presidency of Harvard in 1933, Abbott Lawrence Lowell told the Board of Trustees that Harvard's Graduate School of Education was "a kitten that ought to be drowned”.  In 2013, Harvard stopped conferring the Doctor of Education (EdD) degree.  The Board of Trustees finally acted on the recommendations of the rest of the faculty that this degree is not, and never was, the equivalent of the PhD.  Instead, it was created by the education establishment itself for those seeking the same stature as academic scholars.

What the general public doesn't know is that, at colleges and universities with "schools" or "departments" of Education, those departments are considered the trash dumps of the university by the rest of the faculty, when they think of them at all.

Hard data on education student qualifications have consistently shown their mental test scores to be at, or near, the bottom among all categories of students.  In 1952, the U.S. Army had college students tested for draft deferments during the Korean War.  More than half the students in the humanities, the social, biological and physical sciences, and mathematics passed, compared to only 27% of those majoring in Education.

In 1981, students majoring in education scored lower on both verbal and quantitative SATs than students majoring in art, music, theatre, all the sciences, mathematics, business, and health occupations.  In 1994, a review of the SAT scores of Education majors was done as part of a graduate school research project at Tarleton State University, which has an Education department.  A search of the Registrar’s computer records was conducted.  Student names were not part of the search criteria, only SAT scores by major in each department.  The earlier data was confirmed.  Education majors ranked in the lowest group, by mental and verbal ability, of all students in the university.

At the graduate level, it is much the same story, with students in numerous other fields outscoring education students on the Graduate Record Examination – by from 91 points composite to 259 points, depending on the field.  The pool of graduate students in education supplies not only teachers, counselors, and administrators, but also professors of education and those who speak for the Education establishment, advising such people as elected officials. 

Perhaps this is why, at some point in the past, the Texas legislature inserted a provision in the Local Government Code making teacher's college transcripts private.  As public employees, teachers' salaries are a matter of public record.  But, parents have no way of determining what kind of student a teacher was in college.  Did they score so low on the SAT or ACT that they were required to take remedial classes before being allowed to do college work?  At least 1/2 of all students leaving Texas high schools today must do so.  Were they ever put on academic probation.  Did they start off declaring an academic major, only to switch to Education because they found the other discipline too hard? 

Although local Boards of Education are still elected, this is a nostalgic throwback to the early 20th century.  How and what is taught in the public schools has been taken out of the hands of local officials and is vested in educational bureaucrats who have a vested interest in the status quo.