Skip to main content

Credentialing vs. Educating


ISSUE:  Spring 2004

In addition to their other major expenses, some North American nuclear families bear the cost of a four-year college or university degree for the family’s child or children. The cost has become as necessary as the cost of a car, and for a similar reason: without it, access to a remunerative job is difficult or even impossible. It has long been recognized that getting an education is effective for bettering oneself and one’s chances in the world. But a degree and an education are not necessarily synonymous. Credentialing, not educating, has become the primary business of North American universities. This is not in the interest of employers in the long run. But in the short run, it is beneficial for corporations’ departments of human resources, the current name for personnel departments. People with the task of selecting successful job applicants want them to have desirable qualities such as persistence, ambition, and the ability to cooperate and conform, to be a “team player.” At a minimum, achieving a four-year university or college degree, no matter in what subject, seems to promise these traits. From the viewpoint of a government agency’s or corporation’s department of human resources, the institution of higher learning has done the tedious first winnowing or screening of applicants. For the applicant, this means that a résumé without one or more degrees from a respected institution will not be taken seriously enough even to be considered, no matter how able or informed the applicant may be. The credential is not a passport to a job, as naive graduates sometimes suppose. It is more basic and necessary: a passport to consideration for a job. A degree can also be a passport out of an underclass, or a safety strap to prevent its holder from sinking into an underclass. Without it, as North American high school students are forever being warned, they will be doomed to a work life of “flipping hamburgers.” With it, all manner of opportunities may be accessible.

University credentialing thus efficiently combines the services to employers that in simpler and more frugal days were provided by First Class or Eagle rank in the Boy Scouts, and the services to aspiring climbers that in olden days were provided by a College of Heralds with its monopoly on granting the coats of arms that separated their possessors from the underclass. A coat of arms didn’t really certify that its possessor could wield a bow or a battle-ax. That wasn’t the point. Students themselves understand perfectly well what they are buying with four years of their youth and associated tuition and living costs. While a degree in some subject has become indispensable, one in a field with a currently promising job market and good pay is thought to be even better; thus student enrollment statistics have become an unofficial appendix to stock market performance. In the summer of 2002, when Internet and other high-tech stocks had gone into the doldrums, the Washington Post surveyed enrollment figures in undergraduate computer science departments in the Washington, D.C., area; it reported:

At Virginia Tech, enrollment of undergraduates in the computer science department will drop 25 percent this year to 300. At George Washington University, the number of incoming freshmen who plan to study computer science fell by more than half this year… . In 1997, schools with Ph.D. programs in computer science and computer engineering granted 8,063 degrees… . [T]he numbers rose through 2001 when 17,048 [Ph.D.] degrees were awarded… . Nine hundred of the 2,000 plus undergraduates studying information technology and engineering at George Mason University were computer science majors last year. This year the enrollment in that major is down to 800, although a newly created and more general information technology major has attracted 200 students… . “Having it ease off for a while is a bit of a relief,” said a [George Mason] dean. “Particularly with the field as it has been, they don’t want to spend four years on something and then not get a job.”

The two students whose comments were included in the newspaper’s report, apparently as representative of student thinking, advanced somewhat different reasons for shifts from earlier plans. One, who was switching to an unspecified engineering major, said he wanted to do something “more social and more interesting than working with computers. . . . Besides, you can’t get the chicks with that anymore.” The other, who was switching to business marketing, said, “Technology comes natural to people my age. It’s not fascinating anymore.” In the meantime, the Post reported, the U.S. Department of Labor was contradictorily projecting that “software engineering will be the fastest growing occupation between 2000 and 2010 with other computer-related industries trailing close behind.”

All universities possess their own subcultures, and so do departments within universities, varying to the point of being indifferent or even antagonistic to one another, so a generalization cannot describe all accurately. But it is safe to say that credentialing as the primary business of institutions of higher learning got under way in the 1960s. Students were the first to notice the change. In the unrest and turbulence of that decade, one thread of complaint came from students who claimed they were shortchanged in education. They had expected more personal rapport with teachers, who had become only remote figures in large, impersonal lecture halls. The students were protesting attempts to transmit culture that omitted acquaintance with personal examples and failed to place them on speaking terms with wisdom. In another decade, however, students dropped that cause, apparently taking it for granted that credentialing is the normal primary business of institutions of higher learning and that its cost is an unavoidable fee for initiation into acceptable adulthood. If a student takes out a loan to meet the expense, he or she may reach early middle age by the time the loan is paid off. The guarantee behind the loan is the valuable credential itself. “College degree worth millions, survey finds,” my morning paper tells me in July 2002. Every summer for years readers have been given similar tidings, buttressed by statistics, sometimes from government, sometimes from universities themselves. The survey in this case had been made by the U.S. Census Bureau, which reported, the paper said, that “someone whose education does not go beyond high school and who works full time can expect to earn about $1.2 million between ages 25 and 64… . Graduating from college and earning advanced degrees translate into higher lifetime earnings: an estimated $4.4 million for doctors, lawyers and others with professional degrees; $2.5 million for college graduates,” that is, those with a bachelor’s degree.

At this point in the news report, a policy analyst (presumably with a degree to validate the title) working for the American Council on Education, identified as “a higher education advocacy group,” chimed in with the moral: “Not all students look at college as an investment, but I’m sure their parents do. The challenge is to convince those high school students on the margins that it is really worth their time to go to college.” The survey found that men with professional degrees may expect to earn almost $2 million more than “women with the same level of education,” a difference attributed to the time out that women take to bear and rear children. The trends in the United States have followed in Canada, with the usual time lag. A forum panelist in Toronto, asked by a troubled parent, “When did we decide to change the way we thought about public education?” replied in an essay published in 2003: “Today’s youngsters have had it drummed into their heads that a post-secondary education is the key to a good job… . [It] is no longer considered as an investment that society makes in the next generation; it is seen as an investment that students make in themselves.” The panelist/essayist assigned the start of the Canadian change to the late 1980s, tracing it as a decline then and through the 1990s in public funding to universities and colleges while their enrollments were growing from 15 percent of high school graduates in 1975 to 20 percent in 2001, with educators and legislators expecting that it will reach 25 percent in the near future.

Expansion of first-rate faculty—memorable teachers of the kind the 1960s student protesters were mourning—has not kept pace with expansion of enrollments and courses offered; professors lack the time and energy they could once devote to personal contact with students. Slack has been taken up by what became known as “gypsy faculty,” lecturers hoping for permanent appointments as they move from university to university, and by graduate students as part of their apprenticeships. So many papers to mark, relative to numbers and qualities of mentors to mark them, changed the nature of test papers. Some came to consist of “True or false?” and “Which of the following is correct?” types of questions, fit for robots to answer and to rate, rather than stimulants and assessments of critical thinking and depth of understanding.

In the meantime, rejoicing that university education has become a growth industry, administrators and legislators seek increasingly to control problems of scale by applying lessons from profit-making enterprises that turn expanded markets to advantage by cutting costs. Increased output of product can be measured more easily as numbers of credentialed graduates than as numbers of educated graduates. Quantity trumps quality.

Community colleges that grant two-year diplomas in applied arts and sciences represent a midstation in life, like a second-class ticket in the traditional European transportation arrangement of first and second classes and third class or steerage. Two-year community colleges supply the economy with technicians of many kinds for hospitals and clinics, draftsmen for architectural and engineering firms, designers of graphics, lighting, and costumes for television shows, expositions, and plays, and many, many other skilled workers and craftsmen. Community colleges have typically maintained an admirably close connection between education and training and the diploma credential. But this, too, is now on the verge of a transformation into credentialing disconnected from education. In my home province of Ontario, Canada, a few community colleges have already promoted themselves into “an elite level” by gaining government licenses to grant four-year degrees that will upstage diplomas. The push for this change came from community college administrators, although they were divided about its desirability. Some feared it would “compromise access.” One, who applauded it, argued that his institution needed a degree-granting license “in order for our college to compete in a sophisticated economy where a degree is the currency of the realm.”

To put it crassly, first-class, elite-level tickets cost more than second-class tickets. “Undergrad Tuition Fees up 135% over 11 Years,” shouts a headline over a 2002 analysis by Statistics Canada, the country’s census bureau. Cuts in government funding have caused budgets to fall far behind fifteen years of compounded inflation, despite cost cutting. Appended to the newspaper report of the cost increases was a comment by the chair of a national student federation, who noted, “It’s no longer just the poorest of the poor who are denied. It’s creeping up the income ladder.” As currency of the realm, credentials are attractive to counterfeiters. So it is not surprising to learn that “experts” (credentials unspecified) estimate that 30 percent of job applicants concoct false résumés, or that a former mayor of San Francisco, when told that his chief of police had lied about his college degrees, pooh-poohed the revelation with the comment, “I don’t know who doesn’t lie on their resumes.” It is surprising, however, to learn that captains of industry give in to the temptation. After an unsuccessful career as an executive at General Motors, the successful chief executive of Bausch & Lomb, a venerable and respected maker of lenses and eye-care products, was shown not to possess a master of business administration degree, as claimed in his biographical materials. His competence was affirmed by his corporation’s board, and neither he nor the company suffered from the revelation, except for a sharp but temporary drop in the company’s stock. Other executives have been less fortunate. The chief executive of Veritas Software was fired for falsely claiming an MBA from Stanford University, and others have been publicly embarrassed. One told the press, “At some point I probably felt insecure and it perpetuated itself.” The Bausch & Lomb president, standing on his dignity, wins the arm’s-length prize: “I’m embarrassed,” he told the press, “that some of this incorrect information appeared in some of our published materials on my background. Clearly, it’s my obligation to proofread such things carefully and ensure their accuracy.”

Credentialing is an indirect legacy of the Great Depression of the 1930s. Along with much else in North American culture, credentialing’s origins and appeal cannot be understood without harking back to the Depression years. The physical and financial hardships of America in the years 1930–39 were mild in comparison with hardships endured in the 20th century by societies that suffered famine, genocide, ethnic cleansing, oppression, bombing raids, or defeat in war. The Depression, however, exerted a lasting influence on Americans, out of all proportion to its short duration and relatively mild ordeals. Nobody understood what was happening when jobs and savings vanished and stagnation settled on the United States and Canada. Even now, some seventy years later, economists continue to dispute the Depression’s causes. Mass unemployment was the single greatest disaster. At its worst, it idled some 25 percent of workers in the United States and Canada, and higher percentages in hard-hit localities. When one considers all the others who directly and indirectly depended on those workers, unemployment or its effects touched almost everyone other than the exceptionally rich or sheltered. Government make-work and semiwelfare programs, some of them admirably ingenious and constructive, helped but did not cure, and they had their own insecurities and humiliations.

Some people spent most of the Depression years standing in lines for a chance at a temporary job, for delayed pay from bankrupt companies, for lost savings in failed banks, for bowls of soup or loaves of day-old bread. One sees the anxious rows of pinched faces in photographs of the time. Also in photographs one sees rallies of protesters with their signs, quailing before mounted police and raised billy clubs. Often with incredible bravery, and always with incalculable expenditures of time, scrimped savings, and hopes, protesters devoted themselves to political activities that they had convinced themselves would be beneficial. Quieter involvement with intellectual schemes, like technocracy, social credit, and EPIC (End Poverty in California, the platform of Upton Sinclair’s unsuccessful campaign for election as that state’s governor), was a comfort to many. Others busied themselves politically with combat against those who espoused Marxism, Trotskyism, or other radical politics. Some of these, too, and their opponents turn up in photographs of sessions of the U.S. House of Representatives Committee on Un-American Activities.

However, most attempts at living through the Depression are not documented in photographs at all; they were only very lightly touched on in films, and almost as lightly in music. People who weren’t used to being idle and unwanted tried to keep busy somehow; but even jobs at no pay, valued for learning and experience, were hard to get. Architects made jigsaw puzzles and renderings of ghastly, inhuman utopian cities and sold them if customers could be found. I got a job without pay, for a year, on the Scranton, Pennsylvania, morning paper. It was my journalism school.

For individuals, the worst side effect of unemployment was repeated rejection with its burden of shame and failure. Many quietly despaired that the world had a place for them. This hopelessness, at the time, seemed endless.Would life always be like this? Something unfathomable, without visible cause, had engulfed everyone’s expectations and plans. For someone in her teens or early twenties, as I was during this time, it wasn’t really so bad. My friends and I could make stories out of our rejections and frugalities and the strange people we met up with in our futile searches, and could bask in the gasps or laughs we generated. It was harder for people in their thirties, who had gotten launched (they thought) in careers that so soon came to nothing. For people in their forties or fifties, rejections and idleness could be devastating. The parents of some of my friends never recovered ease with themselves, their families, or society after this demoralizing break in their lives. It was harder on men who had been family breadwinners than on women who devoted themselves to homemaking and child-rearing, as most did after marriage.

My father, a doctor, worked long hours, seven days a week, and in spite of weariness, he stayed in good spirits because he was needed and, especially, because his work interested him. But like everyone else, he worried about getting by. In our little city of Scranton, where the chief industry was mining expensive, high-grade anthracite coal, the Great Depression was intensified because, in effect, it had started four years early with a long and bitter coal strike and subsequent loss of markets.

Few of my father’s patients were able to pay him as the effects of mass unemployment spread. He told me one Saturday evening in 1936 that he had to earn $48 a day merely to pay for his office rent, his subscriptions to medical journals, office supplies, and the salary of his assisting nurse. To me that seemed an incomprehensibly formidable sum; I was earning $12 a week in New York as a stenographer in a candy manufacturing company that soon went into bankruptcy. He expressed relief as he told me about the $48; that day, he had broken even, thanks to twelve hours of hard, concentrated work in his office and the hospitals where he was a visiting physician and surgeon. He was not unique. Countless Americans who thought of themselves as the backbone of the country kept doing their work, regardless of the struggle, and helped hold things together.

When the stagnation lifted, at first tentatively in 1938–39 as the armament economy clicked in, and then in full force in 1942 after America entered the war, the change was miraculous. It was too late for my father, who had died in 1937. Everyone knew it was ghoulish to delight in jobs and prosperity at the price of war; nevertheless, everyone I knew was grateful that suddenly good jobs and pay raises showered like rain after a drought. It seemed that the world did need us, and had places for us.

After the war was over, during the euphoria of victory and the minor booms of the Marshall Plan and the Korean War, a consensus formed and hardened across North America. If it had been voiced, it would have gone something like this: We can endure meaningful trials and overcome them. But never again—never, never—will we suffer the meaningless disaster of mass unemployment. Cultures take purposes for themselves, cling tenaciously to them, and exalt them into the purposes and meanings of life itself. For instance, in ancient Rome the ideal of service to the state was the overriding cultural purpose. After the republic was succeeded by the empire, Virgil added a slightly new spin, in a passage of the Aeneid cited with reverence by the emperor Augustus: “Your task, Roman, is this: to rule the peoples. This is your special genius: to enforce the habits of peace, to spare the conquered, to subdue the proud.” In medieval Western Europe and in early colonial Puritan America, the purpose of life, which had been reshaped during the Dark Age, became the salvation of souls, one’s own and others’, for the Christian Kingdom of Heaven.

In the founding period of the United States, a time when the Copernican, Newtonian, and Cartesian Enlightenment had succeeded both medievalism and the Renaissance, the cultural purpose became Independence. Not for nothing was the charter of reasons behind the war of separation from Britain called the Declaration of Independence, and July fourth called Independence Day. An accompanying cult developed around Liberty, as symbolized by both the Liberty Bell and the aims of the French Revolution. Independence and Liberty were succeeded by the related Freedom, indeed by two conflicting versions of freedom: the political freedom of states’ rights, offshoot of Independence, and the social freedom of abolition of slavery, offshoot of Liberty. In the decades after the Civil War and the bloodletting that seemed briefly to resolve the conflict between concepts of freedom, there was no obvious American cultural consensus on the purpose of life, although there were contenders, such as the Manifest Destiny of America’s push westward, which had already risen to its height in the 1840s with the Mexican War, the annexation of Texas, and the purchase of California and New Mexico. Manifest Destiny was extended at the turn of the century by President Theodore Roosevelt to the Caribbean and the Pacific with the Spanish-American War, which was taken by Americans to mean American rule over the Western Hemisphere.

The start of the 20th century and the decades immediately before and after were a time of reforming ferment as Americans sought to perfect their society by eliminating child labor, extending the vote to women, combating corruption and fraud, embracing public health measures and their enforcement, prohibiting the sale of alcohol, outlawing monopolies as restraints on trade, initiating environmental conservation through national parks (a favorite of Theodore Roosevelt), improving working conditions and protecting the rights of labor, and many other practical reforms into which their proponents threw themselves with ardor as great as if each of these aims were indeed the purpose of life.

The reforming spirit carried into the Great Depression years, with President Franklin Roosevelt’s promotion of the Four Freedoms, linking economic aims (freedom from want) to human rights (freedom from fear), and his practical measures for making the links tangible, among them his successful advocacy of collective bargaining under the Robert F. Wagner proposals that became the National Labor Relations Act and his institution of a regulatory Securities and Exchange Commission (SEC), making rules for public corporations’ disclosures and reining in speculative manipulations in corporate stocks. Eleanor Roosevelt, Franklin’s wife, for her part, channeled her lifelong experience with smallish reform movements into advocacy of the United Nations and, most notably, into that body’s formulation and acceptance of a declaration of universal human rights, her chief legacy and monument. Among all these and other contenders for the American purpose of life, one seemed to win out, less with fanfare than with simple, quiet acceptance: the American dream, the ideal that each generation of whites, whether immigrant or native-born, was to become more successful and prosperous than the parent generation.

From the 1950s on, American culture’s gloss on the purpose of life became assurance of full employment: jobs. Arguably, this has remained the American purpose of life, in spite of competition from the Cold War with the Soviet Union, and maybe even from the War on Terrorism, in which postwar reconstruction is being linked with contracts for American companies and hence jobs for Americans.

How does a culture reveal its concept of the purpose of life? A cultural purpose enables perpetrators and witnesses to regard horrific deeds as righteous. Republican Rome’s defeat of Carthage and its people—Virgil’s cant notwithstanding—was as gruesome a scene of murder of helpless and innocent people as has been recorded; it was deemed glorious by Romans because they construed it as a righteous act of preemptive protection for the state. Looting and massacres by 16th-century Spanish conquistadors in South and Central America were justified by the same cultural drives for salvation of souls that justified the labors, sacrifices, and risks of Spanish missionaries. The aggressions of crusading soldiers and kings against Muslim “infidels” in the Middle East and Christian heretics in France; the tortures and executions in Europe by Catholic inquisitors and Protestant witch-hunters; the persecutions and forced conversions of Jews; the oppressions by Puritans in Britain and New England—these and other deeds that created hell on earth were all righteously justified as defeats of the devil and salvation of souls.

In 1956, when Congress passed legislation funding the Interstate Highway System—a government program then unprecedented in America for its vast physical scope and vast cost—the ostensible reason for the program was to allow residents and workers to evacuate cities and towns speedily and efficiently in case of emergency (a Roman type of purpose). However, memories of the Great Depression were sufficiently fresh for everyone to recognize instantly the real and serious purpose of the program: full employment, a guarantee of jobs—jobs building roads; jobs designing, manufacturing, servicing, and repairing automobiles; jobs refining and transporting oil and filling gasoline tanks. President Dwight D. Eisenhower himself acknowledged this purpose in his remarks about the automobile as a mainstay of the economy and employment, when he spoke at the George Washington Bridge in New York at a celebration as the highway program was getting under way.

To settle on the auto industry as the instrument for achieving jobs, the grand cultural purpose of life, was so apt that it may have been all but inevitable. Automobiles, for those who could afford them, were loaded with references and romances from earlier American purposes and meanings of life: independence, freedom, the success of getting a better car than one’s parents could afford—moving up from a Pontiac to a Buick. Nobody recognized and approved the job-making purpose of the highway system more heartily than mayors and other elected officials. The shortsighted destruction of community in America was easily trumped by the righteousness of full employment.

Conflicts between highway building and community values—or any other values—set a pattern that has persisted after memories of the Great Depression have faded. To foreigners, it seems inconsistent that America promotes globalization of trade yet also gives subsidies to American agriculture that sorely hurt poor African economies; claps tariffs on Canadian sawn lumber and Brazilian steel; and hangs border security costs on Canadian exports that competing products made in America do not bear when they cross the border. To American trade negotiators and lobbyists, however, there is no inconsistency in contradictory policies that, each in its own way, are calculated to promote jobs for Americans. What is inconsistent about that?

Any institution, including a government agency, that is bent upon ecological destruction or an outrage on the built environment argues its case or bullies its opponents by righteously citing the jobs that supposedly will materialize or, even more effectively, the jobs that may be forfeited or jeopardized if the ugly deed is not done. To this day, no alternative disaster, including possible global warming, is deemed as dire a threat as job loss. At a time in 2002 when Canada’s Arctic was unmistakably melting—and unexpectedly rapidly, too—the premier of Ontario was asked whether he would support the Canadian prime minister in his professed intent to ratify the Kyoto accord on reduction of climate-warming fossil-fuel emissions, or would instead follow the lead of U.S. President George W. Bush in repudiating the accord. His reply: “We’re not going to put ourselves at a huge disadvantage and cost Ontarians hundreds of thousands of jobs … while our American neighbors to the south—God bless them—are not doing something about reducing their emissions.” That is the Great Depression speaking, on both sides of the border. As exalted cultural purposes of life go, a job for everyone is less brutal and deluded than most cultural ideals. But as my grandmother used to say, “You can run anything into the ground.”

Whether jobs have been succeeded by the War on Terrorism as the American purpose of life is still unclear. The swift surrender of entitlement to a speedy trial, protection against being held without legal counsel or charges, and privacy and, in the case of captured combatants, the abrogation of the Geneva Conventions on treatment of prisoners of war, argue that the exigencies of outmaneuvering putative terrorists have overridden other values, including economic prudence. As Margaret Atwood has pointed out, the surrender of civil rights is “a recipe for widespread business theft … and fraud.” Perhaps we must wait for new arrangements for control of Middle Eastern oil and reconstruction of Afghanistan and Iraq to learn whether the purpose of American life has actually switched from providing jobs and earning profits.

It has been truly said that the past lives on in the present. This is true of credentialism’s origins. It emerged partly out of America’s humiliation and worry when the Soviet Union, with its Sputnik, had beaten America into space, and partly from the still-fresh dread of the Depression. Credentialism emerged, mostly in California at first, in the late 1950s, when it dawned upon university administrators there that modern economic development, whether in the conquest of space or any other field, depended on a population’s funds of knowledge—a resource that later came to be known as human capital. It followed that development’s most culturally valuable product—jobs—also depended on knowledge. The administrators were quite right, and it was brainy of them to reason that the more of this crucial resource their institutions could nurture and certify, the better for all concerned.

Initially there was no conflict between this aim and the quality of the education that administrators expected their institutions to supply. That conflict began to arise in the 1960s, partly out of universities’ attempts to take on many new tasks at once as they engaged with the communities that supported them. Under the civic banner of the “multiversity,” they aimed at furthering every good thing they set their abundant intellects to. Far from elevating credentialing above educating, they were sweepingly enlarging the idea of educating to embrace whatever skills seemed needed, from cost-benefit analysis to marketing. Administrators surely did not recognize how much these enlarged ambitions, coupled with the promise of riches to society from credentialed graduates, would change universities themselves.

As always in a culture, everything that happened connected with everything else. In this case, multiversity educational expansion had connections with a constructive U.S. government program for war veterans.

After World War II and then the Korean War, the government provided tuition and encouragement for veterans who had the desire and qualifications to attend universities or colleges. Tens of thousands of former GIs, many from families in which nobody had ever before been given a chance at higher education, took advantage of this opportunity. On the whole, the veterans were noted for applying themselves more seriously than students just out of high school. They also swelled student enrollments. When the stream of GI students ran dry, their hunger for education was missed in university communities, along with their government-guaranteed tuitions. Credentialing emerged as a growth industry in the 1960s, just when universities needed it to address problems of their own.

The more successful credentialing became as a growth industry, the more it dominated education, from the viewpoints of both teachers and students. Teachers could not help despairing of classes whose members seemed less interested in learning than in doing the minimum work required to get by and get out. Enthusiastic students could not help despairing of institutions that seemed to think of them as raw material to process as efficiently as possible rather than as human beings with burning questions and confusions about the world and doubts about why they were sinking time and money into this prelude to their working lives.

Students who are passionate about learning, or could become so, do exist. Faculty members who love their subjects passionately and are eager to teach what they know and to plumb its depths further also exist. But institutions devoted to respecting and fulfilling these needs as their first purposes have become rare, under pressure of different necessities. Similar trends in Britain have begun to worry some educators there. My impression is that university-educated parents and grandparents of students presently in university do not realize how much the experience has changed since their own student days, nor do the students themselves, since they have not experienced anything else. Only faculty who have lived through the loss realize what has been lost.

0 Comments

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Recommended Reading