Strong Nuclear Family Is Crucial To Nation’s Financial Stability

shutterstock.com

Our contemporary immersion into political correctness and assumed “rights” regarding the basic building block of society has cumulatively, over the past few decades, steadily eroded not only our sociological strength, but our economic viability as a country. The fundamental significance of the family unit, and the hard data evidencing the undeniable importance of the intact nuclear family, have been ignored; and the longer we pander to bad public policy based in political correctness, the more rapidly our society will degenerate.

A few years ago, drawing heavily from government data and peer-reviewed sociological and economic research, Robert I. Lerman and William Bradford Wilcox published an extensive research piece in The Economist confirming the fundamental role the intact nuclear family has on society. Lerman is a Professor of Economics at American University and a Senior Fellow at the Urban Institute in Washington, DC., and Wilcox is a professor of sociology at the University of Virginia.

Their executive summary states: “All the latest evidence confirms that the institution of marriage is a key to productive adulthood, the cornerstone of a stable family, and the basic unit of a healthy community. Its effects go well beyond the married couple. It shapes our whole society, from workforce participation to economic inequality to the effectiveness of education. Children raised by married parents have better odds of succeeding in school, excelling at work, and building a stable relationship of their own.”

Drawing from Department of Labor data, they showed how American families experienced an average 80% increase in their real income from 1950-1979. Family income inequality was relatively low, and more than 89% of prime working age men were employed. All of those trends have reversed, and are accelerating to the downside, with the composition and structure of the family playing the most crucial role in this reversal.

In 1980, married parents headed 78% of households with children. By 2012, that had dropped nearly 20%. The researchers, again relying on hard primary data, showed why that was significant. “Married families enjoy greater economies of scale and receive more economic support from kin, and married men work harder and earn more money than their peers, all factors that give them an economic advantage over cohabiting and single-parent families.”

The economic impact on individual family units, as well as society as a whole, cannot be overstated. Even adjusting for race, education, and other factors, if the share of married parents remained at 78% through 2012, “the rise in the overall median income of parents would have been about 22%, substantially more than the actual growth of 14%.” And if the post-1979 immigrants, coming mostly from low-income countries, are adjusted for, the “growth in median family income would have been 44% higher than 1980 levels.” They therefore conclude that the decline in the share of “married-parent families with children largely explains the stagnancy in median family incomes since the late 1970s.”

Traditional nuclear family units, including a mother, father, and children, have been proven to be more viable in almost every facet of sociological construct. As the researchers explain: “Family structure appears to matter for children’s well-being because, on average, children growing up without both parents are exposed to: More instability in housing and primary caretakers, which is stressful for children; Less parental affection and involvement; Less consistent discipline and oversight; and Fewer economic resources.”

Sociologists Sara McLanahan and Gary Sandefur, in summarizing their research on family structure, put it this way: “If we were asked to design a system for making sure that children’s basic needs were met, we would come up with something quite similar to the two-parent ideal. Such a design, in theory, would not only ensure that children had access to the time and money of two adults; it also would provide a system of checks and balances that promoted quality parenting.”

Lerman and Wilcox summarize: “The research to date leads us to hypothesize that children from intact, married families headed by biological or adoptive parents are more likely to enjoy stability, engaged parenting, and economic resources and to gain the education, life experiences, and motivation needed to flourish in the contemporary economy—and to avoid the detours that can put their adult futures at risk.”

Many of the forces negatively affecting the family are cultural and can be attributed to the gradual, yet accelerated, erosion of social mores. But many of the destructive contributors are driven by governmental policy, statute, and legal code, like the IRS “marriage penalty” and welfare programs that facilitate the absolution of parental responsibilities. And some are couched in principles espoused by political correctness that defy empirical data, the most egregious of the latter represented by the redefinition of marriage, the cornerstone to the family unit, which only further dilutes and weakens the building block of society.

The viability of the American family is crucial for the survival of the republic, not only sociologically, but financially. We all cumulatively either contribute to, or detract from, the soundness of the familial units comprising our society. We must not only do our part in our familial microcosms, but electorally, to elect and support those who favor governmental policy that strengthens the family unit, and who don’t buckle to political correctness in redefining our societal building blocks.

The views expressed in this opinion article are solely those of their author and are not necessarily either shared or endorsed by WesternJournalism.com.

This post originally appeared on Western Journalism – Informing And Equipping Americans Who Love Freedom

Easter: A Celebration Of Hope And Rebirth

shutterstock.com

For Christians worldwide, Easter is a celebration of the resurrection of Jesus the Christ. While the eschatological doctrines associated with Christ’s crucifixion, death, and resurrection are a matter of faith, the attestation of primary accounts makes Jesus’ emergence from the tomb a matter of historical record. And many of the contemporary symbols associated with Easter date back centuries–and represent elements of this most holy of events from the life of one Jesus of Nazareth.

To a historian, primary sources are the bedrock to validate or invalidate events or individuals averred to be historical. Princeton University’s History Department defines a primary source as “a document or physical object which was written or created during the time under study. These sources were present during an experience or time period and offer an inside view of a particular event.”

Primary sources regarding the life of Jesus of Nazareth are plentiful. The eyewitness accounts of four contemporaries are recorded in the synoptic Gospels, the first four books of the New Testament. There are many secular primary sources that attest to the fact that Jesus lived at the time, including Roman historians Tacitus and Suetonius, and the Jewish historian Josephus.

As a quantitative matter of fact, there are more primary sources confirming the reality of Jesus of Nazareth than there are of the Roman leader Julius Caesar. Yet to my knowledge, no serious historian of the antiquities questions whether Julius Caesar really lived. Validating this concept, Rylands professor of biblical criticism and exegesis at the University of Manchester, F. F. Bruce, wrote, “The historicity of Christ is as axiomatic for an unbiased historian as the historicity of Julius Caesar.” World historian Will Durant indicates that, to the best of his knowledge, “no Jew or Gentile from the first-century ever denied the existence of Jesus.”

One of the most prolific classicists of our era, Michael Grant, has said, “In recent years, no serious scholar has ventured to postulate the non historicity of Jesus’ or at any rate very few, and they have not succeeded in disposing of the much stronger, indeed very abundant, evidence to the contrary.” In another of his works he states, “There are those who argue that Jesus is a figment of the Church’s imagination, that there never was a Jesus at all. I have to say that I do not know any respectable critical scholar who says that any more.”

The public death of Christ, by crucifixion, is also broadly accepted as historical fact. Michael Grant said of that event, as well as the account of his baptism, that those “two facts in the life of Jesus command almost universal assent.” Jesus’ public crucifixion is likewise referenced by secular historians of the age, Josephus and Tacitus.

Primary accounts of Jesus’ resurrection, however, are exclusively non-secular. Matthew, Mark, Luke, and John’s accounts of Jesus’ life, death, and subsequent resurrection were canonized. Yet they were written, and widely promulgated, during the time when most of their contemporaries could have dismissed their accounts if they were perceived to have been fabricated or in error. F.F. Bruce confirms this perception: “Had there been any tendency to depart from the facts in any material respect, the possible presence of hostile witnesses in the audience would have served as a further corrective.”

Most of the original apostles died ignominious and horrible deaths as a direct result of their avowed faith in Jesus as Messiah. They died as martyrs for their convictions and testimony regarding the risen Christ. It is wholly unfathomable that someone would die a martyr’s death for a story thought to be no more than a fable. The fact that eleven of them, twelve including Paul of Tarsus, would do so only attests to the veracity of their witness statements. They forever sealed their testimonies with their blood.

Our contemporary iconography associated with Easter is colorful, literally, starting with the Easter bunny. Rabbits are widely known to be prolific procreators, and in some ancient cultures symbolized new life and fertility. The first Easter bunny arrived in America in the 1700s, courtesy of German immigrants, who perpetuated their tradition of an egg-laying hare called “Osterhase”, or “Oschter Haws.” German youth would make nests where the hare could lay its colored eggs, which later simply became decorated baskets for the multicolored eggs.

The egg itself represents new life. For Christians, Christ emerging from the tomb is symbolic of newborn life exiting an eggshell. Coloring and decorating eggs, according to some sources, dates back to the 13th century, undoubtedly with some pagan influence.

The timing of the Christian world’s Easter celebration is somewhat enigmatic to many, since it is observed anywhere from March 22nd to April 25th. This is because early Christians felt that since the resurrection of their Lord occurred after the Passover, they always wanted Easter to follow that Jewish feast, which is based on solar and lunar cycles. The short explanation, roughly speaking, is that Easter is celebrated on the Sunday following the Paschal Full Moon.

Whether celebrated for its theological implications, or its secular treats, Easter represents new life and resurrection, as the Northern hemisphere springs to life following the dreary, darker, and shorter days of winter. The symbolism likewise can represent as much or as little as one desires, but traditionally links back to rebirth and new life. How we respond to the symbolism and the day itself is wholly up to each of us.

Former Cardinal Basil Hume said of Easter, “The great gift of Easter is hope.” And in a world of so much ugliness, evil, and negativity, we all need all the hope we can get.

The views expressed in this opinion article are solely those of their author and are not necessarily either shared or endorsed by WesternJournalism.com.

This post originally appeared on Western Journalism – Informing And Equipping Americans Who Love Freedom

Multiculturalism’s Denigration of Western Values

shutterstock.com

America has a rich history as a melting pot of cultures, ethnicity, and religion. Those who have come here over the past couple hundred years have sought a better life through the freedoms and liberties assured by our Constitution and the free enterprise system that fosters their “pursuit of happiness.” They’ve brought their culture, customs, and language with them; but they became Americans: learned English, learned our customs and conventions, and became encultured into the American way. America is great in large part because of the diversity of our people and the richness of our cultural elements brought here. But multiculturalism has become much more than that, and is now more destructive than ameliorative to American culture.

If the goal of multiculturalism was followed, which was to primarily facilitate the understanding and respect of other cultures, it would contribute and even add “seasoning” to our melting pot by encouraging our young people to compare and contrast, and then eclectically assimilate the best of all cultures. Instead, it has become an assailant to diminish Western values and advance ideologies distinctly anti-American.

Thomas Sowell, Senior Fellow at the Hoover Institution at Stanford University, has said, “What ‘multiculturalism’ boils down to is that you can praise any culture in the world except Western culture – and you cannot blame any culture in the world except Western culture.”

Roger Kimball of the New Criterion has written, “Wherever the imperatives of multiculturalism have touched the curriculum, they have left broad swaths of anti-Western attitudinizing competing for attention with quite astonishing historical blindness.” Multiculturalism has led to the historical revisionism that paints Christopher Columbus as a nefarious European who initiated the transformation of a supposed paradisiacal Western hemisphere into the evil, corrupt America of today.

It is multiculturalism that precludes Shakespeare from being studied by many university literature and English majors because he was a “sexist and racist white man.” It is also the underlying principle engaged in revising history, including the historical roots of our contemporary observance of Thanksgiving and acknowledgement of the Christian principles prevalent at the time of our founding. Multiculturalism, in its extreme, is at the root of the removal of any references to Christ in the public square and public schools, even at the time we celebrate His birthday; for one characteristic of the movement is distinctly anti-Christian.

As convoluted as it may seem, Al Gore was perhaps correct when in the 2000 Presidential campaign, he defined E Pluribus Unum as out of one, many, instead of the other way around. Multiculturalism in its extreme form seeks to divide rather than unify as Jefferson and Franklin intended, as emblazoned on the official Seal of the U.S.

A poll by the Pew Research Center a few years ago indicated that only 55% of Hispanics, living either legally or illegally in this country, consider themselves Americans. Another poll of Muslims in Los Angeles County indicated that only 10% of them consider themselves to be Americans. It seems the hyphenation of Americans is another social and cultural divider, rather than a unifier. A hyphenated American is just another symptom of political correctness.

Multiculturalism in its extreme weakens community bonds and reduces the motivation for new immigrants to participate in the common culture, the shared history, and the common language of America (English).

The American concepts of freedom of expression, religion, human rights, liberty, and democracy are distinctively Western values. As historian Arthur Schlesinger, Jr. has said, “These are European ideas, not Asian, nor African, nor Middle-Eastern ideas, except by adoption. There is surely no reason for Western civilization to have guilt trips laid on it by champions of cultures based on despotism, superstition, tribalism, and fanaticism.”

The pejorative aspects of multiculturalism have contributed alarmingly to a Balkanization of America, where differences are the focus instead of common values and ideals. Where culture and ethnicity divide us, rather than adding seasoning to our melting pot to enrich the entire culture.

President Theodore Roosevelt put the concepts of multiculturalism in perhaps the best context, although it was of course not known as such in 1907. He declared, “In the first place, we should insist that if the immigrant who comes here in good faith becomes an American and assimilates himself to us, he shall be treated on an exact equality with everyone else, for it is an outrage to discriminate against any such man because of creed, or birthplace, or origin. But this is predicated upon the person’s becoming in every facet an American, and nothing but an American…There can be no divided allegiance here. Any man who says he is an American, but something else also, isn’t an American at all. We have room for but one flag, the American flag… We have room for but one language here, and that is the English language… and we have room for but one sole loyalty and that is a loyalty to the American people.”

As long as multiculturalism is an end in and of itself (or worse, as a means to continue to diminish western values and our history and divide and weaken our country), we will continue to decline as a culture, losing those distinctively American traits that once made the nation unique. As it diminishes our value system, erodes our cultural strengths, and rewrites our history, the very meaning of what it means to be an American is perhaps forever changed.

The views expressed in this opinion article are solely those of their author and are not necessarily either shared or endorsed by WesternJournalism.com.

This post originally appeared on Western Journalism – Informing And Equipping Americans Who Love Freedom

Be Informed And Watch Government ‘Like A Hawk!’

Albert H. Teich / Shutterstock.com  Albert H. Teich / Shutterstock.com

It’s inevitable that citizens would often feel frustrated with their elected officials. After all, it’s impossible to please all the people all of the time; and if they are, they likely aren’t doing their job. But there is one thing that likely is felt universally by constituents of all ideological persuasions: our elected officials work for us and represent us and our interests; and they should never forget their role of serving in our behalf.

Every once in a while, something in our popular culture will capture such universally felt sentiments. Such was the case several years ago with a movie titled Protocol, starring Goldie Hawn (mother to actress Kate Hudson).

In the film, Hawn plays the role of a loveable, yet somewhat ditzy waitress in D.C. who happens to save the life of a visiting Emir from the Middle East. For her heroism, the State Department rewards her with a job serving in the Protocol Division and then initiates a scheme to marry her off to the Emir whose life she’d saved, in exchange for a new military base to be constructed in the Emir’s country.

When the plan unravels and comes to light, Sunny (Hawn’s character) is hauled before a congressional committee to answer to her involvement in the scheme that has been affectionately dubbed “Sunnygate.” Her response is classic and reminds us all of some of our responsibilities as American citizens.

As the committee chairman begins the hearing, he declares his intent to find out who was responsible. Sunny responds, “I’m responsible!” She then explained why. “You want to know something? Before I worked for the government, I’d never read the Constitution. I didn’t even begin to know how things worked. I didn’t read the newspaper, except to look up my horoscope. And I never read the Declaration of Independence. But I knew they had, the ones we’re talking about, the experts, they read it. They just forgot what it was about. That it’s about ‘We, the People.’ And that’s ME. I’m ‘We, the People.’ And you’re ‘We, the People.’ And we’re all ‘We, the People,’ all of us.

“So when they sell me that ten cent diamond ring or down the river or to some guy who wears a lot of medals, then that means they’re selling ALL of us, all of ‘We the People.’ And when YOU guys spend another pile of money and when you give away or sell all those guns and tanks, and every time you invite another foreign big shot to the White House and hug and kiss him and give him presents, it has a direct effect on ‘We the People’s’ lives.”

“So if we don’t, I mean if I don’t know what you’re up to, and if I don’t holler and scream when I think you’re doing it wrong, and if I just mind my own business and don’t vote or care, then I just get what I deserve. So now that I’m a private citizen again, you’re going to have to watch out for me. ‘Cause I’m going to be watching all of you. Like a hawk.”

There are some notable principles embedded in that inspiring response. First was the concept of personal responsibility. How often do we see people, whether in public life or in their personal lives, not take responsibility for their actions, or refuse to stand up against those who ultimately are culpable? It’s becoming as uncommon as common sense. Someone, or something, else is always to blame for poor decisions, bad plans, and/or ill-spoken words. And regrettably, it seems most obvious in the realms of government, where all too few feel they’re accountable to the electorate for their actions.

Next, Sunny reminded us that, as citizens, it’s our responsibility to be knowledgeable and proactive citizens. If we let our elected officials get away with things that are unconstitutional or illegal, we’re at least partly to blame. After all, collectively, we are the ones who put them in their position of responsibility; and they are, or at least should be, accountable to us.

That’s one of the beauties of the American governance model: we hire them to protect us and our interests and our rights as citizens. If we’re not proactive, they can increasingly feel like they’re accountable to no one, least of all us. When they start feeling entitled to their perks of office, and taking us, their employers, for granted, they’ve outlived their usefulness; and it’s time to retire them.

Such a level of proactivity will only be efficacious if we’re knowledgeable of our founding documents to know the proper role of governance, and if we keep ourselves apprised of what our government attempts to do for, and to, us. Too many of us are illiterate when it comes to our founding documents and don’t bother to keep informed of what those in government are doing. I think this is what Winston Churchill was referring to when he said, “The best argument against democracy is a five-minute conversation with the average voter.”

I think FDR would have approved of Sunny’s response to the congressional panel; for FDR himself said, “Let us never forget that government is ourselves and not an alien power over us. The ultimate rulers of our democracy are not a President and senators and congressmen and government officials, but the voters of this country.”

It’s unusual to garner anything substantive from movies, and so something like Goldie Hawn’s eloquent speech before a congressional committee stands out rather starkly. Although she’s a fictional character, Sunny represents what should be the best in all of us, as citizens, as we educate ourselves, keep informed, and watch our elected officials “like a hawk!”

The views expressed in this opinion article are solely those of their author and are not necessarily either shared or endorsed by WesternJournalism.com.

This post originally appeared on Western Journalism – Informing And Equipping Americans Who Love Freedom

Healthiest State Economies Are Right To Work States

shutterstock.com

This past week, Wisconsin became the 25th state in the union to pass and sign into law so-called “right to work” legislation. Despite the pejorative light oftentimes associated with right to work (RTW) laws, all they do in is proscribe the requirement that a worker join or pay dues to a union as a qualification for employment.

Unions often view laws removing compulsory union membership for work in the private sector as “anti-union,” while advocates of right to work laws maintain it’s a matter of personal liberty and economic freedom. They argue that workers in given trades or industries should have the option to choose whether to join a union or not. Arguably, if a union is doing a good job representing the interests of its members, it should not be threatened by the freedom to choose, as the benefits of union membership would be self-evident.

Even some union leadership supports such a sentiment. Gary Casteel, the Southern region director for the United Auto Workers, explains: “This is something I’ve never understood, that people think right to work hurts unions. To me, it helps them. You don’t have to belong if you don’t want to. So if I go to an organizing drive, I can tell these workers, ‘If you don’t like this arrangement, you don’t have to belong.’ Versus, ‘If we get 50 percent of you, then all of you have to belong, whether you like to or not.’ I don’t even like the way that sounds, because it’s a voluntary system, and if you don’t think the system’s earning its keep, then you don’t have to pay.”

One cannot be a student of history without recognizing the tremendous contributions unions made to the emergence of the middle class in early to mid 20th century America. They significantly improved working conditions, workweek hours, and compensation levels.

In today’s highly competitive economy, their focus seems to have changed, as they seem to be primarily political entities today, with compulsory union dues used mostly for amassing power in the political arena, and spent on candidates and causes that some members may object to. Even Bob Chanin, former top lawyer for the National Education Association, admitted such in his farewell speech a few years ago. “It’s not about the kids…it’s about power,” he said.

According to Department of Labor statistics, only about 7% of America’s private sector workforce is unionized. In the post-World War II era, it was nearly 40%. The trend is reversed for public employees, where the unionized segment of the public employees workforce 60 years ago was less than 10%, while it currently is nearly 37%. Logic leads one to surmise that maybe all those “evil corporations” have gotten it right, and are providing pay and benefits at a level that employees are satisfied with. While the same logic might lead us to believe that, following those trends, it is “evil government” that is taking advantage of employees and must be represented by collective bargaining.

Average wages do tend to be slightly lower in right to work states, as reported by The Wall Street Journal last year. But the differences may be attributable to other factors. As the Journal explained, “Many economists say when differences in cost of living are taken into account, wages are roughly the same—or even higher—in right-to-work states.” When looking at a map of non-right to work states, geographical and cost of living factors seem to affirm that distinction.

Last year, the National Institute for Labor Relations released a detailed study of right to work vs. non-right to work states. The research was based upon data from the Bureau of Labor Statistics, United States Census Bureau, United States Patent and Research Office, and Bureau of Economic Analysis. Five economic factors were analyzed in right to work and non-right to work states in the Midwest, with the following statistical conclusions:

Job growth is twice as strong in RTW states. The percentage growth of non-farm private sector jobs (1995-2005) 
in right to work states was 12.9%
, while non-right to work states came in at 6.0%.

Perhaps surprising to some, poverty is actually higher in non-right to work states. The average poverty rate, adjusted for cost of living, was 8.5% in RTW states, and 10.1% in non-right to work states. This may likewise have more to do with geography and cost of living factors, however.

New company and new product growth is significantly greater in RTW states. During that same period, annual percentage growth in patents granted was 33% in RTW states, and only 11% in non-right to work states.

Income growth rates are higher in RTW states as well. The percentage growth in real personal income was 26.0% 
in RTW states, while non-right to work states grew at 19.0%.

Even health insurance coverage in RTW states fared better. Note that this data was gathered before the implementation of Obamacare. The percentage growth in the number of people covered by employment-based private health insurance was 8.5% for RTW states, and 0.7%
for non-right to work states.

Consequently, based on National Institute for Labor Relations research, right to work states create more private sector jobs, enjoy lower poverty rates, experience more technology development, realize more personal income growth, and increase the number of people covered by employment-based private health insurance. Clearly, when looking at the big picture, the economy of a state is more likely to be more robust when the workforce has the freedom to choose.

The views expressed in this opinion article are solely those of their author and are not necessarily either shared or endorsed by WesternJournalism.com.

This post originally appeared on Western Journalism – Informing And Equipping Americans Who Love Freedom