The story

How many Western Roman rump states were there?

How many Western Roman rump states were there?

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

During the late 5th and early 6th century AD , there were many Roman officials who defended some parts of the declining Empire, and therefore became rulers of their petty kingdoms and in some cases city states.

For example: Syagrius , who ruled the Kingdom of Soissons, Appolinaris Sidonius , Vicentius, Desiderius , Burdunellus, Peter and Arbogast.

These local rulers established Roman Rump States between the territories of the Germanic Kingdoms in the late 5th and early 6th century AD.

One of these local rulers was Peter , who ruled the city of Dertosa in 506 AD.

My questions are: 1) How many local rulers were there at the time of the Fall of the Roman Empire in the late 5th and early 6th century?

2) What was the last existing Roman rump state to be conquered by one of the Germanic kingdoms?

There were several Berber/Roman rump states in North Africa.

One was the Mauro-Roman Kingdom in North Africa from about 429-578.

In one inscription King Masuna described himself as Rex gentium Maurorum et Romanorum, King of the Moorish and Roman Peoples", which indicates his realm was a sort of a Roman rump state.

In another rump state, the Kingdom of the Aures, King Masties ruled from about 426-494 or 449-516; an inscription claims that he ruled for 67 years as a Dux - military leader or duke - and for 40 or 10 of those years as Emperor of "Romans and Moors". A title that reminds me of the Bulgarian title of "Emperor of the Bulgarians and the Romans" or of Stefan Dushan's title of "Emperor of the Serbs and the Romans".

Some of those Roman-Berber rump states may have been conquered by the Kingdom of the Vandals. The Roman Empire reconquered much of North Africa in the Vandalic War in 533.

In the 570s King Garmul of the Mauro-Roman Kingdom attacked Roman Africa and in 577-579 he was defeated and killed. Part of the Mauro-Roman Kingdom was annexed to the Roman Empire and other parts became as many as eight successor kingdoms which might be considered Roman rump states to some degree.

These states were gradually conquered and/or converted to Islam during the invasions of North Africa by the Caliphate from about 647 to 698.

A Queen Dihya (died c. 700) was a famous leader of resistance to the Muslim invaders of Africa.

And of course there were a number of Romano-British states in Britain after 411. The Anglo-Saxon invaders of Britain didn't complete their conquest of those Romano-British states until the conquest of Gwynedd in 1282/83.

The last such rump to fall would appear to be what became Wessex, with the defeat and death of a British king named Natanleod by the Saxons Cerdic and Cynric, in 519 CE.

At some point one must draw a line and claim that all semblance of "Roman rule" has ended, and all that are left are local warlords of no significance. Where that line is drawn is always a matter of opinion. As Britannia was a single province, I chose to draw the line when it's most significant and wealthiest region had fallen, so that no remaining warlords had any semblance of a claim to "Roman legitimacy".

Italy in the Late Middle Ages

By the Late Middle Ages (circa 1300 onward), Latium, the former heartland of the Roman Empire, and southern Italy were generally poorer than the north. Rome was a city of ancient ruins, and the Papal States were loosely administered and vulnerable to external interference such as that of France, and later Spain. The papacy was affronted when the Avignon Papacy was created in southern France as a consequence of pressure from King Philip the Fair of France. In the south, Sicily had for some time been under foreign domination, by the Arabs and then the Normans. Sicily had prospered for 150 years during the Emirate of Sicily, and later for two centuries during the Norman Kingdom and the Hohenstaufen Kingdom, but had declined by the late Middle Ages.

Origin of the Papal States

In the 4th century, the bishops of Rome and the Catholic Church acquired lands around the city and governed them as the Patrimony of St Peter. In the early 5th century, the Roman Western Empire collapsed, and the Eastern Empire was weakened such that it could not control the entire territory. The population turned to the Catholic Church and the popes for protection and aid. Immigrants began settling on the land acquired by the church around Rome because it was much safer compared to other parts of the Roman Empire. In the 8th century, the Roman Eastern Empire could no longer protect Italy from invaders prompting Pope Gregory II to break ranks with the empire. Pope Gregory III succeeded the former and established a self-control rule in all lands owned by the Catholic Church thus creating the Papal States.

Michael Hudson

The Delphic Oracle Was Their Davos: A Four-Part Interview With Michael Hudson: (Part 3)

By John Siman, who is also the author of Part 1 and Part 2 in this series

John Siman: It seems that unless there’s a Hammurabi-style “divine king” or some elected civic regulatory authority, oligarchies will arise and exploit their societies as much as they can, while trying to prevent the victimized economy from defending itself.

Michael Hudson: Near Eastern rulers kept credit and land ownership subordinate to the aim of maintaining overall growth and balance. They prevented creditors from turning citizens into indebted clients obliged to work off their debts instead of serving in the military, providing corvée labor and paying crop rents or other fees to the palatial sector.

JS: So looking at history going back to 2000 or 3000 BC, once we no longer have the powerful Near Eastern “divine kings,” there seems not to have been a stable and free economy. Debts kept mounting up to cause political revolts. In Rome, this started with the Secession of the Plebs in 494 BC, a century after Solon’s debt cancellation resolved a similar Athenian crisis.

MH: Near Eastern debt cancellations continued into the Neo-Assyrian and Neo-Babylonian Empires in the first millennium BC, and also into the Persian Empire. Debt amnesties and laws protecting debtors prevented the debt slavery that is found in Greece and Rome. What modern language would call the Near Eastern “economic model” recognized that economies tended to become unbalanced, largely as a result of buildup of debt and various arrears on payments. Economic survival in fact required an ethic of growth and rights for the citizenry (who manned the army) to be self-supporting without running into debt and losing their economic liberty and personal freedom. Instead of the West’s ultimate drastic solution of banning interest, rulers cancelled the buildup of personal debts to restore an idealized order “as it was in the beginning.”

This ideology has always needed to be sanctified by religion or at least by democratic ideology in order to prevent the predatory privatization of land, credit, and ultimately the government. Greek philosophy warned against monetary greed [πλεονεξία,pleonexia] and money-love [φιλοχρηματία, philochrêmatia] from Sparta’s mythical lawgiver Lycurgus to Solon’s poems describing his debt cancellation in 594 and the subsequent philosophy of Plato and Socrates, as well as the plays of Aristophanes. The Delphic Oracle warned that money-love was the only thing that could destroy Sparta [Diodorus Siculus 7.5]. That indeed happened after 404 BC when the war with Athens ended and foreign tribute poured into Sparta’s almost un-monetized regulated economy.

The problem, as famously described in The Republic and handed down in Stoic philosophy, was how to prevent a wealthy class from becoming wealth-addicted, hubristic and injurious to society. The 7th-century “tyrants” were followed by Solon in Athens in banning luxuries and public shows of wealth, most notoriously at funerals for one’s ancestors. Socrates went barefoot [ἀνυπόδητος, anupodêtos] to show his contempt for wealth, and hence his freedom from its inherent personality defects. Yet despite this universal ideal of avoiding extremes, oligarchic rule became economically polarizing and destructive, writing laws to make its creditor claims and the loss of land by smallholders irreversible. That was the opposite of Near Eastern Clean Slates and their offshoot, Judaism’s Jubilee Year.

JS: So despite the ideals of their philosophy, Greek political systems had no function like that of Hammurabi-like kings — or philosopher-kings for that matter — empowered to hold financial oligarchies in check. This state of affairs led philosophers to develop an economic tradition of lamentation instead. Socrates, Plato and Aristotle, Livy and Plutarch bemoaned the behavior of the money-loving oligarchy. But they did not develop a program to rectify matters. The best they could do was to inspire and educate individuals — most of whom were their wealthy students and readers. As you said, they bequeathed a legacy of Stoicism. Seeing that the problem was not going to be solved in their lifetimes, they produced a beautiful body of literature praising philosophical virtue.

MH: The University of Chicago, where I was an undergraduate in the 1950s, focused on Greek philosophy. We read Plato’s Republic, but they skipped over the discussion of wealth-addiction. They talked about philosopher-kings without explaining that Socrates’ point was that rulers must not own land and other wealth, so as not to have the egotistical tunnel vision that characterized creditors monopolizing control over land and labor.

JS: In Book 8 of the Republic, Socrates condemns oligarchies as being characterized by an insatiable greed [ἀπληστία,aplêstia] for money and specifically criticizes them for allowing polarization between the super-rich [ὑπέρπλουτοι, hyper-ploutoi] and the poor [πένητες, penêtes], who are made utterly resourceless [ἄποροι, aporoi].

MH: One needs to know the context of Greek economic history in order to understand The Republic’s main concern. Popular demands for land redistribution and debt cancellation were resisted with increasing violence. Yet few histories of Classical Antiquity focus on this financial dimension of the distribution of land, money and wealth.

Socrates said that if you let the wealthiest landowners and creditors become the government, they’re probably going to be wealth-addicted and turn the government into a vehicle to help them exploit the rest of society. There was no idea at Chicago of this central argument made by Socrates about rulers falling subject to wealth-addiction. The word “oligarchy” never came up in my undergraduate training, and the “free market” business school’s Ayn Rand philosophy of selfishness is as opposite from Greek philosophy as it is from Judeo-Christian religion.

JS: The word “oligarchy” comes up a lot in book 8 of Plato’s Republic. Here are 3 passages:

1. At Stephanus page 550c … “And what kind of a regime,” said he, “do you understand by oligarchy [ὀλιγαρχία]?” “That based on a property qualification,” said I, “wherein the rich [πλούσιοι] hold office [550d] and the poor man [πένης, penês] is excluded.

2. at 552a … “Consider now whether this polity [i.e. oligarchy] is not the first that admits that which is the greatest of all such evils.” “What?” “The allowing a man to sell all his possessions, which another is permitted to acquire, and after selling them to go on living in the city, but as no part of it, neither a money-maker, nor a craftsman, nor a knight, nor a foot-soldier, but classified only as a pauper [πένης, penês] and a dependent [ἄπορος, aporos].” [552b] “This is the first,” he said. “There certainly is no prohibition of that sort of thing in oligarchical states. Otherwise some of their citizens would not be excessively rich [ὑπέρπλουτοι, hyper-ploutoi], and others out and out paupers [πένητες, penêtes].”

3 at 555b: “Then,” said I, “is not the transition from oligarchy to democracy effected in some such way as this — by the insatiate greed [ἀπληστία, aplêstia] for that which oligarchy set before itself as the good, the attainment of the greatest possible wealth?”

MH: By contrast, look where Antiquity ended up by the 2nd century BC. Rome physically devastated Athens, Sparta, Corinth and the rest of Greece. By the Mithridatic Wars (88-63 BC) their temples were looted and their cities driven into unpayably high debt to Roman tax collectors and Italian moneylenders. Subsequent Western civilization developed not from the democracy in Athens but from oligarchies supported by Rome. Democratic states were physically destroyed, blocking civic regulatory power and imposing pro-creditor legal principles making foreclosures and forced land sales irreversible.

JS:It seems that Greek and Roman Antiquity could not solve the problem of economic polarization. That makes me want to ask about our own country: To what extent does America resemble Rome under the emperors?

MH: Wealthy families have always tried to break “free” from central political power — free to destroy the freedom of people they get into debt and take their land and property. Successful societies maintain balance. That requires public power to check and reverse the excesses of personal wealth seeking, especially debt secured by the debtor’s labor and land or other means of self-support. Balanced societies need the power to reverse the tendency of debts to grow faster than the ability to be paid. That tendency runs like a red thread through Greek and Roman history.

This overgrowth of debt is also destabilizing today’s U.S. and other financialized economies. Banking and financial interests have broken free of tax liability since 1980, and are enriching themselves not by helping the overall economy grow and raising living standards, but just the opposite: by getting the bulk of society into debt to themselves.

This financial class is also indebting governments and taking payment in the form of privatizing the public domain. (Greece is a conspicuous recent example.) This road to privatization, deregulation and un-taxing of wealth really took off with Margaret Thatcher and Ronald Reagan cheerleading the anti-classical philosophy of Frederick von Hayek and the anti-classical economics of Milton Friedman and the Chicago Boys.

Something much like this happened in Rome. Arnold Toynbee described its oligarchic land grab that endowed its ruling aristocracy with unprecedented wealth as Hannibal’s Revenge. That was the main legacy of Rome’s Punic Wars with Carthage ending around 200 BC. Rome’s wealthy families who had contributed their jewelry and money to the war effort, made their power grab and said that what originally appeared to be patriotic contributions should be viewed as having been a loan. The Roman treasury was bare, so the government (controlled by these wealthy families) gave them public land, the ager publicus that otherwise would have been used to settle war veterans and other needy.

Once you inherit wealth, you tend to think that it’s naturally yours, not part of society’s patrimony for mutual aid. You see society in terms of yourself, not yourself as part of society. You become selfish and increasingly predatory as the economy shrinks as a result of your indebting it and monopolizing its land and property. You see yourself as exceptional, and justify this by thinking of yourself as what Donald Trump would call “a winner,” not subject to the rules of “losers,” that is, the rest of society. That’s a major theme in Greek philosophy from Socrates and Plato and Aristotle through the Stoics. They saw an inherent danger posed by an increasingly wealthy landholding and creditor ruling class atop an indebted population at large. If you let such a class emerge independently of social regulation and checks on personal egotism and hubris, the economic and political system becomes predatory. Yet that has been the history of Western civilization.

Lacking a tradition of subordinating debt and land foreclosure from smallholders, the Greek and Italian states that emerged in the 7thcentury BC took a different political course from the Near East. Subsequent Western civilization lacked a regime of oversight to alleviate debt problems and keep the means of self-support broadly distributed.

The social democratic movements that flowered from the late 19thcentury until the 1980s sought to re-create such regulatory mechanisms, as in Teddy Roosevelt’s trust busting, the income tax, Franklin Roosevelt’s New Deal, postwar British social democracy. But these moves to reverse economic inequality and polarization are now being rolled back, causing austerity, debt deflation and the concentration of wealth at the top of the economic pyramid. As oligarchies take over government, they lorded it over the rest of society much like feudal lords who emerged from the wreckage of the Roman Empire in the West.

The tendency is for political power to reflect wealth. Rome’s constitution weighted voting power in proportion to one’s landholdings, minimizing the voting power of the non-wealthy. Today’s private funding of political campaigns in the United States is more indirect in shifting political power to the Donor Class, away from the Voting Class. The effect is to turn governments to serve a financial and property-owning class instead of prosperity for the economy at large. We thus are in a position much like that of Rome in 509 BC, when the kings were overthrown by an oligarchy claiming to “free” their society from any power able to control the wealthy. The call for “free markets” today is for deregulation of rentier wealth, turning the economy into a free-for-all.

Classical Greece and Italy had a fatal flaw: From their inception they had no tradition of a mixed public/private economy such as characterized in the Near East, whose palatial economy and temples produced the main economic surplus and infrastructure. Lacking royal overrides, the West never developed policies to prevent a creditor oligarchy from reducing the indebted population to debt bondage, and foreclosing on the land of smallholders. Advocates of debt amnesties were accused of “seeking kingship” in Rome, or aspiring to “tyranny”(in Greece).

JS:It seems to me that you’re saying this economic failure is Antiquity’s original sin as well as fatal flaw. We have inherited a great philosophic and literary tradition from them analyzing and lamenting this failure, but without a viable program to set it right.

MH:That insight unfortunately has been stripped out of the curriculum of classical studies, just as the economics discipline sidesteps the phenomenon of wealth addiction. If you take an economics course, the first thing you’re taught in price theory is diminishing marginal utility: The more of anything you have, the less you need it or enjoy it. You can’t enjoy consuming it beyond a point. But Socrates and Aristophanes emphasized, accumulating money is not like eating bananas, chocolate or any other consumable commodity. Money is different because, as Socrates said, it is addictive, and soon becomes an insatiable desire [ἀπληστία, aplêstia].

JS:Yes, I understand! Bananas are fundamentally different from money because you can get sick of bananas, but you can never have too much money! In your forthcoming book, The Collapse of Antiquity, you quote what Aristophanes says in his play Plutus (the god of wealth and money). The old man Chremylus — his name is based on the Greek word for money, chrêmata[χρήματα] — Chremylus and his slave perform a duet in praise of Plutus as the prime cause of everything in the world, reciting a long list. The point is that money is a singular special thing: “O Money-god, people never get sick of your gifts. They get tired of everything else they get tired of love and bread, of music and honors, of treats and military advancement, of lentil soup, etc., etc. But they never get tired of money. If a man has thirteen talents of silver — 13 million dollars, say — he wants sixteen and if he gets sixteen, he will want forty, and so forth, and he will complain of being short of cash the whole time.”

MH: Socrates’s problem was to figure out a way to have government that did not serve the wealthy acting in socially destructive ways. Given that his student Plato was an aristocrat and that Plato’s students in the Academy were aristocrats as well, how can you have a government run by philosopher-kings? Socrates’s solution was not practical at that time: Rulers should not have money or property. But all governments were based on the property qualification, so his proposal for philosopher-kings lacking wealth was utopian. And like Plato and other Greek aristocrats, they disapproved of debt cancellations, accusing these of being promoted by populist leaders seeking to become tyrants.

JS:Looking over the broad sweep of Roman history, your book describes how, century after century, oligarchs were whacking every energetic popular advocate whose policies threatened their monopoly of political power, and their economic power as creditors and privatizers of the public domain, Rome’s ager publicus, for themselves.

I brought with me on the train Cæsar’s Gallic War. What do you think of Cæsar and how historians have interpreted his role?

MH:The late 1stcentury BC was a bloodbath for two generations before Cæsar was killed by oligarchic senators. I think his career exemplifies what Aristotle said of aristocracies turning into democracies: He sought to take the majority of citizens into their own camp to oppose the aristocratic monopolies of landholding, the courts and political power.

Cæsar sought to ameliorate the oligarchic Senate’s worst abuses that were stifling Rome’s economy and even much of the aristocracy. Mommsen is the most famous historian describing how rigidly and unyieldingly the Senate opposed democratic attempts to achieve a role in policy-making for the population at large, or to defend the debtors losing their land to creditors, who were running the government for their own personal benefit. He described how Sulla strengthened the oligarchy against Marius, and Pompey backed the Senate against Caesar. But competition for the consulship and other offices was basically just a personal struggle among rival individuals, not rival concrete political programs. Roman politics was autocratic from the very start of the Republic when the aristocracy overthrew the kings in 509 BC. Roman politics during the entire Republic was a fight by the oligarchy against democracy and the populace as a whole.

The patricians used violence to “free” themselves from any public authority able to check their own monopoly of power, money and land acquisition by expropriating smallholders and grabbing the public domain being captured from neighboring peoples. Roman history from one century to the next is a narrative of killing advocates of redistributing public land to the people instead of letting it be grabbed by the patricians, or who called for a debt cancellation or even just an amelioration of the cruel debts laws.

On the one hand, Mommsen idolized Cæsar as if he were a kind of revolutionary democrat. But given the oligarchy’s total monopoly on political power and force, Mommsen recognized that under these conditions there could not be any political solution to Rome’s economic polarization and impoverishment. There could only be anarchy or a dictatorship. So Caesar’s role was that of a Dictator — vastly outnumbered by his opposition.

A generation before Caesar, Sulla seized power militarily, bringing his army to conquer Rome and making himself Dictator in 82 BC. He drew up a list of his populist opponents to be murdered and their estates confiscated by their killers. He was followed by Pompey, who could have become a dictator but didn’t have much political sense, so Caesar emerged victorious. Unlike Sulla or Pompey, he sought a more reformist policy to check the senatorial corruption and self-dealing.

The oligarchic Senate’s only “political program” was opposition to “kingship” or any such power able to check its land grabbing and corruption. The oligarchs assassinated him, as they had killed Tiberius and Gaius Gracchus in 133 and 121, the praetor Asellio who sought to alleviate the population’s debt burden in 88 by trying to enforce pro-creditor laws, and of course the populist advocates of debt cancellation such as Catiline and his supporters. Would-be reformers were assassinated from the very start of the Republic after the aristocracy overthrew Rome’s kings.

JS:If Caesar had been successful, what kind of ruler might he have been?

MH:In many ways he was like the reformer-tyrants of the 7thand 6thcenturies in Corinth, Megara and other Greek cities. They all were members of the ruling elite. He tried to check the oligarchy’s worst excesses and land grabs, and like Catiline, Marius and the Gracchi brothers before him, to ameliorate the problems faced by debtors. But by his time the poorer Romans already had lost their land, so the major debts were owed by wealthier landowners. His bankruptcy law only benefited the well-to-do who had bought land on credit and could not pay their moneylenders as Rome’s long Civil War disrupted the economy. The poor already had been ground down. They supported him mainly for his moves toward democratizing politics at the expense of the Senate.

JS: After his assassination we get Caesar’s heir Octavian, who becomes Augustus. So we have the official end of the Republic and the beginning of a long line of emperors, the Principate. Yet despite the Senate’s authority being permanently diminished, there is continued widening of economic polarization. Why couldn’t the Emperors save Rome?

MH:Here’s an analogy for you: Just as nineteenth-century industrial reformers thought that capitalism’s political role was to reform the economy by stripping away the legacy of feudalism — a hereditary landed aristocracy and predatory financial system based mainly on usury — what occurred was not an evolution of industrial capitalism into socialism. Instead, industrial capitalism turned into finance capitalism. In Rome you had the end of the senatorial oligarchy followed not by a powerful, debt-forgiving central authority (as Mommsen believed that Caesar was moving toward, and as many Romans hoped that he was moving towards), but to an even more polarized imperial garrison state.

JS:That’s indeed what happened. The emperors who ruled in the centuries after Cæsar insisted on being deified — they were officially “divine,” according to their own propaganda. Didn’t any of them have the potential power to reverse the Roman economy’s ever-widening polarization of the, like the Near Eastern “divine kings” from the third millennium BC into the Neo-Assyrian, Neo-Babylonian and even the Persian Empire in the first millennium?

MH:The inertia of Rome’s status quo and vested interests among patrician nobility was so strong that emperors didn’t have that much power. Most of all, they didn’t have a conceptual intellectual framework for changing the economy’s basic structure as economic life became de-urbanized and shifted to self-sufficient quasi-feudal manor estates. Debt amnesties and protection of small self-sufficient tax-paying landholders as the military base was achieved only in the Eastern Roman Empire, in Byzantium under the 9th– and 10th-century emperors (as I’ve described in my history of debt cancellations in …and forgive them their debts).

The Byzantine emperors were able to do what Western Roman emperors could not. They reversed the expropriation of smallholders and annulled their debts in order to keep a free tax-paying citizenry able to serve in the army and provide public labor duties. But by the 11thand 12thcenturies, Byzantium’s prosperity enabled its oligarchy to create private armies of their own to fight against centralized authority able to prevent their grabbing of land and labor.

It seems that Rome’s late kings did something like this. That is what attracted immigrants to Rome and fueled its takeoff. But with prosperity came rising power of patrician families, who moved to unseat the kings. Their rule was followed by a depression and walkouts by the bulk of the population to try and force better policy. But that could no be achieved without democratic voting power, so faith was put in personal leader — subject to patrician violence to abort any real economic democracy.

In Byzantium’s case, the tax-avoiding oligarchy weakened the imperial economy to the point where the Crusaders were able to loot and destroy Constantinople. Islamic invaders were then able to pick up the pieces.

The most relevant point of studying history today should be how the economic conflict between creditors and debtors affected the distribution of land and money. Indeed, the tendency of a wealthy overclass to pursue self-destructive policies that impoverish society should be what economic theory is all about. We’ll discuss this in Part 4.


In pre-Christian Wales, the Druids (a special class of leaders) dominated a religion in which Celts worshipped a number of deities according to rites associated with nature (Hartman, p. 27). However, Welsh and Welsh American identities have centered on religious traditions of strictness, evangelicalism, and reform. From the breach between Welsh and Anglican churches stemmed modern Welsh nationalism itself. Also, Mormonism and scattered versions of pre-Christian paganism figure in Welsh American religion.

The patron saint of the Welsh, St. David (born circa 520) "organized a system of monastic regulations for his abbey . which became the awe of Christian Britain because of its severity of discipline" (Hartman, p. 28). St. David's Day commemorates his death. On the first day of March, Episcopalian churches such as St. David's Episcopalian Church in San Diego (the cornerstone of which comes from St. David's Cathedral in Wales) hold memorial services (Greenslade, p. 33). For all denominations of Welsh Americans, the day represents an occasion for the annual rallying of Welsh consciousness.

As Welsh churches pitted their religious fundamentalism against the English establishment, their progressivism foreshadowed contributions of Welsh Americans to American puritanism and progress. Around the year 1700, when English rule still dominated Welsh religion, the reform movement came from within the church and received its great stimulus from the pietistic evangelism introduced by John Wesley and George Whitfield. Soon these men, and Welshmen of similar beliefs, were emphasizing the necessity of abundant preaching within the church and the need for experiencing a rebirth in religious conviction as a necessary part in the salvation of the individual.

After this evangelical Methodism spread through Wales, Welsh Methodists split from Wesley and from English Methodists and followed Whitfield into Calvinism, calling themselves Calvinist Methodists. Welsh Methodists, furthermore, withdrew from the Anglican Church and precipitated a consolidation of Welsh culture. "Within a few decades, the Calvinist-Methodists, the Congregationalists, and the Baptists had won over the great majority of the masses of Wales from the established [Anglican] church," and at Sunday Schools taught Welsh people to read the Bible (Hartman, p. 33).

Welsh Christian nonconformists shared fundamentalism and puritanism, yet did not lack for internal controversy. Unifyingly, their shared religion demanded "rigid observance of the marriage vows, discouragement of divorce, austere observance of conduct of life generally" and the strict reservation of Sundays for religious activities on the other hand, divisive religious differences arose "over the issues of church organization, Calvinism, and infant baptism" (Hartman, pp. 103-104). Congregations and denominations guarded their independence.

In America, as in Wales, Welsh churches pioneered Sunday Schools children and adults attended separate classes in which teachers used Socratic methods of questioning. Welsh American churchgoers sang hymns and testified, respectively, on Tuesday and Thursday nights, and regularly held gymanvas, preaching festivals.

The first groups of Welsh converts to Mormonism came to America in the 1840s and 1850s. Mormon founder Joseph Smith converted Captain Dan Jones to the religion, then sent him on a mission to Wales. Captain Jones in turn converted thousands, most of whom resettled in Utah and contributed much to Mormon culture. As a prime example, Welsh Americans founded the Mormon Tabernacle Choir.

Since the 1960s, versions of Celtic nature-worship have gained popularity in America and Britain. Two members of the Parent Kindred of the Old Religion in Wales brought Hereditary Welsh Paganism to the United States in the early 1960s. Today, Welsh Pagans can be found in Georgia, Wisconsin, Minnesota, Michigan, California, and West Virginia. Welsh pagans form circles with names like The Cauldron, Forever Forests, and Y Tylwyth Teg. Members take symbolic Welsh names like Lord Myrddin Pendevig, Lady Gleannon or Gwyddion, Tiron, and Siani. Welsh pagans in America also use the Welsh language in their rituals. Although the Druids, who led the pre-Christian Welsh religion, have not survived, some of their practices have.

Historically each Potawatomi village was ruled by a chief, called a wkema, or leader. The chief, a senior member of the clan and a man of good character, was selected by his village. If he were strong and wealthy enough he could rule over several villages, but this did not happen often.

The chief was assisted by a council of adult males who approved the chief’s decisions and a society of warriors called the wkec tak. A man called the pipelighter carried announcements, arranged ceremonies, and called council meetings.

Relationships among the widely scattered Potawatomi villages (they had villages in four states) were kept strong through social ties such as marriage. As the Potawatomi nation expanded new villages were founded, but the people retained close ties to their old villages and clans. The clans, such as the Bear Clan and the Wolf clan, were large extended family groups that originally had animal symbols.

History of Abortion

Pro-choice and pro-life demonstrators during the 2004 Washington, DC March for Women’s Lives protest.
Source: Declan McCullagh Photography, (accessed Apr. 1, 2010)

The debate over whether abortion should be a legal option continues to divide Americans long after the US Supreme Court’s 7-2 decision on Roe v. Wade [49] declared the procedure a “fundamental right” on Jan. 22, 1973.

Proponents, identifying themselves as pro-choice, contend that choosing abortion is a right that should not be limited by governmental or religious authority, and which outweighs any right claimed for an embryo or fetus. They say that pregnant women will resort to unsafe illegal abortions if there is no legal option.

Opponents, identifying themselves as pro-life, contend that individual human life begins at fertilization, and therefore abortion is the immoral killing of an innocent human being. They say abortion inflicts suffering on the unborn child, and that it is unfair to allow abortion when couples who cannot biologically conceive are waiting to adopt.

Variations exist in arguments on both sides of the debate. Some pro-choice proponents believe abortion should only be used as a last resort, while others advocate unrestricted access to abortion services under any circumstance. Pro-life positions range from opposing abortion under any circumstance to accepting it for situations of rape, incest, or when a woman’s life is at risk.

Pro-Choice and Pro-Life Groups

Some prominent pro-choice organizations include Planned Parenthood, NARAL Pro-Choice America, the National Abortion Federation, the American Civil Liberties Union (ACLU), and the National Organization for Women. Although many pro-life positions derive from religious ideology, several mainstream faith groups support the pro-choice movement, such as the United Methodist Church, United Church of Christ, the Episcopal Church, Presbyterian Church (USA), and the Unitarian Universalist Association. The 2016 Democratic Party Platform endorsed the pro-choice position, stating, “We believe unequivocally, like the majority of Americans, that every woman should have access to quality reproductive health care services, including safe and legal abortion – regardless of where she lives, how much money she makes, or how she is insured. We believe that reproductive health is core to women’s, men’s, and young people’s health and wellbeing.” [169] However, 26% of Democrats consider themselves to be pro-life. [170]

Some prominent pro-life organizations include The National Right to Life Committee, Pro-Life Action League, Operation Rescue, the Catholic Church, the Eastern Orthodox Church, Americans United for Life, the National Association of Evangelicals, Family Research Council, Christian Coalition of America, and the Church of Jesus Christ of Latter-Day Saints (Mormon Church). [6] The 2016 Republican Party Platform opposed abortion, stating, “We oppose the use of public funds to perform or promote abortion or to fund organizations, like Planned Parenthood, so long as they provide or refer for elective abortions or sell fetal body parts rather than provide healthcare… We will not fund or subsidize healthcare that includes abortion coverage… We thank and encourage providers of counseling, medical services, and adoption assistance for empowering women experiencing an unintended pregnancy to choose life.” [171] However, 36% of Republicans consider themselves to be pro-choice. [170]

Bob Englehart’s 1981 political cartoon “When Does Life Begin?,” originally published by The Hartford Courant.
Source: “Cartoon Plagiarism Case Offers a Metaphor for the Abortion Debate,”, Nov. 15, 2005

Public Opinion

A 2018 Marist Poll and Knights of Columbus survey found that 51% of Americans consider themselves to be pro-choice, and 44% consider themselves to be pro-life. [174]

A 2017 Pew Research survey found that 57% of Americans say abortion should be legal in all or most cases, while 40% say it should be illegal in all or most cases. [201]

Pew Research found that 69% of Americans – 84% of Democrats and 53% of Republicans – surveyed said “No, do not overturn” in response to the question “Would you like to see the Supreme Court completely overturn its Roe versus Wade decision, or not?” [175]

A 2018 PPRI poll found that 45% of women and 42% of men agreed abortions should be covered by most health insurance plans. [172]

Abortion Procedures

Surgical abortion (aka suction curettage or vacuum curettage) is the most common type of abortion procedure. It involves using a suction device to remove the contents of a pregnant woman’s uterus. Surgical abortion performed later in pregnancy (after 12-16 weeks) is called D&E (dilation and evacuation). [81][82] The second most common abortion procedure, a medical abortion (aka an “abortion pill”), involves taking medications, usually mifepristone and misoprostol (aka RU-486), within the first seven to nine weeks of pregnancy to induce an abortion. [39] The Centers for Disease Control and Prevention (CDC) found that 67% of abortions performed in 2014 were performed at or less than eight weeks’ gestation, and 91.5% were performed at or less than 13 weeks’ gestation. [176] 77.3% were performed by surgical procedure, while 22.6% were medical abortions. [176] An abortion can cost from $500 to over $1,000 depending on where it is performed and how long into the pregnancy it is. [147][177]

Early History

Abortion techniques were developed as early as 1550 BC, when the Egyptian medical text Ebers Papyrus suggested that the vaginal insertion of plant fiber covered with honey and crushed dates could induce an abortion. Abortion was an accepted practice in ancient Greece and Rome. Greek philosopher Aristotle (384–322 B.C) wrote that “when couples have children in excess, let abortion be procured before sense and life have begun.” [86] In the latter days of the Roman Empire, abortion was considered not as homicide but as a crime against a husband who would be deprived of a potential child. [87][86]

Throughout much of Western history, abortion was not considered a criminal act as long as it was performed before “quickening” (the first detectable movement of the fetus, which can occur between 13-25 weeks of pregnancy). [86][88] American states derived their initial abortion statutes from British common law, which followed this principle. [106] Until at least the early-1800s, abortion procedures and methods were legal and openly advertised throughout the United States. [89][91] Abortion was unregulated, however, and often unsafe. [90]

In 1821, Connecticut became the first state to criminalize abortion. The state banned the selling of an abortion-inducing poison to women, but it did not punish the women who took the poison. Legal consequences for women began in 1845 when New York criminalized a woman’s participation in her abortion, whether it took place before or after quickening. [41] In the mid-1800s, early pro-life advocate Dr. Horatio Robinson Storer (1830-1922) convinced the American Medical Association to join him in campaigning for the outlawing of abortion nationwide. [92][90] By the early 1900s, most states had banned abortion. By 1965, all 50 states had outlawed abortion, with some exceptions varying by state. [42]

Demonstrators holding pro-choice and pro-life signs.
Source: “New Pew Poll Shows Support for Legal Abortion Drops to Lowest Level in 15 Years,”, Apr. 29, 2009

The motivation behind these early abortion laws has been disputed. Some writers argue that the laws were not aimed at preserving the lives of unborn children, but rather were intended to protect women from unsafe abortion procedures [90] , or to allow the medical profession to take over responsibility for women’s health from untrained practitioners. [86] Others say that pro-life concerns were in fact already prevalent and were a major influence behind the efforts to ban abortion. [93]

Federal action on abortion didn’t occur until Roe v. Wade, which declared most state anti-abortion laws unconstitutional. The high court’s 7-2 decision established rules based on a pregnancy trimester framework, banning legislative interference in the first trimester of pregnancy (0-12 weeks), allowing states to regulate abortion during the second trimester (weeks 13-28) “in ways that are reasonably related to maternal health,” and allowing a state to “regulate, and even proscribe” abortion during the third trimester (weeks 29-40) “in promoting its interest in the potentiality of human life,” unless an abortion is required to preserve the life or health of the mother. [49][95] The decision also allowed states to prohibit abortions performed by anyone who is not a state-licensed physician. [49]

The initial Roe v. Wade lawsuit was filed at the Dallas federal district courthouse on Mar. 3, 1970 by pregnant Texas resident Norma McCorvey, named in court documents as “Jane Roe.” Henry Wade, Dallas County District Attorney from 1951 to 1987, was the named defendant. McCorvey was seeking to end her pregnancy, but abortion was illegal in Texas except to save the mother’s life. [96][97] McCorvey said the pregnancy was the result of rape, but she later retracted that claim, admitting she lied in the hope of increasing her chances of procuring an abortion. The baby was eventually delivered and given up for adoption. [123] McCorvey later abandoned her support of abortion rights, becoming a pro-life activist and an evangelical Christian in 1995. She then converted to Catholicism and took part in silent prayer vigils outside abortion clinics. [100] In the 2020 documentary, AKA Jane Roe, McCorvey claimed anti-abortion activists paid her to support their cause. [218]

Federal Regulation

Immediately following Roe v. Wade, pro-life proponents pushed for federal legislation that would restrict abortion. In 1976, Congress passed the appropriations bill for the Departments of Labor, Health, Education, and Welfare (now the Department of Health and Human Services) which included an amendment ending Medicaid funding for abortions. Known as the “Hyde Amendment,” this provision banning federal funding for abortions has been renewed with various revisions every year since its inception.

At the Aug. 1984 United Nations International Conference on Population held in Mexico City, Mexico, President Ronald Reagan announced the Mexico City Policy, [60] which restricted all non-governmental organizations funded by the US Agency for International Development (USAID) from performing or promoting abortion services. President Bill Clinton rescinded the policy (Jan. 22, 1993) President George W. Bush reenacted it (Jan. 22, 2001) President Barack Obama again rescinded it (Jan. 23, 2009) President Donald Trump again reinstated it (Jan. 23, 2017) and President Biden revoked it once again (Jan. 28, 2021). [60] [168] [221]

A coat-hanger is a frequently used symbol for abortion rights.
Source: “Celebrating 25 Years of Decriminalized Abortion in Canada,, Jan. 26, 2013

On June 29, 1992 the US Supreme Court case Planned Parenthood of Southeastern Pennsylvania v. Casey [57] (5-4) upheld the constitutional right to have an abortion, but it abandoned the “rigid trimester framework” outlined in Roe v. Wade and adopted a less restrictive standard for state regulations. The decision allowed states to impose waiting periods before a woman can obtain an abortion, allowed some legislative interference in the first trimester in the interests’ of a woman’s health, and permitted parental consent requirements for minors seeking abortions. [107] The Court ruled that none of these conditions imposed an “undue burden” upon women seeking abortions, but some pro-choice advocates warned that Roe v. Wade had been significantly weakened and that states would limit abortion access. [108][109]

On Nov. 5, 2003, after passing in the US House of Representatives (281-142) and the US Senate (64-34), the Partial-Birth Abortion Ban Act of 2003 [58] was signed into law by President George W. Bush. This federal legislation banned physicians from providing intact dilation and extraction (aka “partial-birth” abortion), a late-term (after 21 weeks gestation) method which accounted for 0.17% of abortion procedures in 2000. [43] The act defines a “partial-birth abortion” as “an abortion in which the provider deliberately and intentionally vaginally delivers a living fetus until… the entire fetal head is outside the body of the mother, or… any part of the fetal trunk past the navel is outside the body of the mother, for the purpose of performing an overt act that the person knows will kill the partially delivered living fetus.” Pro-choice advocates challenged the constitutionality of the Partial-Birth Abortion Ban Act of 2003 however, the Apr. 18, 2007 US Supreme Court case Gonzales v. Carhart/Gonzales v. Planned Parenthood [59] upheld the act, ruling 5-4 that it did not impose “an undue burden on a woman’s right to abortion.”

The topic of abortion was raised during the 2009-2010 US Congress health care debate. Some pro-life advocates said the Patient Protection and Affordable Care Act would allow federal funding for abortions, a claim denied by abortion rights supporters. To ensure passage of the bill, President Obama signed an executive order “to establish an adequate enforcement mechanism to ensure that Federal funds are not used for abortion services,” re-affirming Hyde Amendment restrictions and extending them to cover the newly created health insurance exchanges. [63]

In Mar. 2017, the Department of Health and Human Services announced that all federally-funded shelters housing undocumented unaccompanied minors are henceforth prohibited from taking “any action that facilitates” access to abortion. [198] The American Civil Liberties Union (ACLU) challenged this decision in Garza v. Hargan, and on Mar. 30, 2018 the US District Court for the District of Columbia issued an injunction, ruling that the federal government must not interfere or obstruct any “unaccompanied immigrant minor children who are or will be in the legal custody of the federal government” from having an abortion while the case is being heard. [199][200]

A US district judge ruled on July 13, 2020 that requiring in-person visits for abortions was unconstitutional during the COVID-19 (coronavirus) pandemic. The ruling allows healthcare providers nationwide to mail mifepristone for the duration of the pandemic. The drug, when used in combination with misoprotosol, induces an abortion, and is the only drug the FDA requires to be administered in a medical setting, according to the ACLU. [220]

State Restrictions

State restrictions on abortion access increased sharply after the 2010 midterm elections, in which Republicans gained at least 675 state legislative seats, the biggest gain made by any party in state legislatures since 1938. [162] Between 2011 and 2017, states enacted over 400 new abortion restrictions. [178] These represent 34% of the 1,193 restrictions enacted since Roe vs. Wade in 1973. [178] Between Jan. and Mar. 2018, 308 new abortion restrictions were introduced in 37 states, 10 of which were enacted. [179]

Anti-abortion sign and wooden crosses placed outside the Whole Woman’s Health abortion provider in McAllen, TX.
Source: “Anti-Abortion Groups Push New Round of Abortion Rules in Texas,”, Nov. 22, 2012

Fetal pain laws or 20-week bans typically ban abortion at or after 20 weeks of gestation on the theory that a fetus can feel pain at that time. On Apr. 13, 2010, Nebraska’s Republican Governor Dave Heineman signed the first law in the United States to restrict abortions based on fetal pain. [47] After Nebraska’s law was passed, several other states enacted similar laws. [101] On Mar. 6, 2013, Idaho’s fetal pain law was the first to be struck down by a federal court. [102]

Ultrasound laws require pregnant women seeking an abortion to get an ultrasound, which is frequently accompanied by a detailed description of the fetus’ heart, limbs, and organs. While other states had passed laws requiring women to undergo an ultrasound before having an abortion, on Apr. 27, 2010, the Oklahoma legislature passed the first law requiring that the women watch the monitor and listen to a detailed description of the fetus. [48] However, the law was struck down by the Supreme Court in 2013. [188] Many states have laws regulating the provision of ultrasound by abortion providers three of these states (LA, TX, WI) require the abortion provider to display and describe the image to the pregnant woman. [188]

The criminalization of abortions based on the sex or race of a fetus was first enacted in Arizona on Mar. 29, 2011. The bill, signed into law by Republican Governor Jan Brewer, was opposed by Democrats, who said there was little evidence that sex- or race-selection abortions were taking place in the state. [64] As of June 2018, eight states (AZ, AR, KS, NC, ND, OK, PA, SD) ban sex-selection abortions Arizona is the sole state to ban race-selection abortions. [186]

Fetal abnormality laws ban abortions in cases of fetal abnormality even if the fetus will die before or shortly after birth. Enacted in 2013, North Dakota is the only state to ban abortions in cases of fetal abnormality. [186]

Fetal heartbeat laws or six-week bans outlaw abortions as early as six weeks after a woman’s last menstrual period, when a fetal heartbeat can first be detected. In Mar. 2013, North Dakota enacted a fetal heartbeat law. [110] A federal appeals court struck down the law in 2015, noting that the law “violates Supreme Court precedent establishing that abortion is legal until a fetus is viable outside of the womb, usually about 24 weeks into pregnancy.” [183] In 2018, the governors of Mississippi and Iowa signed into law similarly restrictive abortion laws banning abortion at 15-weeks and 6-weeks respectively both laws were put on hold by federal judges pending appeals. [181][182]

Admitting privileges and surgical center standards laws require that doctors who perform abortions have admitting privileges in local hospitals, and these laws require abortion clinics to have the same building standards as ambulatory surgical centers. Despite an 11-hour filibuster from State Senator Wendy Davis, the Texas legislature passed a law in 2013 that added admitting privileges and surgical center requirements. The number of clinics providing abortion services fell from 42 to 19 over the next two years. On June 27, 2016, the US Supreme Court struck down the Texas law. Writing for the majority, Justice Stephen Breyer said: “neither of these provisions offers medical benefits sufficient to justify the burdens upon access that each imposes… each violates the Federal Constitution.” [111][112][167] A similar law passed in Arkansas in 2015 requires abortion providers using pills to induce abortion in the first nine weeks of pregnancy (medication abortions) to have admitting privileges at local hospitals. [185] On June 29, 2020, in a 5-4 ruling, the Supreme Court struck down a Louisiana admitting privilege law similar to the Texas law struck down in 2016. [192]

Trigger laws are abortion bans that would stop all or nearly all abortions if Roe v. Wade is overturned. During the 2018 midterm elections, voters in Alabama and West Virginia voted in favor of constitutional amendments that would restrict access to abortion if Roe v. Wade were to be overturned by the Supreme Court. [206] [207] As of Apr. 1, 2019, six states have trigger laws that would ban all or nearly all abortions and an additional five states have trigger laws that were blocked by courts but could be put in effect if Roe v. Wade were overturned. [209]

Roe v. Wade protection laws are those that codify the right to abortion within the state constitution or legal code and are meant to be a protection against Roe v. Wade being overturned by the US Supreme Court. In 2017, Oregon enacted the Reproductive Health Equity Act that keeps abortion legal even if the Supreme Court overturns Roe v. Wade. [210] As of Apr. 1, 2019, nine other states have laws that will keep abortion legal prior to fetal viability. [209]

Laws designed to challenge Roe v. Wade in court were passed by several states in 2019. [215] These laws typically combined six-week bans with other restrictive measures such as allowing no exceptions for rape or incest and including felony penalties for doctors who perform abortions. Alabama passed the most restrictive of these laws to date on May 16, 2019. Alabama State Representative Terri Collins (R) stated, “This bill is about challenging Roe v. Wade and protecting the lives of the unborn, because an unborn baby is a person who deserves love and protection.” Elizabeth Nash, MPP, Senior States Issue Manager at the Guttmacher Institute, stated, “There’s a real momentum around banning abortion at the state level and it’s stemming from the shift in the U.S. Supreme Court” with the addition of conservative Associate Justices Neil Gorsuch and Brett Kavanaugh. [211][212][213][214]

COVID-19 (coronavirus) restrictions were put in place by at least seven states by Apr. 9, 2020, including Alabama, Indiana, Iowa, Mississippi, Ohio, Oklahoma, and Texas. Each state listed abortion as a nonessential medical procedure during the COVID-19 pandemic, which banned abortion. The states contend they were freeing up medical personnel to deal with the pandemic, while abortions rights supporters argued that the states were already hostile to abortion rights and were using the pandemic as an excuse to enact a ban that could last beyond the pandemic. Federal judges blocked the bans at least in part in most of the states. [217]

Abortion Statistics

From Roe v. Wade through 2017, over 60 million legal abortions were estimated to have been performed in the United States – an average of about 1.4 million abortions per year. [189] In 2014, 19% of pregnancies (excluding miscarriages) ended in abortion, and 1.5% of women aged 15-44 had an abortion. [190] At 2014 abortion rates, one in twenty US women will have an abortion before age 20, one in five by 30, and about one in four by 45. [190] 11% of women having an abortion are teenagers, while most women having abortions are in their 20s: 32% aged 20-24 and 27% aged 25-29. [176]

The US abortion rate fell 29% between 1990 and 2005, from 27.4 to 19.4 abortions per 1,000 women of childbearing age, before leveling out from 2005-2008. [65] Between 2008 and 2011, the abortion rate dropped again by 13% to its lowest point since 1973: 17 abortions for every 1,000 women in 2014 the rate dropped another 14% to 15 abortions per 1,000 women. [190] Pro-choice supporters credited an increased use of new birth control methods such as Mirena (an intra-uterine device that can last for several years) as one of the reasons for the decline. Pro-life groups credited an increase in anti-abortion laws at the state level amongst other factors, although abortion rates dropped faster than the national average in some states that had not enacted abortion restrictions, such as Illinois, where the rate dropped by 18%. [13][85][121]

The number of abortion providers has been declining since 1984, after it reached a peak of 2,908 providers in 1982. [196] There were 1,671 abortion providers in the United States in 2014, including 272 abortion clinics, 516 non-specialized clinics, 638 hospitals, and 245 physicians’ offices. [191] 90% of US counties did not provide abortion services, with 39% of women living in those counties. [191] Between 2011 and 2017, at least 126 clinics providing abortion services closed. [124][192][193] Seven states (KY, MO, MS, ND, SD, WV, WY) have only one clinic left. [194]

Pro-choice advocates believe increased clinic violence has contributed to this downward trend in abortion providers. [99] In 2016, 6% of abortion clinics reported losing staff members as a result of anti-abortion violence or harassment. [197] According to the National Abortion Federation, a professional association of abortion practitioners, at least 229 arson attacks/bombings were committed against abortion providers between 1977 and 2017, with at least another 99 attempted arson attacks/bombings. [195] Additionally, at least 11 abortion providers were murdered during that time and there were at least 26 attempted murders of clinic staff and physicians. [195] Mainstream pro-life leaders and organizations have publicly denounced violence committed against abortion providers and clinics. [98]

In 2017, abortion rates declined to an estimated 862,320 in the United States, or 13.5 abortions per 1,000 women between the ages of 15 and 44. Those rates represent a 7% drop since 2014, according to a Sep. 2019 Guttmacher Institute, and the lowest recorded rate since abortion was legalized in 1973. [216]

A June 2021 Gallup poll found 47% of Americans believed abortion to be morally acceptable, while 46% believed it not to be. 48% thought abortion should be legal “only under certain circumstances,” 32% “under any circumstances,” and 19% “illegal in all circumstances.” The majority of Americans opposed overturning Roe v. Wade (58%), while 32% are in favor of overturning the US Supreme Court decision. 56% oppose banning abortion after the 18th week of pregnancy, 58% oppose fetal heartbeat restrictions, and 57% oppose abortion bans if the fetus is found to have a genetic disease or disorder. [222] [223]

What Were the Countries of the British Empire?

The British Empire held Canada, Australia, New Zealand, Tonga, Fiji, Western Samoa, India, Burma, Papa New Guinea, Malaya, Sarawak, Brunei, Oman, Iraq, Egypt, Libya, Sudan, Kenya, Uganda, Northern and Southern Rhodesia, Tanganyika, Zanzibar, Mauritius, the Maldives, South Africa, Swaziland, Nigeria, Gold Coast, and Sierra Leone, among other countries during its reign. It has also held a portion of the present day United States and China.

Over the course of Britain's existence, the country has invaded nine out of 10 of the world's countries, or all but 22 of them in total. At its peak, the British Empire was composed of about one-fifth of the entire world's population and covered about a quarter of the world's total land mass.

The empire spanned from the 16th century, when Britain began colonizing the Americas, to the present day where Britain retains sovereignty over 14 external territories. 53 states are also voluntary members of the Commonwealth of Nations and continue to recognize England's royal family as the heads of state. A trend of decolonization began after World War II, with many countries gradually becoming independent. Prince Charles regarded the return of Hong Kong to China in 1997 as the informal end of the British Empire.

In American History

In the 1850s, a burgeoning coalition of self-proclaimed nativists swept into office and called for radical change. During the nineteenth century, the perception of immigrants shifted from welcome to demonization, usually depending on whether the United States was going through economic expansion or stagnation.

From the start, immigration and the resulting competition, whether religious, class, or racial, between ethnic groups became a key issue in the development of the United States, and one that was frequently expressed in the rhetoric of conspiracy theory.

Historically, immigration falls into three periods: colonial and eighteenth century “Old” in the first half of the nineteenth century and “New” starting in the 1880s. The decade from 1845 to 1854 saw the greatest proportionate influx of immigrants in U.S. history. By 1860 more than one out of every eight Americans was foreign-born, with the most numerous being Irish, German, and English immigrants.

Each period generated its own kind of nativist reaction, from Know-Nothingism (the openly nativist political party of the 1840s and 1850s), to anti-immigration laws (the first being the Chinese Exclusion Act of 1882, culminating in the closing of the gates through the National Origins Acts of 1921 and 1924).

It is important to note, however, that openness to immigration has remained the majority opinion, for in Tom Paine’s words, the United States was to be “an asylum for the persecuted lovers of civil and religious liberty” from all parts of the world.

In the colonial period, although ethnic mixture was the reality, with a majority white population living alongside an Indian and a black group of African origin, the white group was very heterogeneous in its composition. The majority were of English origin but many were Dutch, French Huguenots, German, and Scots-Irish, which created frictions.

For instance, in the Massachusetts colony, the Puritans did all they could not to admit non-English settlers. In spite of the reality of ethnic plurality, the global perception was that of Englishness. Hence, after the Revolution, the 60 percent of English origin in the white community took political power and set the tune culturally.

Early nativism was marked by a belief in total assimilation, the giving up of one’s former culture, language, and behavior to be blended into a new identity, that of an American, as celebrated by Hector Saint John de Crèvecoeur, who glorified the land of limitless opportunities to all newcomers (the “melting-pot” theory).

The asylum tradition was promoted through the 1790 Naturalization Act, which made it possible for virtually anybody to be admitted and naturalized into a citizen. However, this “generous” act contained limitations only “free white persons” who had resided in the United States for at least two years were eligible for naturalization. Hence, from the start, the reality of social and political exclusion—of blacks and Indians—paved the way for future exclusions.

The self-image of hospitality was seriously tested at the time of the 1798 Alien and Sedition Acts, which gave the president arbitrary countersubversive powers to exclude or deport any foreigner deemed to be dangerous, and to prosecute anybody publishing or writing in “a false, scandalous and malicious nature” about the president or Congress.

The government was reacting against European radicals whose political activities were considered subversive. The Naturalization Act was amended to provide for a fourteen-year residency requirement for prospective citizens in 1802, Congress reduced the waiting period to five years, a provision that remains in effect today.

In the following decades, most immigrants entering the United States were Roman Catholics (one-third of all immigrants between 1830 and 1840 were from Catholic Ireland), and so ethnic prejudice against immigrants was also usually accompanied by conspiracy-mongering against Catholicism.

Since the colonial period, Americans had come to identify themselves as a Protestant nation, and many leading Protestant clergymen had cautioned the country against a papal plot to destroy U.S. liberty and society.

In the nineteenth century this conspiratorial tradition fed into nativism in a variety of forms: exclusive nativist clubs and fraternities such as the Order of United Americans or the United Sons of America and political parties, especially when the social and economic situation was bleak, as in the late 1830s, the early 1840s, and the mid-1850s.

These groups attracted middle-class Protestants, members of the two “traditional” parties (Democratic and Whig), and working-class voters who resented what they considered to be the job competition from immigrants, the increase in crime, public drunkenness, and pauperism, and the manipulation of immigrant voters.

More significant was the proliferation of nativist propaganda. Prompted by the news of an Austrian Catholic missionary society sending money and men to the United States, Samuel F. B. Morse, a distinguished professor of sculpture and painting at New York University, wrote A Foreign Conspiracy against the Liberties of the United States (1834) and he went on to publish The Imminent Dangers to the Free Institutions of the United States (1835), both of which involved denunciations of the Catholic conspiracy against the United States.

Lyman Beecher, a seventh-generation clergyman and president of Lane Theological Seminary in Cincinnati, published A Plea for the West in 1835, in which he exposed the alleged plot by the pope to build a “Vatican” in the West by sending hordes of Catholic settlers there. However, perhaps the most effective was the “Uncle Tom’s Cabin” of nativism, the Awful Disclosures of Maria Monk, which sold 300,000 in 1836.

Monk told of her alleged experiences with Catholicism, which involved forced sexual intercourse with priests and the murdering of nuns and children. Although her mother denied the legitimacy of her work, stating that Maria never belonged to the nunnery and that a brain injury her daughter received as a child could be the cause of her stories, the book was widely accepted as truth.

In 1841, the Vindicator was published by Rev. W. C. Brownlee, the leader of the New York Protestant Association. In the same year, there was growing concern in New York State that Catholics were gaining influence in schools because of the action of Archbishop John Hughes of New York.

He was seeking to obtain state aid for Catholic schools, which was interpreted as both a subversive plot against the First Amendment, and a refusal by Catholics to attend public schools and be assimilated. In 1842, the American Protestant Association was founded by 100 Clergymen in Philadelphia to oppose Catholics.

This propaganda led to agitation, rioting, and mobbing. Although Catholics occasionally reacted to the nativist movement with violence, nativists instigated the greater part of those violent acts. In Boston, there were numerous riots in 1823, 1826, and 1829. In May 1832, these potentially explosive conditions produced a riot at a New York Protestant Association meeting.

Further, while addressing a Baltimore Baptist audience in 1834, a group of Catholics attacked a Baptist speaker. On 10 August 1834 a mob of forty to fifty people gathered outside of the Ursuline Convent School at Charlestown, near Boston, and burned it to the ground. Although eight people were arrested and tried, only one was sentenced to life imprisonment.

This rather lenient sentence, together with the lack of condemnation in moderate Protestant circles, shows how widespread hostility to Catholics had become. The violence continued into the following decade when, for example, thirty people were killed and hundreds injured during nativist riots in Philadelphia in 1844.

Anti-Catholicism gradually evolved into a political crusade. In 1844, James Harper founded the American Republican Party in order to break the deadlock between the Whig and Democratic Parties in New York State, and offer another approach to politics.

It allied with the Whigs, which resulted in the defeat of the Democratic Party. The American Republican Party demonstrated the political relevance of the nativist movement and paved the way for the entrance of the Know-Nothings into the national political scene as the only coherent organization to rest its political action on hostility to immigration and to Catholics.

The American Party had its origins in 1849 in New York. At first a secret society called the Order of the Star-Spangled Banner, it became a formal party in 1853, and its members were dubbed the Know-Nothings (after their refusal to answer questions about their involvement) by Horace Greeley, a famous newspaper editor. By the middle of the 1850s the party ranked over a million members across the country.

At the local level, in the 1854 election, the Know-Nothings won six governorships and controlled legislatures in Massachusetts, New Hampshire, Connecticut, Rhode Island, Pennsylvania, Delaware, Maryland, Kentucky, and California, where they passed discriminatory laws against immigrants, including the first literacy tests for voting, in order to disenfranchise the Irish.

The party’s platform focused on voting rights, stretching the residency period before naturalization from five to twenty-one years, and requiring the exclusion of foreigners and Catholics from public office. After the defeat of their candidate in the presidential election of 1856, the Know-Nothings were split by their inability to overcome the slavery issue.

They lost influence and were absorbed into the expanding Republican Party, formed in 1854. However, another important factor in their decline was that not all Americans opposed the arrival of new immigrants because they were much needed by industrialists, railroad builders, and other businessmen as unskilled labor willing to accept lower wages.

Exclusion or Americanization?

After the Civil War, “new immigrants” from southern and central Europe, even more numerous and alien, increased the demonological anxiety of the native-born, which led to numerous conflicts and a radical reexamination of the country’s immigration policy.

From 1880 to 1930 a total of 25 million newcomers entered the United States. The more numerous were Italians, Jews, and Slavs—totaling more than 9 million—who brought in new customs, manners, languages, and religions.

To this flow of immigration one should add the massive black migration to the North. All these groups were scattered throughout the country but they tended to flock together in big cities, especially in New York, New Jersey, Pennsylvania, and New England.

In this era of laissez-faire capitalism, nativism evolved into the fear that class conflict would destroy the social fabric of the United States. Mounting labor organization, and the importation of socialist and anarchist ideologies by immigrants, rekindled the conspiracy theories.

The violent strikes of the 1870s and 1880s were therefore seen as signs of forthcoming disaster. In this climate, the American Protective Association was organized as a secret society dedicated to eradicating “foreign despotism,” which included Catholics. One of its aims was to ban German-language instruction.

Nativism took on a special coloring in the West, where the fear was of Chinese immigrants, considered a threat to white workers because they accepted lower wages. The Workingmen’s Party led a movement for a new state constitution in California in 1878 and 1879 that included stringent discriminatory measures.

At the national level, riots and mobbing, especially in Wyoming and New York, led to mounting pressure by California and other western states on Congress to pass the nation’s first immigration restriction, which some commentators have viewed as the institutionalization of racial paranoia. In 1882, the Chinese Exclusion Act excluded the Chinese from naturalization and immigration.

More restrictions were introduced in 1892, and Chinese immigration was banned permanently in 1902. In 1906 the first English language requirements for naturalization were enacted. The U.S. government legislated gradually to close the doors by limiting Japanese immigration through the Gentlemen’s Agreement of 1907�.

In the 1917 Immigration Law, Congress enacted a literacy requirement for all new immigrants and designated Asia as a “barred zone” (excepting Japan and the Philippines). The 1921 National Origins Act inaugurated the quota system, by which admissions from each European country was limited to 3 percent of each foreign-born nationality in the 1910 census.

It effectively favored northern Europeans at the expense of southern and eastern Europeans and Asians. The 1924 Johnson-Reed Act can be considered as a perfect application of nativist concerns for racial homogeneity since it confirmed that immigration quotas were based on the ethnic makeup of the U.S. population as a whole in 1920.

It was not until 1965 that racial criteria were removed from U.S. immigration legislation. An annual quota of 20,000 was awarded to each country, regardless of ethnicity, under a ceiling of 170,000. Up to 120,000 were allowed to immigrate from Western Hemisphere nations, not subject to quotas (until 1976).

Meanwhile, at the end of the nineteenth century in the wake of the Progressive movement, the muckrakers, social workers, and social reformers drew the public’s attention to the poverty, disease, and crime rates of immigrant ghettos.

Moreover, they sought to bridge the gap between newcomers and native-born Americans. The “new immigrants” were less skilled, less educated, more clannish, and slower to learn English. However, in order to cope with their new life, immigrants tended to organize into minority societies, trying to preserve as much of the group’s culture as possible.

But growing concern for national homogeneity urged many to think that a campaign to “Americanize”—meaning assimilation—was necessary. Thus the Bureau of Americanization was created to encourage employers to make English classes compulsory for their foreign-born workers.

For example, in the Ford Motor Company School, the first thing an immigrant was asked to learn to say was, “I am an American.” Most states banned schooling in other tongues some even prohibited the study of foreign languages in the elementary grades, in the belief that public schools were the major tool for Americanization.

The global trend since Word War II has been to diminish discrimination, at least by statute, and to reduce prejudice against immigrants and members of ethnic minorities. Hostility certainly lost much of its conspiracy-minded intensity, with the combined effects of the civil rights movement and the struggle by Hispanics and Native Americans for equal rights.

However, the end of racial quotas in 1965 led many Third World people to enter the United States, especially those coming from Central and South America, which alarmed many Americans and gave new targets to nativism, especially in the states where those immigrants tended to flock together. The question of bilingualism then became the central issue of nativists.

In the 1980s the “English only” movement was launched to restrict the language of government to English and encourage immigrants to learn English. Illegal immigration was another element that encouraged nativist anxieties, as encapsulated in President Ronald Reagan’s declaration in 1984 that “we have lost control of our own borders.”

Illegal immigrants were seen as a threat to native-born workers and an obstacle to unions, as they were enjoying all the advantages of living in America (schools, hospitals, welfare benefits) while escaping all the drawbacks, like taxes.

However, no legislation managed to curb the number of “undocumented” aliens on U.S. territory. In California, another upsurge of activism took place in the 1990s due to economic stagnation, rising racial tensions, and the widening gap between the rich and the poor.

Voters approved Proposition 187, which was meant to force public agencies (schools, police, and social and health services) to find out the immigration status of supposedly undocumented aliens, and report them to the immigration authorities. The initiative was judged unconstitutional. However, a direct consequence was the enactment by Congress of legislation toughening immigration enforcement laws.

The Briefest of Empires

The empire was created militarily and had to be enforced militarily. It survived the failures of Napoleon’s appointments only as long as Napoleon was winning to support it. Once Napoleon failed, it was swiftly able to eject him and many of the puppet leaders, although the administrations often remained intact. Historians have debated whether the empire could have lasted and whether Napoleon’s conquests if allowed to last, would have created a unified Europe still dreamt of by many. Some historians have concluded that Napoleon’s empire was a form of continental colonialism that could not have lasted. But in the aftermath, as Europe adapted, a lot of the structures Napoleon put in place survived. Of course, historians debate exactly what and how much, but new, modern administrations could be found all over Europe. The empire created, in part, more bureaucratic states, better access to the administration for the bourgeoisie, legal codes, limits on the aristocracy and church, better tax models for the state, religious toleration and secular control in church land and roles.