Google analytics

Friday, December 19, 2014

Robot Workers Stoke Human Fears; how we can still win

by John MacBeath Watkins

Finally, a headline that made me feel like I was in the 21st century appeared on the print edition of Tuesday's New York Times:

Rise of Robot Workforce Stokes Human Fears

The e-version of the Times, not restricted by the character count, had a longer headline that did a better job of explaining the fears of human workers as artificial intelligence makes machines more capable, but the headline in the print edition is what a time traveler from 1950 might have expected to see above the fold on the Dec. 16, 2014 New York Times.

In some ways, the future has been late to arrive, but as Louis Althusser noted, "l'avenir dure longtemps" (usually translated as "the future lasts forever," the name of his memoir.) There is plenty of time for the future to happen, and it will arrive in unexpected ways.

Let me first tell you my solution to the automation problem, then I will explain what it means.

We have been managing our economy for a "natural" rate of unemployment and an inflation rate of 2%, with inflation defined by the core consumer price index number. We have no real empirical knowledge of what the natural rate of unemployment is, nor have we any reason to assume 2% is the proper level of CPI growth. We should be managing the economy by some objective metric.

Even the core CPI is subject to external shocks. The inflation metric we should be using is wage inflation, and we should be managing it to match productivity growth. In fact, since we have not done so for a long time, we should be managing for wage inflation higher than that until it catches up to the pre-1980 trend. This would ensure wages don't, in the long run, exceed the structural capacity of the economy, and that they don't trail so far behind it.

One of the things that makes capitalism different from previous systems of market economies is that in most of them, the workers owned their tools, and land and labor were the keys to making a living (which is why if you wanted to be wealthy, you needed to get land or a mine.) The total amount of wealth was presumed to be finite. Capitalism is a system where the tools usually don't belong to the artisan, they belong to those who accumulate and invest capital. And investing capital in technology is often a way to increase wealth, meaning that capitalism isn't a zero-sum game.

The transition from a traditional market economy to a capitalist one is not great for those who used to be the skilled artisans, which is why the invention of mechanized weaving sparked the Luddite uprisings of the early 19th century. The Luddites recognized that while weaving made cloth cheaper, and enabled most people to buy more clothing and make it a smaller part of their budget, it also meant that the weavers, once high-skilled and well-paid artisans, were surplus to requirements.

Eventually, we found things for people to do. The key was to have enough economic activity that the increased productivity does not permanently unemploy workers whose careers have been disrupted by more productive technology.

Economists used to think that there was an inverse tradeoff between inflation and unemployment, a relationship called the Phillips Curve. This was displaced in the late 1960s and 1970s by a new concept, the natural rate of unemployment, championed by Milton Friedman and Edmund Phelps.

Friedman and Phelps argued that for there to be a permanent increase in employment, something would have to change in the real economy. Essentially, he argued, the Phillips Curve relied on an illusion, and when inflation expectations for wages and prices caught up with reality, this would leave unemployment unchanged.

The problem with managing the economy for the natural rate of unemployment is, there is no established way to know what the natural rate of unemployment is. Friedman and Phelps, it would appear, moved us from managing the economy based on an illusion to managing it based on a guess.

How's that working out for us?
United States Labor Force Participation Rate by gender 1948-2011. Men are represented in light blue, women in pink, and the total in black.

Labor force participation peaked in about 1998, which is about the last time we had much in the way of wage inflation, but really, labor force participation has been stagnant or declining since about 1990. That's because wages, like any price, are a signal, in this case, a signal to come one out and get a job. Take a look at the comparison between productivity growth and average real wages:

1990 is about the time real wages for the average worker fell below the 1970 wage level, and it's been there since. Never the less, female participation in the labor force has increased, and women's wages, while they have not caught up to men's, have at least been increasing.

Here is another set of data that most people have not incorporated into their analysis:

- See more at:

While working men's wages fell a bit from 1980 to 2012, men found it harder to get a job. As a result, the decline for income among all men has been much worse than the situation among working men.

Much of the decline in male incomes has been at the median and lower end of the distribution of education:

Data: Hamilton Project 

Dyland Matthews notes about the above chart (and I recommend following the link and reading his full essay)
High school dropouts' earnings have fallen 66 percent since 1969, and people with some college - the median level of education in the US - have seen earnings fall by a third.

Now, there's a good news/bad news situation here. Men's labor force participation has been falling since the 1950s, even though through the 1950s and 1960s their wages were increasing rapidly, so what's causing their decline in labor force participation isn't the integration of women into the work force. The bad news is that their wage gains started falling about the time women's labor force participation increased.

I'm not convinced that this is a case of post hoc, ergo propter hoc. The thing is, at about the same time, we had a major recession, then started managing the economy for the unknowable natural rate of unemployment. Median male income started declining about the time productivity increases and wage increases became de-linked. That's about the time we adopted two of Milton Friedman's big ideas, managing companies for shareholder value and managing the economy for the "natural" rate of unemployment. I strongly suspect that these two ideas played a strong role in removing the link between productivity increases and wage increases.

Since incomes for men have fallen most for the least educated, the indication would be that the kinds of work available do not play to male strengths, such as upper-body strength and a willingness to take jobs with lots of heavy lifting, risk, and inclement weather, which would indicate one possible reason the male workforce participation rate has been falling since about 1950.

Teachers used to say, "you want to dig ditches for a living?" but now, to do that you have to be a heavy equipment operator, and a lot fewer of those are needed than ditch digging humans for the same size ditch. The increase in productivity is in an industry were the demand for ditches is not particularly price elastic.

While many people might say, "that's a pretty good price on shirts, I think I'll get two," few people look at a contractor's rates and say to themselves, "well, I don't really need a new ditch, but at these prices..."

More and more of those risky, physical jobs in the open air have been mechanized. Now, artificial intelligence offers the opportunity to replace humans at inside work with no heavy lifting, which will affect both genders.

This is also an opportunity for those who own capital to take more of the gains from productivity. The question is, why should they? Those weavers who became Luddites were employable, and had they had access to new skills and a hot labor market, they might not have minded so much losing their work as artisans. While the early part of the industrial revolution created great misery and inequality as the capitalists took most of the money, eventually, we figured out how to redistribute the wealth and start the great era of the middle class.

We need to give people access to new skills. Right now, higher education and the student loan program are a mess. How we expect to get a skilled labor force with any spending power out of that is one of the great mysteries of our time. The acquisition of new skills is one of the great levelers for a society, and we've made it tremendously expensive and difficult.

We need to manage for a higher level of employment. The fact that so much of our productivity gains have gone to the top instead of the working class indicates that we are managing for such low levels of employment that there is no pressure for higher real wages. Employers have been able to manage for stagnant real wages across the board, and plummeting wages for the class of people -- white males -- who had been paid best.

Managing for a slack labor market has certain advantages for the capital owning class, and which includes me to a small extent, but is concentrated at the top end of American incomes. Companies can spend their money on dividends and stock buybacks instead of worker's pay. The speculative natural rate of unemployment isn't particularly scientific, but it has provided a rationale for managing the economy for sufficient slack in labor demand to suppress the rise of real wages. The NRU has continued in use not because it is economically useful or provable, but because it is politically useful.

Ah, you say, but if we manage for wage inflation related to productivity growth, what about asset bubbles? I submit that those have as much to do with changes in banking as they do with monetary policy. The real estate bubble accompanied innovations in the banking industry that allowed the sale of securitized mortgages, making far more money available than could be invested by savings and loans making mortgages based on deposits. The great age of corporate raiders was also the great age of junk bonds. The tech bubble actually happened at the same time as some wage inflation.

Asset bubbles may well have more to do with the deregulation of the banking system and the rise of the shadow banking system than with monetary policy, and should be dealt with on that basis.

Wednesday, December 17, 2014

Surge Pricing and Cindi of the Shattered Shoes

by John MacBeath Watkins

There were three parking places in front of Mad Merlin's Joke, Costume, and Magic Emporium, and one was a 30-minute load zone. That is where Cindi of the Shattered Shoes was parked, and she was not done shopping.

She had come out to see if any other spaces had opened up, knowing that she'd been parked for nearly half an hour and her car was about to turn into a pumpkin. She found Jack the Blighter fuming, because the cars parked in front of and behind his Bulgemobile were too close for him to get out.

“Don't worry,” said Cindi, wincing from the pain in her feet that had been caused by a glass footwear-related accident, “I'll move out of my spot and take yours.”

A crafty look came into Jack's eyes.

“Oh, that's just what you'd like me to do,” Jack said. “If the spot's that valuable, I'm keeping it.”

The magic sigal the parking fairy had chalked on Cindi's left rear tire was beginning to glow.

“But the spot's no use to you, because you're done shopping,” Cindi objected.

:”Yeah, well, it's my spot, and I'm not moving,” the Blighter said.

Two people happened along just then. One was Prince Charlie, Earl of Studly, a region famed for its bull semen, the other was a parking fairy.

Maybe it was just the magic blowback from the parking fairy's wand as she turned Cindi's car into a pumpkin, but Studly was so charmed by Cindi that he immediately fell in love.

“Please,” Studly pleaded, “turn this young, beautiful and fecund young woman's car back into its original state, I wish to get busy with her and make some little princelings and princesslings, that my House may continue to rule, and I think it would really impress her if I got her car back.”

“Can't turn it back until the fine is paid,” the parking fairy said. “It's one magic pea.”

“I have a magic pea,” Jack the Blighter confided, confidingly. “I'll sell it to you for half your kingdom.”

“But the going rate for a magic pea is one farm,” Studly objected. “You're gouging.”

“Clearly, you wish to surge withing my lady's garden of delight,” the Blighter said. “Thus, surge pricing applies.”

“I will accept your hand and other projecting parts of you in marriage if you can get my car back to its original state,” Cindi declared, declaratively. The pain in her feet was becoming unbearable, and she really just wanted to sit down in her car. Plus, the guy still had half a kingdom.

“Oh, all right,” said the smitten prince, smitingly for some reason. With a flourish, he signed over half his kingdom to Jack the Blighter.

When he took the pea, Prince Charlie, Earl of Studly, discovered it was hot. As he juggled it in his hands, wincing, Jack winked at him, and Studly remembered that Jacks tend to steal things.

He quickly dropped the hot pea into the parking fairy's proffered purse. The fairy promptly turned to the pumpkin and waved her wand, turning it into an acorn.

“It was a Honda Accord, not an acorn,” Cindi objected, objectively.

“My remit was to return the pumpkin to its original state,” the parking fairy said. “When you bought the car, did you carefully read what it said on the boot? (Like many fairies, the parking fairy had attended an English boarding school.)

“But the lady who sold it to me was so nice!” wailed Cindi. “She said she only drove it to black mass every full moon!”

“Bent old biddy with a wart on her nose and sort of Goth taste in wardrobe?” the fairy asked, questioningly.

“Well, yes,” said Cindi, agreeably.

“You got taken,” the fairy announced. “I'm only allowed to return the vehicle to its original state.”

“I'll bet you'd turn it into a car for another magic pea,” suggested Jack, leering at the prince suggestively.

The prince didn't care to trade off the other half of his kingdom, so he decided to try and persuade the fairy instead, and started moving toward her, raising his hands pleadingly.

The fairy whipped out her wand and shouted “stop or I'll shoot,” while firing her wand. Prince Charlie, Earl of Studly, turned into a frog.

“You saw him!” the fairy said, “He came at me!”

“He had his hands up,” Cindi replied. I'm afraid that replyingly isn't a word recognized by my spellcheck, so I can't tell you how she replied.

“Change him back,” Cindi suggested, suggestively.

“He had it coming,” the fairy said. “the grand jury will clear me.”

Jack sidled up to the frog and offered to sell him a magic pea for the rest of his kingdom.

“No, you don't!” Cindi snapped, snappishly. She was a practical woman in all matters not related to the choice of footwear. “He still has half a kingdom, his offer of marriage is still valid, and under recently passed marriage equality laws, I can marry the frog I love. He's my prince charming.”

“Charlie,” the frog croaked, correctingly, if that's really a word.

Saturday, December 13, 2014

Our echoes roll from soul to soul

by John MacBeath Watkins

One of my wooden boat friends, Rick, in Australia, is dying, and doing so with dignity and fortitude. I posted this for him:

from The Princess: The Splendour Falls on Castle Walls

By Alfred, Lord Tennyson
The splendour falls on castle walls
And snowy summits old in story:
The long light shakes across the lakes,
And the wild cataract leaps in glory.
Blow, bugle, blow, set the wild echoes flying,
Blow, bugle; answer, echoes, dying, dying, dying.

O hark, O hear! how thin and clear,
And thinner, clearer, farther going!
O sweet and far from cliff and scar
The horns of Elfland faintly blowing!
Blow, let us hear the purple glens replying:
Blow, bugle; answer, echoes, dying, dying, dying.

O love, they die in yon rich sky,
They faint on hill or field or river:
Our echoes roll from soul to soul,
And grow for ever and for ever.
Blow, bugle, blow, set the wild echoes flying,
And answer, echoes, answer, dying, dying, dying.

I'm particularly fond of that line about how our echoes roll from soul to soul. Rick, you are a part of all of us now, and what we've learned from you will live on in us, and I hope will allow us to die with the dignity and fortitude you display.

Thursday, December 4, 2014

Tamir Rice and the depraved heart

by John MacBeath Watkins

When I heard that 12-year-old Tamir Rice was shot by Cleveland police after waving a realistic toy gun around, I thought, what a tragedy.  Wouldn't have happened to me as a 12-year-old white kid if I'd been doing something that stupid when I was growing up in Maine, but it's the sort of thing that happens in cities.

Then I heard that the two officers at the scene did not render  first aid to the kid for four minutes, and at that point an FBI agent who happened to be in the neighborhood came along and rendered assistance.

In my opinion, that's a crime.

There are two possible interpretations of this. One is the doofus theory, that the cops were so freaked out by one of them having shot a kid, they didn't know what to do. If that's the case, they were negligent, and so useless in an emergency that I'd say they have no business on a police force.

The other possibility is more sinister and I'd say less likely. Sometimes, negligence is intended to cause harm or death. These are called depraved heart crimes, or sometimes depraved indifference. If the officers at the scene were convinced that they had screwed up to the point where it could cost them their badges, it might have been to their advantage if the kid died and could not testify.

In support of the doofus theory, we have information provided by the previous employer of the cop who shot the kid. Timothy Loehmann, the 26-year-old cop who did the shooting, had previously worked of a police department in a suburb of Cleveland called Independence. The city of Independence has taken the unusual step of releasing a letter recommending the dismissal of Loehmann.


A Nov. 29, 2012 letter contained in Tim Loehmann's personnel file from the Independence Police Department says that during firearms qualification training he was "distracted" and "weepy."
"He could not follow simple directions, could not communicate clear thoughts nor recollections, and his handgun performance was dismal," according to the letter written by Deputy Chief Jim Polak of the Independence police.
The letter recommended that the department part ways with Loehmann, who went on to become a police officer with the Cleveland Division of Police.
"I do not believe time, nor training, will be able to change or correct the deficiencies," Polak said.
So, maybe the guy had issues and should never have been hired. That doesn't explain why his partner, 46-year-old Frank Garmback, also failed to render first aid. Perhaps there's more to the story, but it's hard to think of a good reason for this.

The thing that makes both men look bad, and made me think of the phrase "depraved indifference," is the story they told before they knew there was video of the incident.

From the Daily Kos:

Not knowing that a camera recorded the entire incident, the police told what appear to be at least five lies about what happened.
1. Police said that Tamir Rice was seated at a table with other people.
2. Police said that as they pulled up, they saw Tamir Rice grab the gun and put it in his waistband.
3. Police said they got out of the car and told Tamir Rice three times to put his hands up but he refused.
4. Police said that Tamir Rice then reached into his waistband and pulled out the gun, and was then shot and killed by Officer Timothy Loehmann.
5. Timothy Loehmann was described as a rookie.

1. Tamir Rice as not seated at a table with other people.
2. Tamir Rice does not appear to grab the gun and put it in his waistband.
3. Police shot and killed Tamir in less than two seconds and could not have told him to put his hands up three times.
4. Tamir Rice absolutely does not pull the air gun out of his waistband and brandish it in any way. This fact is so crucial.
5. Timothy Loehmann was not a rookie, but had been an officer for over two years.
If both officers told this story and it didn't agree with the video of the event, that makes it look to me like they colluded on a story that would exonerate Loehmann.  But why would Garman do that?

Anyone who has spent time around cops knows that one of the most important traits a cop can have to survive in the work they do and support those they work with is loyalty, and especially loyalty to your partner.

I don't know if this is a case of misplaced loyalty or if the two men talked about what happened and became convinced that their erroneous account was true. If the latter is the case, it supports the doofus theory and indicates neither man is a reliable witness to a crime, and should not, in my humble opinion, be cops. If the statements of fact that were not true were a conspiracy to clear Loehmann, that would be another reason both men should not be cops.

Even if that were the case, it would not prove that letting the kid lay there bleeding for four minutes was a depraved heart crime. I'm not a lawyer, but it seems to me that you'd have to prove that they were letting him bleed out to eliminate the only other witness to the shooting.

That's a high bar to clear, as it should be, because I find it difficult to believe anyone would do that. But then, I find it difficult to believe anyone kills people for the flimsy reasons they do.

I doubt very much these men will be held criminally liable for failing to render first aid for those four minutes.

But unless I hear a damned good explanation for that failure, I'll always wonder: Did they have depraved hearts, or were they doofuses, or is there something I'm missing here?

Wednesday, November 19, 2014

Is America really so violent?

by John MacBeath Watkins

People who compare the United States to European countries say we have an extraordinarily high murder rate. But is that the appropriate comparison?

For a country located in the Americas, the United states has a relatively low murder rate. Canada and Chile are the exceptions. I suspect the issue is cultural. One thing that has happened with colonization is that some cultural aspects of the mother country are preserved from the time of colonization. I would look to the murder rate in the mother country at the time the country was colonized to explain a high murder rate in that culture today.

The murder rate in Europe in the middle ages was extremely high, and dropped quite a bit during the time the
Murder rate per 100,000 inhabitants in 2012.
Americas were being settled. Steven Pinker, in his book The Better Angels of Our Nature, states that murder rates were about 30 times higher in the middle ages than they are now. If my theory is correct, the earlier a country was settled, the more likely it should be to have a high murder rate.

This seems to go against the fact that Chile has a low murder rate, even though the conquest of Chile started in 1540. One answer to this is that the low murder rate in Chile reflects the relatively strong state there. A strong state tends to reduce the murder rate because it's not good for the state to have taxpayers killing each other, any more than it helps a farmer to have his livestock fighting.

The early Chilean state was small and homogenous, prevented from expanding northward by the desert or southward by the unconquered Mapuche Indians. The conquest of Chile was gradual, and as a consequence of failing to conquer the Mapuche, Chile relied more than most Spanish colonies on European settlers. In fact, parts of the country attracted German settlers in the mid-19th century. Much of the country's expansion occurred after it declared independence from Spain in 1818, and with many immigrants arriving after that, the country could be expected to be culturally closer to modern Europe that nations settled earlier.

One of the uses Britain made of its American colonies was as a place to transport criminals. Once transportation to America as a punishment became impossible, Australia and Canada began to absorb Britain's malcontents. And whereas the French had chosen mainly to trade with the Indians and send only people they could trust to the new world, the British sent people pushed off the land by the Inclosure Acts, criminals and pretty much anyone they felt they were well shed of. As a consequence, the British culture imported to Canada was that of the 19th century, while the British culture imported to America was that of the 17th century.

Contrast this to Venezuela, a country where Columbus actually landed, which was colonized to a great extent in the 16th century. We find that it has an intentional homicide rate of 53.7 per 100,000 annually, in contrast to the 3.1 of Chile  or the 1.6 of Canada, and the United States of America turns out to be one of the least dangerous countries in the new world with a murder rate of 4.7 per 100,000 (all figures are for 2012.)

So, if the culture of violence in new world countries reflects the timing of their formative European colonization, what made European murder rates fall so much?

For one thing, violence became harder to get away with. As European states became more centralized, policing got better, and it became harder to walk away from a murder and start over elsewhere. In addition, as states became more centralized, warfare within a country became less practical -- dukes who might have tried to expand their duchy found that they were restrained by the increasing power of kings.

Another factor was the decline of subsistence farming and the increase in trade and industry. The key to wealth and power became less how many farms you could subjugate by the sword, and more the trade and industry you could dominate. Power moved from men with horses and armor to men with ledgers and gold.

While many a duke had risen to his post by violence (a duke was originally a war leader) few merchant princes found violence the path to influence and wealth. Because commerce is not a zero sum game, cooperation was a better path.

The shift from agrarian empires to mercantilist empires was a shift from warring tribes to warring nations, in which the violent domination of resources and trade routes led to greater national wealth. This was the great era of colonization. The shift from mercantilist empires to capitalism put further emphasis on cooperation, and undermined the colonial empires. Modern global capital creates stateless income that undermines colonial empires and makes wars less rewarding. Because the capital doesn't enrich the state that spends money one wars, but goes where it won't be taxed, much of the feedback mechanism that made empires possible is gone.

So, it's easy enough to see why violence has become less common in Europe. From the top down, it has become less rewarding and harder to get away with. The question remains, why did their colonies preserve the barbaric attitudes of an earlier age, and what can be done to move them beyond that?

Friday, November 14, 2014

Anomie and the search for meaning

by John MacBeath Watkins

The French have a word for it: Anomie. No norms. It is a condition when people find themselves so disconnected from social norms that they cannot find their place in the world. Emile Durheim used the term in his book, Suicide, published in 1887.

His theory was that a rapid change in the values and standards of society would lead to a feeling of alienation
and purposelessness. Picture the situation; society is changing rapidly, and while it may try to prepare you for your place in it, that place is no longer there by the time you are trained for it. Your entire life plan, the existence you have spent your childhood and adolescence preparing for, is nowhere to be found.

Are you a failure? No, worse. There was no path to a life of honorable labor, no place for you in the world.

You cannot even fail, because all that you have prepared for is simply not there. You were groomed to play a part in a pantomime that has been cancelled. And here you are, alone on the stage in a parody of makeup for a part no one cares to see you play. How meaningful is your life, then? If society were a dictionary, you would not even be a word, just an indecipherable squiggle in the margin.

That is anomie, diagnosed at the end of the 19th century, discussed to death to 20th century, a wallflower at the party in the early 21st century.

As you might expect from the title of Durkheim's book, suicide was one common response to this condition. Perhaps it still is. We don't talk about anomie much anymore. People still kill themselves, people still feel disconnected from social norms, but that 19th century term is less common than it once was. It's a shame, because the term explains a lot.

Much of what makes us human is in our interaction with others. It is in the social realm that we display our sanity or madness, and our very humanity. That is why solitary confinement is such a severe punishment, one that can even produce psychological effects such as hallucinations, paranoia and obsessive thoughts. We are meant to be social creatures, incomplete without interaction with others.

Once, society changed slowly, and when we spoke of the Old Kingdom, the Middle Kingdom, and the New Kingdom, we meant social orders that differed little and lasted a thousand years each. Then it was possible for generation following generation to fall easily into their social roles, and we can suppose anomie was not a problem. Those days ended in the Axial age, which we discussed in this post..

When the world started changing too rapidly for an entire society's structure to adapt new places for its members, individuals had to find their own places. That may seem hard enough, but when they invented their new positions, they had no norms established for the new ways of life they were inventing. They needed guidance, and they got it in a great age of prophesy. Across Europe, the Middle East, and Asia, prophets told people that they should be compassionate, that they should do unto others as they would have done unto them. And that was enough, for two or three millennium. People could think for themselves, and still think about others, with the guidance of the prophets.

And then, the world started changing faster, and faster, and faster. The feeling of disconnection from social norms, social roles, spread wider and wider. Some felt the change, and said “God is dead.” Some felt the change, and said, “God, save me!” and started churches dedicated to preventing change. Some felt the change, and the loneliness, and the pain, and became angry, and said, “God, I will kill those who caused this!” and became terrorists. And some, strangely enough, said, “God is dead. I bet we can build a better one,” and started dreaming of an all-knowing computer.

Do you want to know how they felt? Do you know who's to blame? Look in a mirror. No, seriously, that's one way to study the problem. Psychologists have people look in a mirror in order to get them to focus on themselves, in order to study one of the central problems of psychotherapy.

People come to see a psychologist very often because they are depressed. The psychologist needs to assess the problem, so has the client talk about themselves.

This self-focus causes the people talking about themselves to become sadder if they perform this self-focus in private, or to experience social anxiety if they do it in public. In essence, they experience a heightened sense of anomie, of disassociation from the warmth and comfort of human contact, because they are focused on themselves.

There is the problem, then. To be human requires participation in human society, and rapid social change can cast us adrift, maroon us in an island of the self. And as we try to understand ourselves, we focus on ourselves, and feel more isolated and alone as a consequence.

The shared hallucinations of our social constructs are meaningless if we are alone. If we are only animals, eating, sleeping, reproducing, we are only the appetites our genes have programmed us to have. If we are human, we live in a world invisible to most animals, a world of language and symbol, in which what we pass on to others may not even be physical matter, such as genes. It may be our ideas, ideals, songs and gods. It may be the world of meaning, the most human world of all.

However out of place we may feel, however useless our social skills and unattainable our aspirations, what makes us human is the people who have shaped us. We are never alone, because they are a part of us, and we are a part of those whose lives we've touched. Even the worst families teach their children to be human. What those children rejects from those who have shaped them sets the boundaries of their souls, what they accept gives those souls their content.

Unlike most animals, we can cooperate with one another even without family ties. This is because in that ethereal world of symbolic thought, we can pass on a part of who we are to people genetically unrelated to us. Our thoughts are at least as fecund as our bodies, and we lust for the sort of social intercourse that will allow us to transmit our wisdom to each other and build up something greater than ourselves.

Anomie is a symptom of the failure to do this, a sign that we must find a way to reach one another and find comfortable niches for ourselves in the great body of civilization.

Friday, November 7, 2014

How democracy ends: The Sjem-Wiemar problem

by John MacBeath Watkins

The Polish-Lithuanian Commonwealth was once a force to be reckoned with, a country more powerful than Russia and far bigger than most of the countries of Europe. What happened to that empire?

Well, the commonwealth was one of the few countries in Europe that had a really influential parliament. It was called the Sjem, and it operated as a legislative body starting in 1493 and became the legislative body of the Polish-Lithuanian Commonwealth when that was founded in 1569. It was, like many republics prior to the modern era, not particularly democratic. Its members were indirectly (by regional bodies) elected by the nobility, which amounted to about 10% of the population.

For much of its existence, any member could nullify legislation that had just passed and end the session by shouting "Nie pozwalam!" (I do not allow.) This is known as a liberum veto.

Harvard political scientist Grzegorz Ekiert argued that:
The principle of the liberum veto preserved the feudal features of Poland's political system, weakened the role of the monarchy, led to anarchy in political life, and contributed to the economic and political decline of the Polish state. Such a situation made the country vulnerable to foreign invasions and ultimately led to its collapse.
For one thing, foreign regimes discovered they could bribe legislators to use their veto, thereby paralyzing the government. This led to the partition of the empire and foreign occupation.

In Germany,  the Wiemar Republic had a rough start, but after the hyperinflation got tamped down, there were some very good years -- until the crash of 1929. The American banks that were helping Germany pay its reparations for WW I had to call their loans in, unemployment went up just as it did in other countries, and the people responded by throwing the bums out. Unfortunately, the bums they threw in tended to be people who didn't believe in democracy, like the the German National Peoples' Party, the Communists, and the Nazis.

Unable to form a majority coalition, Heinrich Brüning formed a minority coalition, but was forced to often rule by emergency decree, because the Reichstag could not pass legislation. Unfortunately, his policies for dealing with the Depression were exactly wrong -- he tightened credit and rolled back wage increases, making him unpopular with the electorate and the Reichstag.

Since his decrees were actually ruining the country, Brüning opened the door for the election of populists like the Nationalist Party and the Nazis. Even business interests turned against him, though it must be admitted that some started financing Hitler long before Brüning became chancellor.

In each case, democracy failed because it could not govern. Francis Fukuyama, in Political Order and Political Decay, argues that American political order is decaying because it has become to easy for special interests to veto decisions. This, he claims, leads to a government unable to function well enough to address the nation's challenges, which undermines the peoples' faith in the ability to address their problems, which leads them to deny it the resources to address their problems, which leads to...well, you get the idea.

The destruction of the Polish Commonwealth and the descent of Germany into the totalitarian hell of Nazi dictatorship had this in common -- democratic, representative government ceased to function. When democracy can't address the peoples' problems, they will turn to a strongman or watch things get worse and worse.

So it is with real dread that I read this:
To prevent Obama from becoming the hero who fixed Washington, McConnell decided to break it. And it worked. Six years into the affair, we now take it for granted that nothing will pass on a bipartisan basis, no appointment will go through smoothly, and everything the administration tries to get done will take the form of controversial use of executive power.
 Sound familiar? This is the way democracy is destroyed. As long as politicians find they can increase their clout by making sure government does not address peoples' problems, and not take the blame for how things turn out as a result, our democratic system is in danger.

Wednesday, October 29, 2014

How to start a dark age and what myths should do for you

by John MacBeath Watkins

The term "dark ages" is not much used anymore, but it still conjures up notions of an age of ignorance following the fall of a great civilization.

It was first applied to the entire Middle Ages in about 1330 by Petrarch. Light and darkness had symbolized good and evil, but Petrarch made them symbols of knowledge and ignorance. He saw his own time as one of darkness, and aspired to a time of greater light.

That time of light arrived as the Renaissance some time later, the dawning of a time when people admired knowledge and it became more widespread. Then came a time when archaeology started digging up the "dark ages" and found a great deal had been known and accomplished in the middle ages, so now we seldom used the term for anything but the early middle ages.

It's easy to put a starting date to the dark ages. Emperor Justinian closed pagan and Jewish school in 529 AD, and the dark ages began.

The decree, as translated by James Hannam, reads as follows:
We wish to widen the law once made by us and by our father of blessed memory against all remaining heresies (we call heresies those faiths which hold and believe things otherwise than the catholic and apostolic orthodox church), so that it ought to apply not only to them but also to Samaritans [Jews] and pagans. Thus, since they have had such an ill effect, they should have no influence nor enjoy any dignity, nor acting as teachers of any subjects, should they drag the minds of the simple to their errors and, in this way, turn the more ignorant of them against the pure and true orthodox faith; so we permit only those who are of the orthodox faith to teach and accept a public stipend. 
Justinian seems mainly to have aimed this at the Athenian Academy, which traced its (sometimes interrupted) existence back to its founding by Plato in the early 4th century BCE, but he also closed Jewish schools and schools run by those judged to be heretics.

In so doing, he centralized power over what was deemed to be true. The decree made it illegal to teach things that were contrary to the teachings of the "catholic and apostolic orthodox church."

There were Greek philosophers who had figured out not only that the earth was round, but had calculated pretty accurately its circumference. They knew that the rotation of the earth explained the sequence of day and night. Justinian didn't make it a crime for the great pagan scholars of his age to write and publish -- that came later -- but he shut down the Academy, leaving the scholars to make their own way.

Hammon is a skeptic about the impact of this action. Many pagan documents survived, and were even taught in Christian academies.

But the schools in the Eastern Roman Empire were survivors after the fall of the Western Roman Empire in
476. Justinian was the last of the Latin-speaking emperors of the Eastern Roman Empire. Justinian sought to reconquer the territory that had been the Western Roman Empire, but failed. As the empire's grip over Europe failed, political institutions that had united it failed, and the only pan-European institution remaining was the Church. It became the dominant force in the preservation of knowledge and the maintenance of teaching institutions and traditions. And it demanded allegiance to what the Church believed.

Some scholars and some texts made there way to Persia, and with the rise of the Muslim religion, schools that remained in Alexandria and Cairo fell into Muslim hands. Thus began the golden age of Muslim science and philosophy, early in the 7th century AD.

The golden age of Muslim science and philosophy spanned from 750 AD to about 1100 AD. What happened then?

The Incoherence of the Philosophers, that's what. The second-most influential Muslim cleric (after Muhammad) was a scholar named  Abu Hamid Al Ghazali, who wrote a book of that title published in the late 11th century. He argued against those Muslim scholars who had based their works on Plato and Aristotle were wrong -- essentially, heretical. The spread of his thought led to religious institutions that taught that human reason by itself cannot establish truth. Although Al-Ghazali himself had nothing against science, this in effect meant that if you really wanted to establish truth, you didn't go to a scientist or a philosopher who had devoted his life and efforts to learning about the thing in question. Instead, the final arbiter of truth would be a cleric who specialized in the Koran.

This led to a decay of Muslim science and philosophy. Some would say, it led to a dark age for their civilization.

This seems to be the way to cause a dark age: You simply give religion authority over establishing what is true of the physical world.

Religion is in the business of delivering eternal verities, not of discovering new things. In fact, in such celebrated cases of the discovery of new things as Galileo's astronomy or Darwin's Origin of Species, religion has fought against new knowledge of how the universe works.

Joseph Campbell, in Myths to Live By, wrote that religion or myth (the difference seems to be that myths are religious beliefs no longer in use) serves four functions:

One, "to waken and maintain in the individual a sense of awe and gratitude in relation to the mystery dimension of the universe..."

Two, "to offer an image of the universe that will be in accord with the knowledge of the time..."

Three, "to validate, support, and imprint the norms of a given, specific moral order, that, namely, of the society in which the individual is to live."

Four, "to guide him, stage by stage, in health, strength, and harmony of spirit, through the whole foreseeable course of a useful life."

Can a religion that fails in the second function succeed in the other three? I doubt very much it can, because a failure in one area undermines faith in the truth of sacred knowledge in all the others. How could a church that taught the earth was flat have any authority after we had photographed the earth from the moon?

But the Catholic Church did not remove Galileo's books teaching heliocentrism from the its  Index of Forbidden Books until 1758, and in 1992 the Pope announced that the church accepted that the earth moves around the sun. I can find no indication, however, of the verdict of the Inquisition against Galileo being rescinded. The committee Pope John Paul II appointed in 1979 had, by 1992, concluded that the Inquisition had acted properly by the standards of its day, although Galileo was right about the sun and earth.

So, that's all right. Retard intellectual progress by a century of so, and it's all in good fun. In 2008, Pope Benedict XVI cancelled an appearance at La Sapienza University because some students and professors sent him a letter protesting the Pope's expressed views on Galileo. He was probably thinking, "why you talkin' 'bout old stuff?"

It was the notion that there had been a dark age that gave people the notion to call the blossoming of knowledge and science the Enlightenment.

The Counter-Enlightenment, which started not long after the Church took Galileo's books off the Index of Forbidden Books, has argued that the Enlightenment undermines religion and the political and social order. This is, in fact, the basic stance of conservatism since at least Edmund Burke. The term "Counter-Enlightenment," as I'm using it here does not refer to a single coherent movement with identifiable leaders, more to a wide span of groups and individuals who have argued against the goal of constant progress to new knowledge and a more rational society espoused by the great Enlightenment thinkers.

They are probably right in arguing that the Enlightenment has undermined religion and the existing social order. After all, the Inquisition is a shadow of its former self, the church has had to repeatedly retreat on who is listed on the Index of Forbidden Books, and the most recent Pope has finally said that the beliefs of the Church do not conflict with the big bang theory about the origins of the universe or Darwin's ideas about the origin of species. It would be better if the church had not involved itself in such matters in the first place, but if it must make pronouncements about the nature of the physical world, it will have to change its tune when our knowledge changes or be undermined by new knowledge.

We are still fighting this battle. Zealots want their religion's version of the origin of specie taught in public schools (they originated as God made them) and moral notions, such as whether it is better to condemn homosexuals or accept them, are being fought out as the culture changes. A church that has failed to distinguish between its core beliefs and issues that seem less religious than social must change or fail the test of providing a world view in harmony with the knowledge of the society to which it offers spiritual guidance.

The Catholic Church is a handy way to talk about this, precisely because it is so well organized. But it is accompanied in its problems with the Enlightenment by people of many faiths. The easy way to deal with such problems used to be the one used on Galileo, tell the inconvenient person to shut up or die. But at this point in history, the world is changing too fast and the knowledge base outside the church is to big to be controlled.

Saturday, October 25, 2014

Market power, monopsony and the porn industry

by John MacBeath Watkins

In a previous post, we discussed how changes in the music industry explain a bit of the Solow paradox, the fact the new technology is being adopted, but productivity hasn't seen much increase. Now we have another example of a way in which technology is suppressing, rather than increasing, productivity growth.

It also shows how power can transfer wealth from one group to another in ways a free market wouldn't allow based on monopsony, the dominance of a buyer in the marketplace.

The porn industry, once an economically vibrant part of the economy, has been devastated by changes in the business even as it adopts new technology. Porn stars once had a decent income from their performances, but now many have to work as prostitutes on the side to support themselves. It's a bit like the musicians who used to make most of their money from recordings, and now find they must get their living from live performances.

Like the musicians, part of their problem is piracy. Computer technology allows the rapid and almost perfect copying of music and videos. As a result, many viewings of porn have been taken entirely out of  the economic sphere.

But in the case of porn, there's another problem, the market power of the main distributor. The industry is dominated by Mindgeek, formerly Manwin. The company describes itself as being founded in 2013, but that's just when it changed its name back to Mindgeek after a period of being known as Manwin. Each name change came after its owners ran into legal trouble, resulting in the sale of the business.

Mindgeek has something like monopsony power over the porn studios. They own an array of "tubes," the Youtube-like on-line distribution channels for porn.  They also own a lot of porn producers, and are essential for the distribution of the works of other porn producers. According to a recent Slate article, Mindgeek doesn't always pay the porn producers when they put up a video on one of their sites:
Even content producers that MindGeek owns have trouble getting their movies off MindGeek’s tube sites. The result has been a vampiric ecosystem: MindGeek’s producers make porn films mostly for the sake of being uploaded on to MindGeek’s free tube sites, with lower returns for the producers but higher returns for MindGeek, which makes money off of the tube ads that does not go to anyone involved in the production side.
 The result is that performers have to have sex more times to support themselves, performing for the videos and doing their "live" performances as prostitutes.But isn't more work for less money lower productivity as we account for such things?

There was a time when one company in an industry owning most of the production and distribution would have set off alarms in the Justice Department and resulted in anti-trust action. That changed in 1980 with the election of Ronald Reagan. Word soon went out that the justice department would not be worrying about practices such as predatory pricing, and in fact, was really only worried about monopoly power if it resulted in higher prices to consumers, essentially meaning that the Justice Department was now mainly interested in price fixing in its anti-trust enforcement. It was a legal theory advanced by Robert Bork in a book titled The Antitrust Paradox.

This radically changed the incentives for American businesses. Predatory pricing, a practice that got Safeway in trouble with the Justice Department in the 1960s, became a notorious tactic of WalMart. The key was not to use this power to raise prices, but to dominate its markets and use its market power to squeeze producers.

Mindgeek is using a similar tactic. It is distributing the product for free on ad-supported sites, while squeezing porn production companies and performers to lower its costs. It routinely violates the intellectual property rights to sexual performances, but is so essential to production companies and porn performers for distribution that many say they can't speak out about the problem.

So, why don't the production companies get together and refuse to sell to Mindgeek unless they get paid? Well, if they demand a given price for their goods, that would be price fixing, one of the few aspects of the anti-trust act that the government is still enforcing.

Production of porn films is down 75 percent from the year before Mindgeek was founded. DVD sales of porn are down 50% over the same time span, because who wants to pay for porn they can watch for free if they tolerate some ads?

Netflicks and Amazon are starting to produce their own content (not porn, so far as I know.) We can expect more ethical behavior from them than we see from Mindgeek, but the incentives will be the same. We need to re-examine how our legislation regarding market power affects people selling their wares to distributors or working for them.

The paradox referred to in Bork's book was that antitrust action to increase competition could increase, rather than decrease, prices. What he either failed to realize or didn't care about was that monompsony power, the market power of a dominant buyer, interferes with the business arrangements of people who contract to sell their wares or labor to that buyer. This represents a transfer of wealth from one group to another based on power rather than the workings of a free market just as much as price fixing does.

Wednesday, October 15, 2014

Are we prisoners of language or the authors of our lives?

by John MacBeath Watkins

The Sapir-Whorf hypothesis tells us that language, because it gives us the categories we use to think, affects how we perceive the world. Some researchers have gone so far as to propose that people who have different color lexicons actually see colors differently.

Color me skeptical. I think it highly likely that the Sapir-Whorf hypothesis is correct on more culturally conditioned matters like our sense of fairness, but find it unlikely that it has much, if any, effect on how we see color, as opposed to how we talk about what we perceive.

But this basic insight, which has really been with us since Ferdinand de Saussure's book,  A Course in General Linguistics, was published in 1913, gets at a deeper question. Are we prisoners of the languages that give our minds the categories we think with? Do we have individual agency, or are we prisoners of the structure of meaning?

Is language a prison that restricts us, or a prism through which we see new things?

Marxist political theory has insisted that the structure of meaning is a prison, that those who initiate us into it are enforcing capitalist cultural norms. Structuralist thinkers like Roland Barthes argued against what he called the cult of the author, and in general, structuralists argued against the relevance of human agency and the autonomous individual.

Is this what language looks like?
Structuralism has lost ground in its original field of linguistics. Noam Chomsky, for example, proposed that while structuralism was all right for describing phonology and morphology, it was inadequate for syntax. It could not explain the generation of the infinite variety of possible sentences or deal with the ambiguity of language.

When Saussure developed structuralism, the previous movement in linguistics had been philology, which studied texts through their history, and the meanings of words as they have changed. This is a necessary process when examining classical texts, and philology has sort of calved off from the glacier of linguistics.

Saussure proposed studying language synchronically, that is, at it exists at one time, which was perhaps a good corrective to the habits of his profession. But it did mean that the method was never intended to examine where the structure came from or how it changed. I doubt Saussure anticipated his method completely displacing the earlier methods of studying language. He simply felt is would be helpful to look at language as it exists, as well.

As the understanding of the power of language spread, however, it did tend to obscure the role of the individual. Its proposal to study language as it is, rather than try to attach it to its past, fit with the modernist movement's desire to shed tradition and make the world new and rational, sweeping away the dust and sentiment of the centuries and plunging into the future. At the same time, the concept of the structure of language and thought was frightening. How could we leave the past behind when all we could think was already in the structure?

Some tried to escape the structure of meaning, by making art that represented nothing, writing that tried to trick the brain into a space not already subsumed into the structure. But in the end, you cannot escape from meaning except into meaninglessness, and why do any work that is meaningless?

We are not words in a dictionary that can never be revised. We define ourselves, in fact, we are the source of meaning. The web of meaning we call language would disappear if there were no minds to know it, no people to speak and hear. We learn by play, and it is through creative play that we expand the realm of meaning. A web without connections is just a tangle of fibers. We are the connections, and our relationships to each other are the fibers.

Barthes was wrong. Authors are important, and authorship is pervasive. We are all the authors of our acts, writing the stories of our lives. Learning language and the other structures of society enable us to do this, to create new meanings, affirm or modify traditional meanings, and to influence others.

We need not choose between being ourselves and being part of humanity, because we cannot help being both. Yes, we are in large part made up of those we've known, the books we've read, the traditions we've learned, but we are the vessels in which those things are stored and remade and passed on with our own essence included.

Saturday, October 11, 2014

The Solow paradox, public goods, and the replicator economy.

by John MacBeath Watkins

Robert Solow, a Nobel-prize-winning economist, remarked way back in 1987 that "what everyone feels to have been a technological revolution...has been accompanied a slowdown in productivity growth.”

This has become known as the Solow paradox.

The golden age of productivity growth in the U.S. was between 1939 and 2000, with a slowdown in the 1980s, an increase in the Clinton Administration, and a slowdown again since.

What happened in 1939? Well, we began preparing for war. We didn't just build tanks, guns, ships, and aircraft, we also built roads and airports, and we dredge harbors and improved port facilities. Prior to World War II, flying boats were popular for serving areas that didn't have airports. After the war, there were plenty of airports.

The infrastructure binge continued after the war, and Dwight Eisenhower thought his greatest accomplishment was the Interstate Highway Act, which knit the country together with ribbons of road. Eisenhower understood logistics. He also understood that training was important if you wished to mobilize a large enterprise, and he elevated education to a cabinet-level office.

The federal investment in roads and education set loose the potential of the people and the land. And what have we done with this legacy of supply-side investment in public goods?

We've disinvested.  Our public goods are getting old, and we've pushed onto students the cost of financing their education, so that someone can come out of college very easily in $100,000 debt. Higher education keeps getting cut while more is spent on other things, like prisons and welfare. Yet providing better education is one way we should be able to spend less on prisons and welfare.

Our bridges are getting old, some of our roads are getting rough.

But why didn't our technology give us the added productivity our disinvestment in public goods was taking away?

Maybe it did. Or maybe, sometimes technology is not necessarily useful for increasing measured productivity.

You measure productivity by seeing how many widgets are produced over a period of time by a given number of people. For example, in the cottage industry of music that existed before recorded music came along, you had to either make your own or hire a musician to make the music for you. Every song required a person making music to happen.

When recorded music cam along, you no longer had to have a musician present to have a song. This meant fewer people would be employed as musicians, but also that people at the top of the profession could provide music for a larger number of people. A musician could sing a song once, and millions of people could buy that song and play it repeatedly. There was more music in our lives, it was made by the best musicians, and the cost was lower. Productivity increased.

But we don't know how much, because we weren't calculating the productivity of musicians. A few musicians at the top were more productive, but once a record had been sold, it could be played many times. Those repeat performances were taken out of the economic sphere, and not counted as performances in any accounting sense. The metric became the sale of the record, rather than the performance of the song.

But what happened with the digital revolution in music? Well, this:

Unless there was a dramatic decrease in the number of musicians, this represents a huge decrease in productivity. Far fewer songs are being sold, and if the number of musicians remains constant, their productivity, measured by the usual economic methods, has decreased dramatically.

But we know that this has not been accompanied by an increase in the cost of a song. What has happened instead is that much of the music produced has been taken out of the economic sphere altogether. People are pirating the songs, and getting music for free. There is a cost to this; it's not really as easy to steal a song as to buy it, but those who wish to sell a song are competing with the free copy that can be pirated by acquiring some skill and jettisoning some scruples.

In the realm of classified ads, most of those are free on Craigslist. Until recently, most newspapers have made their digital product free. As a result, whole swaths of the economy have come out of the economic sphere. When you produce something for a lower price, you increase productivity. When you produce it for free, in economic terms you aren't producing anything.

Thus, we have a different paradox, that of the replicator economy. On Star Trek, replicators can make anything you want for free. But if everything you need is free, how does anyone get paid? Musicians are already facing the replicator economy. Writers may face it soon.

This shows that not all technology produces increases in economic productivity, because some of it takes things out of the economic sphere.

In addition, highly-skilled artisans who were more productive than the average person found it impossible to keep making money at their craft. Take the example of the weavers and what William Black called the "dark, satanic mills" that replaced them.

They increased the number of yards of fabric per worker, and reduced the level of skill required by the worker. Weavers, who had made a good living because they were more productive than average, were put out of work. Some became Luddites, smashing the machinery that was eclipsing their way of life, but in the end, they lost.

They were replaced by low-skilled, low-paid workers, including in many cases children. The price of fabric went down, but the way of life of the people working to make the fabric became worse. And while productivity was increased in the making of fabric, the skilled artisans found their skill no longer required.

A skilled artisan who ends up working as a laborer or a waiter is going to become less productive. And every disruptive technology must have the effect of obsoleting some skills. It takes time for people to adjust, and some never will. Society as a whole may benefit, but in the disrupted industry, there is some immiseration, and among the displaced workers, there will be a decline in productivity. In fact, the immiseration of the obsolete workers removes the incentive for other industries to become more productive, because it drives down the price of labor.

So, what does increase productivity?

Full employment. I know, I know, productivity actually climbs in a recession because you lay off your least productive workers, but in the long run, only a shortage of workers convinces companies to make capital investments to reduce the number of workers needed. If you have to bid up the price of workers to attract employees, it makes sense to increase productivity.

Right now, we have the spectacle of cash-rich companies buying back their own stock, which is great for managers who have stock options, but not great for productivity.

Disinvestment in infrastructure has been bad for productivity, and we could kill two birds with one stone by catching up on that, which would increase employment, and build improvements that would unleash some productivity. Investment in public capital goods could increase employment enough to stimulate investment in private capital goods.

But what are the chances of that? We have an entire political party dedicated to the proposition that government spending can't produce jobs.Until we get better lawmakers, we won't have better policy.

Tuesday, October 7, 2014

Undead persons, born at the crossroads of law and money

by John MacBeath Watkins

We argue about what a person is, in terms of the biology of the individual, but what if we were to apply the same standards to those undead things we call persons, the corporations?

The Citizens United decision determined that corporations are people for the purpose of free speech, in particular in spending money to influence political races. The Hobby Lobby decision granted corporations an exemption from a law because the corporation was considered to have religious views. And legislators in several states want to give a zygote the legal status of a person at the moment the sperm enters the egg.

I think these legal maneuvers reflect confusion about what a person is. A corporation has long been a person in terms of being able to sign contracts. but they are composite beings, made up of many biological persons. It is difficult to imagine them as persons in the sense of having faith, when they are likely made up of people of differing faiths, or of being politically engaged as citizens when they are made up of citizens with differing views. It is difficult to imagine a zygote having faith or political views as well.

This used to be a matter of religion, when philosophers argued about at what point a baby is ensouled. Aristotle argued that the baby did not have a soul until it laughed, which he said would happen about three months after birth. This allowed space for the Greek custom of exposing a child who was deformed, illegitimate, or otherwise found wanting, so that it died if it was not rescued by the gods or a passer-by. This possibility of rescue cleared the parents of the charge of murder.

When I saw Abby Hoffman debate Jerry Rubin, he claimed his views on abortion were shaped by his religion:

"The Jewish mother does not consider the fetus a person until it finishes graduate school," he joked.

But he did have a sort of point. We may consider a newborn a person, but we don't allow it to sign a contract until it reaches its majority at 18 years of age. And yet, we allow newborn corporations to sign contracts and dodge taxes with the best of their human competitors.

This is because the corporation is not a human person, it is a gestalt being made up of human persons who are of age to sign contracts. We think it is owned by shareholders, but as a person, it cannot be owned. Shareholders buy a right to some of the corporation's future earnings, just as gangsters used to buy a piece of a fighter hoping to gain part of any purse he won (then made sure of it by paying the other guy to go in the tank.)

If you owned a piece of a fighter, you couldn't say, "I'm a bit peckish, cut off a leg for me and I'll eat it," because you can't own a person the way you can own a chicken. Nor can a shareholder demand the corporation sell off part of itself to buy out said shareholder. The shareholder must find a greater fool to buy the shares.

But what is a human person? We certainly grant them greater rights for being human, and increase their rights as they become more mature in their judgement. In short, we regard them, as Abby Hoffman's mother did, as more of a person when they have more age and experience.

One way to explore when a person begins is to ask, at what point does personhood end? In general, our medical experts agree that human life ends when brain activity ends. Why, then, would we consider a zygote, which has no brain, to be a person?

While some who oppose abortion have claimed there is brain activity at 40 days, this does not seem to be the case. Certainly anyone with a heartbeat has some brain activity, but they would not be considered alive if they have no higher-level cognitive brain activity. One traditional notion was that the child was alive at its quickening. That would be when the mother first feels it kick, at about 16 or 17 weeks from conception.

But many thinks kick and are not human. Brain activity that includes higher-level cognition happens at about 26-27 weeks. But that doesn't mean baby is ready to sign its first contract. Becoming human involves having a human brain, and while a baby is beginning to develop one at 6 months, it hasn't yet. More important, it hasn't yet been programmed.

The real distinction between human and non-human life is the strange sort of virtual reality of the world of symbolic thought. This is part of the reason we delay responsibilities of citizenship such as being able to sign a contract or vote -- it takes a while to gain wisdom. Another reason is simple biology. Our brains mature and with changes in our brains, our judgement matures.

All of this biology is lost in discussions of what sort of person a corporation is. When does brain activity begin in the corporation? Never. Servants of the corporation do the thinking. When does the life of the corporation end?

The corporation cannot be killed by driving a wooden stake through its heart, like a vampire, or with a silver bullet. It can theoretically go on forever, never living, but undead, a creature born at the crossroads of law and money, able to corrupt its servants with rewards and punishments and make them do things they would never do as individuals. The corporation is never ensouled.

A corporation can only die if certain words are inscribed on certain papers and placed in the hands of properly sanctified public servants, perhaps with a sacrifice of money.

They are a locus of power that has its own logic, but not its own soul or conscience, or in any way its own mind. Sometimes their servants manage to gain control of them and use them to increase their own power and wealth while sucking strength out of the corporation, like a demon chained to serve a mage, who is in turn warped by the pull of the soulless thing they have exploited.

Is it any wonder that corporations, these strange and powerful persons, continue to expand their reach and their power, even in the halls of law? They are like an alien hand in the market, a part of the body politic that can act in ways we don't associate with ourselves.

And yet, our Supreme Court has ruled that these undead things are persons who act as citizens, with the same rights of free speech as someone with a mind, and the same rights of religious conscience as someone with a conscience. The alien hand has extended its reach, and gripped our most precious institutions.

Can we find the words to limit their reach, or the make the sacred documents that can confine them? Or can we find a way to ensoul them, so that they will be worthy of the responsibilities the court has thrust upon them?

Friday, October 3, 2014

Don't let your babies grow up to be booksellers

Mamas, don't let your babies
(to the tune of Mamas, don't let your babies grow up to be cowboys, with apologies to the late Waylon Jennings.)

by John MacBeath Watkins

Booksellers ain't easy to love and they're harder to hold.
They'd rather give you a book than diamonds or gold.
thick glasses and old faded Levis,
And each book begins a new day.
If you don't understand him, an' he don't die young,
He'll prob'ly just get fat and turn gray.

Mamas, don't let your babies grow up to be booksellers.
Don't let 'em quote Dickens or drive them old trucks.
Let 'em be doctors and lawyers and such.
'Cos they'll never leave home and they'll recite obscure poems.
Even to someone they love.

Booksellers like reference rooms and gray rainy mornings,
Not little puppies and children and girls on the stairs.
Them that don't know him won't like him and them that do,
Sometimes won't know how to take him.
He ain't wrong, he's just different but his obliviousness won't let him,
Do things to make you think that he cares.

Mamas, don't let your babies grow up to be booksellers.
Don't let 'em quote Dickens or drive them old trucks.
Let 'em be doctors and lawyers and such.
Mamas don't let your babies grow up to be booksellers.
'Cos they'll never leave home and they'll recite obscure poems.
Even to someone they love.

Saturday, September 27, 2014

A friend to entropy and an anarchist at heart

by John MacBeath Watkins

S. was a tall woman, in her private life a sort of den mother for anarchists with whom she shared a house. Some time after she started working for me, she began dating a cousin of mine who I'd never previously met, and eventually she married him.

So, I suppose whatever forces shape our fate must have Intended that she be part of my cohort. I thought of her recently, when I asked my business partner where something was.

"Why do men always ask women where things are?" she replied.

That was an easy one.

"Because you move them."

She had, in fact, tidied away the object in question, and knew exactly where it was in precisely the way I did not. And that is one of the many great things about Jamie. She generally knows where she puts things.

Not so with S. And this was a problem, because of the way I tend to organize things.

If I want to be able to find something, I do the obvious thing: I leave it out in plain sight. This tends to lead to a bit of clutter, with the most often-used items on top.

S. wanted a neat work environment. To her, this meant less clutter. The way she achieved less clutter was in the obvious way: She put things out of view. Unfortunately, once things were out of view, she seemed to think the problem was solved, and actually finding the object next time it was needed was not a high priority for her unless it was something she used.

I came to view this in terms of entropy. Entropy isn't just a good idea, it's the law, and it clearly states that the universe is going from a higher state of organization to a lower state of organization.

My system of organization acknowledges this. My environment is in a state of apparently increasing disorder, and yet, for the most part, I can find things. The system S. used involved the expenditure of energy, which is entropy itself, to bring the environment to a state of greater disorder, in which information about where things were was destroyed, which is entropy again.

Now, it is possible for a system of putting things out of sight to preserve this information, even for it to preserve information better than my somewhat sedimentary system of piles. You would, for example, put stuff under "S" for "stuff," and other stuff under "O" for "other stuff."

This was not the method S. employed. Her method was to expend energy to destroy information, and I cannot help but think that on some level, she did so as a friend to entropy, an anarchist at heart.

Wednesday, September 24, 2014

The Self-conscious mythology of literature (The Strangeness of being human, cont'd)

by John MacBeath Watkins

There was an age of myth, when we explained the world to each other by telling stories about the gods. There was an age of fable, when we explained morality to each other by telling folk stories that belonged to the culture.

And there is the age of literature, when we know who wrote the story, and make it their property.

In the age of myth, we told each other stories that were supposed to be true, and didn't know where they came from. During the age of fable we understood them as parables. In our age of literature, we understand them as personal insight.

We regard all as contributing to our understanding of the nature of human nature, but by stages, they have become more tenuously connected with socially constructed truth, and more subject to our self-conscious understanding. We ask ourselves, is this a story we can accept as telling a truth about humanity, or do we reject it? Rejecting the myths was not optional during the time those religions were active. People lived in societies where the truth of the history of the gods was too socially accepted.

To reject the story of a fable, we would have to say that we disagree with the culture, not with the gods. To disagree with an author, we have only to disagree with one individual. The judgments of the author and the reader are those of individuals, with the social acceptance mediated by markets -- which books people talk about, and buy, or feel left out because they haven't read.

We have other ways of understanding human nature, such as the more rigorous storytelling of science, the unreliable narrators of our families and friends explaining themselves as best they understand themselves, or the frantic efforts of our news sources trying to attract our attention to fragments or figments of information or gossip they think we might like to know.

But it is literature which works the most like mythology, transporting us into stories and allowing us to experience things that have not happened in our own lives. It instructs us or subverts us in ways mere facts do not, influencing the emotional armature on which we hang our facts and shape them into our beliefs.

As our culture has changed, we've become more self-conscious of the process. We may choose to judge a book by its author. We might decide that if Ayn Rand could live off Social Security in her old age, perhaps the philosophy she pushed, which would claim only the morally inferior "takers" would need a safety net, was not even something she could live by.

Or we may say to ourselves, "J.D. Salinger seems so deep when I was so shallow, such a sallow youth, but now that I'm in the working world I have put aside that juvenile cynicism and taken up the more useful and manipulative cynicism of Dale Carnegie."

The ability to do this makes our emotional structure more malleable than we would be if the stories we based our lives on were eternal verities handed to us by the gods, as if the clay of our feet never hardens. This gives us an adaptability our ancestors never knew or needed, but what is the cost? Do we become chameleons, taking on the coloration of our social surroundings to better camouflage our true selves, or do we change our true selves at a pace never before seen in human history?

I suspect the latter. We are bombarded with stories, on television, in games, in books, even, for the dwindling few, in magazines. We grow by accepting them into ourselves, or set boundaries by rejecting them, and we are constantly reshaped, little by little, meme by meme.