Google analytics

Wednesday, October 29, 2014

How to start a dark age and what myths should do for you

by John MacBeath Watkins

The term "dark ages" is not much used anymore, but it still conjures up notions of an age of ignorance following the fall of a great civilization.

It was first applied to the entire Middle Ages in about 1330 by Petrarch. Light and darkness had symbolized good and evil, but Petrarch made them symbols of knowledge and ignorance. He saw his own time as one of darkness, and aspired to a time of greater light.

That time of light arrived as the Renaissance some time later, the dawning of a time when people admired knowledge and it became more widespread. Then came a time when archaeology started digging up the "dark ages" and found a great deal had been known and accomplished in the middle ages, so now we seldom used the term for anything but the early middle ages.

It's easy to put a starting date to the dark ages. Emperor Justinian closed pagan and Jewish school in 529 AD, and the dark ages began.

The decree, as translated by James Hannam, reads as follows:
We wish to widen the law once made by us and by our father of blessed memory against all remaining heresies (we call heresies those faiths which hold and believe things otherwise than the catholic and apostolic orthodox church), so that it ought to apply not only to them but also to Samaritans [Jews] and pagans. Thus, since they have had such an ill effect, they should have no influence nor enjoy any dignity, nor acting as teachers of any subjects, should they drag the minds of the simple to their errors and, in this way, turn the more ignorant of them against the pure and true orthodox faith; so we permit only those who are of the orthodox faith to teach and accept a public stipend. 
Justinian seems mainly to have aimed this at the Athenian Academy, which traced its (sometimes interrupted) existence back to its founding by Plato in the early 4th century BCE, but he also closed Jewish schools and schools run by those judged to be heretics.

In so doing, he centralized power over what was deemed to be true. The decree made it illegal to teach things that were contrary to the teachings of the "catholic and apostolic orthodox church."

There were Greek philosophers who had figured out not only that the earth was round, but had calculated pretty accurately its circumference. They knew that the rotation of the earth explained the sequence of day and night. Justinian didn't make it a crime for the great pagan scholars of his age to write and publish -- that came later -- but he shut down the Academy, leaving the scholars to make their own way.

Hammon is a skeptic about the impact of this action. Many pagan documents survived, and were even taught in Christian academies.

But the schools in the Eastern Roman Empire were survivors after the fall of the Western Roman Empire in
Justinian
476. Justinian was the last of the Latin-speaking emperors of the Eastern Roman Empire. Justinian sought to reconquer the territory that had been the Western Roman Empire, but failed. As the empire's grip over Europe failed, political institutions that had united it failed, and the only pan-European institution remaining was the Church. It became the dominant force in the preservation of knowledge and the maintenance of teaching institutions and traditions. And it demanded allegiance to what the Church believed.

Some scholars and some texts made there way to Persia, and with the rise of the Muslim religion, schools that remained in Alexandria and Cairo fell into Muslim hands. Thus began the golden age of Muslim science and philosophy, early in the 7th century AD.

The golden age of Muslim science and philosophy spanned from 750 AD to about 1100 AD. What happened then?

The Incoherence of the Philosophers, that's what. The second-most influential Muslim cleric (after Muhammad) was a scholar named  Abu Hamid Al Ghazali, who wrote a book of that title published in the late 11th century. He argued against those Muslim scholars who had based their works on Plato and Aristotle were wrong -- essentially, heretical. The spread of his thought led to religious institutions that taught that human reason by itself cannot establish truth. Although Al-Ghazali himself had nothing against science, this in effect meant that if you really wanted to establish truth, you didn't go to a scientist or a philosopher who had devoted his life and efforts to learning about the thing in question. Instead, the final arbiter of truth would be a cleric who specialized in the Koran.

This led to a decay of Muslim science and philosophy. Some would say, it led to a dark age for their civilization.

This seems to be the way to cause a dark age: You simply give religion authority over establishing what is true of the physical world.

Religion is in the business of delivering eternal verities, not of discovering new things. In fact, in such celebrated cases of the discovery of new things as Galileo's astronomy or Darwin's Origin of Species, religion has fought against new knowledge of how the universe works.

Joseph Campbell, in Myths to Live By, wrote that religion or myth (the difference seems to be that myths are religious beliefs no longer in use) serves four functions:

One, "to waken and maintain in the individual a sense of awe and gratitude in relation to the mystery dimension of the universe..."

Two, "to offer an image of the universe that will be in accord with the knowledge of the time..."

Three, "to validate, support, and imprint the norms of a given, specific moral order, that, namely, of the society in which the individual is to live."

Four, "to guide him, stage by stage, in health, strength, and harmony of spirit, through the whole foreseeable course of a useful life."

Can a religion that fails in the second function succeed in the other three? I doubt very much it can, because a failure in one area undermines faith in the truth of sacred knowledge in all the others. How could a church that taught the earth was flat have any authority after we had photographed the earth from the moon?

But the Catholic Church did not remove Galileo's books teaching heliocentrism from the its  Index of Forbidden Books until 1758, and in 1992 the Pope announced that the church accepted that the earth moves around the sun. I can find no indication, however, of the verdict of the Inquisition against Galileo being rescinded. The committee Pope John Paul II appointed in 1979 had, by 1992, concluded that the Inquisition had acted properly by the standards of its day, although Galileo was right about the sun and earth.

So, that's all right. Retard intellectual progress by a century of so, and it's all in good fun. In 2008, Pope Benedict XVI cancelled an appearance at La Sapienza University because some students and professors sent him a letter protesting the Pope's expressed views on Galileo. He was probably thinking, "why you talkin' 'bout old stuff?"

It was the notion that there had been a dark age that gave people the notion to call the blossoming of knowledge and science the Enlightenment.

The Counter-Enlightenment, which started not long after the Church took Galileo's books off the Index of Forbidden Books, has argued that the Enlightenment undermines religion and the political and social order. This is, in fact, the basic stance of conservatism since at least Edmund Burke. The term "Counter-Enlightenment," as I'm using it here does not refer to a single coherent movement with identifiable leaders, more to a wide span of groups and individuals who have argued against the goal of constant progress to new knowledge and a more rational society espoused by the great Enlightenment thinkers.

They are probably right in arguing that the Enlightenment has undermined religion and the existing social order. After all, the Inquisition is a shadow of its former self, the church has had to repeatedly retreat on who is listed on the Index of Forbidden Books, and the most recent Pope has finally said that the beliefs of the Church do not conflict with the big bang theory about the origins of the universe or Darwin's ideas about the origin of species. It would be better if the church had not involved itself in such matters in the first place, but if it must make pronouncements about the nature of the physical world, it will have to change its tune when our knowledge changes or be undermined by new knowledge.

We are still fighting this battle. Zealots want their religion's version of the origin of specie taught in public schools (they originated as God made them) and moral notions, such as whether it is better to condemn homosexuals or accept them, are being fought out as the culture changes. A church that has failed to distinguish between its core beliefs and issues that seem less religious than social must change or fail the test of providing a world view in harmony with the knowledge of the society to which it offers spiritual guidance.

The Catholic Church is a handy way to talk about this, precisely because it is so well organized. But it is accompanied in its problems with the Enlightenment by people of many faiths. The easy way to deal with such problems used to be the one used on Galileo, tell the inconvenient person to shut up or die. But at this point in history, the world is changing too fast and the knowledge base outside the church is to big to be controlled.


Saturday, October 25, 2014

Market power, monopsony and the porn industry

by John MacBeath Watkins

In a previous post, we discussed how changes in the music industry explain a bit of the Solow paradox, the fact the new technology is being adopted, but productivity hasn't seen much increase. Now we have another example of a way in which technology is suppressing, rather than increasing, productivity growth.

It also shows how power can transfer wealth from one group to another in ways a free market wouldn't allow based on monopsony, the dominance of a buyer in the marketplace.

The porn industry, once an economically vibrant part of the economy, has been devastated by changes in the business even as it adopts new technology. Porn stars once had a decent income from their performances, but now many have to work as prostitutes on the side to support themselves. It's a bit like the musicians who used to make most of their money from recordings, and now find they must get their living from live performances.

Like the musicians, part of their problem is piracy. Computer technology allows the rapid and almost perfect copying of music and videos. As a result, many viewings of porn have been taken entirely out of  the economic sphere.

But in the case of porn, there's another problem, the market power of the main distributor. The industry is dominated by Mindgeek, formerly Manwin. The company describes itself as being founded in 2013, but that's just when it changed its name back to Mindgeek after a period of being known as Manwin. Each name change came after its owners ran into legal trouble, resulting in the sale of the business.

Mindgeek has something like monopsony power over the porn studios. They own an array of "tubes," the Youtube-like on-line distribution channels for porn.  They also own a lot of porn producers, and are essential for the distribution of the works of other porn producers. According to a recent Slate article, Mindgeek doesn't always pay the porn producers when they put up a video on one of their sites:
Even content producers that MindGeek owns have trouble getting their movies off MindGeek’s tube sites. The result has been a vampiric ecosystem: MindGeek’s producers make porn films mostly for the sake of being uploaded on to MindGeek’s free tube sites, with lower returns for the producers but higher returns for MindGeek, which makes money off of the tube ads that does not go to anyone involved in the production side.
 The result is that performers have to have sex more times to support themselves, performing for the videos and doing their "live" performances as prostitutes.But isn't more work for less money lower productivity as we account for such things?

There was a time when one company in an industry owning most of the production and distribution would have set off alarms in the Justice Department and resulted in anti-trust action. That changed in 1980 with the election of Ronald Reagan. Word soon went out that the justice department would not be worrying about practices such as predatory pricing, and in fact, was really only worried about monopoly power if it resulted in higher prices to consumers, essentially meaning that the Justice Department was now mainly interested in price fixing in its anti-trust enforcement. It was a legal theory advanced by Robert Bork in a book titled The Antitrust Paradox.

This radically changed the incentives for American businesses. Predatory pricing, a practice that got Safeway in trouble with the Justice Department in the 1960s, became a notorious tactic of WalMart. The key was not to use this power to raise prices, but to dominate its markets and use its market power to squeeze producers.

Mindgeek is using a similar tactic. It is distributing the product for free on ad-supported sites, while squeezing porn production companies and performers to lower its costs. It routinely violates the intellectual property rights to sexual performances, but is so essential to production companies and porn performers for distribution that many say they can't speak out about the problem.

So, why don't the production companies get together and refuse to sell to Mindgeek unless they get paid? Well, if they demand a given price for their goods, that would be price fixing, one of the few aspects of the anti-trust act that the government is still enforcing.

Production of porn films is down 75 percent from the year before Mindgeek was founded. DVD sales of porn are down 50% over the same time span, because who wants to pay for porn they can watch for free if they tolerate some ads?

Netflicks and Amazon are starting to produce their own content. We can expect more ethical behavior from them than we see from Mindgeek, but the incentives will be the same. We need to re-examine how our legislation regarding market power affects people selling their wares to distributors or working for them.

The paradox referred to in Bork's book was that antitrust action to increase competition could increase, rather than decrease, prices. What he either failed to realize or didn't care about was that monompsony power, the market power of a dominant buyer, interferes with the business arrangements of people who contract to sell their wares or labor to that buyer. This represents a transfer of wealth from one group to another based on power rather than the workings of a free market just as much as price fixing does.

Wednesday, October 15, 2014

Are we prisoners of language or the authors of our lives?

by John MacBeath Watkins

The Sapir-Whorf hypothesis tells us that language, because it gives us the categories we use to think, affects how we perceive the world. Some researchers have gone so far as to propose that people who have different color lexicons actually see colors differently.

Color me skeptical. I think it highly likely that the Sapir-Whorf hypothesis is correct on more culturally conditioned matters like our sense of fairness, but find it unlikely that it has much, if any, effect on how we see color, as opposed to how we talk about what we perceive.

But this basic insight, which has really been with us since Ferdinand de Saussure's book,  A Course in General Linguistics, was published in 1913, gets at a deeper question. Are we prisoners of the languages that give our minds the categories we think with? Do we have individual agency, or are we prisoners of the structure of meaning?

Is language a prison that restricts us, or a prism through which we see new things?

Marxist political theory has insisted that the structure of meaning is a prison, that those who initiate us into it are enforcing capitalist cultural norms. Structuralist thinkers like Roland Barthes argued against what he called the cult of the author, and in general, structuralists argued against the relevance of human agency and the autonomous individual.

Is this what language looks like?
Structuralism has lost ground in its original field of linguistics. Noam Chomsky, for example, proposed that while structuralism was all right for describing phonology and morphology, it was inadequate for syntax. It could not explain the generation of the infinite variety of possible sentences or deal with the ambiguity of language.

When Saussure developed structuralism, the previous movement in linguistics had been philology, which studied texts through their history, and the meanings of words as they have changed. This is a necessary process when examining classical texts, and philology has sort of calved off from the glacier of linguistics.

Saussure proposed studying language synchronically, that is, at it exists at one time, which was perhaps a good corrective to the habits of his profession. But it did mean that the method was never intended to examine where the structure came from or how it changed. I doubt Saussure anticipated his method completely displacing the earlier methods of studying language. He simply felt is would be helpful to look at language as it exists, as well.

As the understanding of the power of language spread, however, it did tend to obscure the role of the individual. Its proposal to study language as it is, rather than try to attach it to its past, fit with the modernist movement's desire to shed tradition and make the world new and rational, sweeping away the dust and sentiment of the centuries and plunging into the future. At the same time, the concept of the structure of language and thought was frightening. How could we leave the past behind when all we could think was already in the structure?

Some tried to escape the structure of meaning, by making art that represented nothing, writing that tried to trick the brain into a space not already subsumed into the structure. But in the end, you cannot escape from meaning except into meaninglessness, and why do any work that is meaningless?

We are not words in a dictionary that can never be revised. We define ourselves, in fact, we are the source of meaning. The web of meaning we call language would disappear if there were no minds to know it, no people to speak and hear. We learn by play, and it is through creative play that we expand the realm of meaning. A web without connections is just a tangle of fibers. We are the connections, and our relationships to each other are the fibers.

Barthes was wrong. Authors are important, and authorship is pervasive. We are all the authors of our acts, writing the stories of our lives. Learning language and the other structures of society enable us to do this, to create new meanings, affirm or modify traditional meanings, and to influence others.

We need not choose between being ourselves and being part of humanity, because we cannot help being both. Yes, we are in large part made up of those we've known, the books we've read, the traditions we've learned, but we are the vessels in which those things are stored and remade and passed on with our own essence included.





Saturday, October 11, 2014

The Solow paradox, public goods, and the replicator economy.

by John MacBeath Watkins

Robert Solow, a Nobel-prize-winning economist, remarked way back in 1987 that "what everyone feels to have been a technological revolution...has been accompanied everywhere...by a slowdown in productivity growth.”

This has become known as the Solow paradox.

The golden age of productivity growth in the U.S. was between 1939 and 2000, with a slowdown in the 1980s, an increase in the Clinton Administration, and a slowdown again since.

What happened in 1939? Well, we began preparing for war. We didn't just build tanks, guns, ships, and aircraft, we also built roads and airports, and we dredge harbors and improved port facilities. Prior to World War II, flying boats were popular for serving areas that didn't have airports. After the war, there were plenty of airports.

The infrastructure binge continued after the war, and Dwight Eisenhower thought his greatest accomplishment was the Interstate Highway Act, which knit the country together with ribbons of road. Eisenhower understood logistics. He also understood that training was important if you wished to mobilize a large enterprise, and he elevated education to a cabinet-level office.

The federal investment in roads and education set loose the potential of the people and the land. And what have we done with this legacy of supply-side investment in public goods?

We've disinvested.  Our public goods are getting old, and we've pushed onto students the cost of financing their education, so that someone can come out of college very easily in $100,000 debt. Higher education keeps getting cut while more is spent on other things, like prisons and welfare. Yet providing better education is one way we should be able to spend less on prisons and welfare.

Our bridges are getting old, some of our roads are getting rough.

But why didn't our technology give us the added productivity our disinvestment in public goods was taking away?

Maybe it did. Or maybe, sometimes technology is not necessarily useful for increasing measured productivity.

You measure productivity by seeing how many widgets are produced over a period of time by a given number of people. For example, in the cottage industry of music that existed before recorded music came along, you had to either make your own or hire a musician to make the music for you. Every song required a person making music to happen.

When recorded music cam along, you no longer had to have a musician present to have a song. This meant fewer people would be employed as musicians, but also that people at the top of the profession could provide music for a larger number of people. A musician could sing a song once, and millions of people could buy that song and play it repeatedly. There was more music in our lives, it was made by the best musicians, and the cost was lower. Productivity increased.

But we don't know how much, because we weren't calculating the productivity of musicians. A few musicians at the top were more productive, but once a record had been sold, it could be played many times. Those repeat performances were taken out of the economic sphere, and not counted as performances in any accounting sense. The metric became the sale of the record, rather than the performance of the song.

But what happened with the digital revolution in music? Well, this:

http://www.theatlantic.com/business/archive/2013/02/think-artists-dont-make-anything-off-music-sales-these-graphs-prove-you-wrong/273571/

Unless there was a dramatic decrease in the number of musicians, this represents a huge decrease in productivity. Far fewer songs are being sold, and if the number of musicians remains constant, their productivity, measured by the usual economic methods, has decreased dramatically.

But we know that this has not been accompanied by an increase in the cost of a song. What has happened instead is that much of the music produced has been taken out of the economic sphere altogether. People are pirating the songs, and getting music for free. There is a cost to this; it's not really as easy to steal a song as to buy it, but those who wish to sell a song are competing with the free copy that can be pirated by acquiring some skill and jettisoning some scruples.

In the realm of classified ads, most of those are free on Craigslist. Until recently, most newspapers have made their digital product free. As a result, whole swaths of the economy have come out of the economic sphere. When you produce something for a lower price, you increase productivity. When you produce it for free, in economic terms you aren't producing anything.

Thus, we have a different paradox, that of the replicator economy. On Star Trek, replicators can make anything you want for free. But if everything you need is free, how does anyone get paid? Musicians are already facing the replicator economy. Writers may face it soon.

This shows that not all technology produces increases in economic productivity, because some of it takes things out of the economic sphere. So, what does increase productivity?

Full employment. I know, I know, productivity actually climbs in a recession because you lay off your least productive workers, but in the long run, only a shortage of workers convinces companies to make capital investments to reduce the number of workers needed. If you have to bid up the price of workers to attract employees, it makes sense to increase productivity.

Right now, we have the spectacle of cash-rich companies buying back their own stock, which is great for managers who have stock options, but not great for productivity.

Disinvestment in infrastructure has been bad for productivity, and we could kill two birds with one stone by catching up on that, which would increase employment, and build improvements that would unleash some productivity. Investment in public capital goods could increase employment enough to stimulate investment in private capital goods.

But what are the chances of that? We have an entire political party dedicated to the proposition that government spending can't produce jobs.Until we get better lawmakers, we won't have better policy.



Tuesday, October 7, 2014

Undead persons, born at the crossroads of law and money

by John MacBeath Watkins

We argue about what a person is, in terms of the biology of the individual, but what if we were to apply the same standards to those undead things we call persons, the corporations?

The Citizens United decision determined that corporations are people for the purpose of free speech, in particular in spending money to influence political races. The Hobby Lobby decision granted corporations an exemption from a law because the corporation was considered to have religious views. And legislators in several states want to give a zygote the legal status of a person at the moment the sperm enters the egg.

I think these legal maneuvers reflect confusion about what a person is. A corporation has long been a person in terms of being able to sign contracts. but they are composite beings, made up of many biological persons. It is difficult to imagine them as persons in the sense of having faith, when they are likely made up of people of differing faiths, or of being politically engaged as citizens when they are made up of citizens with differing views. It is difficult to imagine a zygote having faith or political views as well.

This used to be a matter of religion, when philosophers argued about at what point a baby is ensouled. Aristotle argued that the baby did not have a soul until it laughed, which he said would happen about three months after birth. This allowed space for the Greek custom of exposing a child who was deformed, illegitimate, or otherwise found wanting, so that it died if it was not rescued by the gods or a passer-by. This possibility of rescue cleared the parents of the charge of murder.

When I saw Abby Hoffman debate Jerry Rubin, he claimed his views on abortion were shaped by his religion:

"The Jewish mother does not consider the fetus a person until it finishes graduate school," he joked.

But he did have a sort of point. We may consider a newborn a person, but we don't allow it to sign a contract until it reaches its majority at 18 years of age. And yet, we allow newborn corporations to sign contracts and dodge taxes with the best of their human competitors.

This is because the corporation is not a human person, it is a gestalt being made up of human persons who are of age to sign contracts. We think it is owned by shareholders, but as a person, it cannot be owned. Shareholders buy a right to some of the corporation's future earnings, just as gangsters used to buy a piece of a fighter hoping to gain part of any purse he won (then made sure of it by paying the other guy to go in the tank.)

If you owned a piece of a fighter, you couldn't say, "I'm a bit peckish, cut off a leg for me and I'll eat it," because you can't own a person the way you can own a chicken. Nor can a shareholder demand the corporation sell off part of itself to buy out said shareholder. The shareholder must find a greater fool to buy the shares.

But what is a human person? We certainly grant them greater rights for being human, and increase their rights as they become more mature in their judgement. In short, we regard them, as Abby Hoffman's mother did, as more of a person when they have more age and experience.

One way to explore when a person begins is to ask, at what point does personhood end? In general, our medical experts agree that human life ends when brain activity ends. Why, then, would we consider a zygote, which has no brain, to be a person?

While some who oppose abortion have claimed there is brain activity at 40 days, this does not seem to be the case. Certainly anyone with a heartbeat has some brain activity, but they would not be considered alive if they have no higher-level cognitive brain activity. One traditional notion was that the child was alive at its quickening. That would be when the mother first feels it kick, at about 16 or 17 weeks from conception.

But many thinks kick and are not human. Brain activity that includes higher-level cognition happens at about 26-27 weeks. But that doesn't mean baby is ready to sign its first contract. Becoming human involves having a human brain, and while a baby is beginning to develop one at 6 months, it hasn't yet. More important, it hasn't yet been programmed.

The real distinction between human and non-human life is the strange sort of virtual reality of the world of symbolic thought. This is part of the reason we delay responsibilities of citizenship such as being able to sign a contract or vote -- it takes a while to gain wisdom. Another reason is simple biology. Our brains mature and with changes in our brains, our judgement matures.

All of this biology is lost in discussions of what sort of person a corporation is. When does brain activity begin in the corporation? Never. Servants of the corporation do the thinking. When does the life of the corporation end?

The corporation cannot be killed by driving a wooden stake through its heart, like a vampire, or with a silver bullet. It can theoretically go on forever, never living, but undead, a creature born at the crossroads of law and money, able to corrupt its servants with rewards and punishments and make them do things they would never do as individuals. The corporation is never ensouled.

A corporation can only die if certain words are inscribed on certain papers and placed in the hands of properly sanctified public servants, perhaps with a sacrifice of money.

They are a locus of power that has its own logic, but not its own soul or conscience, or in any way its own mind. Sometimes their servants manage to gain control of them and use them to increase their own power and wealth while sucking strength out of the corporation, like a demon chained to serve a mage, who is in turn warped by the pull of the soulless thing they have exploited.

Is it any wonder that corporations, these strange and powerful persons, continue to expand their reach and their power, even in the halls of law? They are like an alien hand in the market, a part of the body politic that can act in ways we don't associate with ourselves.

And yet, our Supreme Court has ruled that these undead things are persons who act as citizens, with the same rights of free speech as someone with a mind, and the same rights of religious conscience as someone with a conscience. The alien hand has extended its reach, and gripped our most precious institutions.

Can we find the words to limit their reach, or the make the sacred documents that can confine them? Or can we find a way to ensoul them, so that they will be worthy of the responsibilities the court has thrust upon them?




Friday, October 3, 2014

Don't let your babies grow up to be booksellers

Mamas, don't let your babies
(to the tune of Mamas, don't let your babies grow up to be cowboys, with apologies to the late Waylon Jennings.)



by John MacBeath Watkins

Booksellers ain't easy to love and they're harder to hold.
They'd rather give you a book than diamonds or gold.
thick glasses and old faded Levis,
And each book begins a new day.
If you don't understand him, an' he don't die young,
He'll prob'ly just get fat and turn gray.

Mamas, don't let your babies grow up to be booksellers.
Don't let 'em quote Dickens or drive them old trucks.
Let 'em be doctors and lawyers and such.
'Cos they'll never leave home and they'll recite obscure poems.
Even to someone they love.

Booksellers like reference rooms and gray rainy mornings,
Not little puppies and children and girls on the stairs.
Them that don't know him won't like him and them that do,
Sometimes won't know how to take him.
He ain't wrong, he's just different but his obliviousness won't let him,
Do things to make you think that he cares.

Mamas, don't let your babies grow up to be booksellers.
Don't let 'em quote Dickens or drive them old trucks.
Let 'em be doctors and lawyers and such.
Mamas don't let your babies grow up to be booksellers.
'Cos they'll never leave home and they'll recite obscure poems.
Even to someone they love.

Saturday, September 27, 2014

A friend to entropy and an anarchist at heart

by John MacBeath Watkins

S. was a tall woman, in her private life a sort of den mother for anarchists with whom she shared a house. Some time after she started working for me, she began dating a cousin of mine who I'd never previously met, and eventually she married him.

So, I suppose whatever forces shape our fate must have Intended that she be part of my cohort. I thought of her recently, when I asked my business partner where something was.

"Why do men always ask women where things are?" she replied.

That was an easy one.

"Because you move them."

She had, in fact, tidied away the object in question, and knew exactly where it was in precisely the way I did not. And that is one of the many great things about Jamie. She generally knows where she puts things.

Not so with S. And this was a problem, because of the way I tend to organize things.

If I want to be able to find something, I do the obvious thing: I leave it out in plain sight. This tends to lead to a bit of clutter, with the most often-used items on top.

S. wanted a neat work environment. To her, this meant less clutter. The way she achieved less clutter was in the obvious way: She put things out of view. Unfortunately, once things were out of view, she seemed to think the problem was solved, and actually finding the object next time it was needed was not a high priority for her unless it was something she used.

I came to view this in terms of entropy. Entropy isn't just a good idea, it's the law, and it clearly states that the universe is going from a higher state of organization to a lower state of organization.

My system of organization acknowledges this. My environment is in a state of apparently increasing disorder, and yet, for the most part, I can find things. The system S. used involved the expenditure of energy, which is entropy itself, to bring the environment to a state of greater disorder, in which information about where things were was destroyed, which is entropy again.

Now, it is possible for a system of putting things out of sight to preserve this information, even for it to preserve information better than my somewhat sedimentary system of piles. You would, for example, put stuff under "S" for "stuff," and other stuff under "O" for "other stuff."

This was not the method S. employed. Her method was to expend energy to destroy information, and I cannot help but think that on some level, she did so as a friend to entropy, an anarchist at heart.



Wednesday, September 24, 2014

The Self-conscious mythology of literature (The Strangeness of being human, cont'd)

by John MacBeath Watkins

There was an age of myth, when we explained the world to each other by telling stories about the gods. There was an age of fable, when we explained morality to each other by telling folk stories that belonged to the culture.

And there is the age of literature, when we know who wrote the story, and make it their property.

In the age of myth, we told each other stories that were supposed to be true, and didn't know where they came from. During the age of fable we understood them as parables. In our age of literature, we understand them as personal insight.

We regard all as contributing to our understanding of the nature of human nature, but by stages, they have become more tenuously connected with socially constructed truth, and more subject to our self-conscious understanding. We ask ourselves, is this a story we can accept as telling a truth about humanity, or do we reject it? Rejecting the myths was not optional during the time those religions were active. People lived in societies where the truth of the history of the gods was too socially accepted.

To reject the story of a fable, we would have to say that we disagree with the culture, not with the gods. To disagree with an author, we have only to disagree with one individual. The judgments of the author and the reader are those of individuals, with the social acceptance mediated by markets -- which books people talk about, and buy, or feel left out because they haven't read.

We have other ways of understanding human nature, such as the more rigorous storytelling of science, the unreliable narrators of our families and friends explaining themselves as best they understand themselves, or the frantic efforts of our news sources trying to attract our attention to fragments or figments of information or gossip they think we might like to know.

But it is literature which works the most like mythology, transporting us into stories and allowing us to experience things that have not happened in our own lives. It instructs us or subverts us in ways mere facts do not, influencing the emotional armature on which we hang our facts and shape them into our beliefs.

As our culture has changed, we've become more self-conscious of the process. We may choose to judge a book by its author. We might decide that if Ayn Rand could live off Social Security in her old age, perhaps the philosophy she pushed, which would claim only the morally inferior "takers" would need a safety net, was not even something she could live by.

Or we may say to ourselves, "J.D. Salinger seems so deep when I was so shallow, such a sallow youth, but now that I'm in the working world I have put aside that juvenile cynicism and taken up the more useful and manipulative cynicism of Dale Carnegie."

The ability to do this makes our emotional structure more malleable than we would be if the stories we based our lives on were eternal verities handed to us by the gods, as if the clay of our feet never hardens. This gives us an adaptability our ancestors never knew or needed, but what is the cost? Do we become chameleons, taking on the coloration of our social surroundings to better camouflage our true selves, or do we change our true selves at a pace never before seen in human history?

I suspect the latter. We are bombarded with stories, on television, in games, in books, even, for the dwindling few, in magazines. We grow by accepting them into ourselves, or set boundaries by rejecting them, and we are constantly reshaped, little by little, meme by meme.

Tuesday, September 16, 2014

On the persistence of print and absorbing information (publishing in the twilight of the printed word and the strangeness of being human)

by John MacBeath Watkins

I've been reading The Shallows, a 2011 book by Nicholas Carr about how the internet is rewiring our brains, and in the midst of this alarmist text on how much shallower we shall become because of the internet, I've found a cause for hope.

You see, the PewResearch Internet Project has found that younger people are more keenly aware of the limitations of the internet then their elders.

I am not a digital native. You might call me an internet immigrant, or even a digital alien. I've come to use the internet quite a lot, but I'm keenly aware that much of what we know isn't there. It's in books, or in peoples' heads.

But on this issue, as on so many others, I find that young folks today are in better agreement with me than my own generational cohort. From the report:

Despite their embrace of technology, 62% of Americans under age 30 agree there is “a lot of useful, important information that is not available on the internet,” compared with 53% of older Americans who believe that. At the same time, 79% of Millennials believe that people without internet access are at a real disadvantage.
I think that's a very realistic assessment. The internet makes it easy to find the information on it, but there's a lot that just isn't there.

And there is also the issue of what you want to read on the screen. In At Random, Bennett Cerf's memoir of his life in the publishing business, he noted that prior to the introduction of television, fiction outsold non-fiction about three to one. After its introduction and subsequent ubiquity, that reversed.

But when I looked at Amazon's list of top-selling e-books recently, there wasn't a non-fiction book in the top 40. The Barnes & Noble list of the top-selling hardcover and paperback books shows five of the top 10 being non-fiction.

It would appear that peoples' reading habits are adapting to the reality of reading things on a screen which can also be used to go on the internet and buy things or roam around the infosphere. It is Carr's contention that silent reading, which invites us into the private contemplation of the information and thinking of the author better than the public performance of reading aloud, has been with us for about a thousand years. Printed books invited us into this quite, private world, while reading on a device connected to the internet invites constant interruption. A text littered with links, GIFs, and videos invites cursory and distracted reading.

But stories were performed by a storyteller or a cast of actors long before silent reading came about. We can immerse ourselves in stories without thinking deeply, let them wash over us and sweep us away without trying to interpret or challenge their thinking. That seems to be the sort of thing we are willing to read on the screen, partly because the experience of being transported into the story makes lower demands on our intellect.

It seems odd that the newest technology is best for the sort of mythopoetic storytelling where we don't consciously absorb information, while books that demand our use of instrumental logic are best read on paper. Mr. Carr has himself noted that while e-books are now about a third of the market for new books, they are only about 12 percent of the sales of his own, somewhat intellectually demanding books.

Perhaps this is only a pause in the twilight of the printed word, until e-publishers work out the interface a little better. But I found when I was reading A Course in General Linguistics on line, I wasn't getting as much out of it as I did when I got a paper copy. The text was not interspersed with links, and the copy I got in book form had the distraction of marginalia, but I found it easier to immerse myself in a text I needed to read critically and contemplatively when it lay before me on paper.

Now, you might think that the young, more adapted to reading on the screen, would simply read more of the sort of short, punchy stories about who was showing side boob at the Oscars and watching cat videos and porn, but according to the Pew study, they are more likely to have read a book in the past year.

Some 43% (of millenials) report reading a book—in any format—on a daily basis, a rate similar to older adults. Overall, 88% of Americans under 30 read a book in the past year, compared with 79% of those age 30 and older. Young adults have caught up to those in their thirties and forties in e-reading, with 37% of adults ages 18-29 reporting that they have read an e-book in the past year.
Interestingly, e-books seem to have caught on with older adults first, perhaps because you can adjust the type size in an e-book.

But for now, e-books seem to have unexpectedly plateaued, and printed books persist.

Wednesday, September 10, 2014

Religion as an interface: The Strangeness of being human cont'd.

by John MacBeath Watkins

One of the most popular posts on this blog explores the roots of religion, and the need we have for a mythopoetic understanding of the world. Scot Adams, blogger and cartoonist of the Dilbert strip, says that religion is not a bad interface with reality.

And it strikes me that as we've made our machines more compatible with us, we've made them more artistic and poetic. I do not speak machine language, but I am able to communicate with my computer through my simple faith that when I reverently click an icon, the file will open.

On rare occasions, I have to use the command line to communicate in a more concrete way with my computer, and sometimes I even have to open the back and stick in more memory. But I don't really understand the machine in the way my nephew Atom Ray Powers, a network administrator, does, nor do I understand the software the way his brother, Jeremy, a programmer does. And neither has studied assembler code, which my uncle Paul learned after he was injured out of the woods as a logger.

It's as if we are replicating the way people perceive the world. The graphical user interface gives us a visual, metaphorical understanding of how to face the reality of the computer, just as religion gave us a metaphorical, poetic, and often visual way of interacting with the reality of the world. The command line gives us greater control of the computer, just as technology gives us the control of nature.  Science attempts to learn how the world really works, at deeper and deeper levels, similar to knowing how the transistors work and how to read machine language..

The fact that computer scientists, who started at the scientific end of things, felt a need to make the interface more metaphorical and even artistic tells us something about how humanity interacts with the world. The intuitive approximation is vital if we are not to be overwhelmed with detail. It is sometimes said that ontogeny recapitulates phylogeny, because every fetus goes through phases of looking like a primitive fish, then a salamander, and eventually takes on human form. It would appear that the same thing happens cognitively.

Those of us, like myself, who follow the methods of the metaphorical interface in our daily lives often seek guidance from computer gurus. And those gurus, when they are not repairing malfunctioning machines or recalcitrant code, operate their computers in the symbolic realm made possible by the GUI.

We seem to have some difficulty doing this in our world of faith and science. This is usually because each side insists that its way of understanding the world is truth, therefore the other cannot be truth. But a model of an atom isn't what an atom really looks like, because an atom is smaller than a visible light wave. All of our understanding is metaphor and artistic license at some level. In my view, we have understandings at different levels.

Now, perhaps I've offended some religious people by saying religion is metaphor. But all sacred texts were written to be understood by people, not by gods. All of our understanding is metaphor. "For now we see through a glass, darkly" a biblical passage says. We understand the world by telling stories about it, and deciding which best describe it. Sometimes, as with math, the stories can be very precise, and the grammar quite rigorous, but they are stories none the less.

Tuesday, August 26, 2014

On the spell of the spiritual and the mechanism of philisophy

by John MacBeath Watkins

The Guardian has an interesting article on the failure of anglophone philosophy here. In it, Roger Scruton argues that the analytic philosophy of English-speaking philosophers has taken philosophy out of realms where it might be relevant to peoples' lives.

Scruton says:
Academic philosophers in the English speaking world still regard philosophy as Locke defined it in the 17th century, as “the handmaiden of the sciences”: it doesn’t explore the world beyond science but the limits of science, with the result that philosophy doesnt really intrude into the public world. In the early 20th century were were caught up by the movement to form analytical philosophy, based in the study of logic, the foundations of mathematics, the syntax of ordinary language, the validity of arguments, something very formal. So when people have a big question, especially now since the decline of the orthodox religions, they don’t turn to philosophy for the answer but try to formulate it in whatever technical words have been bequeathed to them, and when a scientist comes along and says “I have the answer”, or even “there is no question”, they think “this guy knows what he’s talking about, I’d better lean on him”.
The French, he notes, did not fall into this trap. Sartre was willing to address the great moral questions, even if the morality of his actions in World War II might be a little questionable (he gained his teaching position during the war because Vichy law eliminated a Jew from that position, and chose not to be active in the resistance.)

But Scruton fails to note that many people don't look to science for their answers. Some turn to religion, some turn to New Age gurus. Both reflect a backlash against the Enlightenment ideas reflected in modern philosophy. Most modern philosophy (yes, even the French) is unwilling to deal with the spiritual feelings people have.

Part of the problem is that people tend to believe in the spiritual in an a priori manor., and will interpret any attempt to analyze it as an attempt to destroy it, to reduce it to the physical world. Any logical and analytical approach to the spiritual that does not treat the existence of the spiritual as an accepted fact and a realm not readily explained by the physical world will be seen as the reductive destruction of the spiritual, equivalent to trying to understand the Mona Lisa by turning it to powder and doing chemical analysis of the molecules.

Any attempt to find the part of the brain that needs to believe in god will receive this reception. My own attempts to understand the spiritual in terms of the ethereal parallel world of symbolic thought have been received this way. As an agnostic, I am open to the possibility of the existence of a spiritual world but not convinced of it. And I have to wonder, if we could understand the spiritual world, would that be tantamount to its reductive destruction?

In my series of posts on the strangeness of being human,  I have stuck with trying to explain what I can, which has restricted me to the physical and analytical. I remain skeptical of those who claim a special knowledge of the spiritual world, because so many have been shown to be frauds, but I respect the impulses and the work of sincere ministers of many faiths. For many people, faith has been a support for them spiritually, psychologically, morally, and socially. Scot Adams, long a vocal atheist, said on his blog recently:
In recent years I've come to see religion as a valid user interface to reality. The so-called "truth" of the universe is irrelevant because our tiny brains aren't equipped to understand it anyway. 
 As a pragmatist, I find this appealing. Were I a Christian, I might find it appalling, for the same reason the Catholic Church found Pascal;s Wager appalling: It does not accept the truth of religion as its reason for practicing religion.

Yet in many ways, worrying about the truth of religion is a modern luxury. If you lived in most societies for most of the history of religion, the penalty for failing to believe in the God or Gods of your people was death, ostracism, or incomprehension by your fellows. The notion that religion should have to justify itself was uncommon until recently. Socrates was charged with undermining the young's faith in the gods, and condemned to death. Society was punishing him, not for proving the gods did not exists, but for raising the question of how we might logically confront religion.

Thomas Aikenhead was executed in Scotland in 1697 for the same thing. Thomas Hobbes might have lost his life on a charge of blasphemy for claiming God exists, but is a material being, had he not had protection from the king, who he had tutored.

Although Aikenhead was the last person in the United Kingdom executed for blasphemy, the last successful prosecution in the UK for blasphemy was in 1977. The law has since been repealed.

There are parts of the world where the law says you can still lose your life for leaving the established religion, although in the best-known cases governments have backed off.

But even for the unchurched, the spell of the spiritual has an appeal that the logical mechanisms of philosophy cannot address. This is an interesting problem, because for centuries, philosophy was taught in Europe at Christian institutions. In fact, if you wanted to be educated in Europe after the rise of Christianity, for centuries you had to take orders.

This led to exactly the sort of reductive logic chopping we now see in our more materialistic philosophy. Schoolmasters were ridiculed for arguing how many angels could dance on the head of a pin (my view is, all of them or none of them, depending on whether angels have a sense of rhythm -- after all, they are as immaterial as the question.)

So the problem of the relevance of academic philosophy is not a new one. One of the aspects of the academic environment is that to be wise, you must specialize, so that you may know more about something than anyone else. That specialization takes you away from the big questions. Another is that the trap of irrelevance is not always obvious. The question of whether angels had a material presence interested some philosophers, and the thought experiment about them dancing on the head of a pin was a thought experiment intended to illustrate it.

The real trap was in failing to understand that in the grand sweep of things, whether angels had a material presences was irrelevant to the important questions of how we should live. The conversation became attenuated because those involved did not realize that they had lost the plot.

And if philosophy leaves the questions of how we should live our lives to the soft science of psychology or the realm of new=age gurus, it will be irrelevant to the questions they attempt to answer. Perhaps these questions are not the ones modern philosophy wishes to deal with, but if so, people will continue to ask, what is it for?

Scruton thinks the notion that philosophy is the queen of the sciences makes it beholden to the sciences, but that is wrong. Philosophy is the mother of the sciences, having spun them off. There was a time when naturalists called themselves "natural philosophers." It was philosophers who first examined the basic questions of physics, math, and astronomy.

Philosophy should not now turn its back on its children, but should integrate them, and show how they affect the way we live. But it seems to me that philosophy is the child of the spiritual rather than its queen or mother. We first tried to understand the world in a poetic and mythic way, and only later brought our problem-solving logic to bear on those understandings. It is much harder for the spirtual's logical child to understand its parent, because its business has been to supplant mythic understanding with logical understanding.

But it can talk about the questions the spiritual attempts to answer. After all, the Buddha had little to say about the gods, nor did Confucius. The question is, will academic philosophy reward such efforts, or view it as an enterprise left to some other field of study?



Friday, August 22, 2014

On the illusion of the self: The Strangeness of being human #27

By John MacBeath Watkins

As we discussed in an earlier post, Julian Jaynes introduced the intriguing concept of the origins of consciousness in the bicameral mind. He supposed that brains worked differently until about 1200 BC, that the part of the brain that produces hallucinations was speaking to us with the irresistible compulsion of the voices of the gods.

This represented a different sort of mind than we now experience, a mind without the metaphorical self-narrating person in our heads.

This brings up several questions. Jaynes claims that only the mentally ill still hear voices from that part of the brain, which is not much used by modern humans. But surely the part of the brain responsible for these hallucinations existed prior to human culture. What role did it play before that, and what role does it play in the style of perception used by animals other than man? Is it part of a system of perception for a spiritual world that is real, or the source of the invention of the spiritual? 

I propose that the supposition of the breakdown of the bicameral mind is unnecessary. Psychologists refer to a healthy psyche as a well-integrated personality. This recognizes that a personality is made up of many motivations, often conflicting – the self who wants sweets and the self who wants to be slender, the self who wants children and the self who is selfish, the self who aspires to goodness and the self who cheats on its spouse. Some of us avoid conflicts by compartmentalizing. Some actually fragment into different personalities.

There was a case a few years ago in which a man was accused of raping a woman with multiple personality syndrome. What had happened was that the accused had started having sex with the woman's adult personality, then asked to speak to her little girl personality. The woman had consented to have sex in one personality, but not in the other – in fact, that personality was incapable of consenting to sex. The man was convicted, but the conviction was overturned.

That the woman had shattered into several personalities is considered pathological, but what if a single, well-integrated personality is as much an hallucination as the gods were? Does that mean that neither is real, or that both are real, or something in between?

I propose that both are ways of constructing reality. Scott Adams says that religion is a pretty good interface with the world, and I suspect that for many people it is. Think of it as a graphical user interface. The real world of computers is a world of 1s and 0s, but this is not a way of thinking about computers that enables us to work smoothly with them.

Similarly, the world we perceive is one of differing amplitudes and frequencies of light and sound, of the atoms we are composed of interacting with the atoms of other objects. Who knows, it may even be one of our spirit interacting with other spirits, though I see no particular need to suppose this. We have several levels of perception, memory, and constructing all the evidence of our senses into a narrative that “makes sense” of our lives. The product of all this is a useful interface, a sort of useful illusion of the world.

When societies became larger and needed coordination beyond the clan level, we developed institutions and patterns of behavior that made that possible, resulting in the great age of religion, which gave societies a sort of group mind.

This group mind gave us a structure that allowed stable societies of great size to develop, but it was not adaptable. As Jaynes pointed out, in the Iliad, there are almost no references to individuals having motivations that were not the gods dictating their actions. The later Ulysses is all about one clever, adaptable individual making his way through changing circumstances that his gods did not issue instructions for.

About the same time, the great age ofprophecy began, and for about a thousand years, new religions told people how to act as individuals. And those religions focused on human prophets, less than on ethereal gods. Mohammed gave the word of God to Muslims, Jesus gave the world of God to Christians, and while Siddhartha had no brief agains the Hindu gods, his followers focus on his teaching more than on worshiping those gods.

Each, in his own way, taught people not to be selfish. It may have been literally unthinkable in the age of myth to be selfish, but in a world where adaptable individuals made their way, it was an ever-present danger.

An it is a danger. Any society that relies for its survival on people having and raising children requires some level of self-sacrifice. Any society that needs to defend itself from aggressive neighbors requires it as well.

We live in a transitional era, when adherents of the prophets are worried about the relentless rise of unbelief, when prophets of the Singularity are trying to invent an entirely material god, when atheism is no longer the creed that dare not speak its name. Reason rules our world more than myth, although often, it is motivated reasoning that seeks out desired conclusions.

But what role does reason really play? Often, our reason justifies things we already want to do, but have not consciously acknowledged. What if, when we spoke to the gods to get our guidance, the same thing was happening there as happens when we talk to ourselves?

If Jaynes was right about the literary evidence pointing to a different sort of mind prior to 1200 BCE, it may be that it was a different way of integrating a personality than our current mode, rather than a completely different way of using our brains.

The strangeness of being human is a series of posts about the way language makes us human, giving us abstract categories we use to think and memes that make up much of what we are.

1
http://booksellersvsbestsellers.blogspot.com/2011/06/to-read-is-to-become-stolen-child.html
2
http://booksellersvsbestsellers.blogspot.com/2012/03/on-disenchantment-of-world.html
3
http://booksellersvsbestsellers.blogspot.com/2012/02/blue-man-speaks-of-octopus-ink-and-all.html
4
http://booksellersvsbestsellers.blogspot.com/2012/05/bicameral-mind-and-strangeness-of-being.html
5
http://booksellersvsbestsellers.blogspot.com/2012/05/structure-of-thought-and-death-of.html
6
http://booksellersvsbestsellers.blogspot.com/2011/11/ane-how-will-our-minds-be-rewired-this.html
7
http://booksellersvsbestsellers.blogspot.com/2012/07/sex-death-and-selfish-meme.html
8
http://booksellersvsbestsellers.blogspot.com/2012/10/what-is-soul-of-man_10.html
9
http://booksellersvsbestsellers.blogspot.com/2012/11/stories-language-parasites-and-recent.html
10
http://booksellersvsbestsellers.blogspot.com/2013/02/god-language-and-structure-of-society.html
11
http://booksellersvsbestsellers.blogspot.com/2013/02/be-careful-who-you-are-more-on.html
12
http://booksellersvsbestsellers.blogspot.com/2013/02/the-strangeness-of-being-weird.html
13
Night of the unread: Why do we flee from meaning?
 14
http://booksellersvsbestsellers.blogspot.com/2013/03/night-of-unread-do-we-need-ethnography.html
15
http://booksellersvsbestsellers.blogspot.com/2013/03/when-books-become-part-of-you.html
16
http://booksellersvsbestsellers.blogspot.com/2013/04/drunk-on-milk-of-paradise-spell-of.html
17
http://booksellersvsbestsellers.blogspot.com/2013/04/the-power-of-forbidden-words-and.html
18
http://booksellersvsbestsellers.blogspot.com/2013/04/so-like-filler-words-you-know-they-uh.html
19
The conspiracy of god, the well-intentioned lie, and the strangeness of being human
20
Spiritual pluralism and the fall of those who would be angels
21
Judging a book by its author: "Fiction is part confession, part lie."
22 
What to do when the gods fall silent, or, the axis of ethics
23 
Why do we need myths?  
24 
Love, belief, and the truth we know alone
25 
"Bohemians"-- The Journey of a Word
26
On being a ghost in a soft machine
 27
On the illusion of the self

Thursday, August 21, 2014

A 4th helping of notes on a novel in 1940s noir

by John MacBeath Watkins

I woke with an aching head, and found there was a heavily-built character sitting on the bed with me. A stitched-up scar ran up his swarthy face to the missing eye, and his remaining eye was dead, completely devoid of human emotion.

But he was my teddy bear, and I loved him.
_______________________

"You're undercover?" she whispered.

"Yes," I said.

"Well, you look about as inconspicuous as Herman Cain at a Republican convention."

____________________


"The streets were dark with something more than night," Chandler said.

"Yes, you've got some of it on your shoe, and tracked it on the rug."

______________________


"Alcohol is like love," Chandler said. "The first kiss is magic, the second is intimate, the third is routine."

"And after the third kiss, I start to puke," I finished for him. "Just like our first date."

____________________

She told me every time we said goodby, she died a little. She must have said goodbye once too often, causing blood to leak from a massive head wound. 

I made a note to go with "smell you later" in the future.


Friday, August 15, 2014

Demonic males: Failure of a narrative

by John MacBeath Watkins

On July 25, 2014, ESPN host Stephen Smith brought an uproar on his head with the comment that women should not "provoke" men to anger, shifting the blame for domestic violence.

And he was quite properly pilloried for the comment. There is no excuse for beating your mate. Shifting the blame from men to women is wrong, not just because it's blaming the victim, but because blame is not a useful framework for solving the problem.

Because there is another issue here. The consistent narrative about domestic violence is that the problem is demonic males, and the solution is controlling those demons.This seems obvious from the fact that most people hurt in domestic violence incidents are women.

Logically, the real hell should be two men living together.

And, in fact, according to a Centers for Disease Control study, gay men report that 26% have, in their lifetimes, been subject to violence from a domestic partner. That's a shockingly high number. But it's not the highest number. For lesbians, the figure is 44%. For straight couples the figure was 29% for men, 35% for women.

This is the opposite of the expected result. The more women are involved in a relationship, the more violent that relationship becomes. It is an astounding, disturbing result that has received far too little attention.

Now, there are several possible explanations for this. It could be that women are more likely to report having been hit. This is possible, but I submit that this is not about police reports, these people participated in a survey that allowed them anonymity. I believe the numbers. In any case, why would straight men be more eager to report domestic violence than gay men? And why would lesbians be more eager to report than straight women?

Another possibility is that men are beating up lesbians. After all, not all lesbians start out in lesbian relationships. I'm sure that happens, but a psychologist I know told me years ago that a deplorable amount of domestic violence happens in lesbian couples.

A 1949 comic, for sale here.
There are other possibilities. Some people fight with their mate as a prelude to sex. I've never understood that one, but I know it exists.

Maybe there is something wrong with the CDC's sampling or the wording of their questions, but I doubt it. I do think the survey opens a window on a deeply emotional issue, and may even point a way to making peoples' lives less violent.

There is a more disquieting possibility, that women are more subject to violence because they are seen as more vulnerable, even by other women. That would be a more intractable problem. It would also fail to explain why straight men report being hit more than gay men.

If this is the problem, the solution would be to decrease the perception of female vulnerability, a rather difficult bit of cultural engineering.

It is possible that what we are teaching women in our culture about conflict resolution is working badly, especially when dealing with other women.

 That is an intriguing possibility, because if conflict resolution styles is a problem, teaching better techniques could benefit any couple having this problem. You'd have to teach both parties, and not everyone would be willing, but lesbian, straight or gay, you'd be better off.

There are a number of stereotypes about this. The woman who enforces her will with a rolling pin. The woman who won't tell the guy what he's done wrong, but expects him to know, for example. I know nothing about the validity of the stereotypes, and I doubt that's the sort of thing that leads to most domestic violence, but having never been involved in domestic violence, I don't know what problems lead to it from personal experience. Is it score settling? Naked competition for power within the relationship?

I don't know, but someone must find out.

We'd have to open our minds to a new approach. I would suggest teaching kids conflict resolution, before their patterns are set. You could ask them what they would do in certain circumstances, and what would likely result, and explore alternatives. If 26% of those in all-male relationships are subject to domestic violence, it's clear men need this. If 44% of women in all-female relationships are subject ot domestic violence, women could use it even more.

Part of the problem with the notion of demonic males is that it focused on who was to blame, just as the problem with what Mr. Smith said was its focus on shifting blame. If we shift the focus to how to resolve domestic conflicts without violence, everyone could benefit. After all, most couples of all types manage to avoid violence.

Saturday, August 9, 2014

The proper strategy for selling ebooks (publishing in the twilight of the printed word continued)

by John MacBeath Watkins

When Amazon sells an ebook published by Hachette, the proceeds are divided as follows: 30% to Amazon, 52.5% to the publisher, and 17.5% to the author. The two companies are now at odds over the fact that Amazon wishes to discount books more heavily.

They have also proposed the authors should get more -- 35% instead of 17.5%. Only, that wouldn't come out of the Amazon share, it would come out of the publisher's share. So far, the company under attack, Hachette, has had very vocal backing from its authors, who are deprived of part of their income because Amazon is refusing to sell their books. The tactic in suggesting that the publishers give authors a bigger share is an attempt to drive a wedge between authors and their publisher -- let's you and him fight. It's a free lunch for Amazon, which would not dream of giving authors more money out of their share.

And in Germany, Amazon is trying to get a 50% share of the ebook price..

My question is, why should Amazon be getting even 30%? The cost of delivering ebooks is minimal, while many of the marketing costs are borne by the publishers.

Suppose you could plug the title of a book into a search engine and pull up a variety of booksellers offering the book at a lower price than Amazon's. The publishers would have greater influence over a large group of independent booksellers than they do over Amazon. They might find themselves paying as little as 15% or even less to such competing sellers.

The reason this hasn't happened is that publishers worry about losing control over the perception of value of their products. What is needed is the agency model -- they wholesale books to an agent who then sells them.

Ah, you say, but that has been tried. Not, I answer, in the way that I propose. The publishers tried to ally themselves with Apple and set a higher price than Amazon wanted to charge.

I say they should fully commit to ebooks, and under-price Amazon. They were tripped up by the fact that they colluded with Apple to have high prices. Well, don't collude. Set prices that cover the cost of finding, editing, and promoting the book, plus a reasonable markup, and try to sell a lot of copies. Don't negotiate what margin the seller gets, just sell them the book and let them set the retail price. The company that can keep its overhead low while effectively promoting itself and the books can make money with a lower percentage of the price. With competing companies selling the books, the one who can make money on the smallest margin will have the lowest price.

No doubt a company like Google could build such a marketplace quickly that would be highly automated and have minimal costs. Or maybe someone wearing bunny slippers and working in their basement will find the key. The big problem is overcoming Amazon's marketing muscle, so I would expect either a well-funded startup or a fairly large existing company to take this on.

Amazon has a large and increasing overhead connected with delivering physical objects. A company with lower overhead could charge less for ebooks.

It has now become evident that not everyone wants an ebook. They seem to be best for leisure reading. For absorbing information, print books still have an edge. There is still, therefore, a place for bookstores and experts on the physical delivery of books such as Amazon.

This is not too different from the mass-market paperback revolution of the 1940s and '50s. Suddenly, news agents who had never sold books before were selling paperbacks with lurid covers. More people read more books, and publishers found that what had been a carriage trade became a mass market. The process was very well documented in one of my favorite books, Two-bit culture: the Paperbacking of America.

But even during the paperback revolution, the business was one of distributing books through centralized organizations.  Most publishers did not own printing plants, let alone warehouses and trucks to take the books to the many independent bookstores that peppered the land, which meant more middle men were needed. With ebooks, that lack doesn't matter, and in fact, becomes an advantage, because it means lower overhead.


Tuesday, August 5, 2014

Still more notes for a novel in 1940s noir

by John MacBeath Watkins

A ricochet zinged off the rock we hid behind.

"They mean business, doll," I said.

"By asking for your help, I've put you in danger," she said in that husky voice that drove me wild.

"Don't worry, babe," I said, firing back -- Bam! Bam! Bam!. "I won't let them take you alive."
___________________________

"You told me your uncle was a humanitarian," I whispered, "but it looks more like he's a cannibal."

"Well, dear," she said,  "if a vegetarian eats vegetables..."
__________________________

"You're a dick, aren't you?"

 "That's right, I'm a dick, a shamus, a private eye," I responded. :"In polite society, which I never meet, they call me a detective."

"That's not the kind of dick I was calling you."
___________________________

The crime was monstrous, so I added Mothra to my list of suspects
_______________________________

:"My gun is quick," Jack Hammer said.

"Next time, try thinking about baseball while we do it," she replied.
_________________________________

"Farewell, my lovely," I said, my voice choking with emotion.

"It's just a dental appointment," she replied. "you'll be done in half an hour, and I'll buy you some ice cream as a reward."
____________________________________

"You can't pin that on me!" he shouted.

"Want to bet?" I reached out to thrust the sharp point through his jacket over the heart. The pin was pink with black type, and said "world's greatest grandad."
_______________________________________

More notes here:

http://booksellersvsbestsellers.blogspot.com/2014/04/notes-for-novel-in-1940s-noir.html

and here:

http://booksellersvsbestsellers.blogspot.com/2014/06/more-notes-for-novel-in-1940s-noir.html

Sunday, August 3, 2014

Superstition and the singularity

by John MacBeath Watkins

I always figured California would be a place where religions could arise, but I had no idea smart people could come up with one so lame.

I'm talking about the Singularity. Vernor Vinge invented the term, which describes the future advent of a super-intelligent, conscious being as a result of computers getting smarter. Consciousness is supposed to emerge, but Vinge used the term "singularity" as a metaphor from black holes, from which no information can escape. His view was that we could not predict the capabilities or motives of such a being.

Which has not kept people from speculating.

Some believe a benevolent super-intelligence will be effectively all-knowing and omnipresent. Some believe they will be taken up into the cloud and given eternal life. And some believe in the devil.

I'm talking here about Roko's Basilisk. From RationalWiki::

Roko's basilisk is a proposition that says an all-powerful artificial intelligence from the future may retroactively punish those who did not assist in bringing about its existence. It resembles a futurist version of Pascal's wager; an argument used to try and suggest people should subscribe to particular singularitarian ideas, or even donate money to them, by weighing up the prospect of punishment versus reward. Furthermore, the proposition says that merely knowing about it incurs the risk of punishment. It is named after the member of the rationalist community LessWrong who most clearly described it (though he did not originate it).Despite widespread incredulity,[2] this entire saga is about things that are actually believed by some groups of people. Though it must be noted that LessWrong itselfdoes not, as a policy, believe in or advocate the basilisk — just in almost all of the premises that add up to it.

One of those premises is that an exact copy of you is you. It would feel what you would feel, suffer as you would suffer, and react as you would react. To a materialistic atheist, it would be no different from you.

I am a bookseller. I have recently seen a first edition of Hemingway's For Whom the Bell Tolls. I have in my store a rather nice facsimile of the same book, the only detectable differences being an entry on the copyright page. If I were to sell the facsimile as a first edition and were found out, it would ruin my reputation -- and if the publisher had not included an entry on the copyright page, the book would be not merely a facsimile, but a counterfeit.

An exact copy of you would be a counterfeit you. In fact, the super intelligence could make endless copies of you if it were so inclined. Differences in experience would start to occur almost at once, and each copy would become a different person as time when on. If so, which one would be you? All of them? None of them?

The notion that an exact copy of you would be you is atheist theology, based on the idea that you are no more than a physical being. I consider it a claim to know more than can be known, so one might call it a superstition or a religious belief.

And short of creating a new body for you, some of those who are doing the theology of the singularity speculate that you could do a mind upload, which would give you a bodyless existence in the cloud. But would that be you? Again, once the copy of you is in digital form, it can be copied endlessly. None of the copies would be you. They might act like you, or they might not, depending on how badly the copy gets corrupted and how different the urges of an expert system "living" in a machine are from those of a person living in a body.

What you could create would not be you. It would be a sort of software monument to you. Theoretically, a super intelligent machine could more easily create a software version of your mind than an entirely new you, but in either case, what motivation would it have to build monuments to inferior beings?

The next problem is that the assumptions about the singularity are that it will come to evolve differently than machines have to date. Up to now, machines have evolved the way ideas evolve, being designed and built by humans. If a machine started designing better versions of itself, its motivations would have to be those designed into it. Yes, you could even program it to be motivated to build software monuments to internet billionaires, but that seems like a vainglorious use of a powerful machine. At the point where we have "conscious" machines, they will be designed to simulate consciousness, which will be a signal to start an endless controversy about what consciousness is.

But part of the theology of the singularity is that consciousness is an emergent property, which will appear when the conditions are right, such as sufficient intelligence, sense data and memory. I see no reason to assume that this is the case, and I posit that any conscious machine that we create will be designed to be conscious, with its motivations in its software.

Which brings us back to Roko's Basilisk. It can only be created if we create it, and do so in a way intended to harm ourselves. I wish I could be certain that fearful, superstitious people would not do that.