About Me

My photo
Variously, a film/video editor, programmer, author, teacher, musician, artist, wage slave

26 July 2010

Why Verizon Sucks Big-Time

Verizon is a large, heartless media monopoly designed to pump money from its captive customers into its corporate coffers while nominally providing telephone and internet service. Why does it suck? Let me count the ways...

A while after I had FIOS installed, a power shutdown over a weekend revealed that Verizon, in stealth mode, had altered my land-line telephone service. It seems that Verizon uses FIOS as bait to change telephone service to its liking, as it will not install FIOS unless you sign something that empowers them to make these changes. OK, so I signed something when they installed FIOS, which could have been a work order or any number of other things, so how was I to suspect that installing a fiber optic internet service would involve an arbitrary change to my basic telephone service that benefits no one except Verizon. (Answer: By reading pages of microscope legalese while the hard-working installer drums his fingers impatiently.)

What they did was rip out the copper line that had powered telephone service for the past century from the telephone exchange and replace it with a crummy system that shifts the burden of powering telephone service onto the customer. When power is unavailable, as during an outage, a lead-acid battery they supply and will replace only once, powers the telephone for up to 6 hours. Why a simple lead-acid battery should only last for 6 hours beats me, but that is Verizon's own outside estimate for the time it takes to discharge the battery. After that, you are on your own. And, the battery tends to last only about a year, so after replacing it once on their dime, another year has passed and now I am expected to run down to a Radio Shack and happily shell out $20+ for another crappy battery. Et cetera and so forth.

What's wrong with this picture? Verizon, which will not provide DSL internet service in areas where their more expensive FIOS is available, by taking out the copper line, not only gets to sell the copper bought at public expense to provide telephone service with no remuneration to the public, but assures that no one will ever be able to provide DSL service, either Verizon or a competitive company leasing Verizon's wire to provide competitive ISP service. So taking out the copper line is deeply anti-competitive, and this change will pertain to subsequent owners of a FIOS-enabled house.

Verizon could care less that after 6 hours land-line service is all but guaranteed to fail and with it any 911 emergency telephone service, because, I have been assured by a Verizon representative that they have their corporate rears covered against legal liability just fine, thank you.

By electing to remove the copper wire, Verizon has created a whole lot of new costs, external to itself, that could easily have been avoided. The most obvious is the cost to customers of buying new batteries, which have a short lifetime. Second is the small amount of electrical power that each customer now has to pay for service. At least as important are costs of diverting resources to make the batteries and the costs of disposing them, for they must be recycled responsibly as they contain poisonous lead and acid. The fact that many batteries will not be recycled properly will put an avoidable stress on the environment wherever they ultimately land. But, hey, this is not Verizon's problem. It is so much not their problem that Verizon does not even offer to recycle the toxic trash mandated by their corporate grand designs.

This all adds up to one huge boondoggle, with Verizon in a position to dictate to anyone desiring fiber optic internet service, albeit choked down to a fraction of its potential speed, as Verizon conditions its customers to virtual spoonfuls of data for real money, that they must agree to egregious changes in their land-line telephone service, costing customers and the environment alike untold billions of avoidable expense, in the aggregate.

I contacted the FCC, which I considered the only responsible thing to do, given that this idiotic change in basic telephone service makes emergency communications far less robust in an era when national security is being threatened on all sides.

Before I heard anything back from the FCC, I was first greeted at my door by two Verizon "representatives" whom I turned away, thinking they were high-pressure sales types. Then, I got a call from an oily-voiced Verizon "representative" who wanted to "answer questions" I might have about my Verizon service. Only when I asked did same "representative" say that Yes, he was calling me because the FCC had contacted Verizon with my complaints. It turns out that the FCC does not at first investigate your complains on their face value but has a knee-jerk reaction of notifying the company the complaint is about to reach some sort of understanding with their aggrieved customer directly. The "representative" did his best to put a happy face on what I would, after all, just have to accept because these were the terms of Verizon's service...

Further contact with the FCC were even more bleak: they claimed to have no authority to do anything but, if I wanted to pay something like $140, I could file a so-called formal complaint. I was advised that 911 issues were a state issue (which I find amazing, on the face of it).

Now, in the early 1970's I had a Bell Telephone "representative" burst into my office door to see that I had (gasp!) an answering machine hooked directly to the telephone line. At the time, Bell was trying to extort $60/month for their Codaphone answering machine (maybe $500/month in today's money), claiming that any alternatives risked the lives of telephone employees. (Seriously!) Needless to say, I complained to the FCC and the Pennsylvania PUC. At that time, I got a two page letter from the FCC, explaining what they were doing about my complaint, which echoed that of many other people. From the PUC I got nothing but a number to refer to if I ever wrote about it again. The egregious behavior by Bell was rewarded by its breakup. I can only hope that a similar fate awaits Verizon.

Having written this, I now think that the Department of Homeland Security and the Environmental Protection Agency might be interested in this idiotic move by Verizon even if the FCC doesn't feel they have jurisdiction. We shall see.

20 July 2009

Prosecute Bush's Criminal Cohort

To read the posts of Harper's Scott Horton in his quest to bring to justice the numerous perpetrators of crimes, particularly torture, committed during the recent Bush administration is an exercise in extreme frustration. Few can argue that such prosecutions would only be of academic import or would be little more than a partisan witch-hunt. The United States, after all, is a signatory of the various Geneva Conventions outlawing torture and the Constitution gives such international agreements that we agree to the same force as our domestic laws. The argument that the perpetrators were "only following orders" and cannot be held responsible is without merit, as established during the Nuremberg trials at the end of World War Two. Although a certain degree of national discomfort would doubtless accompany the prosecution of torturers and those responsible for formulating torture policy, it would be far less disruptive than friends of the potential defendants self-servingly predict. The objective is not to open a legal circus to persecute out-of-power bureaucrats, rather, the trials are necessary to prove that laws apply to everyone, high and low, that the illegalities of the past eight years were an aberration, and that we, as a country, must reassert our allegiance to lofty principles and show that our actions are guided by them.

14 February 2009

Paying for News on the Internet

Since the internet has become ubiquitous, traditional sources of information, such as newspapers, magazines and encyclopedias, have been plagued with declining readership and thus declining advertising revenue, while costs, as always, have trended upwards. As papers have cut back resources, fallen prey to takeovers and gone out of business in increasing numbers, there has been a lot of hand-wringing about exactly how all those freeloading internet readers can be made to pay for the costly content newspapers provide, and rightly so. As the internet has opened a new and unprecedented universe of choices for readers of news and opinion, the mechanism for extracting some sort of compensation for this privilege has been seriously compromised. The conventional business model for newspapers and magazines has consisted of two revenue flows, one much larger than the other, to generate the financial wherewithal to continue publication. The first, and most obvious, has been the subscription fee or per-paper price at the newsstand. This has been , traditionally, a nominal charge that hardly has paid for the costs of distribution, but avoids the impression that the paper is "free", and thus of some diminished intrinsic value. The nominal cost is, far more importantly, a way of proving that so many thousands of readers are, in fact, reading the paper. The second, and are more significant source of revenue is, of course, from advertisers. Given the proven number of readers (which are also audited for verification), and lore about the "page-views" of certain sizes of ads and their positions throughout the paper, newspapers and advertisers have worked out mutually-satisfying schedules of prices for a whole raft of add sizes and display options. The first translation of newsprint to internet distribution followed more or less the same model. Some papers, with captive audiences with the means to pay, such as the Wall Street Journal, charge a subscriber's fee for more than first-page access, and, needless to say, others have tried to follow suit, usually with dismal results. Advertisements are presented, as in print, surrounding the news "content". However, the payment model for ads on the internet, instead of being based on page-views by a known number of print subscribers, is based, not surprisingly, on "click-throughs", i.e. the number of user clicks to go to the advertiser's site. Notice that this is a higher standard than for print ads; it is almost equivalent to counting the number of readers who stop and write down the telephone number in a print ad, say, if this were possible, as the clicks demand more attention than simply being on the same page as a print ad. Given the level of ways to audit web traffic, it would certainly be possible to monitor the actual time readers spend in proximity to web adds, and this is certainly being done for research on advertising on the web. However, I would hazard that there is a hidden disincentive to do so, because, almost certainly, ad rates are based on an over-optimistic view of the reader's attention, and more precise research would probably shake the foundations of current print ad pricing to the detrement of publishers. What makes me think this? The fact that click-throughs have produced dissapointing results, on the one hand, and, that ads of all kinds are abhorrent, an opinion that I think is widely shared. If there is some truth to the last paragraph's assertions, the problems with newspaper publishing on the web have not produced new problems so much as revealed the deeply incorrect notions that rationalise advertising pricing altogether. The difference that the internet provides, as cable and satillite TV provides over network TV, is a diversity of offerings and a freedom of choice for reader/viewers, which heretofore have been captive audiences. A dilute audience, whose attention is spread over many outlets, simply cannot generate the same revenue stream that a concentrated or captive audience can, without increasing the per-view prices into a stratospheric region that advertisers and their customers refuse to pay. This is another example of the poverty-in-riches paradox that has visited many beneficiaries/victims of various applications of technology. Another example can be found in video technology: Video equipment, costing perhaps $3,000 is capable of producing shows that would require a $1M of equipment 20 years ago. Can it be said that video productions, either in quality or quantity, have increased in approximately the same proportion? No. Instead, we have a whole new type of video on YouTube and similar sites, which defies comparison to network television productions, but would be utterly impossible without ongoing changes in multiple technologies. The lesson seems to be that new technologies have forever changed the old models, and the institutions that grew up in the regimes of previous technologies. That the news must be researched and written is a given, and, despite recent discouraging demographics, I think it is also a given that there will always be an important population of readers of news, too. The link between democracy and a free press need not be debated: Each depends on the other. Given these momentous stakes, isn't it time to look beyond traditional means of privately paying for the news, to means that tap deeply into public sources of revenue, and whether such sources can be, for once, removed from political manipulation. I think the hazards of finding a solution to funding news in this direction are much better, at the moment and into the foreseeable future, than continuing to depend on faulty and delusional rationalizations to pay for it with advertising revenue.

18 January 2009

Andy Wyeth is America's Greatest Artist

After reading obituaries such as this and this on the occasion of Mr. Wyeth's passing, 17 Jan 2009 at age 91, I thought it only right to set down my defense of his artistic integrity, which hardly needs any defense in light of his near-universal popularity here in the heart of Wyeth country in Chester County, some 35 miles west of Philadelphia. Small wonder, this, as living in Chester County is, at least in February and March, in particular, the near equivalent of living within a Wyeth painting, so accurately did he depict the light and texture of the place. If this means that that we, the natives of Wyeth Country, are the hardscrabble rural unfortunates the dour obituary writers say Wyeth populated his world with then so be it: I can imagine worse company to keep.

Years ago, between the shooting and publication of Life Magazine's famous story on Wyeth, as it turned out, I talked to Andrew Wyeth, expressing my interest in marking a film documentary about him. I had dallied about how to contact him with my nebulous ideas, and had hoped for an introduction through Lucius Crowell, also a realist artist (and deserving a much wider audience). Lucius, a friend of my parents, said to just call Andy, as he called him, to explain what I had in mind.

With some trepidation, I did. Wyeth was friendly, as I explained that Lucius had said to call him, and I briefly pitched my idea of a film portrait of him at work, uniting him with the landscapes that I myself loved so much. He was gracious, but demurred, saying that he was a rather private person and that Life was coming out with a big story shortly, and that this would be enough exposure for the moment. I thanked him and said goodbye.

Some years later I heard that a relative, a nephew named Denny McGowan, I believe, was working on a documentary about the Wyeth clan, which, if this was the same project as the show shown on PBS (in the 1980s?), it turned out to be not dissimilar from what I had in mind.

So, what might have been a career-forming experience turned out instead to be just an obscure footnote. However, Wyeth's willingness to hear me out added substance to my admiration of his art, which is nothing if not honset and sincere.

From the second obit, in the New York Times, comes the following quote of Wyeth: "I put a lot of things into my work which are very personal to me. So how can the public feel these things? I think most people get to my work through the back door. They’re attracted by the realism and they sense the emotion and the abstraction — and eventually, I hope, they get their own powerful emotion."

19 July 2008

Casting Aside Anonymity

As I have absolutely no record of anyone having read this blog, my emergence from behind the nom de blog Viktor Pyke is of absolutely no consequence whatsoever, except that, in future, a search on my actual name may cough up my association with Time's Arrow. Harrumph.

13 July 2008

What's Wrong with Google

Google has been fabulously successful, as everyone knows. My understanding of their basic search algorithm, which gave/gives noticeably better results than the visible competition, is simply using the relative number links to sites from other sites as 'votes' for how 'most people' would rate the target site. Thus, it is remarkably like, and for all I know, probably derived from, the practice in academia of rating academics by how many citations their papers accrue. For this reason, I would suspect that this would form a sound basis for challenging some basic patents that Google must hold in this area, as this technique of 'rating' (which is what a searching engine does) has plenty of previous use.

But that is not why I write this. My intention is rather to question what the effect of Google's ubiquity has been on the quality of the web, and, the somewhat different problem of whether Google is yielding to probable temptations to alter their search results for their, rather than the searcher's, gain.

One obvious effect of Google has been to commercialize just about every square inch of leftover screen real estate on a significant proportion of sites of all types. Although the ads are presumably less obtrusive for being simply text, they are finding placements in increasingly obnoxious proximity with the 'content' text of the site such that how it is not uncommon to find yourself wondering in some instances which is which. As the basic ad placement is the prerogative of the site owner or designer, Google can hardly be faulted for big errors of design, per se.

However, I would counter that Google can indeed be faulted for this simply by providing the financial incentives for creating the web equivalent of thousands of miles of strip malls, which profits Google, the site owners (inasmuch as he is able to profit), the companies paying for the ads, in short, everybody but the web surfer, whose attention has been commodified without his realizing it (and the site owner, inasmuch as he has diluted his own message and been a party to the high-jacking of the surfer's attention).

This simple analysis might be contested by a capitalist, who might vigorously opine that the surfer, too, stands to gain from his encounter with the ad, as it may satisfy a need, albeit a need that the surfer may not have unrealized he had. And, of overriding importance, each time an encounter such as this results in money changing hands, it greases the wheels of commerce, contributes to the general well-being of society, and serves to illustrate the wonderful efficiency of the marketplace.

Google is providing an unprecedented degree of 'relevance' in the placing of its ads. The idea is to provide surfers with ad options that are as semantically close to the content as possible (and to companies placing the ads, potential customers who are already 'thinking around' their product area). This narrowly targets specific ads to specific potential customers, at least insofar as the content describes the wants of the people who read it.

Google can populate very cheap screen space with ads very cheaply. The ad content seems to be generated dynamically, that is, at the time each page is served, rather than in the conventional manner, as part of the page layout phase. This and automation of click statistic collection (the basis for payments to the site owner) make Google ads very cheap for a fairly high-quality pool of potential buyers, this last fact assuring good fees to Google.

The last quality of Internet advertising Google is exploiting is the gathering of valuable statistics on all aspects of their operation, which has allowed them to fine tune it on a near-real-time basis.

So, one might ask, what does all of this mean to me? I feel it means a great deal, because it is the first time that such a large-scale operation, with its coordination of content and advertising, has blanketed the Internet (virtual) landscape to such a depth that the experience of using the web is being profoundly subverted for economic purposes. It also has the potential for changing the usefulness of Google's primary product, namely, its search engine.

Let's deal with the potential effects on Google search first. The prime utility of Google, or any search site, is whether it yields useful results in a reasonable time. This has been done amazingly well for billions of searches. A main objective of those who profit from search engine exposure is to be as close as possible to the top of the list of search results. Methods of spicing up pages to favor top listings by search engines are the stuff of hacker legend.

All of this worked great when the Web was young. Now, it has grown up and dreams of a Universal Library have fallen by the wayside as a horde of would-be billionaires are seeking to exploit every corner of the Web to generate fabulous profits by one means or another. This has served to all but eliminate 'Page Not Found' warnings, for instance, which are now replaced by countless 'value-added' redirection pages, dolled up to appear as neutral indices of possible interpretations of the URL that was not found and which are, you guessed it, click ads. My ISP, Verizon, is so smarmy that it intercepts 'Page not found' pages and produces their own click ad pages instead. The effect is somewhat like looking for a book in the library and, when not finding it, instead being thrust into the middle of a bazaar filled with loud venders hawking their wares. I may be being overly sensitive but--no thanks.

Sites such as these, really 'meta-sites', seem to be finding their way into search results. I have found that, often, the first ten or more items of a request on a broad category yield not primary sites but bogus meta-sites that purport to be asking you to refine your search. Now, for all I know, all of these pages, which are not recognizable companies, are simply CGI fictions that are built dynamically by someone onto a spare server, hoping that the user will make a click that will generate revenue. The outcome is that I have to look further down in the original results to get the real sites that I had in mind to begin with.

I should think that the temptation to do this would be irresistible for many self-righteous profiteers determined to mine such opportunities the Web affords to make a fortune.

It shouldn't be hard to see where I stand on the commercialization of the Web: I think it is as misguided as putting advertising on every page of every book published; as placing billboards on every inch of every highway, byway and cowpath; as running a constant crawl of ads on both top and bottom of every movie and entertainment produced; as having loudspeakers blaring ads during every musical and theatrical performance; that bad, and worse. Why? For the very reason that my examples seem ludicrous; because carving up the surfers' minds and selling them off to the highest bidder may be 'the American way' or the most perfect example of economic efficiency yet devised, but it is wrong. When the very quality of experience is up for grabs, it is devalued to arbitrarily low levels by the elites who stand to profit therefrom.

Advertising is predicated on the notion that attention, the spotlit, center stage of consciousness, can be divided into arbitrarily small temporal units. Now, even though there have been attempts to derive profit from sub-second exposures (e.g. subliminal advertising), such attempts, when recognized, have been rejected as being too blatantly manipulative, whereas ads of, say, 5 seconds are 'fair game' and ads of 10 seconds are commonplace on television. However, even after all these years of 'accepted practice', I think it is still worth contemplating what effect this Balkanization of consciousness has had on individuals and society at large.

The easiest way to get an idea of what people, on average, were able to think about in earlier times is simply by reading old newspaper articles, easily done on the Internet. Without belaboring the point, it is easy to observe much longer articles in greater depth with larger vocabularies in papers published 50 years ago than today. The same measures can be used to compare 50 and 100 year-old papers; I say that the 100-year-old articles are even more 'difficult' than those half their age, but would hazard the opinion that the 'dumbing down' has been even much greater in the last 50 years than in the 50 previous years. Why would this be?

It was roughly 100 years ago that education began a long decline, when its philosophical emphasis shifted from a 'leading out' (e+ductare) of individuals from the darkness of ignorance to the utterly different emphasis of preparing students to be functioning workers and citizens. The reasons for this momentous change were many, including, the emergence of huge industrial concerns in need of trained labor, the influx of large numbers of immigrants needing to be integrated into society and, a new faith, doubtless on the heels of industrialization, that informed a new ethic of placing people into their 'rightful place' in society and the workforce.

So, the noble goal that had illuminated the process of education since classical times, forming discerning minds, was tossed in favor of the 'practical education', training, really, of fodder for industry. No use preparing the masses for a life of contemplation, we can imagine the captains of industry saying among themselves, when they will be spending their lives as part of the machinery of modern life.

What was lost? Ancient Greek and Latin, which had from the Renaissance until around 1900 been a window leading to realms of classical literature, philosphy, philology, history and Roman law for countless students and was the common ground from which grew our republic. Euclidean geometry was, for twenty centuries, the basis for learning mathematics, not simply as a set of techniques to be applied, but as a mental discipline for establishing incontrovertible proofs. The fruits of mathematics can be applied by anybody in possession of a computer, but the basis at its root, mathematical thinking, if you will, is no longer a part of everyone's schooling. The inclusion of music and art as an integral and necessary part of everyone's education has been swept aside and pushed into optional 'electives'.

I propose that the first great degradation of public literacy was one result of degrading the life of the mind in general through the wholesale destruction of an educational tradition of many hundreds of years, all in the name of 'modernity' or 'efficiency'. The second, and greater, degradation, down to the 'sound bites' of today, is easily chalked up to the brain-numbing exercise of advertising, visited on everyone foolish enough to subject themselves and their children to commercial television. And, not coincidentally, the first process made the second possible, as an uneducated public is a pliant one.

And now, the Internet, a medium of supremely more promise than television, has been set upon by the same soldierly ants who commandeered television to be an instrument of plutocrats. I predict that they will not stop until every page of the Internet is laced, through and through, with mind-altering, mind-mincing ads, making sure that every surfer's attention is never far from the business at hand, namely, someone else's need to make money. Although Google may assert that their business plan is in no way 'evil', I would counter with the observation that Google and its imitators are destined to be known as the destroyers of the Web, not an enviable reputation to have, and hardly benevolent.

28 October 2007

The Undoing of James Watson

It is difficult to say what James Watson, the co-discoverer along with Francis Crick (and some would say, Bonnie Franklin) of the structure of DNA, was thinking when he recently remarked that he felt that people of African descent were less intelligent than the rest of humanity. This resulted in the cancellation of a book tour and then his retirement, at age 79, from the Cold Harbor laboratory he had been associated with since the 1960s. A similar reception greeted the authors of a book called "The Bell Curve", which similarly made an argument for racially-based differences in intelligence.

Such assertions, whether or not they have any merit whatsoever, are absolutely irresponsible in the way that lighting a match in a theater filled with people and gasoline fumes is irresponsible: There is no way that the outcome can be beneficial to anyone, which, as is true in a few cases such as this, moots the truth or falseness of the assertion. As real human beings are put at risk when such statements are made, their welfare trumps whatever brownie points are piled up somewhere by "furthering human knowledge", if this is what is purported to be the point of morally questionable science.

That said, it still is worth considering the issue as a metaphysical question: Not to assert something about the intelligence of different groups of people, but to consider what moral or practical imperatives MIGHT pertain IF such a determination were made sometime in the unspecified future. In fact, findings of differences in intelligence could as well have nothing to do with "race" or place of origin, but might be tied to medical conditions or specific genetic differences (such as Downes). In short: What then?

The larger issue here is whether "science" is competent to or should be allowed to make life-altering pronouncements based on presumably "objective" criteria that is asserted to have some role in predicting, say, whether some people are more likely to graduate from hight school or college than other people.

Whether or not you support such efforts to predict human outcomes, scientific or otherwise, such "objectification" of the decisions of admission officers at schools or recruiters for corporations or the military has been of course a fact of life for practically a century in the US and most likely many other countries as well.

A crucial part of this process of sorting wheat from chaff has been the widespread acceptance of the concept of intelligence as an objective (i.e. measurable) trait that has broad predictive value. Now, this may resonate with commonly-accepted notions of being "bright" or "dull", but the full-fledged operational definition of an IQ or "Intelligence Quotient" hardly came about by accident. No, it was developed as a concept by the military and large corporations as a way for justifying selectively investing resources (e.g. training dollars) in some people and not others. No mind that the concept might itself be flawed, as forcefully asserted by Stephen Jay Gould in his well-known book on the matter, "The Mis-measure of Man"; the truth of such objections mattered little compared to the utility of being able to (arbitrarily) assert, beforehand, that some people were more likely to succeed than others, so OF COURSE it made economic and moral sense to give those lucky people the means to succeed while turning away the losers.

So, anybody who makes assertions about "intelligence" as if it were an objective trait of humans, a number that can be easily obtained by administering one or another series of questions, has, at the very least, bought into a very dubious set of ideas with very large moral and practical consequences, and so is quite likely to suffer from many other illusions as well. The undeniable fact that people who support the idea of "intelligence" and IQ testing have, at some point in their careers, doubtless benefited from IQ tests (because, after all, they went to exclusive colleges, which have given them the platform from which to make such pronouncements "with authority") should tell you something about the circular nature of the intelligence game.

Far better, from both a moral and practical point of view, is to stop viewing opportunities as scarce resources that should be handed out parsimoniously but rather lavish opportunities over the whole spectrum of the population, giving all the chance to make the most of education and career opportunities. Not only does this approach square with our true knowledge of future performance, it is the only way to prevent being blind-sided by bias and bad judgment and wasting a large proportion of the promise inherent in any population.

18 September 2007

God as Consciousness

Regardless of the endless, unanswerable questions about the purported existence or demise of God, and the truth of the answer to this seemingly fundamental question, there still remains the real riddle: What/Who is this thing/being called "God"? Why does this concept have all but universal acceptance in human cultures? How can the philosophical or abstract 'God of Nature' also be a 'Personal God'? How is one to distinguish one from the other?How could the concept of God be entertained seriously by some of the greatest minds of humanity? How can their pious conclusions be so varied, yet, at least in the case of J. S. Bach, be accepted almost universally as an expression of—*something*? If not God, what?

An answer that appeared to me to resolve these sorts of questions is that God, in the most general sense, is another name for consciousness (or, more precisely, *self*-consciousness). Whether this might be construed as *consciousness itself* ( the world and the mind apprehending it) or simply, a name for the *portal of consciousness*  (the Enabling Force that permits self-awareness) is difficult to determine, even now.

This confusion, which might be expected with such intangible abstractions, has given rise to the two main schools of thought about the nature of God. Those who hold that 'God' is the *content* on consciousness , are saying, practically by definition, that God 'is Nature'. Those who see consciousness as a numinous mystery with portentous consequences, are again saying,  practically by definition, that God is a 'Personal God'.

Although these two ways of looking at the nature of God seem to diametrically opposed, they would seem to be merely two different aspects of resolving *what* is being identified when one tries to objectify consciousness or self-awareness by naming it.

Why would one want or need to give consciousness or self-awareness another,  even more abstruse, name (or Name)?

Considering the evident antiquity of the concept of God, or Gods, the answer to this must be found in the twilight before the dawn of history.

Whether of not the naming of consciousness by early humans was a 'conscious decision' or not is highly debatable, but considering the consequences of the rise of self-consciousness, I believe there can be no doubt that early humans understood its significance. Indeed, early humans would have understood the phase-change happening in their midst far better than anyone since.

To be suddenly aware---literally---that there were profound, yet indescribable, differences between yourself and many of your fellows is difficult to describe, even now. Whether this took many generations or only a few, the 'dawning of awareness' must have been experienced by those in whom consciousness was emergent as 'suddenly,' even as we are often repeatedly aware as we regain consciousness that, just a moment ago, we were 'asleep' by comparison. It is equally certain that the change from 'being asleep' to being 'awake' must have happened at different rates among different, particularly isolated, groups, as mutation rates would have varied quasi-randomly. The condition of being self-conscious must have profoundly forever sundered the first self-conscious humans from their less-self-conscious precursors. (This can easily be appreciated by those who have contemplated the differences between modern humans and Neanderthals, even though there seems to be ample evidence of interbreeding.)

The emergence of human self-consciousness must have been accompanied by a preoccupation with the nature of self-awareness. There can be little doubt that this must have been of great pragmatic interest among First Aware Peoples, because it touched upon every aspect of their intra- and inter-tribal encounters.

This event, and I think it was an event, which, like the harnessing of fire, must have happened time and again in different places and times, but at one point in history 'took fire,' caught on, and from there on was indispensable for all prehuman tribes to survive.

This also must have created great anxiety of a sort previously unknown, as previously unthought-of existential questions suddenly became a central fact of life. This great angst must have created a collective need to identify and control consciousness itself, as the hall of mirrors of self-awareness  would have been at least as terrifying as any external danger. Indeed, it may well be this that is alluded to in the Biblical story of Adam and Eve, who were cast out of their comfort zone, Eden, as a consequence of partaking of the Apple of [Self]-Knowledge.

So, the concept, and the name(s) of 'God,' or a panoply of Gods, might first have been a way of identifying ones consciousness, a sort of password to enter the new club of humanity, while keeping out the dross of pre-conscious (less fit and thus less attractive) hominids. At a time when genetics itself did not preclude cross-breeding between pre-conscious and self-conscious hominids, the God-concept would have become the means by which the sapient could exclude their pre-conscious potential breeding partners from entering, and swimming in, the newly forming human gene pool.

It is easy to imagine that this landscape of changing degrees of   consciousness caused a period of great strife and bloodshed, wherein the new humans sought to differentiate themselves from their recent forebears, killing those who would or could not embrace the new modality of awareness. As these varied groups of hominids occupied more or less the same ecological niche, identical in every way except for having different recursive levels of awareness, the fight for supremacy must have been deadly. This is easy to believe, in that similar struggles over doctrinal issues have been singularly brutal throughout human history.

Thus, humans entered the historical period having long before murdered all closely related hominids, a burden that could well have been the origin of the persistent concept of Original Sin.

Self-consciousness might have caused newly conscious societies to exercise control over what must have been perceived as a potentially anarchic influence. Although self-consciousness is a sine qua non of being human, it is also fertile ground for cultivating alternative thoughts about how to live life, who should have what, and so forth. By asserting a hold over access to the Gods, a priestly class would have been seen as beneficial for retarding the anxiety-producing side-effects of being *too* self-conscious. Ultimate submission to the authority of religion, then as now, would have diminished the angst of standing, naked, before an unfeeling Universe, stripped of the comforts of a pre-conscious Eden.

These apparent contradictions have remained with us throughout history. Understanding 'God' as consciousness (or Gods as aspects of reality and personality) makes some sense of what otherwise seems to be utterly devoid of objective meaning. Religion, theology, theogony (genealogy of the Gods), and personal revelation can all be seen more clearly as an expression of and a means to cope with humanity's uneasy relationship with itself and its self-awareness.

Whether 'God,' as understood until now, has had survival value, in the relatively short term of human history, is debatable. But whether 'God', as now generally understood, will have survival value in the longer term seems very dubious at this point.

The God-concept has been used to exclude not only the nearly human from humanity but has consigned Nature Herself to a subordinate role in human affairs. But now, Nature, increasingly spurned by a secular world,  has been rising to reassert Her primacy in the affairs of humans and all other living beings. Even as Nature's workings have become more visible in light of the infant sciences, so has the wreckage on Nature's biosphere become more obvious, besmirched by many heavy, human footprints.

So, humanity must again undergo a revolution of consciousness, this time inverting the God-concept. Even as God or self-awareness, has heretofore been exclusive, now it must become all-inclusive. Our Jealous God, must, at long last, become a Loving God, who shall again embrace Nature's fecundity, and together, give birth and nurture to a renewed Eden.

Media Technology

The "media" as we know them started with the invention of photography, in June or July of 1827, by Joseph Nicephore Niepce. There had been many precursors, the camera obscura ("dark room") being perhaps the most prominent. But Niepce brought together an image focused inside a box with an image-fixing medium that reacted to light but was then treated so as to stop further exposure when the image was formed.

Countless modifications by many other pioneering photographers perfected the photographic process. What started as a many-hours ordeal to expose ended as the snap of a shutter. Largely experimental processing was replaced by clearly-defined steps with commercially-available chemicals. Transparent negatives would be used to make unlimited numbers of prints on sensitized papers. Photo-etching processes and halftone allowed widespread publication of photographs in newspapers and books.

Soon work began on motion pictures. Eduard Muybridge and others took photographs in series, of horses galloping, people walking and so forth. The individual photos could be placed inside the zoopraxiscope, either projected in series or revealed by slits in the walls of a spinning cylinder, they appeared to move. It was not long before George Eastman's photographic film was perforated and exposed in Biograph cameras and prints projected, frame by frame with a maltese cross movement and illuminated by carbon arc lamps, filling movie theaters around the country.

The holy grail then moved to sound reproduction on movie film, achieved by two methods to create a stripe along side the pictures that modulated the light passing through to a photocell. Movies became the talkies.
As radio moved sound through the "airwaves", it became clear that pictures would soon follow over the air--television. The first boardcast featured Felix the Cat, whose image was slit-scanned to decompose it by a rotating disk, so a stream of brightness levels could be transmitted and reconstructed at the receiver. More sophisticated methods appeared; the iconoscope and others scanned an optical image with electrons, which was recreated at the receiver by painting it onto a cathode ray tube. Television was a reality before World War II, but became a mass medium after the war.

Television began a long period of incremental improvements and new inventions. Videotape became a reality in the 1950s and, gradually, computer controllers permitted video editing. Film negative and equipment improved incrementally as well, with more portable equipment, better images and less bulky sound equipment.
Claude Shannon wrote a paper in 1948 that rationalize the process of sending a signal over a noisy channel. This was of most direct utility initially for the Bell Telephone system (he worked at Bell Labs) but it laid the mathematical basis for understanding how much information would be stored in a given medium, giving reasonable engineering goals for a host of communications equipment.

In the 1960s and beyond, television equipment was radically improved by using transistors, computer controllers and high-tolerance manufacturing. Japanese companies revolutionized video with U-Matic, BetaMax and VHS and then C-format 1/2" tape technology. These created a huge home video market and permitted acceptable video production and editing by less-highly-trained staff. Cameras began using solid-state CCD pickups, which were longer-lasting and more stable than tube pickups, as well as much easier to set up and use.

With miniaturization, film lost out to smaller-format video for news gathering and documentaries. Video editing systems for small-format tape became frame-accurate and could produce an Edit Decision List (EDL) to semi-automatically assemble edits on broadcast tape.

All of these factors lowered the cost of making broadcast-quality video while lowering the technical requirements needed compared with the previous generation of video equipment. This was a boon to broadcasters, which lobbied the FCC successfully in the 1980s to soften quality standards for broadcast video and loosen training requirements for the operators of video equipment at television stations. As these factors lowered the over all cost of productions, at least on the technical side, a new sort of semi-professional production house rose to the task of making VHS tapes for industry and individuals.

In the 1970s there was still enough money in industrial films that a $100,000 budget was not uncommon, which was enough to make a product with high production values and still make a sustainable profit. By the 1990s, expectations of industrial PR people had been lowered sufficiently that a $25,000 budget was often a tough sell for a presentation to be distributed on VHS, Expectations were conditioned by the semi-high quality of home video equipment costing under $1,000, the seeming simplicity of making home videos and the like, as well as the markedly lower cost of VHS tapes versus film prints. This "price gap" allowed successful wedding video makers to branch into the industrial video market and produce (barely) successful product, with cost-cutting to permit a profit margin to be had even with curtailed budgets.

This forced many former filmmakers out of the business altogether, not having the wherewithal to capitalize new video equipment on much lower margins. The wedding video people were able to work up because they had already started with cheaper video equipment adequate for home VHS product. This, at least from what I have observed, has lead to a wholesale turnover in the industrial market with much less sophisticated tapes "being good enough".

Herein lies the paradox of technological advances: One would think that cheaper and (for the most part) better equipment would promote better productions. However, the outcome was the opposite of this, largely because of the altered standards of media buyers who could see that cheaper equipment could give (basically) the same results than older, more expensive equipment. This, the buyer's surmised, was reason enough to expect that productions could be made for much less. The real problem probably wasn't the media buyers so much as their bosses or higher, who generally pride themselves with snap judgments about things (such as lower-cost video equipment) that "obviously" would lower the cost of videos the company wants to make.

The trouble with this sort of prejudice is that it afflicts every negotiation about prices, driving budgets close to the bone because of a misconception about the costs of production, which are not driven predominately by equipment costs (though that might have been an excuse used decades before to "explain" too-high costs). This may be a classic case of being bitten back by spreading earlier misinformation.

The situation describe above pertained to the 1990s. Since then, video technology has moved much further, into the confusing world of multiple High Definition standards, with both amateur and professional camera equipment recording onto internal optical or hard disks, with editing on computers with a HD deliverable right off the computer. Although HDTV has raised the ante, requiring a huge up front investment in new, high-bandwidth equipment, the battle already is joined to convince semi-pros that some newer "prosumer" gear is at least almost ready for prime time. No, this gear will not be good enough for a full-fledged studio production, intercut with studio cameras many times their cost. However, they may be acceptable for news gathering and local documentaries that are simply too risky to make with $50,000 cameras on a shoestring budget.

So, however much digital cameras in the $3,500 range have been able to crack local television markets for some free lancers, High Definition once again resets the mark. This benefits no one as much as the equipment manufactures and their dealers. Once again, would-be production companies must mortgage their future just to gain entry into the uncertain market of television production.

Probably the only way to make a real difference in local television production would be to renew the FCC requirement instated in the 1960s, requiring that TV stations produce local programming to fill the slot from 7:30 to 8pm. This would create, as it did then, job opportunities for a large number of producers, cameramen and editors throughout the country, as well as a great quantity of locally-originated programming. This would restore a great system for learning production. A public-service requirement would reinstate documentaries at the local and regional level, which, in all likelihood, would be accepted by the public as a variation on the so-called "reality" shows that have become so popular in the meantime.

06 September 2007

The Utility of Life

In an opinion piece in the New York Times, Edward O. Wilson, an emeritus professor of biology at Harvard, makes a plea for an Encyclopedia of Life as a means of documenting the 90% or so of species yet to be discovered. I think it would take a serious case of ostrich-like know-nothingness to object to acquiring this knowledge. However, in this article and elsewhere, there is a serious, if understandable, flaw in the reasons usually given to support such an effort. For example, in this article, Wilson asks and answers a rhetorical question:
Why bother making such an effort? Because each species from a bacterium to a whale is a masterpiece of evolution. Each has persisted, its mix of genes slowly evolving, for thousands to millions of years. And each is exquisitely adapted to its environment and interlocks with a legion of other species to form the ecosystems upon which our own lives ultimately depend. We need to properly explore Earth’s biodiversity if we are to understand, preserve and manage it.

It is impossible to gainsay Wilson's knowledge of biology nor his poetry in revealing the beauty of the biosphere. We all would do well by admiring the richness of living nature as does he, and admiring Wilson himself for his impassioned call to understand nature in her every detail. However, I still feel that the above quote contains one flaw that endangers all the rest. 

The flawed statement is the assertion that we humans are to manage Earth's biodiversity. The sentiment is not the problem. It would be nice to have a benevolent mankind looking out for the best interests of each and every species. It would be empowering to harvest the bounty of countless generations of evolution to apply every trick in nature's armamentarium to conquer diseases and better living conditions everywhere. I think these sentiments are unarguably good. 

However, the call for mankind to manage the biosphere is over-reaching and impractical. In fact, I feel that daydreams of managing the biosphere are categorically wrong and dangerous simply because mankind, nor any agency, is competent enough to "manage" a biosphere. A biosphere is, literally, unmanageable. 

We are sustained by the synergistic workings of the biosphere, the sum of the activities of all life on Earth. From life's humble origins in a methane-dominated atmosphere, it has had an active role in shaping the physical environment. Gradually the early methane atmosphere was altered by freeing oxygen, allowing different forms of life with higher metabolic rates to grow dominant. Life has conquered nearly every environment Earth has presented, from the perpetually frozen arctic to the ever-boiling sulfurous vents astride faults. Life permeates Earth as if it were a sponge. 

Each living thing satisfies its needs for nutrients and passes its metabolites back into the life-generated soup that is the environment. Plants feed on carbon dioxide and exhale oxygen whilst animals inhale oxygen and expel carbon dioxide. Every input and output of every organism on the planet is meticulously complemented by other organisms, high and low, each which extract every scintilla of energy at every stage. The unequivocal process of rewarding the fit with progeny, while tolerating some variation in reproduction has produced, through evolution, an astounding variety of life forms and behaviors that, like water, fills nearly every crack and crevice of opportunity with a life form to take advantage of it. 

Yet the optimizations of life are not restricted to the designs produced by evolution; the conversion of chemical inputs to biomass is accomplished by the individuals of countless species, each living the drama of its personal existence, motivated by hunger, fear, and desire throughout its life. The whole adds up, on a grand scale, to a homeostatic system, having great stability, as each player is but a tiny contributor to the whole. The wisdom of the system, as it were, resides with no individuals nor groups, rather. it is in the countless, intricate feedback loops that include all of life; its wisdom is in the whole.

 Contrast this with mankind, a recent arrival on an ancient scene, whose connection with nature has been largely severed, or so it would seem, by a consciousness that lifts what it sees from a context unnoticed. Modern humans are singularly adept at acting on their misconceptions, and have left a terrible trail of destruction in their wake. It is for all these reasons that humanity cannot and should not be emboldened to be the "stewards of the Earth" or the managers of the biosphere, for not only is this the height of folly, it is almost certainly the death warrant for even greater numbers of species than have already perished. 

The best, if not only, way for mankind to contribute to the stability of the biosphere is to shrink in numbers and influence by at least 100-fold, and preferably 1000-fold or more, and adopt a policy of "zero-footprint" engagement with nature, wherein all resources are replenished, all human waste biodegradable, and so forth. This would be tantamount to returning to the living conditions of so-called aboriginal peoples. 

Whether this would be possible is doubtful. given general unwillingness to forgo the comforts of modern civilization, not to mention a host of other reasons. The probability of some sort of catastrophic collapse of civilization is all but certain because, this has always been the case. Unlike the rest of nature, humanity organizes itself into fragile hierarchies that mainly serve to amplify the power of lunatics and worse, such as Napoleon, Hitler, Stalin, Mao, Pol Pot and others. 

Yet others are more or less directly responsible for large-scale environmental damage, such as Ford, Rockefeller, DuPont, Nobel, and a host of imitators, who have collectively enabled modern industrial society to reshape the world to the peril of everyone and everything. Clearly, the apparent imperatives of civilization are at odds with survival itself, which brings humanity to the grimly amusing position of having to destroy itself to save itself. This sort of paradoxical conundrum is probably best handled by religion if it is to be handled at all. 

This begs the question as to whether, at this juncture on our threatened planet, science and technology can continue in their accustomed roles as enablers of human development. After all, as has been proven many times throughout history, survival trumps rationality.

01 September 2007

Transparency in News

The New York Times article talking up torture, cited here earlier, seems to be an example of the media's collusion with the interests of the administration. The mechanism for the coordination of news stories beneficial to the White House is completely opaque to the public, but is doubtless more subtle than a press release marked "Confidential" with instructions on what to say. The publication of Judith Miller's stories about Saddam's purported weapons of mess destruction helped sell the war in Iraq, but later turned out to be devoid of truth. Relying on anonymous sources of dubious merit, the public was misled, throwing the editorial judgment of the Times into question.

The appearance of the "everybody's thinking about torture" article(s), followed by the awful and momentous decision by the administration to redefine the meaning of "humane treatment" in the Geneva Conventions does not prove there is a connection between the two. However, I feel Occam himself would think it more probable that there was a connection than not, given that the appearance of such an article in the Times would certainly be noticed by administration players that had something to do with setting up torture on a "business-as-usual" basis. This is not a bolt-out-of-blue conspiracy theory; given the degree that administration officials had been able to cloud the Times' judgment with Judith Miller's handiwork it is as likely to have acceded to informal suggestions or pressure to talk up torture.

I have no proof of a back channel between the Times and the administration, so must be careful not to declare that this was so. I think, however, that since news coverage so mysteriously helped the war effort along and that this was a dreadful disservice to everyone involved, suspicions of any sort of collusion should be vigorously investigated.

Yet, who is to do this? Perhaps it is the job of a credible institution connected with news, such as the Columbia School of Journalism. Or, perhaps there should be a Fourth Estate Ethical Board, funded by members of the media, to act as a clearinghouse for the meta-information about news that is now utterly missing.

For instance, about six weeks ago, the Times had a few headlines about "Al Quaeda (in Iraq)" that raised a big question in my mind, namely: Where these stories simply rewritten press releases from the White House? The reason I thought so was that one of the most pernicious and false pieces of disinformation leading up to the war was that Al Quaeda was somehow allied with Saddam Hussain, an assertion based more on wishful thinking than evidence. So, why was the Times willing to sully it's "objective" stance and start plugging the administration line? My guess was that it was simply laziness or editorial inattention, and rewrites of press releases.

Although the Times fields questions of this sort through its Ombudsman, his arrangement is not very satisfactory, because there is no assurance that the Ombudsman will find your issue as pressing as someone else's, so many issues go unanswered. Rather, as I mentioned in the piece about quality assurance, the best way of assuring quality is to document every stage of a process in a form that can be reviewed. In retrospect, the correlation of stories that were gung-ho about going to war and "unnamed administration sources" might flag such stories as being just a form of propaganda.

At first thought, subjecting news outlets to an ISO 9000 regime might seem completely undoable and a violation of the independence of the press. Secondly, it might seem impossible to have a system for disclosing how the news was gathered in an environment of commercial entities in competition. I don't think that either of these objections would be insuperable: The real question is whether news sources can continue to work with self-defined checks on their own objectivity ("editorial judgment"). The egregious failure (or cooption?) of the press leading up to the Iraq war, when lies were repackaged as truths by a compliant media, means that news outlets should be held to task to document their news gathering efforts to help verify the truth of what is reported.

30 August 2007

UK News Outlet Shoots Self in Head

An article in the Guardian may portend the end of TV news as we know it, at least for viewers of the UK outlet Channel Five. Apparently "TV fakery" (an oxymoron?) has apparently become an issue there recently. (Here is a bit more about that.) Truthfulness and accuracy in news reporting is an ongoing issue, but the problems and list of remedies mentioned in the article about Channel Five are utter nonsense, as shown below.

NB: The full story about the BBC apparently showing a clip of the Queen being asked to remove her crown, is clouded at best. Here is one account. This follows on an earlier story about problems with a phone-in show. The result is a full-fledged crisis of confidence and heads are already rolling. My interest in this remains the Channel Five response.

They include: the ban of reaction shots--gratuitous shots of the reporter listening (or "listening")--whose function is to bridge what would otherwise be jarring jump cuts.

Another: "Contrived" walking shots in which people are filmed strolling towards the camera are also out. "These are ghastly," Kermode said. "They are artificial, so we should ban them."

"Cut-aways", shots that include content being talked about, again to avoid jump cuts, will be banned, along with post-facto shots of reporters asking questions, deemed to "rarely look genuine".

The article concludes by asserting that these changes "can help restore trust in our medium and make our programmes more creative too."

Both the objections cited in the article--to "untrue" material in news reports--and the proposed corrections miss being relevant by a wide margin. That is, "TV fakery" in newscasts has, paradoxically, little to do with the "truth" of individual shots, or even whether they appear staged, so the "corrections" proposed, and, even whether there was a problem to begin with, are clouded with simplistic thinking. The truth of the matter is much more interesting.

First off, the dichotomy between visual narrative and "the truth" has been around for well over a century, even if not at the forefront of the public's concerns. It became clear early on that while single photographs and film shots have a demonstrable, one-to-one correspondence with "reality", when they are combined into a series, suddenly a new and unexpected element appears. This new element is the "reality" created and experienced in the viewer's mind from the narrative inferred from the juxtaposed shots. This implied reality, the reality experienced by the viewer of a series of shots, is not simply the summed "meaning" of the constituent shots considered in isolation. Rather, this "narrative reality" is a near-continuous series of inductive leaps, an evolving interpretation of "what's happening" as shots clash, one with the next. This continuous "jumping to conclusions" to make sense of each shot transition results in a new experience, a narrative flow, what is experienced when watching a series of shots.

The Russian filmmakers Eisenstein and Podovkin, inflamed with revolutionary zeal and open mind, investigated the emergent meanings generated by shot juxtaposition. The fact that his seemed to be a perfect illustration of the Hegelian Dialectic didn't hurt in the fervid atmosphere of 1920s revolutionary socialism. Recall that the Hegalian Dialectic was appropriated by Karl Marx to describe history as a sequence of proposition, antithesis, and synthesis, each transforming into the next, compelling headlong change. It is not hard to see these steps being retraced in the magic of shot 1 being juxtaposed with shot 2 resulting in an idea, the synthesis.

Podovkin reduced this to a simple experiment: He took a shot of an impassive actor and then cut it together with a bowl of soup, a sick child and other shots of what the actor was "looking at". Viewers felt that in the first case, the soup, the actor had done a good job of conveying his hunger. In the second case they felt the actor had projected sympathy, and similarly for the other shots. Now, in each case, the shot of the actor was identical, not just similar, so the synthesis, the attribution of "hunger" or "sympathy" was an artifact in the viewer's mind.

Eisenstein also gave a nod to Hegel, but gave some academic underpinnings for this theory of montage. He showed that many compound characters in Chinese illustrate the same principle. For instance, the characters for "sun" and "moon", when combined, result in a new concept, "brightness". Emboldened by this new take on how the mind works, Eisenstein put theory into practice with the multiple juxtapositions of the forces of tyranny against the people in his memorable Odessa Steps sequence, wherein the hopes of a crushed people finds expression in the awful decent of a baby carriage down endless steps to the sea.

These lessons in how to see were taken to heart by the world's movie going public, which over the years has become immeasurably more sophisticated and adept at interpreting the visual information presented them. Recall that when movies were first shown, a shot of a train approaching was enough to cause people to run away in panic for fear of being run over and early films were shot full-figure because of the gnawing feeling audiences had that the actors were "cut off" by the frame in closer shots. Viewer sophistication has grown immeasurably in the subsequent century.The vocabulary of conventions available to television news is a very limited subset of theatrical film, determined largely by the constraints of shooting news, but the same skills are used to watch both media, as must be self-evident to anyone who has seen movies and TV.

Given the ubiquity of what might be called standard practice for TV news production, it is difficult to understand the flap about reaction shots, "nodding" shots and the rest. I give my thoughts on them, point by point.

Reaction shots: By dint of being restricted to one camera, any efforts to improve the visual flow and remove jump cuts will have to be done with shots that have been shot out of the time sequence of the interview. Yes, the reporter is not "reacting" to anything in particular, but such a "reaction" is understood to be a neutral place keeper to enable the substance of the (edited) statement to be absorbed more easily by avoiding obtrusive jump cuts. (Reaction shots are even used when not strictly necessary, as a form of visual relief, because the eye craves variety.)

Reactions shots "work" for a variety of reasons.

Given the mysterious workings of visual perception, a shot of A, followed by B, a shot of someone looking, is understood as "B is looking at A". Although A and B may have been separated by thousands of miles and more than a lifetime in the real world, the perceived visual narrative is still simply "B is looking at A", ceteris paribus [all things being equal]. How effective this illusion is depends on how well the shots match, the direction speaker and reactor are facing and whatnot, but the illusion of "B is looking at A" seems to reflect how people are wired to interpret a flow of imagery. Think: If this impression were not created in viewers, visual narratives would have to take a very different form.

Another function of reaction shots is to give the viewer a "role" in the presentation, albeit indirectly. If the viewer observes an interview or speech "in person", s/he would naturally look around to gauge how the speaker's ideas are being received, and, how your impression about how others feel compares with the way you feel [and, thus, whether you are in the middle of a friendly or hostile crowd]. Since everyone does this, seeing anyone react is in some way equivalent to seeing oneself react.

Another convention is to implicitly use the reporter as a representative or avatar for the viewer. The reporter is understood as being allied with the viewer or all viewers, so the reporter's reaction is read by the viewer as being a stand-in for his own reaction. This and the previous are similar, but different.

Reaction shots are a formalism but they are also consistent with the way the visual systems works, not as a continuous stream, as with mechanical movies or TV, but as a stuttering series of rapid starts and stops of the eyes, called saccades, effectively "cuts" from one "shot" to another, for nearly everything we see. This is another physiological basis for visual narrative, and probably explains why we so readily accept that "B is looking at A", simply because this sort of inference is drawn countless times every day with regular visual perception.

So, reactions shots are all but inevitable: The final question is whether they are "false" when not the literal reaction of reporter to interviewee. I think this is not important, as the apparent and actual differences between a "fake" reaction shot and the one "real" one at that point in the interview (from a certain viewpoint, etc) are devoid of meaningful difference. So, in my opinion, "true" reaction shots are rarely needed and it makes no real difference whether they are "faked". On the other hand, using an appropriate reaction can give several positive benefits even when the shot is "fake" (though it does have to be appropriate).


Flagging every edit within an interview, which might well be intolerably tendentious and confused without such edits, becomes very burdensome to viewers who wish to simply comprehend the issues the story raises, rather than engage in an epistemological study of representations by the television medium. Stripping interviews of reaction shots and reasonable illustrative material ("cut-ins") only emphasizes the gaps, making it more difficult to follow the sense of the story. Quotations in print solved the need to selectively extract portions of an interview long ago. Given the imperatives of visual narrative, TV news practice is equivalent, with relevant filler material covering bothersome jumps.

However, if "truth in reporting" is a big issue, there are innumerable effects, such as a short flash, than can be used to mark edits in an interview in a way that does not call attention to itself the way jump-cuts do. Such techniques might be useful in situations where unflagged edits would be otherwise said to change the meaning of an interview. I would hesitate to present every interview this way simply because it would be overkill.

The "contrived" walking shots complained about are generally introduced to give some dynamism to what would otherwise be simply a series of talking heads. Yes, they may be contrived or arbitrary, but I think their wholesale removal would be even worse. And, besides, what would replace them?

How banning reactions, cut-aways, and out-of-sequence questions would restore trust is anyone's guess, and it certainly would not make programs "more creative". Better would be to apply all these techniques to make the viewer's experience more pleasant with good judgment and sensitivity while being true to the spirit of reporting the truth rather than being slavishly literal about conventions that are understood to be conventions.

29 August 2007

The Great Quality Crisis

At the root of Mattel's problems with their suppliers, various contaminated food recalls and the like are explicit failures of quality assurance (QA). Why is this happening and what can be done about it?

QA has become one of the core competencies common to a whole range of processes, a common methodology and set of objectives that encompasses just about every aspect of manufacturing and even service industries. This approach was first widely applied during the first successes of Japanese industry in the 1960s. It was recognized that by honing to well-defined, and continuously-improving quality standards, the manufacturing process could be gradually tuned to drastically reduce component failure, improving everything from profits to consumer satisfaction with greatly-enhanced reliability. Soon, in self-defense, QA became a reforming battle-cry in American industries reeling under this new-fangled sort of competition.

Another series of events, in Europe, was to consolidate QA into the wide-reaching methodology it has become. NATO, in purchasing weaponry from member countries, found that the process of developing specifications had to go much further than the obvious requirements of detailed blueprints and the like. To avoid costly errors and redundancies, the specifications themselves had to be specified, so that the provenance of all processes and sub-specifications leading to a good or service was documented in a uniform manner. Publishing every aspect of manufacturing, through specific commitments and contractual obligations, has, for the companies embracing ISO 9000 and so-called Good Manufacturing Practices, resulted in much greater success rates than "business as usual" had previously.

Although ISO 9000 compliance has a hall-of-mirrors quality, it is simply the conclusion of a painstaking attempt to be honest and explicit, so much so that one can in theory proudly document every detail of your manufacturing/service process, and withstand audits and suggested changes from ones suppliers and customers. Although this may seem to put everyone in a position of being meddled with, the realities of manufacturing, say of the AirBus, make this sort of dovetailing far from optional.

Given that this intellectual leap has been made by high-tech manufacturers, pharmaceuticals and others, and ISO exists to certify compliance and there is a body of accepted practice, software support and so on, one might reasonable wonder why ISO or GMP has not become become all but universal.

One reason is cost. Initially, the concepts and scope of ISO are difficult to appreciate and a whole internal culture of ISO-informed wisdom has to be nurtured to take hold within companies, and this may seem to have scant payoff at the start. Industries without the need for highly-coordinated specifications may feel a lack of urgency to start rationalizing their QA process to the degree required by ISO.

My impression is that, sooner or later, companies that have adopted ISO or GMP will find themselves at a significant advantage. If Mattel employs ISO, it should be far less vague about the state of its subcontractors than has been let on in news reports. If the news reports are accurate, that somehow Mattel was so lax with its own manufacturing that it was satisfied making verifying checks only every three months (and on a regular basis rather than on a surprise, random basis), then it has every expectation of being rudely awoken with back-breaking recalls on an ever-frequent basis. Given the money involved, and the acutely important issues attending the wide distribution of potentially dangerous toys to countless children, it is all but incomprehensible that Mattel hasn't been on top of safety and compliance issues not matter who their suppliers are.

For this last reason, I feel that there is much less control of the manufacturing process at Mattel than one would suspect for a company making toys. Without a synoptic view of its own and suppliers' manufacturing and safety process, any company could be quickly brought to its knees and bankrupted by random fluctuations in supplier behavior. Ultimately, the supplier will not bear the ultimate burden of the company's failure to secure its QA: That will fall on the company, its customers and shareholders.

Oxymoron: Fixed Prices

Anyone who has been exposed to even the least bit of the dismal science, has been confronted with the core truth of economics, the supply and demand curve. This hoary abstraction lies close to the root of modern economic theory, as first explicated in the 18th century by Adam Smith and others. Note that a supply and demand curve is not a plot of some dataset, say, years of education versus lifetime income. Rather, it is a graphic depiction of qualitative relations between the quantity, price, and demand for commodities.

The statement "the supply of a commodity, the amount that is offered for sale, generally increases as the price increases" ends up on a supply and demand curve as a diagonal line trending upwards from the origin, called the supply curve. The statement "demand for a commodity generally decrease as prices increase" is depicted by a diagonal line trending down to the right, known as the demand curve. The point where supply and demand curves cross defines the price for a particular supply of a commodity.

None of these curves depict particular values or even particular relations between values. Rather, they express qualitative relations between sets of statements about supply, demand and prices. That is, supply and demand curves may seem a lot more rigorous than they really are, wrapping rather feeble assertions in the trappings of mathematics and then using these to make extrapolations that are logically questionable and even unsupportable.

But, this breathless, back-of-the-napkin approach seems to command a lot of respect in economic circles, allied as it is with business, political opportunism and Nobel-seeking academics. Economics, as a "science" seems to inhabit an alternative universe from the usual sciences, hard or soft, and its practitioners have been accorded a respect disproportionate to their real contributions, usually for self-serving, political reasons.

This hardly scratches the surface. Economics and economists should be high on everyone's list of professions to be skeptical of and I'll return to this theme again.

Now, however, I'd like to investigate the all-but-ubiquitous practice of giving products fixed prices. This is so firmly ingrained as to seem a non-issue but, both from a theoretical and practical point of view, prices should not and can not be fixed without being to the continual disadvantage of consumers. Why? The instantaneous price of a commodity is a moving target defined by instantaneous supply and demand. This is illustrated at any moment by the moving prices for stocks, bonds, commodities, futures, etc. on the world's financial markets, wherein computer-assisted trading helps extract the maximum benefit of transactions for both buyers and sellers by allowing them to settle on a mutually-agreeable price.

The world of consumer products and goods is, by comparison, sub-optimal. Prices are set by decree by the seller, and as such will always be higher than if they were directly bid on by consumers. As it is, the only recourse consumers have is is the rather course-grained and inefficient approach of simply not buying or shopping around for prices that are less sub-optimal (cheaper). In both cases, the seller and the buyer would be better off if the sale had simply gone through at a mutually-agreeable price in a timely way. So, why doesn't this happen?

In a market where seller is king, prices can be set arbitrarily high, up to a point. If the seller can afford to absorb costs of stocking and display, he is in a position, with a population of compliant buyers, to skim off the easiest sales, and simply allow less compliant consumers to move on to retailers willing to sell at a smaller margin. This lazy approach may be fine for upscale stores catering to a clientele always willing to pay more than necessary, but such stores would rapidly find themselves on hard times if such customers became more demanding or hard-pressed to make ends meet. It seems altogether more prudent, for both consumers and retailers, to hone closer to prices that would result from haggling between buyers and sellers.

Buyers needn't feel guilty for offering a lower price than marked for items, because the difference in profit can and often is offset by other savings that may accrue to the seller, such as the opportunity costs of having capital tied up on aging inventory, etc. In any case, it is hardly the buyer's role to try to second-guess the best interests of the retailer. Indeed, sometimes it is hard enough to figure out his own interests.

The buyer should not be intimidated by the self-serving "prohibition" on haggling or bargaining to a price often put up by retailers to forestall their having to revise their (always-higher-than-optimal) prices. The main problem facing retailers is that most have crippled themselves with staff without knowledge or competence to enter a meaningful price negotiation. Generally, however, store managers are so empowered and, when approached discretely, will respond to sensible offers, though they are often chary of doing so if it seems that this would trigger a stampede of haggling customers, a nightmare that would surely have grave effects on their career prospects. Usually, offers on cosmetically hurt products are not refused, even offers that are less than a markdown, because such items are compromised and could result in much larger losses if never sold.

Prices, without the active participation of buyer and seller, are a fiction, laughably confused further with the "value" of commodities. One must understand that neither buyer nor seller alone is competent to judge what an appropriate price may be: a certain price may indeed be too low to justify the seller's parting with an item, as it involves more factors than simply the wholesale price and a "reasonable" markup. The buyer's options are not known to any seller, either, so a price offer can be a source of real-world feedback in the face of statistical guesswork.

The key is to not be shy in standing up for your interests as a consumer. It is your role in the marketplace, pure and simple, and doing so can only increase the efficiency of retail transactions and your own satisfaction.

28 August 2007

Sympathy for the Devil?

Earlier, I mentioned the "cutting-edge economic approach" trotted out by the Pentagon in late July 2003, for a few hours as it turned out. This was the Policy Analysis Market (and here), a futures exchange developed under the auspices of DARPA (the U.S. Defense Advanced Research Projects Agency) to try to amplify the information about events concerning the War on Terrorism.

Briefly, the idea of a "prediction market" is that motivated actors, buyers and sellers, can develop more accurate predictions about events than, say, opinion polls or other means. Presumably, this has been proven true in some circumstances to some people's satisfaction. In any case, it is an appealing idea on the face of it: get together a group of guileless, highly-motivated (greedy) buyers and sellers in a structured marketplace and you are bound to reap the reward of preternaturally aroused senses sniffing out accurate predictions. Intense self interest brings out the very best in people, after all.

The appeal is there, particularly if you are a believer in the magic of the marketplace. Such people, I believe, are so blinkered by their own Bear and Bull reality that the world is itself, beneath it all, an arena of vast opportunities for the bold but prudent investor. The obvious bias, akin to the Anthropic Principle, of making pronouncements about the General Goodness of Capitalism from the improbable perspective of the Last Man Standing seems never to occur to happy capitalists, as they continue to sow their faulty wisdom among lesser, and substantially poorer, economic beings. [This is never a digression in a world of dreadful inequities of wealth.]

However, I felt an itty-bitty, ever-so-tiny degree of sympathy for Poindexter because, impossible as it may seem, the forces that arrayed against him, particularly Senators Dorgan and Wyden, at least through their statements, seemed utterly opportunistic and moralizing and seemed to seek political brownie points by depicting a research program in cartoon-like terms (see links).

Not that the program was flawless, mind you. Poindexter, who graduated number one in his class at Annapolis and worked with Nobel Laureate Rudolph Mossbauer while getting a PhD, is certainly no idiot. His career took him to the heights of National Security Adviser under Reagen and the Iran-Contra affair resulted in multiple felony convictions in 1990, later reversed on a technicality. So, it seems that Poindexter, like many others, was fooled by issues of ethics and loyalty, making further government service uncertain at best.

Under Bush, loyalty per se seems to be a sufficient qualification, so, Poindexter, even allowing for his high-profile meltdown, must have seemed unusually qualified (particularly when compared with platoons of high-level appointees with absolutely no evident qualifications except a fanatical devotion). The Policy Analysis Market was certainly controversial, simply by dint of its being an unusual idea, but if such futures markets generate expert knowledge from the aggregate decisions of buyers and sellers, as claimed, one cannot fault DARPA for trying to implement it in some fashion. That the controversial details and Poindexter both were needlessly associated with the PAM represents a failure on the Pentagon's part, that may (if the assertions about it are to be believed) have needlessly crippled efforts to garner hard-to-come-by information.

So, the Pentagon should not be vilified for trying an unconventional technique to gain a predictive advantage. It is idiotic to blind oneself because it seems more decorous to do so. (This seems to be the argument of the Senators.)

But, given the way the PAM was introduced and Poindexter being involved with it, the public can be forgiven for rejecting it. However, considering the large number of secret programs, renditions, illegal detentions, torture, surveillance and the rest that have made their way into the twilight of public consciousness, the PAM seems rather tame, or even quaint, by comparison.

26 August 2007

Random Likes and Dislikes

It's hardly time, given the lack of track record here at Time's Arrow, to expect that anyone might be interested in what I think about other "information-providers", as it hasn't been proven that I am one myself. So, instead of being interested because I believe some reporters and commentators are good, turn it around and judge me by the company I attempt to pay attention to.

It is difficult to think of a better reporter than Robert Fisk of The Independent. I have made no special research into his career, but understand from his dispatches that he usually lives in Beirut, the erstwhile Paris of the Levant. This recent article gives a good view of his appeal. Every article I've read by him has contained valuable revelations, born of his gentle and informed perspective from many decades of reporting and a fluent knowledge of Arabic.

Another favorite of mine is Greg Palast, an American independent journalist who has found popularity on the BBC. He first became known to me by breaking the story about Kathleen Harris' bogus list of felons, constructed from various sources, to invalidate some 60,000+ black voters in the 2000 presidential election in Florida. More recently, he has investigated the wretched failures and malfeasance of the administration in the aftermath of Katrina in New Orleans. And, unlike most, he already has his eye's peeled for further GOP plans to try to throw the 2008 election.

America lost a real friend when Molly Ivins died on 31 Jan 2007, mere weeks after writing her last articles. Even Shrub had nice things to say about her. What else could he do? She was one of his most stalwart critics from back in his days as governor of Texas, and spared no effort to get at the truth. We would all do well to follow her example.

I have to confess I'm a long-time New Yorker fan. In fact, my parents had subscribed to the New Yorker since long before I was born (today, as fate would have it, many decades ago), and we had many stacks of neatly bundled issues to show for it. However, aside from the cartoons, it took Truman Capote's "In Cold Blood", John McPhee's factual essays and some Donald Barthelme to get me started reading the longer articles some years after I was in college, and I've never stopped. It is amazing to me that the circulation of the New Yorker is only something like 940,000, for there are certainly many times that number who would enjoy its political observations and timely articles. It may no longer be the Parnassus of Harold Ross or William Shawn, and far too many of the old-timers have passed on, but what remains, helped with new talent, still has few peers.

On a different plane, the Columbia Journalism Review and Editor & Publisher are both well worth reading for a professional perspective on the performance of newspapers and other news sources. Many who proffer their opinions on the web are not trained journalists. Although we may decry the multiple lapses of the main stream media in wrongly aiding Bush by acts of commission or omission, this is better understood as an institutional failure due to media consolidation and business interests rather than a crisis in journalistic standards, per se. So, amatuer journalists have much to learn from their professional peers even if we think they haven't been performing as well as we would have liked.

It's hard not to be impressed with Canadian writer Naomi Klein, whose newest book The Shock Doctrine is coming out Sep 2007. She has written extensively on economic globalization issues, the Iraq war and corporate brands in the battle for consumer mind-share.

Sidney Blumenthal, whose work appears all over the place and often in The Guardian, is always an entertaining and informative read.

Pierre Tristram's essays on Candide's Notebooks are always worth keeping your eye on, as they cover a wide range of topics with passion and honesty. Refreshingly, Tristram writes about and features music and drama of note, aspiring, one suspects, to be something of a one-man New Yorker.

True Virtual Torture III

And now, the moment all you virtual readers have been waiting for: What would virtual torture, as implemented by the Pentagon, be like? Would it be legal? Would it work?

The imagination boggles, at least at first.

Prisoners and redacted people of all stripes, the "New Disappeared", as it were, would have to be inculcated with a sense of the near-supernatural possibilities of Virtual Torture as part of the orientation process during matriculation at [redacted]. This might include bogus Popular Science articles strategically placed in the numerous waiting rooms that every bureaucracy has to prepare clients, giving them a number, etc. But this would be mere stage dressing, for the real work of convincing clients would be to plant carefully designed clues, half-heard conversations of earlier inductees' inability to resist the awful, and strangely wonderful, power of virtual torture.

Having been suitably softened, the prisoner would be in near panic as the virtual reality goggles and other instruments of obfuscation are arrayed about his body. Then, unexpectedly, perhaps within a multimedia diorama constructed with the help of memory fragments dislodged by earlier interviews, the prisoner finds himself observing himself at a distance, surrounded by trappings worthy of the Arabian Nights. Fear begins to melt away after a time, as the prisoner struggles to reassemble his perceptions within this concocted reality.

This treatment, crafted carefully by highly-paid consultants working under legal immunity in a work-for-freedom program of their own, continues for a predetermined period in this vein. Gradually, the untoward circumstances of their apprehension and detention is allowed to fade away, perhaps discretely helped along with advanced pharmaceuticals. Soon, the prisoner finds himself looking forward to these periods of wish-fulfillment with an earnest fervor.

Unbeknown to prisoner and jailer alike, the whole operation is documented by well-hidden cameras and combined into an entertaining stream for none other than President George W. Bush, for whom the whole operation is a mixture of ultimate miniature railroad and reality TV show. One can imagine our Dubya, bored by the mundane business of state, slipping into the study off the Oval Office at the White House, perhaps fondling the late Saddam's pistol, and avidly watching the activities of his very own prisoners.

Now, if Dubya doesn't like what he sees, he throws the interrogation process into high gear with a call from the highest level. Emerging from behind one of the diaphanous veils at the virtual harem is Tom DeLay, wielding a not-so-metaphorical hammer, virtually striking the helpless prisoner where it counts most, instantly redistricting his brain and rendering further resistance futile. Dubya finds this irresistibly amusing as he turns away, shaking his head and smiling broadly.

Now, unlike the bad old days of stress positions, water boarding, sleep deprivation and the rest, our prisoner has been untouched, except by the gentle hands of laboratory assistants, who may not even know he was a high-value information source. It cannot be demonstrated that the prisoner even experienced pain, even though his will-power has been shattered along with his virtual body-parts.

The remaining problems are simply technical: How is data actually to be mined from the jelly that the prisoner's brain has become? Some hints appear in a recent series in the LA Times, called Chasing Memory. Although Gary Lynch at UC Irvine is still working on microtomed slices of rat brains, one can safely project that it's only a matter of time until non-invasive methods are perfected.

Hopefully, even if it's decades into the War on Terror, mind-bending techniques will have been developed to imprint upon the emptied brains of former maniacal terrorists a completely new identity, so they could be returned to civilian life and become productive wage slaves to really pay their debt to society. A fascinating corollary of malleable-brain work is that any, any personality could be impressed upon cleansed brains, so our former terrorist could be released as a happy-go-lucky but pious kibbutzim, sure to harm no one.

24 August 2007

True Virtual Torture II

Now, the Guardian Unlimited site has put up an article about out-of-body experiences, but with an upbeat interpretation, perhaps befitting the somewhat more liberal Guardian:

Scientists develop technique to induce out-of-body experiences

· Breakthrough could be used in remote surgery
· Virtual reality games may also be improved



But, let not these happier, humanistic visions conjured by the Guardian for this new experimental technology deter us from our main objective, that of giving our very own, beleaguered Pentagon, CIA and [redacted] a new lease on [redacted] for the War on Terrorism.

Since George W. Bush was caught with his pants down on 9/11, the first thought that evidently came to his mind was "how can we torture these bastards". Now, this may not be literally true, but it's quite plausible considering Bush's Ming the Merciless role vis-a-vis the Texas death row inmates, some 131 of them, who perished on his watch. In one case, Bush parroted a condemned woman's pleas to live with derision. [Note the article linked to was written in 25 Oct 2000, and ends with the following prescient paragraph:

"Such confidence in the face of the evidence borders on the deranged. Three decades ago, a president [Johnson] refused to change course, and it cost thousands of American lives. In two weeks, the nation may elect a president [Bush] with a similar hubris. If Bush will not change course on the death penalty, there is no telling what he will not change course on if elected president."

Bush has done little to refute these notions, which were simply based on a common-sense reading of his character. Perhaps it's unfashionable to draw such conclusions (or listen to them), but this modest prediction certainly spades Bush at the roots.]

Bush, was caught with his pants down a month earlier, when it was disclosed that on 09 Aug 2001 he essentially tabled a PDB (Presidential Daily Briefing) titled Bin Laden Determined to Attack Inside the U.S, and then covered up its existence, then its title, then its content until 2004, based on an impaired notion of the public's right to know. But, this PDB was merely one of a many-months-long period of ignoring warnings, the first having been delivered on 24 Jan 2001 by to his terrorism tsar du jour, Richard Clarke.

One gets the impression that this August pants-down moment was largely motivated by the super-cool strategy that the Bush 43rds had to erase the lingering affection over half of the electorate still held for President Clinton, as evidenced by Gore's popular victory in 2000. This strategy was, simply ignore or reverse anything his antecedent had uncovered or stood for, paying attention to absolutely nothing but the baldly political gains to be had from doing so.

Now, given the President's desire to thrash the truth--or life--out of anyone who might pass for a "terrorist" in the eyes of admiring citizens scared to death by the prospect of further attacks, we can justifiably wonder how the Bush administration went about stirring up enthusiasm for these heretofore unsportsmanlike notions. I am not, of course, privy to the "Decider's" decision-making paraphernalia, but I have found an article that seems to show that The New York Times had some part in this. It was published on 05 Nov 2001 and was titled Torture Seeps Into Discussion By News Media, by Jim Rutenberg. [The link points to my saved Times Select copy, so I'm not sure it will be available on-line without a subscription. Happily, library copies of the Times are not subject to this policy of imprisoning information.]

The gist of this article/op-ed piece is that: Wow, all the sudden, everybody in the news media was, at the start of November 2001, discussing the possibility of using torture, very much in the manner described in Chomsky's Manufacturing Consent, or at least it seemed so to me at the time. Now, even though the mass media often seem to be a craven crowd of copy-cat cowards, it seems just a bit strange that Newsweek, CNN, Fox, the Wall Street Journal, and others, were all consumed with having a "serious debate" about a wholesale abandonment of the rule of law.

Now, you could argue that everybody was talking torture back then and that this justified an article, or even made it imperative to write about, in the interest of informing the readership. Yes, but. The article is little more than a group of sound-bites from people standing on one side or the other of the "torture issue". For instance, the following statement is attributed to Newsweek columnist Jonathan Alter ("considered a liberal") and couched in the following way:

''In this autumn of anger,'' he wrote, ''even a liberal can find his thoughts turning to . . . torture.'' He added that he was not necessarily advocating the use of ''cattle prods or rubber hoses'' on men detained in the investigation into the terrorist attacks. Only, ''something to jump-start the stalled investigation of the greatest crime in American history.''

It is odd, because the whole discussion Alter is having with himself makes no specific reference to any man or men who are not giving information. That is, it seems entirely hypothetic. How is this "news"? In my opinion, it's not news at all. Instead, it, and the rest of the article, is a rhetorical slight-of-hand that describes nothing (no real thing), while at the same time making torture the subject of attention. That is, it is simply an example of "talking up" an idea, which functions to desensitize the reader, making it easier to propose the same or more drastic departures from traditional practice later on.

I believe that this sort of thing is employed all the time by news outlets that are large and respected enough to have a profound effect on the events of the day. It's not exactly propaganda, for no agenda is laid out explicitly and we're not told which "side" of the issue to take. However, bringing up the prospect of "serious doubts" by "respected journalists" about a taboo and illegal activity (torture) without some countervailing and weighty opinion is irresponsible, for the breezy tone seems to diminish the perils of altering the status quo.

This is illustrated later in the article by the following passage:

Mr. Alter said he was surprised that his column did not provoke a significant flood of e-mail messages or letters. And perhaps even more surprising, he said, was that he had been approached by ''people who might be described as being on the left whispering, 'I agree with you.' ''

So here we have the issue settled by an self-selected group of lefties, a kind of Object Lesson that implies that we readers, lefties by dint of reading the Times, shouldn't be embarrassed about our secret support of torture because others have already signed off on it, albeit in a whisper.

Surely, editors at the Times should resist the pressure or temptation to publish such subversive rubbish, and tighten up their journalistic standards when writing about such nebulous subjects. The importance of doing so is clear: The Bush Administration took the lack of a widespread outcry at articles such as these as an implicit go-ahead to pursue the matter to their liking.

Perhaps the country would have been better served by journalists wondering less about why the zeitgeist suddenly swirled around torture, and instead followed the scent back to the source(s) in the White House--and did some old-fashioned reporting to expose the calumnies being perpetrated there in 2001.