About Me

My photo
Variously, a film/video editor, programmer, author, teacher, musician, artist, wage slave

18 September 2007

God as Consciousness

Regardless of the endless, unanswerable questions about the purported existence or demise of God, and the truth of the answer to this seemingly fundamental question, there still remains the real riddle: What/Who is this thing/being called "God"? Why does this concept have all but universal acceptance in human cultures? How can the philosophical or abstract 'God of Nature' also be a 'Personal God'? How is one to distinguish one from the other?How could the concept of God be entertained seriously by some of the greatest minds of humanity? How can their pious conclusions be so varied, yet, at least in the case of J. S. Bach, be accepted almost universally as an expression of—*something*? If not God, what?

An answer that appeared to me to resolve these sorts of questions is that God, in the most general sense, is another name for consciousness (or, more precisely, *self*-consciousness). Whether this might be construed as *consciousness itself* ( the world and the mind apprehending it) or simply, a name for the *portal of consciousness*  (the Enabling Force that permits self-awareness) is difficult to determine, even now.

This confusion, which might be expected with such intangible abstractions, has given rise to the two main schools of thought about the nature of God. Those who hold that 'God' is the *content* on consciousness , are saying, practically by definition, that God 'is Nature'. Those who see consciousness as a numinous mystery with portentous consequences, are again saying,  practically by definition, that God is a 'Personal God'.

Although these two ways of looking at the nature of God seem to diametrically opposed, they would seem to be merely two different aspects of resolving *what* is being identified when one tries to objectify consciousness or self-awareness by naming it.

Why would one want or need to give consciousness or self-awareness another,  even more abstruse, name (or Name)?

Considering the evident antiquity of the concept of God, or Gods, the answer to this must be found in the twilight before the dawn of history.

Whether of not the naming of consciousness by early humans was a 'conscious decision' or not is highly debatable, but considering the consequences of the rise of self-consciousness, I believe there can be no doubt that early humans understood its significance. Indeed, early humans would have understood the phase-change happening in their midst far better than anyone since.

To be suddenly aware---literally---that there were profound, yet indescribable, differences between yourself and many of your fellows is difficult to describe, even now. Whether this took many generations or only a few, the 'dawning of awareness' must have been experienced by those in whom consciousness was emergent as 'suddenly,' even as we are often repeatedly aware as we regain consciousness that, just a moment ago, we were 'asleep' by comparison. It is equally certain that the change from 'being asleep' to being 'awake' must have happened at different rates among different, particularly isolated, groups, as mutation rates would have varied quasi-randomly. The condition of being self-conscious must have profoundly forever sundered the first self-conscious humans from their less-self-conscious precursors. (This can easily be appreciated by those who have contemplated the differences between modern humans and Neanderthals, even though there seems to be ample evidence of interbreeding.)

The emergence of human self-consciousness must have been accompanied by a preoccupation with the nature of self-awareness. There can be little doubt that this must have been of great pragmatic interest among First Aware Peoples, because it touched upon every aspect of their intra- and inter-tribal encounters.

This event, and I think it was an event, which, like the harnessing of fire, must have happened time and again in different places and times, but at one point in history 'took fire,' caught on, and from there on was indispensable for all prehuman tribes to survive.

This also must have created great anxiety of a sort previously unknown, as previously unthought-of existential questions suddenly became a central fact of life. This great angst must have created a collective need to identify and control consciousness itself, as the hall of mirrors of self-awareness  would have been at least as terrifying as any external danger. Indeed, it may well be this that is alluded to in the Biblical story of Adam and Eve, who were cast out of their comfort zone, Eden, as a consequence of partaking of the Apple of [Self]-Knowledge.

So, the concept, and the name(s) of 'God,' or a panoply of Gods, might first have been a way of identifying ones consciousness, a sort of password to enter the new club of humanity, while keeping out the dross of pre-conscious (less fit and thus less attractive) hominids. At a time when genetics itself did not preclude cross-breeding between pre-conscious and self-conscious hominids, the God-concept would have become the means by which the sapient could exclude their pre-conscious potential breeding partners from entering, and swimming in, the newly forming human gene pool.

It is easy to imagine that this landscape of changing degrees of   consciousness caused a period of great strife and bloodshed, wherein the new humans sought to differentiate themselves from their recent forebears, killing those who would or could not embrace the new modality of awareness. As these varied groups of hominids occupied more or less the same ecological niche, identical in every way except for having different recursive levels of awareness, the fight for supremacy must have been deadly. This is easy to believe, in that similar struggles over doctrinal issues have been singularly brutal throughout human history.

Thus, humans entered the historical period having long before murdered all closely related hominids, a burden that could well have been the origin of the persistent concept of Original Sin.

Self-consciousness might have caused newly conscious societies to exercise control over what must have been perceived as a potentially anarchic influence. Although self-consciousness is a sine qua non of being human, it is also fertile ground for cultivating alternative thoughts about how to live life, who should have what, and so forth. By asserting a hold over access to the Gods, a priestly class would have been seen as beneficial for retarding the anxiety-producing side-effects of being *too* self-conscious. Ultimate submission to the authority of religion, then as now, would have diminished the angst of standing, naked, before an unfeeling Universe, stripped of the comforts of a pre-conscious Eden.

These apparent contradictions have remained with us throughout history. Understanding 'God' as consciousness (or Gods as aspects of reality and personality) makes some sense of what otherwise seems to be utterly devoid of objective meaning. Religion, theology, theogony (genealogy of the Gods), and personal revelation can all be seen more clearly as an expression of and a means to cope with humanity's uneasy relationship with itself and its self-awareness.

Whether 'God,' as understood until now, has had survival value, in the relatively short term of human history, is debatable. But whether 'God', as now generally understood, will have survival value in the longer term seems very dubious at this point.

The God-concept has been used to exclude not only the nearly human from humanity but has consigned Nature Herself to a subordinate role in human affairs. But now, Nature, increasingly spurned by a secular world,  has been rising to reassert Her primacy in the affairs of humans and all other living beings. Even as Nature's workings have become more visible in light of the infant sciences, so has the wreckage on Nature's biosphere become more obvious, besmirched by many heavy, human footprints.

So, humanity must again undergo a revolution of consciousness, this time inverting the God-concept. Even as God or self-awareness, has heretofore been exclusive, now it must become all-inclusive. Our Jealous God, must, at long last, become a Loving God, who shall again embrace Nature's fecundity, and together, give birth and nurture to a renewed Eden.

Media Technology

The "media" as we know them started with the invention of photography, in June or July of 1827, by Joseph Nicephore Niepce. There had been many precursors, the camera obscura ("dark room") being perhaps the most prominent. But Niepce brought together an image focused inside a box with an image-fixing medium that reacted to light but was then treated so as to stop further exposure when the image was formed.

Countless modifications by many other pioneering photographers perfected the photographic process. What started as a many-hours ordeal to expose ended as the snap of a shutter. Largely experimental processing was replaced by clearly-defined steps with commercially-available chemicals. Transparent negatives would be used to make unlimited numbers of prints on sensitized papers. Photo-etching processes and halftone allowed widespread publication of photographs in newspapers and books.

Soon work began on motion pictures. Eduard Muybridge and others took photographs in series, of horses galloping, people walking and so forth. The individual photos could be placed inside the zoopraxiscope, either projected in series or revealed by slits in the walls of a spinning cylinder, they appeared to move. It was not long before George Eastman's photographic film was perforated and exposed in Biograph cameras and prints projected, frame by frame with a maltese cross movement and illuminated by carbon arc lamps, filling movie theaters around the country.

The holy grail then moved to sound reproduction on movie film, achieved by two methods to create a stripe along side the pictures that modulated the light passing through to a photocell. Movies became the talkies.
As radio moved sound through the "airwaves", it became clear that pictures would soon follow over the air--television. The first boardcast featured Felix the Cat, whose image was slit-scanned to decompose it by a rotating disk, so a stream of brightness levels could be transmitted and reconstructed at the receiver. More sophisticated methods appeared; the iconoscope and others scanned an optical image with electrons, which was recreated at the receiver by painting it onto a cathode ray tube. Television was a reality before World War II, but became a mass medium after the war.

Television began a long period of incremental improvements and new inventions. Videotape became a reality in the 1950s and, gradually, computer controllers permitted video editing. Film negative and equipment improved incrementally as well, with more portable equipment, better images and less bulky sound equipment.
Claude Shannon wrote a paper in 1948 that rationalize the process of sending a signal over a noisy channel. This was of most direct utility initially for the Bell Telephone system (he worked at Bell Labs) but it laid the mathematical basis for understanding how much information would be stored in a given medium, giving reasonable engineering goals for a host of communications equipment.

In the 1960s and beyond, television equipment was radically improved by using transistors, computer controllers and high-tolerance manufacturing. Japanese companies revolutionized video with U-Matic, BetaMax and VHS and then C-format 1/2" tape technology. These created a huge home video market and permitted acceptable video production and editing by less-highly-trained staff. Cameras began using solid-state CCD pickups, which were longer-lasting and more stable than tube pickups, as well as much easier to set up and use.

With miniaturization, film lost out to smaller-format video for news gathering and documentaries. Video editing systems for small-format tape became frame-accurate and could produce an Edit Decision List (EDL) to semi-automatically assemble edits on broadcast tape.

All of these factors lowered the cost of making broadcast-quality video while lowering the technical requirements needed compared with the previous generation of video equipment. This was a boon to broadcasters, which lobbied the FCC successfully in the 1980s to soften quality standards for broadcast video and loosen training requirements for the operators of video equipment at television stations. As these factors lowered the over all cost of productions, at least on the technical side, a new sort of semi-professional production house rose to the task of making VHS tapes for industry and individuals.

In the 1970s there was still enough money in industrial films that a $100,000 budget was not uncommon, which was enough to make a product with high production values and still make a sustainable profit. By the 1990s, expectations of industrial PR people had been lowered sufficiently that a $25,000 budget was often a tough sell for a presentation to be distributed on VHS, Expectations were conditioned by the semi-high quality of home video equipment costing under $1,000, the seeming simplicity of making home videos and the like, as well as the markedly lower cost of VHS tapes versus film prints. This "price gap" allowed successful wedding video makers to branch into the industrial video market and produce (barely) successful product, with cost-cutting to permit a profit margin to be had even with curtailed budgets.

This forced many former filmmakers out of the business altogether, not having the wherewithal to capitalize new video equipment on much lower margins. The wedding video people were able to work up because they had already started with cheaper video equipment adequate for home VHS product. This, at least from what I have observed, has lead to a wholesale turnover in the industrial market with much less sophisticated tapes "being good enough".

Herein lies the paradox of technological advances: One would think that cheaper and (for the most part) better equipment would promote better productions. However, the outcome was the opposite of this, largely because of the altered standards of media buyers who could see that cheaper equipment could give (basically) the same results than older, more expensive equipment. This, the buyer's surmised, was reason enough to expect that productions could be made for much less. The real problem probably wasn't the media buyers so much as their bosses or higher, who generally pride themselves with snap judgments about things (such as lower-cost video equipment) that "obviously" would lower the cost of videos the company wants to make.

The trouble with this sort of prejudice is that it afflicts every negotiation about prices, driving budgets close to the bone because of a misconception about the costs of production, which are not driven predominately by equipment costs (though that might have been an excuse used decades before to "explain" too-high costs). This may be a classic case of being bitten back by spreading earlier misinformation.

The situation describe above pertained to the 1990s. Since then, video technology has moved much further, into the confusing world of multiple High Definition standards, with both amateur and professional camera equipment recording onto internal optical or hard disks, with editing on computers with a HD deliverable right off the computer. Although HDTV has raised the ante, requiring a huge up front investment in new, high-bandwidth equipment, the battle already is joined to convince semi-pros that some newer "prosumer" gear is at least almost ready for prime time. No, this gear will not be good enough for a full-fledged studio production, intercut with studio cameras many times their cost. However, they may be acceptable for news gathering and local documentaries that are simply too risky to make with $50,000 cameras on a shoestring budget.

So, however much digital cameras in the $3,500 range have been able to crack local television markets for some free lancers, High Definition once again resets the mark. This benefits no one as much as the equipment manufactures and their dealers. Once again, would-be production companies must mortgage their future just to gain entry into the uncertain market of television production.

Probably the only way to make a real difference in local television production would be to renew the FCC requirement instated in the 1960s, requiring that TV stations produce local programming to fill the slot from 7:30 to 8pm. This would create, as it did then, job opportunities for a large number of producers, cameramen and editors throughout the country, as well as a great quantity of locally-originated programming. This would restore a great system for learning production. A public-service requirement would reinstate documentaries at the local and regional level, which, in all likelihood, would be accepted by the public as a variation on the so-called "reality" shows that have become so popular in the meantime.