George T. Anderson

George T. Anderson is the author of The Year of Perfect Sight (September, 2015). He blogs on transhumanism, tech evolution, and the technological singularity at www.theyearofperfectsight.com (@BlogDatFuturism). He wrote the novels The Tower of Babel and A Chair Between The Rails under the pen name G. T. Anders. His essays have appeared here, in the Other Journal, and in Bedlam Magazine. He holds a degree in music composition and is married to a professional cellist. For more on his music and writing, see http://george-anderson.net or follow @GT_Anders.

True Eccentricity

In Pilates and various meditative disciplines, I’m told to look inward and focus my being—to “center” myself. While this works on a physical level, helping to tame breathing and related symptoms, it doesn’t work for my soul. When I look inward, the magnificent bastion of self I’m supposed to find simply isn’t there. I get nothing but a void.

When in pursuit of the elusive self, the cross-country road trip, the career switch, the sudden taking up of painting, all get invoked. But what is the self? I’m not even sure it exists outside our neurochemical construction of it. Perhaps it’s nothing but an accumulation of remembered attitudes toward remembered things. Memories slip. Language, the building block of memory, stumbles under the burden of nailing down meaning. Hence, “each moment is a new and shocking valuation of all we have been,” as T. S. Eliot writes in Four Quartets. As the self seeks to know itself better, it can study nothing but its own reflexive construction of what it thinks it is. The very words it uses are, at each moment, its own words.

Our generation often shatters on the rocks of this search. Driving forward under waves of find yourself, find yourself, people of our generation lose more and more of their vitality and peace in the frantic search for vitality and peace. Isadore the Priest spoke to our generation when he supposedly said, “of all evil suggestions, the most terrible is the prompting to follow your own heart.” And yet the endless breeding of new blogs and tumblrs and iPhone covers declares that the only way forward is increasing differentiation and uniqueness. But as the media phenomena through which our generation seeks uniqueness reveal more and more of their self-construction, the burden to make the self falls entirely on the self. Only you can build your personal brand, and only machine-enabled experiences can tell you how to do it. You inform machine processes, and they inform your self-construction.

So the self grows with the machines it creates (which also recreate it). The question of this generation is, what kind of self will emerge from an increasingly symbiotic relationship with increasingly powerful and amorphous machines? We are already beginning to see an answer: a mindset of networked simultaneity has begun to displace the sequential approach to activity which the modern era curated (and which curated the modern era). The modern self worked in discrete processes and in environments with clear limits. It orbited goals with clear courses and gravities. In other words, the modern self was centered. The hypermodern self, today’s self, works with fluid processes in unbounded environments, uncovering and continually adapting. In other words, the hypermodern self is decentered.

Like the technological relationships which engender it, the decentered self is amorphous. It is ready to switch media at a moment’s notice, following the latest notification. It does not categorize things as clearly as the centered self did because it has no need to. Relational technology performs the categorization and determines what to present and when to present it. The de-centered self is totally networked, perpetually enervated, participating in this emergent common mind that is larger and louder than any one of us. The decentered self googles everything and uses apps to get its chores done.

In something like a perpetual panic attack, the decentered self lives an amped life, staying on top of the ever-evolving sum of popular knowledge, culture, and technology, predicating its sense of legitimacy on its alignment with the ever-expanding latest and greatest. The newest operating system, newest iPhone or Apple gadget, it has to have them all. Without them, the decentered self is simply a castoff piece of debris from the mad hayride of technological advance. It fears this fall on stony ground more than anything else: to be lost and disoriented, dis-networked and dis-mediated, unseated from the careening carriage of change.

These two selves exist easily inside us because we live in a transition between two great ages of history. We are abolishing the centered self, but it is not yet abolished. Some of us are still trying to achieve the American Dream, and though the true promise of the Dream was always dead, this strain of meaning-making has hardly disappeared from the collective psyche. But the discovery of “meaning” within personal wealth and disregard for the community is fundamentally modern. Like all things modern, it values certain separations: nation from nation, public from private, middle-class castle from middle-class castle. Because this narrative relies heavily on these separations, it will not survive the transition; for the technology of the transition, our enervation by the internet, obliterates separations of this nature while introducing new separations.

The disintegration of all psychological barriers, the networking of all physical and psychological realities, means the decentered self coexists with the centered self. The decentered self emerges naturally in reaction to the failure of the narrative that created the centered self. Sensitive to the boundaries which the modern age enforced, the decentered self  experiences those boundaries like a cultural punch in the face. The decentered self seeks solace from the terror of modern boundaries in endlessly branching connections across discourse spaces. Where the centered self sought a localized heaven on earth through personal advance within capitalism, the decentered self seeks heaven across network in a growing resume of connective technological experiences.

In his brilliant essay “Buffered and Porous Selves,” Charles Taylor describes the pre-modern consciousness as “porous.” He means that the pre-modern self believed that its experience flowed into and out of it. This stands in stark contrast to the modern self, which Taylor calls “buffered.” In this terminology, he captures the ubiquitous sense of separation which characterizes the modern self.

His idea of the porous self relates in two ways to the decentered self. First, Taylor’s porous self—a self at one with the phenomena around it—is that which the decentered self seeks to become in its search for heaven-across-network. Yet since the decentered self seeks this experience through a fundamentally buffering phenomenon (personal connective technology), it will never achieve the return to human connective roots that it desires without eroding the meaning of connective experience in the process.

Second, I believe Taylor’s porous self is what we should strive for—a consciousness that participates in the world processes around it and even participates in the Divine. Owen Barfield addresses this idea of participation in his brilliant book Saving The Appearances. He writes, “Participation is the extra-sensory relation between man and the phenomena… actual participation is… as much a fact in our case as in that of primitive man. But we have also seen that we are unaware, whereas the primitive mind is aware of it.” [1] For Barfield, participation is that which Taylor’s porous self practiced in the pre-modern era. Participation was the flowing-into-and-out-of which the premodern self saw occurring between itself and its world. In an essay with the Other Journal, I discussed the connection of Barfield’s idea of participation through participatory technology—devices and platforms claiming to connect us with friends and family. In truth, these technological phenomena quietly buffer our experience of other humans at a financial gain for the corporation disseminating the technology. While the decentered self rightly seeks the participation which the modern era lacked, it defeats its search by seeking this participation in technology that encourages the spatial isolation of users.

Modernity has taught us that separation is hell. We have begun to long for a breakdown of separations, and now we seek that breakdown in decentered connective technology. In phenomena as diverse as Facebook, Uber, and Skype, we grope back towards a participatory, porous sense of self. But what does this return entail? And is it authentic? I believe it is not what we’re looking for. The true object of our desire is far more radical. We live in a war zone between centered and decentered value systems. In this cultural climate, the intentional construction of a porous self with many branching online connections is not radical. However, the intentional creation of a non-technologically mediated porous self is radical. This choice means seeking connectivity not in technologically networked experience, but with people, in person, and in relationship to the Divine. One can only make this move after recognizing the failure of both the centered and decentered models of self, and this double failure leaves the constructed self with a deep and overwhelming sadness. This radical connectivity is grace. It is the only solution.

The arts are always the front lines of cultural development, and grace has begun to emerge there. As Piet Mondrian said it in another context in the 20th century, “art had to find a solution.” Art is still finding solutions. Makoto Fujimura’s paintings rest in a static, transcendent place of grace. Forest Management’s ambient drone enables peace and real downtime. Matthew Anderson’s newest record fuses sorrow and beauty relentlessly. All these artists grope for true participation. Their works lack personal rage and political agendas. In place of “words, words… led out to battle against other words,” [2] these artworks proclaim something from beyond the constructing self.

To gaze on this something other is to be truly ex-centered—truly eccentric. One is centered not inwardly on one’s private fortress, not outwardly on expanding noise, but on a singular moment of grace. The true eccentric need not flesh out a full theology of who and what this grace implies, for she understands that if it is grace, if it is extrinsically sourced by no effort of hers, it will defy her ready-made cognitive boxes anyway. Grace manifests in art that’s content with the incongruities, the uncertainties, and the dissonant implications of the numinous. These things drive so many people away from the arts and from faith, yet they suggest to the true eccentric that she has found the interwoven pain and beauty of real truth. No longer looking inward to emptiness nor outward along networked, self-augmenting noise, the true eccentric fixes her gaze, as best she can, on grace—on something wholly other—on the numinous. Where logic breaks down, grace begins. Though every one of us needs grace, none of us can demand it; we can only give it.

 

 

[1] Barfield, Owen. Saving the Appearances: A Study in Idolatry (New York, NY: Harcourt Brace Jovanovich, 1975), 40.

[2] Lewis, C. S. Till We Have Faces. (New York, NY: Harcourt Brace Jovanovich, 1985), 308.

 

Featured Image:  Untitled, from the series Protest, Tokyo, by Shōmei Tōmatsu, 1969, printed later, photograph | gelatin silver print

Source: http://www.sfmoma.org/explore/collection/artwork/29652#ixzz3fWLTQEGl
San Francisco Museum of Modern Art

 

The Word Is Changing with Its World

The word is changing with its world. With it goes our thought and the formation of our worlds. Our very being leaps from this notion of world-building: the quick text, the smile gifted to the passing acquaintance, the momentary valuation of time spent in petty communication. Not now, not now, I’ll text her back when I get inside. I’ll text her back at a red light. The thoughts light up and disappear like notifications we’ve attended to. Life, thought, world, word—they all orbit and spring from and return to the iPhone.

The iPhone teaches us efficiency. Offers of efficiency set off the vague notions of personal ascent which capitalist life has branded into us. Jay-Z said, “I’m a business, man.” Aren’t we all. If I can text her back hands-free while cutting someone off on the freeway, I’ve maximized the value of my time while still delivering to her, my emotional customer. This means more for my bottom line, emotionally, psychosocially. This means getting ahead.

Word makes world, and world makes word. To say that a truism is a truism is, itself, a truism. There is no escape from the loop of perceptual creation. Dom Cobb illustrated this beautifully with his little drawing for Ariadne in the dreamspace of Inception. We live that reality, and to write or read that we live it is to further strengthen and illuminate it. This is completely acceptable. It means we’re alive.

Technology has always driven the creation of word and world. Conversely, it has always been driven by them, by the best that our reference-frames could imagine into being. Writing systems, oral tradition, literacy, myth structure—the comparative rationality and scientific precision of cultural narrative are a direct function of technological advance. What Owen Barfield calls “the rational principle” is the driving force behind technological advance. It is both the motivation of advance, and advance’s augmented byproduct: rationality begets technology, which amplifies rationality.

In his brilliant book Poetic Diction, Barfield discusses what he considers an original fusion between abstract categories and concrete phenomena as organized in perception and expressed by language. Then he says,

Afterwards, in the development of language and thought, these single meanings split up into contrasted pairs—abstract and concrete, particular and general, objective and subjective. And the poesy felt by us to reside in ancient language consists just in this, that, out of our later, analytic, ‘subjective’ consciousness, a consciousness which has been brought about along with, and partly because of, this splitting up of meaning, we are led back to the original unity.[1]

Writing in the early 20th century, Barfield had witnessed only the first leg of the exponential curve that describes the increase of technological complexity. He writes in static language because his experience of linguistic change in his own time was slow enough to appear static—grounds enough for a sort of chrono-centrism implicit in the above passage.

Living as we do now—at a much steeper point in the exponential curve—we see that the fragmentation of meanings into abstract and concrete divisions, and the emergence of new fused metaphors for further fragmentation, has not only accelerated with technological advance, it has even begun to operate under new, non-monolithic rules. Unique terms govern every technological subculture.

For Snapchatters, “snap” as a verb has a specific, crystallized meaning divorced from its general, abstract use. iPhone users laugh at the dirty fantasies of the autocorrect algorithm—though “autocorrect” is so old in 2015 that it has already bridged its original subculture-specific context and is now widely understood. These are only two examples. Examination of any emerging personal media experience reveals more.

Where Barfield tracked changes in language and thought over centuries, today’s scholar of the word must tune herself to every new wave, every new communal mode of being which technology creates for us. Here, fragmentation occurs on a higher level, a plane of sociolinguistic divorce which Barfield couldn’t have imagined: forms of personalized media breed faster than bunnies, creating smaller and smaller subcultures of particular technological configuration. From the mainstream, homogenous culture abolished in the twentieth century, we may be moving towards an ultimate crystallization of the individual—a unique subculture and language known only to each person and his personal software augmentation—a total breakdown in communal relations, the ultimate triumph of the divine will at Babel.

Alarmism aside, language change does imply many things about psychosocial unity. The Norman invasion of England saddled upper-class English with a glut of aristocratic French words. The prestige of the new pidgin sent native Anglo-Saxon skulking into the shadows to live only in dialectal and nautical usage.

And this is only one historical division. The emergence of the technocracy today, and their division from the rabble who buy into their apps and data mining, and the rabble’s dividing into different camps, between Android and iPhone, Snapchat and Skype, for example, exerts a powerful fragmenting force on language. These changes are so new, we have not yet even begun to track them. They are happening so fast, we may never begin. Babel may attain first.

Agreement on the terms of our arguments is the only place where argument really occurs. The abortion debate is not about abortion, but about what fetus means. All arguments are fought over meaning. All arguments are fought for the right to unilaterally occupy a meaning-space of critical ideological weight. Only definite, crystalized meaning can maintain any claim to a contested meaning space. Argument, like science, demands the shattering of metaphor and the freezing of its fragments. A fetus is human; a fetus is not human. But just as language crystallization makes and cements our worlds, the emergence of a new metaphor—in which abstract and concrete have not yet fragmented—unmakes our worlds and breathes life into the fragments. This is the function of art, of poetry broadly defined, of the Word itself.

The Word has undergone its own terrifying fusion and fragmentation of natures. Made flesh, misunderstood, blown apart into dead body and pure abstraction, fused again in flabbergasting resurrection, the Word is both microcosm and macrocosm of the history of our words and thought-worlds. The Word displays the terrible justice of Babel, the rightness of technology’s inexorable shattering of community. The Word reminds us that our own words will never approach its concision and grace. The Word reminds us that it abolished all rational argument in its triumphant emergence as Poetry, as finally fused metaphor. The Word invites us to copy itself, to plagiarize the only artwork that transcends the psychosis of ideology. The Word is licensed under Creative Commons because it knows that copyright and originality are constructs of argument bent on attaining godhead. The Word reminds us to love, to die for others, and thus to live in new metaphor every day as words and thought-systems calcify and shatter all around us.

[1] Owen Barfield, Poetic Diction, 3rd edition. Wesleyan University Press, 1973, 85-86.

photo by:

Like a Cork Out of a Bottle

If these genre conventions had flesh and blood, I would fight them with swords. As it is, I am powerless. They win. If a book is not neat and square and laid out in rows, the market won’t have it. I still have to get my elevator pitch together. I still have to let the audience know what category I fall into, or they won’t know where to put me. If they can’t put me, they won’t even take a second look, assuming they see me at all.

I study the ways people can put me. Interesting. I’m not sure what “religious fiction” means. Is it fiction that creates living religion, or fiction that panders to the comfortably religious? Who reads this fiction? Those who seek to align themselves to reverence, or those who believe they have already figured reverence out? The former know they need stories. The latter prefer certainties in their breakfast cereal.

Sometimes I hate facts, numbers, budgets, research, categories—anything that holds me down with its insistence on certainty. Same with jamming my work into holes: it’s like trying to recork champagne. Rigid form will not hold joyous, expansive function. Once the cork is out, it’s out, and it grows in celebration. I grow weary of this logical world with its emphasis on corks that go back in nicely and just-sos and tests and gatekeepers and  résumés. I just don’t fit back into the bottle.

Today I opened my music composition notebooks for the first time in four years. The transport to that creative period of my life was instantaneous. I’m there again, at the height of the construction of a style. Each successive piece that emerges is a frontier work for me. I am so alive back then, so unselfconscious, that I can still make things that are wholly new. This is because four years ago, I knew very little first-hand of the pressure towards conformity that the market puts on published art. Knowing nothing of this idiocy, I was able to write fresh literature and music with abandon. Look at me now: all of today’s work, spat from the grind of trying to make it as a writer, feels like nothing but a tired rehashing of my earliest, unpublished successes.

A remark that my music composition professor made after I had finished a large piano sonata has resonated in my mind for four years. It has come to define the neurosis that twists its way into every avenue of art I pursue. I can still hear my professor speaking behind me as I sit at the upright piano in his office: “Well, it’ll be interesting to see what you do now. Some people only have one big piece in them.” It came true in music, as I never wrote another big piece after that sonata. Will it come true now in literature, too?

How does a person with a muse get by in a world of grids and rigid logic and instant categorization? I am, like E.E. Cummings, “unfit for any kind of occupation.” That which I am most fitted to create, which flows from a prostrate fall before Beauty itself, is viewed as an entertainment commodity in the culture at large. When I publish a 70,000-word paean to the recklessness of divine love, the book will “compete” alongside subway reading about vampires and sexual deviants. And when I consider that the book will often be read on a device that lets the person instantly navigate away to email, social media, or even pornography, I lose all hope for the sanctity of art. This breakdown of categories is not good. Though we still are what we eat, we are now also what we click on; and if our clicking habits display a manic inability to focus on profundities, our minds must display the same thing.

I understand that we need content cues, a sort of lexical and visual shorthand for what we’ll find in a book or on a website. But I can’t figure out this particular communication ritual, this here branding thing. What is its language? How is it decoded, and how do I encode it? How do I sell without selling? Is anyone out there looking for something of value? Do I even have something of value? How do I put my work in a package that will communicate its value? How do I stick the needle of my work in somebody?

The clot thickens: generally a good thing, as it prevents bleeding to death—but here, I am, confronted with a culture that defines itself by its favorite TV shows, its love of Call of Duty. This culture is not ready for what I have to say, for the surgery I must perform. Should I bother making the cut, knowing that this culture’s immune system may reject my work as a foreign pathogen? Should I try to get through the emotional and psychological clot that this conformity achieves, or will no one really get my work anyway?

It isn’t even that great art is being lost in the noise; rather, for those unpracticed in contemplation, great art does not even exist. To them, my hymn to divine love probably sounds like the off-key mumblings of a psychotic. They lack the software to decode my file type. How do I get people to download the codec they need to make sense of the madness of beauty?

Maybe we shouldn’t even try. Maybe we should just keep saying what must be said, painfully squeezing it into something like a conventional format, and selling it. How sneaky to weave transcendent beauty into the entertainment product! Oh, is this criminal? Then send me to prison for life.

photo by: kevygee

Internethamphetamines

Bittersweetly, I may not even see the published version of this piece. I may be leaving the Internet. I suppose it’s rather silly to throw out the baby with the bathwater, but the bathwater stinks. I need to go back to something I used to know, which I can barely remember now.

The other day, I was preparing a paper-mailing fundraiser. The inefficiency of hand-copying my address onto each return envelope (and then affixing a stamp to it) was a balm to my soul. The “print [60] copies” command was unavailable. There wasn’t even a user interface in front of me. Working in slower increments of serial time, I confronted my own smallness. What a relief, the realization that there was no better way to do this—that efficiency and volume were no longer my concerns, because I had chosen a limited medium.

Here we see the insidious nature of our technological advancement. As sharply double-edged as any sword guarding Eden, this one offers immense broadcasting capability and immense efficiency on the one edge—and the ravenous beast of infinity on the other. Each one of us, occupying nothing but a single human head with thoughts composed of nothing but all we’ve ever known and all we can think in a second, is too small a thing to interface with the infinity of all that is. There are simply too many things to read, watch and experience on the Internet; and, for a writer, there are too many rabbit-holes that promise to get you more exposure. Humanity’s greatest invention is fast becoming an insidious addiction.

Format matters. A wall of book-filled shelves does not induce an attention deficit in the way that a browser with twenty open tabs does. Sure, you can put down one book and pick up another, just as you can read an entire in-depth article on the Internet. But will you? Have you ever? I swear there is something about the old media, the old formats, that limits our ingestion capacity in a good way.

Speaking of ingestion, gastrointestinal terms apply quite nicely to our problem. If the previous generation’s paper books and vinyl records were consumed by everyday people justly hungry for good art that was highly limited by format and distribution structures, today’s digital offerings are consumed by gluttons addicted to the psychological sensation of ingestion. The ease of access that comes with digital distribution does nothing but enable this generation of addicts. In these terms, my current psychological detox was my mind’s vomit reflex kicking in. It had to, because putting something in your mind’s mouth is only the beginning of satiating your hunger to know and be known. Hand raised: my name is George, and I’m an addict.

But I can’t wait for true digestion, for true satiation. I’m addicted to speed: the speed of email, of Amazon purchases, of social media marketing. I’m always looking for the next thing that will generate a click and a book purchase. I’m always looking for as much audience as possible, knowing full well that of those ten new readers I net, only one or two may genuinely return for more. I’m trying to build a career as an indie writer, scrape by bloody scrape. But I am beginning to fall out of love with indie. I have a new term for it: narcie. And I’m guilty.

See, an artwork cannot emerge from a talent bubble. Either the artist will die of exhaustion, or the artwork will suck, or both. I am beginning to crave the corrections of an editor, the legwork of a publicist. I wish I knew that they, the people with money and influence, believed in my work and were pushing it. I just can’t make my work and push it. I am beginning to long for some compartmentalization. I want you to do your job, and I want to do mine. I want my own cozy compartment that I can crawl into for a spell and crawl out of again to be present, truly present, with friends and family. I’m beginning to hate my choice to self-publish and wear all the hats, because it has planted in me this awful addiction, this degenerate striving for infinity.

Of course, the aforementioned paper mailing is helping to return some sanity to my head. So is the longhand composition of this essay, and the assignment to make spaghetti sauce in the crockpot and go to the store for dish soap and toothpaste. For a boy who grew up entirely in his own head, for a man who has begun (dangerously) to transfer the same psychological disconnect to adulthood through the Internet’s promise of Power and Influence, these simple, hands-on tasks are medication. The illness is a mental imbalance, borne of genuine ability twisted by latent narcissism. The balance is many-faceted: stagnant book fundraisers, a quiet inbox, no notifications on my Facebook author page, dinner with my wife, hugs from friends at church, real conversation at the grocery store and Sufjan at high volume. The last reminds me that digits can do good, too, as Mr. Stevens lives on my hard drive with my other musical friends. They are real, you know.

Perhaps, then, we need to step back and look at digits. What is digital life doing to real life? How are the products of algorithms and coding affecting our interactions with our fellow flesh-and-blood beings? The Internet hands ordinary humans more power than any previous generation ever had—yet for many of us, that power extends to precious little in the real world. In truth, it is a power to influence one’s own mind, one’s own sanity, for good or ill. Brothers and sisters, let’s use it wisely.

Too Many Cooks

The kitchen in this communal house is often trashed and smelly. I’m pretty sure I clean up after myself. Doubtless, everyone else says the same thing, but the place just won’t stay clean. The mess is everyone’s—and no one’s.

The cultural landscape is the same kind of boarding house. Crowded, vying for space and an audience, we artists brazenly put the pots and pans of our artistic creation wherever is most convenient at the moment. Our carelessness clutters up this communal place of cultural experience that is the arts. The resultant mess—of mediocre self-published novels and lackluster indie albums—is everyone’s, and no one’s.

What can we do? After a few well-meaning attempts at cleaning up after others, we find that the burden of our own survival is too heavy to allow us to bear a triple or quadruple load. We leave other people’s dirty dishes and try to focus on our own. We try to clean up after ourselves and do our part. But other people just keep on cluttering up the kitchen.

Making art is most assuredly a survival act. If the young child’s creative catharsis is not cut off by a derelict public education system hellbent on social control, if it is not cut off by parents who find strange paintings and invented worlds embarrassing, the young child will carry his personal window view on the fields of creativity with him into adulthood. He will find solace in the release of making. The act of creation will nourish him, and it will therefore become his own. It will become his personal medicine for his personal condition. This is wonderful.

However, the artist’s internal place of pleasure hides carries with it a great intrinsic danger. Endless indulgence of personal catharsis, with no consideration of how the community experiences that catharsis, is just as bad as leaving one’s dirty dishes in the kitchen. Everyone needs to eat, and cooking is the way to do it; but there is more than one cook in the kitchen—especially now, after the Internet and the indie revolution—and taking care of this limited space that we all share is more important than ever.

The artist must remain mindful of the size of her act. A complex meal requiring three pans, some spoons and spatulas, knives and cutting boards, not to mention plates and forks, requires an even more prompt cleanup than the simple leftover lunch eaten out of a bowl. The kitchen, like the collective cultural consciousness, is a limited space. It graciously hosts great cooking projects (and great art projects), but it can only bear so much at any given time. Recognizing the kitchen’s capacity for cooking (and the culture’s capacity for digesting) is critical; and that knowledge must be coupled with restraint.

This is not to say that we shouldn’t undertake gigantic creative projects. Rather, it’s to say that we should study our surroundings. We should only broadcast our projects when the time is right—and we must check ourselves to ensure that our projects are things with the culture really must see. But in times of crowding in the kitchen, we must settle for simple, economical cooking projects, or for no cooking at all.

Of course, we don’t have to cook for ourselves alone. We can make dinner for others. We can even make dinner with others. For some of us, this is not terribly natural. But why not pool our resources? I’m out of rice, and you’re out of vegetables. Why not put our fridges together and make something greater than the sum of its parts? In this kind of sharing, there is no offensive clutter, because the mess remains everyone’s, and no one can disown it. The food that results is everyone’s, too. There’s no measuring of portions to determine how much food matches a given contribution. There’s a big pot, and hungers of all sizes have equal rights to be satisfied.

While artists need to eat, just as everyone else does, the commercialization of creativity has wrought great damage to the creative process and even to our culture’s process of enjoying art. Cranking out the next novel to try and put bread on the table may or may not produce the best version of the novel that is possible; and while the fire of economic need gets us off our behinds to do something, it also easily moves us into a degenerate view of what art is and why we should seek an audience. It’s been said a million times, but art is not a commodity. Art does not participate in the laws of supply and demand in the same way that a can of beans does. While there may be times of famine and times of plenty in a culture’s creation and appreciation of art, artworks tend to last, when given the proper care. Indeed, the greatest works of art feed us again and again, as if that can of beans had turned bottomless.

An artist has bills to pay, same as everyone else; but you can’t put a price on that work of bottomless plenty. This is why society desperately needs great art—because it blows up the bottom line.

There’s no telling when the next great cooking project will spring up; but if it’s something that could feed our souls for generations to come, we need to keep the kitchen (and the marketplace of cultural experience) uncluttered so that we can recognize the genesis of great art and get out of its way.

 

photo by: Mr. T in DC

The Trueness of Beauty

Neither audience nor artist should approach art as self-expression. To do so robs art of its universal applicability. If James Joyce had written strictly to see himself on paper, A Portrait of the Artist as a Young Man would not express me; yet it does. And if it does, whatever Joyce tapped into in the book must be something from beyond the self who was James Joyce.

Recognition of the universal in other people’s art does not necessarily give the artist a similarly open conduit to universality. How easy it is to say, “Joyce wrote about a sensitive, rebellious young artist discovering himself; I could do likewise,” and thus to produce a work imitative of Joyce in form yet devoid of the function that Joyce’s book performs.

The question of form and function in art is too often overlooked. In our age, creativity divides squarely along lines of familiar genre pieces and incomprehensible highbrow art. The idea of an artwork having a “function” or a purpose is a bit of a plebeian notion to both sides. Tools have purposes; but we can’t even talk about functionality in high art (since the artist declares what art is), nor in genre art, since that’s a comfortable product for a specific audience’s consumption. The function of the one is inward-focused, while the function of the other is financially focused. Neither is truly other-focused.

Yet Joyce’s book accomplishes something in me. It meshes with something in me that was just waiting to receive it. It closes some sort of open system. It just might be a functional work of art.

Somewhere in the many stages between draft and publication, Joyce must have set himself aside and set his audience aside and just listened. Walking the minefield of that distinction is the artist’s lifelong battle. The perils gather close on either side: a focus on the self, producing audience-experience-by-force; and a focus on audience, producing product-for-consumption. Strictly followed, neither of these approaches to creativity can produce functional art.

Joyce’s protagonist Stephen Dedalus says:

I mean that the tragic emotion is static. […] The feelings excited by improper art are kinetic, desire or loathing. Desire urges us to possess, to go to something; loathing urges us to abandon, to go from something. These are kinetic emotions. The arts which excite them, pornographical or didactic, are therefore improper arts. The esthetic emotion … is therefore static. The mind is arrested and raised above desire and loathing. [1]

Joyce’s framing of the improper arts illumines the dreadful confusion of artistic form and function that we see all around us. While complaints concerning pornographical versus didactic and genre versus highbrow are not exactly analogous, each demonstrates the catastrophic divide on either side of the summit of true art; and each complaint calls us to set aside consideration of our own taste and audience taste so we can listen to something bigger.

If there is truth in the world, it must be waiting just around the corner. Perhaps this, then, is that universality that Joyce hit upon: the trueness of beauty. No, I’m not talking about prettiness. Prettiness is a description of form, but beauty is a description of function.

So what is the function of art? It’s to heal. And such is also the function of love.

Astoundingly, the analogy carries. In personal relationships, blind self-expression (read narcissism or didactic art) attacks the bond of love. Likewise, insecure pandering to the other person’s assumed desires (read kissing up or pornographical art) attacks the bond of love. The motivation of both approaches is the same: to prevent rejection.

So how should we talk to each other?

Of course, this whole discussion is a bit disembodied. In the real world, form and function are inseparable properties of artworks and loveworks. But this view of the approach can help us realize what’s wrong when art and love aren’t working.

Form grounds people and gives them a sense of belonging. It’s the vehicle through which they experience the function of love. Just as we read genres in literature, we read genres in acts of love. Some people read romance. Others prefer literary fiction. Some people feel loved when you take out the trash for them. Others need tender words.

When love isn’t working, there are two things to check: actions and heart—form and function. You can’t repair a bad heart on your own, but you can at least choose the right action. You can at least write in the correct genre while you wait for your heart, for the trueness of beauty, to come back.

But you can’t stay there. Execution of familiar forms is the laziest, most dangerous place for an artist or a lover to be. If you practice a form long enough, you’ll start believing that there’s intrinsic function within the forms that are familiar to you; and your heart, your muse, will atrophy. You will lose your awestruck gaze on the trueness of beauty.

Again, Joyce’s Stephen Dedalus:

—[My mother] wishes me to make my easter duty.

—And will you?

—I will not, Stephen said.

—Why not? Cranly said.

—I will not serve, answered Stephen.

[…]

—Do as she wishes you to do. What is it to you? You disbelieve in it. It is a form: nothing else. And you will set her mind at rest. [2]

You can’t make yourself desire the trueness of beauty. As visual artist John Baldessari has said, “you have to be possessed, which you can’t will.” [3] The statement holds across art and love. But the funny thing is that in love, if you keep trying, the possession will start to come upon you; and what a victory that is, as form finally fills out with function.

 

Footnotes:

[1] Joyce, James. A Portrait of the Artist as a Young Man. Penguin Books, 1976. p. 205

[2]  ibid., p. 239, 241

[3] Video: A Brief History of John Baldessari

Unstatement

Having dabbled in music composition, graphic design, and writing fiction, I found in myself a confusing network of interests and an alarming readiness to feign expertise in a new medium. I realized that I had become a jack of all trades (and the rest of the cliché)—a fool who enjoys sensual and creative stimulation but can’t bring himself to commit to any one discipline. But now, after recovering from that intense addiction to creative practice, I’ve begun to understand a larger definition of space—one that functions not only in art per se but in the creative process and in real life.

I had never thought about space until I started taking graphic design classes. When I heard the same crits over and over and studied the successful examples the teachers pointed out, I began to perceive the visual silence around the subject, around what we consider at first glance to be The Good Stuff. Comparing this to the student work around me, I realized that The Good Stuff suffers in oversaturation without a context of visual silence.

I started thinking about this in relation to music. I thought about things I had written and things I had analyzed. I questioned why I loved or hated certain pieces. I found that what I disliked about a lot of mainstream genre music was its low range of both vertical and durational space—that is, a tendency toward sameness in how big and how complex. Then I realized that that’s what I like about Oceansize and Fleet Foxes and Bach: Each gives me, in its own way, a constant interplay between The Good Stuff and some form of silence. Bach’s music especially displays this use of space. He is a master of the unstated, and his unstatement shines in his solo cello suites. He uses the leaping of a single melodic line to sketch the forms of larger harmonies, giving you the sense of a harmonic context which you aren’t actually hearing. Your ear, rather than Bach, assembles the outlines and suggestions into something larger.

While I was still a music student, I never worked toward unstatement. Instead, I pushed myself obsessively toward Good Sounds. I looked for one grandiose chord, something with the towering suggestion of infinite color. I created some good chords, but I never found The Chord of Everything. Eventually I gave up. I graduated with a bachelor’s degree in composition and a general bitterness toward academia, a resolution not to perpetuate the cycle of get-degree-to-teach-at-university. Feeling smart and rebellious, I abandoned that track and turned to my neglected writing project, the latest novel in a long line of work that wasn’t worth sharing with people. And I resolved to become a Novelist, capital N.

I tried it for a while. I grew a lot as a writer, but then I started thinking about raising a family and being a breadwinner. My creative potential was still high, but my income potential looked like silence; so I scrambled to fill that silence. I started another degree, this one in graphic design. Rather than leaving the white space in my life alone, I tried to slather it with The Good Stuff. One and three-quarters semesters later, I dropped out, overworked and plagued with anxiety attacks. I had not yet learned the function of silence, of uncertainty.

I suppose any creative discipline, including the living of life, is a constant relearning. Before I tried graphic design, I thought I knew novelizing; I didn’t. Now, no longer cutting cardstock with a razor, I started cutting words with a razor. I relearned and relearned, but I knew I had not yet found my voice. It was not until I grappled with William Faulkner’s The Sound and the Fury that I began to realize the meaning of white space in Story. With my clinical descriptions of roads and trees, feelings and thoughts, I was shouting at my imagined reader, “This is my story! Don’t you like it?” But Faulkner showed me another way. He insisted, with his obtuseness, that I pay attention and stumble toward some form of reality that even he might not know. He never preached Story; he offered it, and he refused to hold my hand.

I resolved to do likewise in my novel. I rewrote the whole thing. I worked obsessively, editing first thing every morning and reading last thing every night to absorb another writer’s brilliance and do it all again the next day. But the anxiety attacks came back. In purging myself of other people’s demands and turning inward to my great purpose as Novelist, I wasn’t curing myself; rather, I was oversaturating my mind and emotions with The Good Stuff. Insanity was blossoming in the noise, and it was my fault. For once, I couldn’t blame the bosses or the professors, for I had become my own professor; and I was a tyrant.

At the same time, I couldn’t say no to music. I was still in a band with my brother. I was still hauling amps at the wrong hours of the night and still adoring the imago of our rich and varied material. Oh, we had certainly stumbled upon the law of silence. Our stuff was loud and then quiet, mechanical and then melodic. It was really good. But our well-composed music bore no reflection in my disorderly life.

All of this began to look like some weird analog to the concept of Signal Versus Noise—except that the meat of artwork, The Good Stuff, wasn’t the signal; it was the noise. The signal that I so desperately needed could only be found in the silence that I refused to practice.

That was when I realized that the creative process itself is an artwork, sheltering what we call art, nested within the larger artwork of life. This three-tiered fractal structure of art within art within art, of wheels within wheels, was collapsing around me. I was not balancing the outermost medium of creativity—my life itself, my mental and emotional health—with crucial white space. My head was crammed with obsession over what I wanted to accomplish and the corollary fear of failure. I may have written The Good Stuff with my pen, but with my life I was writing noise, an insidious scrambling toward infinity. And it was becoming clear that I was not meant to be infinite.

That’s where I am now. Not infinite, living sometimes in the sounds and sometimes in the silence. The other night I lay in bed at 4:30 a.m. wishing that Sudafed hadn’t made me antsy—and that, if it was going to do so, it would at least clear my nose so I could sleep. Exhausted, suffering this for many nights in a row, I asked God with sincere and childish tears where He was. I heard only silence, and I cried some more.

But a thought kept nagging: maybe this God is balancing his artwork with white space.