quarta-feira, 9 de abril de 2014

Morality is Always on the Side of the Machine: Deep Stall over the South Atlantic: The Crash of Air France 447 by John David Ebert

Airbus`s murderous hubris at the botton of the South Atlantic :
"Airbus machines DON´T just fall out of the skies."

Airbus´s sidestick "fly-by.wire"

Boeing´s pneumatic stick shaker stall warning

St. Elmo's fire ghostly illuminates the cockpit, filled with the smell of ozone, as the plane was already flying through a severe thunderstorm system

"Anti World The Cabin as scenario
“Stepping back from the glare of the wreckage in order to gain some perspective on this catastrophe, we now turn to the world of books and discourse in order to remind ourselves of Heidegger’s point that, when a tool breaks, such as a hammer, it becomes suddenly conspicuous, standing out from the contextual background of its web of referentiality. “When a thing in the world around us becomes unusable,” Heidegger states, it becomes conspicuous. The natural course of concern is brought to a halt by this unusability. The continuity of reference and thus the referential totality undergoes a distinctive disturbance which forces us to pause. When a tool is damaged and useless, its defect actually causes it to be present, conspicuous, so that it now forces itself into the foreground of the environing world in an emphatic sense.”
The Age of Catastrophe Disaster and Humanity in Modern Times, p.43 


The Air France crash had been carefully constructed in many other previous incidents in recent years to become the paradigm itself of a systemic failure in the interface man/machine. It is described, as Gérard Arnoux, the head of the French pilots union SPAF, depicts, as a “computer brain stroke”. Let us begin with your Heidegger´s quote above in the chapter on Tenerife, with the "broken hammer" As Heiner Müller once said , morality is always on the side of the machine, because I would like you to start with BEA final report and political compromise between Airbus and its partners, the most concession the Corporation could allow itself and the key word in this report is the alternate regime “Flight Director”, that induced the pilots to a catastrophic chain of errors. They slavishly followed Flight Director pitch-up commands, ignoring 58 stall warnings in a ghostly environment, simply because “Airbus did not just fall from the sky” in Flight Director mode. They cannot, says the industry. When you buy an Airbus ticket, this mantra should be printed on the cover. I invite you, Ebert, to reconstruct, not in a simulator, but in a Pynchon´s entropic environment these tragic three minutes and a half as an eternal return, a symphony and cacophony of death, that opened a new age in the aviation . Without a pneumatic stick shaker, as in all Boeing machines, the pilots would never had a real chance, in this progressive unrealistic scenario as the deep stall swiftly unfolds, to feel, with their own hands, in dual input mode, the violence of the fall.


1. With Air France Flight 447, I think we have to keep in mind that two structural couplings which are normally essential for flying the plane came undone, or uncoupled: the first of those is the seamless fusion that must occur between the pilots and the new and highly automated flight systems. The pilot must nowadays form a "machinic assemblage" with the automated system, such that both pilot and computer are locked into a relationship of "structural coupling." The pilot's consciousness, that is to say, is not self-enclosed but ruptured into the feedback loop in which the plane's central nervous system must become an extension of his own. This is a kind of modern Centauric fusion, not of animal and man, but of man and machine.

But what happened in this case seems to have been the very opposite of the situation with Captain Van Zanten at Tenerife, where the problem was precisely that he was totally fused with the machine to the point that he could take no other data into account. In the case of Flight 447, the structural coupling of human consciousness with the plane's electronic nervous system was disrupted and the pilots were abruptly exiled from the plane's feedback systems. If this theory is correct, it is because of the freezing of external Pitot tubes on the plane that caused the computer to lose the ability to calculate the plane's speed, and without that data, it had a "nervous breakdown" and lost its orientation, immediately shutting down the autopilot and all the plane's automated systems.

But this had the effect of abruptly exiling the consciousness of the pilots from machinic fusion with the plane's nervous system and they suddenly found their own consciousnesses on the "outside" as it were of the plane's internal consciousness and had to abruptly switch to manual flying techniques, but it does not appear that they were able to do this fast enough to avoid the plane's going into a stall.

And that's the second structural coupling that came undone: normally, as you know, planes stay in the air because of the structural coupling between the angle of their wings and the airflow that engages them. But if the plane slows down by as much as a mere 10 knots, the angle of the wing changes and breaks the engagement with the aerodynamic flow over the wings and goes into a stall, which may have happened in this case. The pilots, overwhelmed with data from the shutdown of all the plane's automated systems, could not regain control of the plane quickly enough to avoid the stall, and the plane simply fell into the ocean at that point.


So the situation seems to be the opposite to what happened at Tenerife in which Van Zanten got stuck in his machinic fusion with the aircraft. Here the pilots were suddenly exiled from the plane's interior consciousness and were overwhelmed with too much data coming at them too quickly to process and also maintain the necessary pitch and power to keep the plane in the air.

Relying on automation, it seems, has become a problem for pilots these days, who wind up being poorly trained with regard to manual emergency situations. The automation is a nervous system that can act as a buffer between the pilot and the actual physical body of the plane itself, which further and further distances his consciousness from the necessary structural coupling with the aircraft to control it.

Automation tends to create electronic feedback loops that leave human decision-making out of account. We are creating a global world civilization that is always on "auto-pilot" and has therefore created the danger of rendering humans, with their old-fashioned philosophical free wills, superfluous to the automated functioning of these systems. Leaving the human being, with his Kantian-Schopenhauerian free will out of account, is creating an accident prone civilization in which free will cannot be injected quickly enough to keep up with the speed-of-light decision-making processes of the machines themselves.

Free will, according to Kant, actually injects new causal sequences into the world's otherwise predetermined cause and effect programs, but these machines are attempting to replicate those causal sequences of matter that do not leave room for the sudden injection of fresh causal sequences by the human will. We are slowly and gradually tightening up the civilization to the point that human freedom is now superfluous and there is, consequently, no room for it in the System taken as a whole. We are creating a civilization, in other words, that stands philosophically opposed to all the great antinomies of German Idealism and rejects them as incompatible with the System.
"The sharpest ideology is that reality appeals to its realistic character" (Alexander Kluge), since the German Idealism, since Clausewitz´s absolute concept of pure war. Let us talk about "overthinking" in the context of Frank Schirrmacher´s Thesis (Deputy Chief of FAZ) "Die Auswanderung des Denkens von dem Gehirn" (The Emmigration of the Thought from the Brain) (here is the interview: http://www.carta.info/22535/schirrmacher-kluge-algorithmen-geben-niemals-auf/). Kluge begins the interview with a “thought experiment” at the dawn of the Verdun Battle. No one could hardly suppose at that moment what a powerful and apocalyptic “idea” lies on the horizon, namely the “total war” (as “Materialschlacht”, “the war of materials and elements”), as Ernst Jünger would later immortalize in his writings. He means also, in this new digital world, there are only algorithms at work in a hyperspace, as in the Matrix. Algorithms give never up! They build up what you are, what you feel and what you think. We are not anymore in the realm of the classical thinking subject. Thinking itself becomes an obsolete category in the geometry of this new space, which is real time, there is no way to fix any individual historicity, historical consciousness, only flatness. Which kind of overthinking is at work at the AF 447 cabin in these three and half minutes?

As far as your second question goes, I believe I have actually answered it inadvertently in my answer to your first question, but for the sake of thoroughness, I will reiterate: the kind of thinking that has taken over in the situation with Air France 447 is precisely the dominance of pre-programmed "algorithmic thinking" that leaves no room for the injection of Freedom of the Will in any German Idealistic sense. That Abyss of Freedom which Schelling built his whole philosophy of the subject out of does not exist in the civilization of global algorithms which simply propagate themselves like mathematical viruses and end up paving over the Abgrund, or abyss from out of which Freedom accesses singularities of Thought. The Matrix covers up the abysses of German Idealism, just as it paves over Heidegger's Being, and attempts to substitute pre-programmed thinking and decisions in advance, which leave human autonomy out of account.

Human Freedom has now become a deconstructed relic in the post-metaphysical age, simply tossed aside into the middenheap (as pictured in the robot junk heap in the movie "A.I.") as one more discarded Idea along with all the others of the metaphysical age. The Subject has been hollowed out, and the walls of the newly formed hollow are inscribed with fresh programs that substitute the behaviors and thoughts of the cyborgianized human being in place of the classical philosophical Subject.

It is no longer what "I" think that matters, since the "I" that once formed the ontological basis for the Subject is no longer there and has left behind only a semiotic vacancy; but only what the Machinic Assemblage of Global Civilization thinks FOR me and on my behalf that matters. What I would do in this particular situation (any situation) does not matter, but what I-as-embedded-in-the-Matrix would do is all that counts. Autonomy is now regarded with suspicion and autonomous individuals who question the System, such as whistleblowers like Assange or Snowden, for instance, are regarded as heretics for daring to demonstrate anything remotely like critical thinking.

Unfortunately, I think the pilots on Flight 447 found that when their Matrix-embedded-I's were suddenly expelled from the machinic consciousness, there was no autonomous "I" left there for them to rely upon because there was no protocol for it in their training. They discovered in the matter of just a few seconds that the philosophical "I" that should have served them with autonomous thinking was gone and simply not there for them to use.

Hence, the problem of the elimination of the philosophical "I" in contemporary global civilization is that it can, and often does, lead to catastrophes of this sort in which machinic consciousness, together with its programs, does not match the program of human autonomy, which it has already a priori come to regard as superfluous.

Thank you very much, John David Ebert!


In memory of all the passengers and crew of Air France flight 447

 

sábado, 29 de março de 2014

Gravitational Waves and Discoveries at the South Pole by Harry Collins

Thanks to Prof. Harry Collins and The University of Chicago Press for granting permission to publish this piece on my homepage.

On March 17, 2014, there was a huge fuss about the discovery of primordial gravitational waves that could tell us something about the Big Bang’s first tiny fraction of a second. Since I have spent most of my academic life studying the sociology of the—so far fruitless—direct search for gravitational waves, I received a lot of emails asking me about whether this was the real thing at last. I had to answer “no.” Let me take this opportunity to explain.
There’s not much sociology here: only an attempt to explain the science that provides the context for my professional studies. I have to point out that I do not represent the gravitational wave detection community, among whom there are many different opinions, including some revealing much more enthusiasm for and engagement with these findings than are expressed here.
The biggest and best-known direct detection devices are two interferometers, each with two four-kilometer arms at right angles. They are located in Washington and Louisiana, and together comprise the American “Laser Interferometer Gravitational-Wave Observatory,” or “LIGO.” The 3-kilometer Italian-French device (“Virgo”), the 600-meter German-British device (“GEO”), and a few others in construction also exist, scattered around the world.  Gravitational waves are often described as ripples in space time; they are incredibly weak. If LIGO finally “sees” a wave, its effect will be to change the relative length of its two arms.  The change in length of a four-kilometer arm will be equivalent to the rise in the water level of one-square-mile Cardiff Bay caused by adding 1/100,000th of a drop. It is a hard science!
Since gravitational waves are so weak, their expected sources are huge events in the heavens, such as the explosion or collision of stars, or anything else that shifts stellar amounts of mass around in an asymmetrical way. The direct search community is split into four groups. The “burst group” looks for ill-defined packets of energy, such as might be emitted by a supernova or maybe an earthquake on a neutron star; the “inspiral group” looks for the well-defined waveforms emitted by binary-star systems at the very end of their life when they  ‘inspiral’ together and coalesce; the “continuous wave group” looks for well-defined long-duration waves emitted by asymmetric pulsars or the like (these waves are specially weak but their effect can be integrated over years); the “stochastic group” looks for random waves coming, from among other places, the Big Bang—this is the gravitational equivalent of the cosmic microwave background. So far, there has been no confirmed detection of any kind, but assuming no one has made a terrible error, there are reasons to hope that with a more sensitive generation of detectors coming on air, binary-star inspirals might begin to be detected a few years from now.

Matters get complicated because there are other ways to detect gravitational waves. Waves can be detected because of their influence on matter, such as the way they change the length of the interferometers’ arms. This is referred to as “direct” detection even though those changes have to be measured by electromagnetic means. But gravitational waves also affect the matter of stars. They have already been detected in this way by Hulse and Taylor—winners of the 1993 Nobel Prize in physics—who observed for a decade the slow decay of a widely separated binary system’s orbit, and showed it was consistent with the energy emitted by gravitational waves. Given that this observation concerns changes in the separation of lumps of matter (stars) detected by electromagnetic means, it could be argued that this detection is no more indirect than the potential detections that will be made by the interferometers. Maybe that’s a bit too philosophically cute, but maybe not; it can depend on whether you own a telescope or an interferometer (and that’s sociology). What is certain is that when (if) LIGO and the international network of interferometers start observing, they will be looking in different wavebands than did Hulse and Taylor, and they will be able to see many more of many different kinds of phenomena. The observation of a binary inspiral, or a supernova, or a neutron starquake will take seconds or less, not decades, and there should be many per year once full sensitivity is reached. The true justification for the interferometers is then gravitational astronomy—including our first look into the heart of colliding black holes—with the direct discovery of gravitational waves exciting but not so surprising as it once would have been.
Now, if it is confirmed, BICEP has observed gravitational waves in another indirect way.  The group has inferred their existence from the polarization patterns of electromagnetic waves (the microwave background). Once more there is scope for arguing that this too is no more indirect than the interferometric detections that may one day be made by the stochastic group; for some, what one calls “direct” and “indirect” seems like a matter of taste. What also seems likely is that the interferometers may one day be able to see primordial gravitational waves at different frequencies and with different kinds of resolution from those seen by BICEP—in other words, a combination of both techniques seems likely to give the best information about the first moments of the universe.
The direct detection community is excited by the BICEP result, because apart from its cosmological importance, it shows that the phenomena that they are looking for are there to be found one day. In the same way, they were pleased by the Hulse-Taylor observation, given that at one time there was doubt whether gravitational waves could be detected even in principle. Speaking now purely as my unprofessional self—a citizen with a schoolboy interest in science, but one who is perhaps biased by lengthy contact with these groups—I think building mind bogglingly fine gossamer webs that can capture exquisitely ephemeral waves is more exciting than inferring their existence from the movement of stars or from patterns in the much stronger electromagnetic spectrum. This is because it leads to more than new understanding: it demonstrates unprecedented control over nature and a heroic extension of our means to uncover its secrets.
Harry Collins is the Distinguished Research Professor of Sociology and director of the Centre for the Study of Knowledge, Expertise, and Science at Cardiff University, and a fellow of the British Academy. He is the author of numerous books, including Gravity’s Ghost and Big Dog: Scientific Discovery and Social Analysis in the Twenty-First CenturyGravity’s Ghost: Scientific Discovery in the Twenty-First Century, and Gravity’s Shadow: The Search for Gravitational Waves.

terça-feira, 18 de março de 2014

The Inflationary Universe: Alan Guth



Inflationary theory itself is a twist on the conventional Big Bang theory. The shortcoming that inflation is intended to fill in is the basic fact that although the Big Bang theory is called the Big Bang theory it is, in fact, not really a theory of a bang at all; it never was.

ALAN GUTH, father in the inflationary theory of the Universe, is Victor F. Weisskopf Professor of Physics at MIT; author of The Inflationary Universe: The Quest for a New Theory of Cosmic Origins.

[ALAN GUTH:] Paul Steinhardt did a very good job of presenting the case for the cyclic universe. I'm going to describe the conventional consensus model upon which he was trying to say that the cyclic model is an improvement. I agree with what Paul said at the end of his talk about comparing these two models; it is yet to be seen which one works. But there are two grounds for comparing them. One is that in both cases the theory needs to be better developed. This is more true for the cyclic model, where one has the issue of what happens when branes collide. The cyclic theory could die when that problem finally gets solved definitively. Secondly, there is, of course, the observational comparison of the gravitational wave predictions of the two models.
A brane is short for membrane, a term that comes out of string theories. String theories began purely as theories of strings, but when people began to study their dynamics more carefully, they discovered that for consistency it was not possible to have a theory which only discussed strings. Whereas a string is a one-dimensional object, the theory also had to include the possibility of membranes of various dimensions to make it consistent, which led to the notion of branes in general. The theory that Paul described in particular involves a four-dimensional space plus one time dimension, which he called the bulk. That four-dimensional space was sandwiched between two branes.
That's not what I'm going to talk about. I want to talk about the conventional inflationary picture, and in particular the great boost that this picture has attained over the past few years by the somewhat shocking revelation of a new form of energy that exists in the universe. This energy, for lack of a better name, is typically called "dark energy."
But let me start the story further back. Inflationary theory itself is a twist on the conventional Big Bang theory. The shortcoming that inflation is intended to overcome is the basic fact that, although the Big Bang theory is called the Big Bang it is in fact not really a theory of a bang at all; it never was. The conventional Big Bang theory, without inflation, was really only a theory of the aftermath of the Bang. It started with all of the matter in the universe already in place, already undergoing rapid expansion, already incredibly hot. There was no explanation of how it got that way. Inflation is an attempt to answer that question, to say what "banged," and what drove the universe into this period of enormous expansion. Inflation does that very wonderfully. It explains not only what caused the universe to expand, but also the origin of essentially all the matter in the universe at the same time. I qualify that with the word "essentially" because in a typical version of the theory inflation needs about a gram's worth of matter to start. So, inflation is not quite a theory of the ultimate beginning, but it is a theory of evolution that explains essentially everything that we see around us, starting from almost nothing.
The basic idea behind inflation is that a repulsive form of gravity caused the universe to expand. General relativity from its inception predicted the possibility of repulsive gravity; in the context of general relativity you basically need a material with a negative pressure to create repulsive gravity. According to general relativity it's not just matter densities or energy densities that create gravitational fields; it's also pressures. A positive pressure creates a normal attractive gravitational field of the kind that we're accustomed to, but a negative pressure would create a repulsive kind of gravity. It also turns out that according to modern particle theories, materials with a negative pressure are easy to construct out of fields which exist according to these theories. By putting together these two ideas — the fact that particle physics gives us states with negative pressures, and that general relativity tells us that those states cause a gravitational repulsion — we reach the origin of the inflationary theory.
By answering the question of what drove the universe into expansion, the inflationary theory can also answer some questions about that expansion that would otherwise be very mysterious. There are two very important properties of our observed universe that were never really explained by the Big Bang theory; they were just part of one's assumptions about the initial conditions. One of them is the uniformity of the universe — the fact that it looks the same everywhere, no matter which way you look, as long as you average over large enough volumes. It's both isotropic, meaning the same in all directions, and homogeneous, meaning the same in all places. The conventional Big Bang theory never really had an explanation for that; it just had to be assumed from the start. The problem is that, although we know that any set of objects will approach a uniform temperature if they are allowed to sit for a long time, the early universe evolved so quickly that there was not enough time for this to happen. To explain, for example, how the universe could have smoothed itself out to achieve the uniformity of temperature that we observe today in the cosmic background radiation, one finds that in the context of the standard Big Bang theory, it would be necessary for energy and information to be transmitted across the universe at about a hundred times the speed of light.
In the inflationary theory this problem goes away completely, because in contrast to the conventional theory it postulates a period of accelerated expansion while this repulsive gravity is taking place. That means that if we follow our universe backwards in time towards the beginning using inflationary theory, we see that it started from something much smaller than you ever could have imagined in the context of conventional cosmology without inflation. While the region that would evolve to become our universe was incredibly small, there was plenty of time for it to reach a uniform temperature, just like a cup of coffee sitting on the table cools down to room temperature. Once this uniformity is established on this tiny scale by normal thermal-equilibrium processes — and I'm talking now about something that's about a billion times smaller than the size of a single proton — inflation can take over, and cause this tiny region to expand rapidly, and to become large enough to encompass the entire visible universe. The inflationary theory not only allows the possibility for the universe to be uniform, but also tells us why it's uniform: It's uniform because it came from something that had time to become uniform, and was then stretched by the process of inflation.
The second peculiar feature of our universe that inflation does a wonderful job of explaining, and for which there never was a prior explanation, is the flatness of the universe — the fact that the geometry of the universe is so close to Euclidean. In the context of relativity, Euclidean geometry is not the norm; it's an oddity. With general relativity, curved space is the generic case. In the case of the universe as a whole, once we assume that the universe is homogeneous and isotropic, then this issue of flatness becomes directly related to the relationship between the mass density and the expansion rate of the universe. A large mass density would cause space to curve into a closed universe in the shape of a ball; if the mass density dominated, the universe would be a closed space with a finite volume and no edge. If a spaceship traveled in what it thought was a straight line for a long enough distance, it would end up back where it started from. In the alternative case, if the expansion dominated, the universe would be geometrically open. Geometrically open spaces have the opposite geometric properties from closed spaces. They're infinite. In a closed space two lines which are parallel will start to converge; in an open space two lines which are parallel will start to diverge. In either case what you see is very different from Euclidean geometry. However, if the mass density is right at the borderline of these two cases, then the geometry is Euclidean, just like we all learned about in high school.
In terms of the evolution of the universe, the fact that the universe is at least approximately flat today requires that the early universe was extraordinarily flat. The universe tends to evolve away from flatness, so even given what we knew ten or twenty years ago — we know much better now that the universe is extraordinarily close to flat — we could have extrapolated backwards and discovered that, for example, at one second after the Big Bang the mass density of the universe must have been equal, to an accuracy of 15 decimal places, to the critical density where it counterbalanced the expansion rate to produce a flat universe. The conventional Big Bang theory gave us no reason to believe that there was any mechanism to require that, but it has to have been that way to explain why the universe looks the way it does today. The conventional Big Bang theory without inflation really only worked if you fed into it initial conditions which were highly finely tuned to make it just right to produce a universe like the one we see. Inflationary theory gets around this flatness problem because inflation changes the way the geometry of the universe evolves with time. Even though the universe always evolves away from flatness at all other periods in the history of the universe, during the inflationary period the universe is actually driven towards flatness incredibly quickly. If you had approximately 10-34 seconds or so of inflation at the beginning of the universe, that's all you need to be able to start out a factor of 105 or 1010 away from being flat. Inflation would then have driven the universe to be flat closely enough to explain what we see today.
There are two primary predictions that come out of inflationary models that appear to be testable today. They have to do (1) with the mass density of the universe, and (2) with the properties of the density non-uniformities. I'd like to say a few words about each of them, one at a time. Let me begin with the question of flatness.
The mechanism that inflation provides that drives the universe towards flatness will in almost all cases overshoot, not giving us a universe that is just nearly flat today, but a universe that's almost exactly flat today. This can be avoided, and people have at times tried to design versions of inflation that avoided it, but these versions of inflation never looked very plausible. You have to arrange for inflation to end at just the right point, where it's almost made the universe flat but not quite. It requires a lot of delicate fine-tuning, but in the days when it looked like the universe was open some people tried to design such models. But they always looked very contrived, and never really caught on.
The generic inflationary model drives the universe to be completely flat, which means that one of the predictions is that today the mass density of the universe should be at the critical value which makes the universe geometrically flat. Until three or four years ago no astronomers believed that. They told us that if you looked at just the visible matter, you would see only about one percent of what you needed to make the universe flat. But they also said that they could offer more than that — there's also dark matter. Dark matter is matter that's inferred to exist because of the gravitational effect that it has on visible matter. It's seen, for example, in the rotation curves of galaxies. When astronomers first measured how fast galaxies rotate, they found they were spinning so fast that if the only matter present was what you saw, galaxies would just fly apart.
To understand the stability of galaxies it was necessary to assume that there was a large amount of dark matter in the galaxy — about five or ten times the amount of visible matter — which was needed just to hold the galaxy together. This problem repeats itself when one talks about the motion of galaxies within clusters of galaxies. The motion of galaxies in clusters is much more random and chaotic than the spiral galaxy, but the same issues arise. You can ask how much mass is needed to hold those clusters of galaxies together, and the answer is that you still need significantly more matter than what you assumed was in the galaxies. Adding all of that together, astronomers came up only to about a third of the critical density. They were pretty well able to guarantee that there wasn't any more than that out there; that was all they could detect. That was bad for the inflationary model, but many of us still had faith that inflation had to be right and that sooner or later the astronomers would come up with something.
And they did, although what they came up with was something very different from the kind of matter that we were talking about previously. Starting in 1998, astronomers have been gathering evidence for the remarkable fact that the universe today appears to be accelerating, not slowing down. As I said at the beginning of this talk, the theory of general relativity allows for that. What's needed is a material with a negative pressure. We are now therefore convinced that our universe must be permeated with a material with negative pressure, which is causing the acceleration that we're now seeing. We don't know what this material is, but we're referring to it as "dark energy." Even without knowing what it is, general relativity by itself allows us to calculate how much mass has to be out there to cause the observed acceleration, and it turns out to be almost exactly equal to two-thirds of the critical density. This is exactly what was missing from the previous calculations! So, if we assume that this dark energy is real, we now have complete agreement between what the astronomers are telling us about the mass density of the universe and what inflation predicts.
The other important prediction that comes out of inflation is becoming even more persuasive than the issue of flatness: namely, the issue of density perturbations. Inflation has what in some ways is a wonderful characteristic — that by stretching everything out (and Paul's model takes advantage of the same effect) you can smooth out any non-uniformities that were present prior to this expansion. Inflation does not depend sensitively on what you assume existed before inflation; everything there just gets washed away by the enormous expansion. For a while, in the early days of developing the inflationary model, we were all very worried that this would lead to a universe that would be absolutely, completely smooth. After a while several physicists began to explore the idea that quantum fluctuations could save us. The universe is fundamentally a quantum mechanical system, so perhaps quantum theory was necessary not just to understand atoms, but also to understand galaxies. It is a rather remarkable idea that an aspect of fundamental physics like quantum theory could have such a broad sweep. The point is that a classical version of inflationary theory would predict a completely uniform density of matter at the end of inflation. According to quantum mechanics, however, everything is probabilistic. There are quantum fluctuations everywhere, which means that in some places the mass density would be slightly higher than average, and in other places it would be slightly lower than average. That's exactly the sort of thing you want to explain the structure of the universe. You can even go ahead and calculate the spectrum of these non-uniformities, which is something that Paul and I both worked on in the early days and had great fun with. The answer that we both came up with was that, in fact, quantum mechanics produces just the right spectrum of non-uniformities.
We really can't predict the overall amplitude — that is, the intensity of these ripples — unless we know more about the fundamental theory. At the present time, we have to take the overall factor that multiplies the predicted intensity of these ripples from observation. But we can predict the spectrum — that is, the complicated pattern of ripples can be viewed as ripples of many different wavelengths lying on top of each other, and we can calculate how the intensity of the ripples varies with their wavelengths. We knew how to do this back in 1982, but recently it has actually become possible for astronomers to see these non-uniformities imprinted on the cosmic background radiation. These were first observed back in 1992 by the COBE (Cosmic Background Explorer) satellite, but back then they could only see very broad features, since the angular resolution of the satellite was only about seven degrees. Now, they've gotten down to angular resolutions of about a tenth of a degree. These observations of the cosmic background radiation can be used to produce plots of the spectrum of non-uniformities, which are becoming more and more detailed.
The most recent data set was made by an experiment called the Cosmic Background Imager, which released a new set of data in May that is rather spectacular. This graph of the spectrum is rather complicated because these fluctuations are produced during the inflationary era, but then oscillate as the early universe evolves. Thus, what you see is a picture that includes the original spectrum plus all of the oscillations which depend on various properties of the universe. A remarkable thing is that these curves now show five separate peaks, and all five of the peaks show good agreement between theory and observation. You can see that the peaks are in about the right place and have about the right heights, without any ambiguity, and the leading peak is rather well-mapped-out. It's a rather remarkable fit between actual measurements made by astronomers, and a theory based on wild ideas about quantum fluctuations at 10-35 seconds. The data is so far in beautiful agreement with the theory. 
At the present time this inflationary theory, which a few years ago was in significant conflict with observation now works perfectly with our measurements of the mass density and the fluctuations. The evidence for a theory that's either the one that I'm talking about or something very close to it is very, very strong.
I'd just like to close by saying that although I've been using the theory in the singular to talk about inflation I shouldn't, really. It's very important to remember that inflation is really a class of theories. If inflation is right it's by no means the end of our study of the origin of the universe, but still, it's really closer to the beginning. There are many different versions of inflation, and in fact the cyclic model that Paul described could be considered one version. It's a rather novel version since it puts the inflation at a completely different era of the history of the universe, but inflation is still doing many of the same things. There are many versions of inflation that are much closer to the kinds of theories that we were developing in the '80s and '90s, so saying that inflation is right is by no means the end of the story. There's still a lot of flexibility here, and a lot to be learned. And what needs to be learned will involve both the study of cosmology and the study of the underlying particle physics, which is essential to these models.

segunda-feira, 17 de março de 2014

Such a Massive Expansion: Gravitational Waves 10-34 seconds old

(a found poem)

Cosmologists are digging
searching
subtle twist
polarized
gravitational wave
the fabric of spacetime

the universe will look a little hotter
The photons will scatter
astronomers to uncover evidence
the big bang
universe expanded
 — inflated — by at least a factor of ...

a theoretical framework
we can’t explain

would have driven such a massive expansion
born from quantum fluctuations
something incredibly fundamental
what was happening when the universe was only 10-34 seconds old

it’s crucial to remain skeptical
distorted by intervening clusters of galaxies

My German translation of "Black Holes and Time Warps: Einstein's Outrageous Legacy" by Kip S. Thorne Hannover 17.03.2014 A day to remember!

17.03.2014 The Day Has Finally Come:Primordial Gravitational Waves in CBM B-Modes. It's Happening, Now! (Eugene Shoemaker)

"A major discovery", BICEP2 and B-modes
[Added note (on Sunday): It seems highly probable that these rumours are essentially true. Although the precise details of the results aren't yet public, the BICEP2 PI, John Kovac, has sent a widely distributed email with the following information: Data and scientific papers with results from the BICEP2 experiment will go public and be viewable here at 2:45pm GMT on Monday. At the same time a technical webcast will begin at this address.

It's going to be an exciting day!]

The cosmology rumour mill exploded today. Harvard Astrophysics have issued a press release stating that, on Monday, they will announce a "major discovery".

This is the only hard-evidence of anything interesting on the way and it could be an announcement of anything that fits under the label of "astrophysics". This is important to keep in mind. However, for one reason or another (that is hard to nail down), cosmologists are suggesting that it is going to be about cosmology. The speculation is that it will be about the BICEP2 experiment, which has been measuring the polarisation in the CMB. The speculation is that BICEP2 have seen primordial "B-mode" polarisation.

If this speculation is true, this would be a result immense in its significance.

Primordial B-modes would be a smoking gun signal of primordial gravitational waves. This, alone, makes such a discovery important. Gravitational waves have not yet been observed, but are a prediction from general relativity. Therefore, such a discovery would be on the same level of significance as the discovery of the Higgs particle. We were almost certain it would be there, but it is good to finally see it.

However, the potential significance of such a result goes further because these primordial gravitational waves would need a source. The theory of cosmological inflation would/could be such a source. Inflation is a compelling theory, not without some problems, for how the universe evolved in its very earliest stages. If it occurred when the universe had a large enough temperature, it would generate primordial gravitational waves large enough to tickle the CMB enough to make these B-modes visible in the polarisation. As of yet, inflation has passed quite a few observational tests, but nothing has been seen that could be described as smoking gun evidence. A spectrum of primordial gravitational waves would very nearly be such a smoking gun. If the spectrum was scale invariant (i.e. if the gravitational waves have the same amplitude on all distance scales) that would be a smoking gun for inflation and accolades, Nobel Prizes, etc, etc, would flow accordingly.

All of this is just speculation, but some of it does seem to be coming from reputable sources. And some of my colleagues have been talking about tip-offs from people who wish to remain anonymous, so I figured I'd collect all the speculation I know of here in a post (let me know if I've missed anything):

Richard Easther, on the rumour and its implications
Bruce Bassett, on the probability that the rumours are true
The Guardian (the first major news source to pick up on this) with comments from various prominent cosmologists
Jester/Resonaances on the context (i.e. earlier constraints on primordial B modes)
Lubos Motl, amongst other things, explains what B mode polarisation actually is
Philip Gibbs, at viXra log
Peter Coles, amongst other things, on why gravitational waves mean there should be B-modes in the CMB polarisation
Sesh Nadathur on why, amongst other things, the rumoured measurement would appear to be in tension with results from Planck and WMAP

Sean Carroll has written a very thorough overview of the implications for cosmology (if the rumours are true).

The PI of BICEP2, John Kovac, gave a talk at the annual COSMO conference last year that had some pretty ambitious claims for how sensitive BICEP2 and similar experiments were going to be, so... well... we'll know on Monday. It should also be noted that, although the existence of these gravitational waves is a prediction of inflation, their amplitude is a free parameter and an amplitude this big is potentially a little surprising (for me, lower temperature inflation models just seem more compelling, others might disagree).

Source:http://trenchesofdiscovery.blogspot.de




Twitter:@just_shaun


The Jewel of Hannover: GEO 600 - Interview with Prof. Bruce Allen (Director at Max Planck Institute for Gravitational Physics Hannover) and Prof. Harry Collins (School of Social Sciences Cardiff University)

Why do you consider GEO600 as your “favourite cuisine laboratory” in the global GW ground network? What is so special here in Hannover? All detectors in the world need one another, but the competing communities in the race do not always share their scientific culture. Could you explain this paradox?

Bruce Allen: What I said in my lecture was that GEO-600 is not the most sensitive of the gravitational wave (GW) detectors, because it is smaller than the others. However it has served as an important laboratory for developing and testing technology that was then later adopted by the other GW detectors. Currently it is the only detector in the world testing "squeezed light". Probably those methods will later be adopted by the larger (LIGO and VIRGO) instruments.

I don't see any paradox. The different detectors world-wide share a common goal: to detect GWs. They have developed somewhat different technology, because the different groups have different resources, different expertise, and most important, different opinions about what methods might work the best.


Harry Collins - It is not a paradox any more than international trade is a paradox. Countries gain mutual benefit from trading with one another but that does not mean they share their entire cultures with one another.

In gravitational wave physics it is a little more complicated because each group needs the other but they have different histories and different interests. GEO600 is unlikely to contribute much directly to the detections that ought to arriving from the late 2010s because it is so small but the group can contribute a lot to the development of the science that will go into the larger detectors. It may well be the smallness of the group and their detector that allows for the flexibility needed to pioneer new approaches. Quite properly, GEO600 will take its place among the discoverers when the first terrestrial detection comes along. Virgo is much larger, with 3 kilometer arms, and ought to be as sensitive, or nearly as sensitive, as the 4 kilometer detectors built by the Americans due to ingenious features of the design. So far, however, it has tended to lag behind the Americans in the speed of development and in fulfilling the promise of the technology -- perhaps because they have fewer personnel. For this reason, or because Italy has been the source of what are now counted as incorrect claims, Virgo tends to be very cautious in interpreting signals. The Americans, who own the two largest and most advanced detectors -- the devices without which there could be detection -- have shown themselves to be very generous in spirit -- always ready to share the honours with the smaller detectors. I would say the collaboration is very healthy though there is the potential for disagreement when the first detection comes along if it is marginal.

Your field is at the brink to become a totally New Science with LISA, that is, this new outstanding measurement quantum leap will fulfill Einstein’s legacy in its plenitude. According to Prof Simon White, the first directly detected GW will come from merging black holes about in 2016. What kind of epistemic culture will emerge from this horizon? And how this hegemony will resonate among other scientific communities?

Bruce Allen: Our field will become new science with the LIGO and VIRGO instruments later this decade, not with LISA, which is about 20 years in the future. There is another effort underway to detect very low frequency gravitational waves, using "pulsar timing arrays" or PTAs. This effort might succeed at the same time, or later, or even earlier than the LIGO and VIRGO efforts. It will be scientifically very interesting, but complementary (different sources) to the LIGO and VIRGO efforts. White was talking about PTAs.


What is the chance of Planck to detect primordial gravitational waves through the so-called B-polarization mode?

Bruce Allen:  I would really be happy if the Planck satellite could detect the B-mode signature of primordial GWs. However my understanding is that they are probably not strong enough for Planck to detect them: a follow-on mission would be needed.

Would cosmic strings lead to a observable gravitational wave background?

Bruce Allen: Cosmic strings produce GWs. So if they are formed with high enough density, and at the right time in the history of the universe, then yes, they could produce an observable background. This is a topic that I worked on a lot many years ago, and published several papers about. However there is until now no evidence that our universe ever contained cosmic strings. So it's a long shot!


The so-called Anthropic Principle (AP), in its many forms, attempts to explain why our observations of the physical universe are compatible with the life observed in it. From the Weak AP (WAP), which in one form states that "conditions that are observed in the universe must allow the observer to exist", to the Strong AP (SAP) which in one version states that: “The Universe (and hence the fundamental parameters on which it depends) must be such as to admit the creation of observers within it at some stage,” they all try to answer the question of why there is life in the universe, or why the fundamental constants are the way they are. But, do any of these principles add anything to our understanding of the ultimate question of life and the universe?
Perhaps the best answer is embedded in Martin Gardner’s sarcastic proposal of the Completely Ridiculous Anthropic Principle (CRAP): “At the instant the Omega Point is reached, life will have gained control of all matter and forces not only in a single universe, but in all universes whose existence is logically possible; life will have spread into all spatial regions in all universes which could logically exist, and will have stored an infinite amount of information, including all bits of knowledge which it is logically possible to know. And this is the end.” Is this logical conclusion our only chance?

Bruce Allen:: the Anthropic Principle, that the universe is the way that it is, because otherwise we would not be here to observe it, has always seems rather sterile to me. Of course one can't argue against it, but on the other hand there is absolutely no way to falsify it. So I have consider it to be a philosophical rather than a scientific statement. The fact that it can't be falsified, and thus can't be tested, puts it for me in the category of questions like "how many angels fit on the head of a pin" or ""what happens when an unstoppable force meets an immovable object?"

Thank you very much Prof. Bruce Allen and Prof. Harry Collins, specially for sending me this quote for Urania!

"My feeling that I have seen the dark side of the Moon is not so different from my feeling that I have seen a cup and saucer ...

That we do not feel terribly disadvantaged by the indirect quality of the seeing in the case of the dark side of the Moon is quite striking, but the really odd thing about the second spatial dimension of seeing reveals itself, as might be expected, when the seeing gets stormy. In heavily disputed areas, people who are far removed in social space from the instruments of seeing are often more certain about what has been seen than those who actually peered through the instruments. A big part of the sociology of seeing concerns the way that those distant from the instruments of seeing come to learn about what has been seen, and the way that they form their often very strong opinions.

One element in the explanation of this strange phenomenon is that, as a general rule, as you move from the scientific core group the message gets simpler and more straightforward. Those in the core group are aware of every argument and every doubt as it unfolds, whereas those a little more distant have their views formed from more-digested sources. In the nature of things, digested sources must simplify. The medium of transmission has, as it were, too narrow a “bandwidth” to encapsulate the sum total of the core’s activities. Other things being equal, then, “knowledge waves,” at least weak knowledge waves, behave in the opposite way from gravitational waves. Gravitational waves weaken as they spread; knowledge waves get stronger.

Because weak knowledge waves get stronger as they spread, it is hard for most of us to remember how faint they were at their source. In a modern science there are often only a very small number of scientists “looking through” the instrument and handling the strings of numbers that emerge. The study of seeing is about how this indirectness, this faintness, and all this scope for disagreement get turned into the kind of widespread certainty that allows us to say the equivalent of “We have seen the dark side of the Moon” and one day will allow us to say, without fear of contradiction, “Gravitational waves have been directly detected.”​
Gravity's Ghost: Scientific Discovery in the Twenty-first Century. University of Chicago Press, 2010. pp. 11-12.


The Wave Hunters

The Wave Hunters - 01 - Expanded and Squeezed
 The Wave Hunters - 02 - Stretched and compressed

 The Wave Hunters - 03 - Extremely sensitive

 The Wave Hunters - 04 - Squeezed light

 The Wave Hunters - 05 - New York, New York

 The Wave Hunters - 06 - Clean Optics

 The Wave Hunters - 07 - Tailor-Made and Remote-Controlled

 The Wave Hunters - 08 - High Tech in the Cornfield

 The Wave Hunters - 09 - A World Record

 The Wave Hunters - 10 - Needles in Space

sexta-feira, 14 de março de 2014

Lyric Poetry after Auschwitz by Herbert Marcuse*


Thanks to Peter Marcuse and Peter-Erwin Jansen who grant permission to publish this piece only for this publication on my homepage.


The question whether after Auschwitz, poetry is still possible can perhaps be answered: yes, if it re-presents, in uncompromising estrangement, the horror that was - and still is. Can the same be said about prose? Prose is much more committed to reality than poetry, consequently estrangement is much harder to achieve - estrangement which still is communicable, “makes sense.” It has been achieved: Kafka, Beckett, Peter Weiss (in Aesthetik des Widerstands).[1]

What is involved is more than the “tragic experience” of the world of death and destruction, cruelty and injustice. The tragic experience of suffer­ing is also the vision of its mitigation: Fate or the Gods, or Reason may still prevail (even the Greek tragedy has its negation in the ensuing Satyr-play).

But Auschwitz is the ultimate, is the refutation of Fate, the Gods, Reason; is the demonstration of total human freedom: the freedom to order to organize, to perform, the slaughter. That human freedom can be exercised with equal efficiency to prevent the slaughter, history still has to prove.

The Ultimate cannot be re-presented, cannot become “literature” without mitigating the horror. This is the guilt of the aesthetic form which is essential to art: sublimation. And the Anti-form, the negation of form, remains literature while the slaughter continues.

How can the immediacy be attained which undoes or suspends the sub­limation without ceasing to be literature? For it is the immediacy that has to be caught here - as the starting point of all mediations (perhaps, as the ultimate reality, it defies all mediations). This immediacy is in the cry, the despair, the resistance of the victims. And it is preserved only in memory. To preserve and develop the memory of those who did not have a chance (and of the many millions who have no chance) is the legitimation of literature after Auschwitz.

Memory is a potential of (human) subjectivity. The turn toward sub­jectivity happens in a specific political, historical context: the continued power of those who were responsible or co-responsible for Auschwitz, and the apparently continued impotence of the Left. The rediscovery of the subject, and of subjective responsibility could at last be the negation of that degenerate historical materialism which shies away from the question of subjective responsibility by stipulating the objective responsibility of capital, labor, class, production process, etc. - the human subject disappears behind these relationships reified into thing-like entities moving under their own power. But if “the conditions” are responsible, what about the human sub­jects who make and who suffer the conditions? They are the ones who change them: literature is an emancipatory process in the human subjects before it becomes an objective process of changing institutions and economic-political conditions. And this process involves the entire mental structure: consciousness and the unconscious, intellect and emotions, drives striving for objectification.



It is nonsense to say we’re all responsible for Auschwitz, but we are responsible for preserving the memory. We? Those who know what hap­pened, that it [is] still happening in many areas of the globe, and that there is no historical law which would perpetuate the Ultimate. Why should we refuse to live with the horror? Because there are, in spite of the sages of Marxist orthodoxy, not only men and women who are members of their class, who are existing in class relationships, who are shaped by the mode of production, etc. - there are also men and women who are the human beings in and against these conditions. They are supposed to be liberated and to fight for their liberation - not a class, not a bureaucracy. And they are those who have to organize (themselves).


Emancipation from the given conditions of life (which in the class society are necessarily repressive), transcendence beyond them toward more free­dom, joy, tranquility are the drives which constitute subjectivity. This means that subjectivity is “in itself” (an sich) “political.” At least since Aristotle’s definition of man as logos echon, the Western tradition has restricted sub­jectivity to its rational features, and with Descartes, concentrated it in the Ego. In the last analysis a solitary Ego in a world of things, which has great trouble in getting together with other Egos, [DK: makes it difficult] to understand intersubjectivity.[2] Hegel connects this conception in compre­hending the subject as spirit, objectifying itself in nature and society. And phenomenology sees in the transcendence of the Ego the very essence of the subject as consciousness: enclosed in the domain of thought.[3] But the transcendence of (“pure”) consciousness is only the abstract, purified form of a political process in the individuals, in which the individual introjects, and confronts his and her society.

The turn to subjectivity as emancipation is never a turn to the Ego as the center of a private sphere or as “unique.” Rather, the Ego always only appears as a particular manifestation of the general, which does not merely constitute its exterior but its interior as well. This general (the “context” of the Ego, which is inseparable from it) is the social, which in turn is rooted in biology. It is the Freudian unity of Ego, Superego, and Id, which only [together - RB] constitute the individual. The Superego and a “part” of the Ego are the representative of social conditions and institutions. The general penetrates the Ego in both poles of the psyche: (1) in the Superego as society; (2) in the Id as the various realizations of the primary instincts: Eros and Thanatos (life instinct and death instinct). Subjectivity is therefore generality, and the recourse to a private sphere is at best an abstraction. This abstrac­tion is not only a matter of thought but also of behavior. It takes on a social function. It was always ambivalent in capitalism: a necessary sphere of protection against dehumanization and the deindividualization of life in everyday relations - but also powerlessness, unable to prevent the intrusion of exchange relations into the private sphere.

Today the power of exchange relations over the private sphere is reaching completion: the identification of the individual with the roles that it must play in society. For example: the liberalization of sexual morality. This sub­jugates the private sphere to exchange relations. It tends to turn the other person into an exchangeable object - repressive desublimation. A genuine liberation of the sexual sphere is incompatible with the repressive society. It would [instead - RB] require a sublimation of sexual relations to eroticism and their “broadening” into a common life-world, autonomy as solidarity -  community as destiny. When great literature elevates sexuality to Eros, this transformation is not only that sublimation characteristic of all art but also the rebellion against the limitation of the life instincts in society.

Today system-conformist, repressive desublimation is becoming totali­tarian. In multiple forms, it generates acaptive audience, which is condemned to see, hear, and feel the manifestations of immediacy. In literature, desub­limation appears in the discarding of form. Aesthetic form demands that the general be preserved in the particular of a work, as a binding testimony to truth. This essential quality of the aesthetic is by no means only the imperative of a specific historical style but rather a matter of the transhistorical power of art to uncover dimensions of man and nature which have been buried or leveled. When this dimension is absent, the writing remains solely a private matter, the publication of which has the sole rationale of private therapy.

Adicionar legenda
It seems to offer an escape from the horror and impotence of the individual in society. Yet the flight into immediacy, encountering the Ego, also encounters the same society, which has made it an Ego. Society appears in a work indirectly, not as what it is, but rather as the context, in which the word is written. In the regression to the immediate Ego, this context is reduced, both in quantity and quality, to the experiential sphere of the Ego. The external is centered on the internal: form does not depend on what happens but on how the Ego experiences events. This was still possible in the classical epistolary novel (Werther!): but subjectivity as the basis of aesthetic form has become questionable today. Poetry and reality make this develop­ment evident in the extreme case: Werther’s suicide was still a challenge to society, while Jean Amery’s was a matter of despair, for which there was no more tomorrow.

If literature should nonetheless maintain its particular dimension of truth and represent the breach between dominant consciousness and the uncon­scious, then its subject can only appear as a victim of existing society, an existence that embodies resistance and hope. The author registers what is done to the subject. This labor is not a matter of the private Ego and its immediate experiences; instead the Ego “opens itself” to the general and to reality. And reality, measured at the extreme, is Auschwitz - as reality and possibility. But then it is not representable - neither in realism nor in formalism. For image and world already conjure up the unsayable and the unimaginable.

This consciousness motivates the struggle of the avant-garde against form and against the “work.” But the production of non-works dispenses with the inherent contents and the truth of form. Such non-works therefore frequently have a playful, uncommitted and artificial character (against Adorno!): they are exactly what they say they want to oppose: abstract. They lack substance: what makes them literature are words and their ordering - in other words, style, again exactly what they do not want to be (parallel: analytic philosophy).

Perhaps the possible presence of Auschwitz can be suggested in literature only negatively: the author must forbid himself from writing or describing trivialities - and such trivialities include some things he might think, do or not do. He cannot sing about parts of his body and their activities - after what Auschwitz has done to the body. He cannot describe his own love life, or those of others, without inviting the question as to how such love can still be possible, and without eliciting hate for whoever renders this love questionable. Nor can he sprinkle poverty and labor strife as “episodes” in his narrative. Given the desperation they entail, any such treatment would be untrue.

Yet a literature respecting such taboos would not be without hope. The hopelessness of those who struggle is reflected in the power of the author to communicate through the description of horror some of the resistance to reality today. But aesthetic form refuses an immediate representation of resistance and of the forces, always alive within it and able to survive all defeats: the will to live - and the need to destroy whatever suppresses this will.

The taboos just mentioned are not brought extrinsically to literature. They are based in the mimesis function of literature: to re-present reality in the light of that negativity that preserves hope. Auschwitz cannot be excluded from this thinking or dismissed. Nor can it be represented without sublimating the unsublimable through formal construction. It can only be present in the inability of humans to speak with each other without roles, and to love and to hate without anxiety and without fear of happiness. This inability must appear as the general in the particular, the destiny of reality - not as personal bad luck, misfortune, incapacity or psychological deficit.

Only the sublimation of personal experience can insert it into the dimension in which the reality appears as the general in the particular. The immediate cannot be separated from the particular individuality; everything else is external. Horror, as personalized, becomes a private event, which, however, because it is literature, needs to be published. Indeed it is published and sold because only such looking away from the real generality, from the external reality, can provide a good conscience to existing conditions. Reading what they do in bed and how still provides unspoiled pleasure.

It appears that literature after Auschwitz may still be possible, indeed even necessary, but it can no longer provide pleasure, at least not aesthetic enjoy­ment (but certainly pornographic enjoyment). This does not mean that [all - RB] literature which does not provide enjoyment is therefore authen­tic. The pitiful epigones of the dadaists and surrealists provide no aesthetic enjoyment, nor do they want to, without invoking the horror of reality. The destruction of form, the rejection of the (“organic”) work reflect only in a very limited way the real destruction underway in the world: in a bad abstraction, with no vision of hope.


Desublimated literature remains literature, i.e., it elicits the enjoyment which is inherent in aesthetic form. The classical (organic) form (the “work”) demands the transformation of the object, the content. In desublimated literature, the content is no longer transformed by form, nor internalized by form. Form becomes independent and reduced tostyle. Style can be extremely accomplished and mastered in all tiers of language, from everyday jargon, dialect, and administrative German all the way to the highest high language. Style “beautifies” the description of a sex act as well as a murder, the appearance of Hitler as well as Lenin . . .

The power of style indicates the poverty, indeed the irrelevance of the content. It is not formed by style: it remains rather in its immediacy: episodes from a whole, that is imperceptible. Or that is only a personal context for a hero, without transcendence and without the real sublimation that con­stitutes the general. Where reality beyond the private context constitutes the work (for example, the early Soviet state in the “Stories from Production”), reality renounces the beauty of style. People speak in perfect verses, but they versify a doctrine that has already congealed into ideology as well as a horrible reality, that robs the verse of any seriousness. For example: the piece becomes a hymn to the machine that requires human sacrifice. Reification of communism.

There is evidently a reality that resists form-giving, and which therefore cannot become an object of literature, without being falsified and reduced - and this is precisely the reality which should be remembered in litera­ture. This would mean that there is an internal border in literature: not every material would be appropriate for literature or form. Where is the legitimation of this imperative?

Just as literature has its internal truth, so too does it have an internal morality. That critical transcendence which is essential to literature ties literature both to the harm that oppression does to humans and to the memory of that past and to what can return. But the reality of Auschwitz cannot be transcended, it is a point of no return. Literature can remind us of it only through breaks and evasions: in the representation of people and conditions that led to Auschwitz and the desperate struggle against them. Representation remains obligated to the transformational mimesis: the brutal facts are subjugated to form-giving; reportage and documentary become raw material for formation through creative love (the principle of hope) and creative hate (the principle of resistance). The two principles of formation constitute an (antagonistic) unity, which is the political potential of art.

This principle forbids the trivializing and privatizing of literature. It does not permit centering the work on eating or sexuality . . . Precisely the political potential of art demands the formation of ageneral in the particular, that surpasses the “natural sphere.”[4]

But art abdicates not only before the extreme horror but also before the extreme situation as such. A telling example is the incompatibility between art and the depiction of the extreme manifestations of the body (such as fucking, masturbating, vomiting, defecating, etc.). This taboo is not asserted in terms of a more or less puritanical and petty bourgeois morality, but in terms of the very quality of the aesthetic form, its essential beauty. The avant-garde rejection in its liberty to violate and shock petty bourgeois pre­judice and repression - it achieves only the attraction of pornography. Not that these extreme situations are disgusting or perversions or ugly (the opposite may be the case), but they are turned into what they are not: “literature,” and the author plays the role of the voyeur.

According to Lessing, the extreme horror lies outside of the domain of the visual arts because its representation violates the law of Beauty to which art is subject. This law is also binding for literature, but there the extreme horror is within the power of production in a mediated form, that is, if it appears only as transitory in the context of the work, as a moment “in the story” - aufgehoben in the whole. Only by virtue of its transitoriness does the representation of the extreme horror allow the enjoyment of the work, the feeling of pleasure in its reception.

In the case of Auschwitz, no such aesthetic sublimation seems imaginable. The whole in the context of which Auschwitz could appear as transitory is itself one of horror, and the availability of ever more efficient scientific- technological killing suggests the possibility of repetition rather than passing.

If it is the historical imperative of survival that the memory of Auschwitz must be preserved in art, and that art exists necessarily under the law of Beauty, then we must admit the idea of an art that cannot be and should not be “enjoyed” and yet appeals to the consciousness of unconscious of the recipient. Release of “mauvaise (bad) conscience”? The drive to know the things which are not revealed in scientific as well as in everyday thought and speech and which are yet

[Editor’s Note: The manuscript breaks off at this point.]


* Editor’s note: An untitled text we are titling “Lyric Poetry after Auschwitz” was found in the Marcuse archive. It consists of four pages in English, followed by eleven pages in German, some fragmentary, and two rather fragmentary pages in English. It is not clear what the origins of this article are, what Marcuse intended it for, and why he wrote first in English, then in German, reverting in the final pages to English. It is found in the Herbert Marcuse archive under the number 560.00 with the description “Entwurf La Jolla, 1978.” A German version of the text with the title “Lyrik nach Auschwitz” was published in Peter-Erwin Jansen’s edited edition Kunst und Befreiung (Lüneburg: zu Klampen, 2000), pp. 157-66. We are following Jansen’s suggested title translated into English and Russell Berman has translated the German passages. (DK)

[1] Editor’s Note: Peter Weiss, Aesthetik des Widerstands, appeared in German in a three-volume edition in 1975, 1978, and 1981; an English translation by Joachim Neugroschel with an introduction by Fredric Jameson has appeared, The Aesthetic of Resistance, Volume 1 (Durham, N.C. and London: Duke University Press, 2005). (DK)

[2] Editor’s Note: Marcuse’s point seems to be here that the model of a solitary Ego makes it difficult to comprehend intersubjectivity, a defect of modern philosophy that Marcuse believes is overcome in Hegel. (DK)

[3] Editor’s Note: Marcuse is referring here to the phenomenology of Edmund Husserl and perhaps the early work of Jean-Paul Sartre, The Transcendence of the Ego (New York: Hill and Wang, 1991), which lays out an interpretation and critique of Husserl. (DK)

[4] Editor’s Note: Marcuse inserts “ Vernunft?” (reason) in a handwritten note at the side of the margin at this point and the rest of the text is in English, is somewhat fragmentary, and breaks off before it is concluded. We do not know why Marcuse switched from English to German and then back to English in constructing this text.

Translated by Russell Berman 

Paul Celan "Todesfuge - Death Fugue" Poem animation German

 

Fuge of Death 

 

Death Fuge

 

In: Herbert Marcuse: Art and Libertion. Collected Papers of Hebert Marcuse. Volume Four. Edited by Douglas Kellner. London, 2007, pp. 211-217.