Читать книгу Processual Pessimism. On the Nature of Cosmic Suffering and Human Nothingness (Vladislav Pedder) онлайн бесплатно на Bookz (3-ая страница книги)
Processual Pessimism. On the Nature of Cosmic Suffering and Human Nothingness
Processual Pessimism. On the Nature of Cosmic Suffering and Human Nothingness
Оценить:

5

Полная версия:

Processual Pessimism. On the Nature of Cosmic Suffering and Human Nothingness

Earthquakes, wildfires, mass extinctions, stock-market crashes, and social revolutions all display the same fractal structure in the distribution of events by scale. The frequency of events is inversely proportional to their magnitude according to a power law. This means that small events occur very often, medium events less frequently, and large events rarely – yet all are manifestations of the same process. There is no principled distinction between small and large events, between normality and catastrophe. A catastrophe is a rare but lawful fluctuation of a system that resides in a critical state.

The third mechanism of fractal causality is spatial transmission of change. When something happens in a system, its effect is felt not only at the immediate locus but also in neighbouring regions. Change propagates from one part to another as a chain of causes in which the outcome of the first step becomes the beginning of the second. If a patch of ground becomes saturated, water runs downhill and alters the moisture of adjacent patches. If pressure falls in one part of the atmosphere, air moves and changes pressure elsewhere. If an infection spreads, infected individuals contact susceptibles and transmit the pathogen.

A key parameter here is the connectivity of the system. A dense, highly branched network of links facilitates propagation of a process over considerable distances from its point of origin. If connectivity is weak or sparse, the process dies out. Thus a local event can gradually grow into a global change because the perturbation is transmitted step by step through contacting elements. Epidemics spread along networks of social contacts. Financial crises propagate through chains of debt and interdependence. In all these cases the structure of connections determines the dynamics of the process.

The fourth mechanism of fractal causality is historical dependence. Any change leaves consequences: it alters form, distribution, conditions, or behaviour of elements. These consequences do not vanish but enter the current conditions, becoming part of the context for subsequent processes. Therefore the same impact produces different effects at different times because the environment has already been modified by past events. Soil scorched by fire absorbs moisture differently and supports different vegetation. A society that has passed through crisis responds differently to risk and uncertainty. An organism that has recovered from infection acquires immunity or chronic damage.

The concept of path dependence, developed in economic history and evolutionary economics, describes how past decisions constrain future possibilities. For example, the QWERTY keyboard layout persisted through historical contingency11, reinforced by market mechanisms rather than by optimality: its wide adoption created an infrastructure of training and production that made switching to alternative layouts uneconomical. Technological standards, institutional structures, and cultural norms display the same inertia. Once established, they become self-sustaining, even when the original causes of their emergence have disappeared.

Path dependence differs radically from linear determination. In linear determinism the past determines the future through a continuous chain of causes. In fractal determinism the past determines the future through accumulated structure – through the context in which current processes unfold. Evolutionary biology illustrates this with special clarity. Organisms carry in their structure the traces of the entire history of life on Earth. Each adaptation is overlaid upon preceding structures, modifying them but not cancelling them. Hence evolution does not proceed toward an optimum but wanders across a landscape of possibilities, where every step is constrained not only by present conditions but by the whole prior trajectory.

The fifth and most fundamental mechanism of fractal causality is scale invariance, or self-similarity. This is the property whereby the same processes repeat across different scales. Small and large obey the same rules: change produces a response, the response alters conditions, and the cycle repeats. The difference is only in size and energy, not in the principle of organization. Thus microscopic processes, for example turbulence in a droplet, develop according to the same logic as large atmospheric vortices. In economics this is especially striking: short-term price fluctuations are structured the same way as long-term trends, as Mandelbrot demonstrated in his studies of financial time series.

Fractality means that the form of behaviour is preserved when the scale of observation is increased or decreased. Only the level at which it manifests changes, not the structure of the process. This is a deep property of natural systems that went long unnoticed because traditional science focused on characteristic scales and sought specific laws for each level of organization. The fractal approach shows that many systems lack a characteristic scale. They are self-similar across scales, from the microscopic to the macroscopic.

Power laws, which describe the distribution of events by magnitude, are the mathematical expression of scale invariance. If the probability of an event is inversely proportional to its magnitude to some exponent, then changing the measurement scale preserves the shape of the distribution. This contrasts with the normal distribution, which involves a characteristic scale (a mean) and where deviations from the mean decrease exponentially. In systems governed by power laws there is no typical event: small, medium, and large occurrences form parts of a single continuum.

The population distribution of cities, the distribution of incomes, the distribution of firm sizes, the distribution of citations of scientific papers, the distribution of solar flare intensities, and the distribution of earthquake magnitudes all exhibit power-law behaviour. This indicates that the processes generating these distributions are self-organizing and scale-invariant. There is no external regulator prescribing an optimal city or firm size. The system itself generates the entire hierarchy of scales through local interactions and feedback.

This assemblage of feedback, transmission of perturbations, path dependence, and self-similarity constitutes the essence of fractal determinism. It is not founded on a single originating cause as in the classical Laplacean approach, nor does it admit pure randomness as in indeterminism. Everything is subject to necessity, but that necessity is not one-dimensional: it is multilayered, recursive, and self-repeating.

A crucial consequence of fractal determinism is a reconsideration of the relation between quantum mechanics and the macroscopic world. In standard quantum mechanics the state of a particle in superposition is fundamentally probabilistic: the outcome of a measurement cannot be predicted even with full knowledge of the wave function. This appears to contradict determinism. However, the process by which an uncertain quantum state gives rise to a definite macroscopic outcome is described by the theory of decoherence, which proceeds in a fully deterministic manner.

Decoherence arises from the irreversible interaction of a quantum system with its environment. The environment consists of an enormous number of degrees of freedom – photons, air atoms, molecules of the measuring apparatus. These interactions cause a rapid suppression of the interference terms of the wave function, after which the system behaves as if it were in one of the classical states. Decoherence occurs extremely rapidly for macroscopic objects, on timescales on the order of 10⁻²⁰ seconds, which renders quantum superposition practically unobservable. Decoherence does not solve the measurement problem in quantum mechanics. It does not explain why one particular outcome is observed rather than another. That remains fundamentally random in the Copenhagen interpretation. However, decoherence explains why the macroscopic world appears classical and deterministic. Quantum uncertainty does not penetrate the macroscopic world not because quantum mechanics ceases to apply, but because interaction with the environment makes interference unobservable.

In nonlinear and chaotic systems, small quantum fluctuations at the initial stage can be exponentially amplified and lead to observable macroscopic differences in the final state. Here, however, it is necessary to distinguish predictability from determinism. A system can be deterministic – its state uniquely determined by initial conditions and laws of evolution – and yet unpredictable because of sensitivity to initial conditions. In chaotic systems the exponential divergence of trajectories makes long-term prediction impossible even with infinitely precise knowledge of initial conditions, since any finite precision will be exhausted in finite time.

Thus, although the outcome of an individual quantum event – for example, the decay of an atom – is regarded as fundamentally random, its macroscopic consequences can be deterministically predicted once decoherence has turned that outcome into a classical fact. Moreover, for ensembles of quantum events the probabilistic predictions of quantum mechanics become deterministic in the thermodynamic limit. The law of large numbers guarantees that fluctuations in relative frequency diminish as the number of events grows. Therefore macroscopic observables, which average over an enormous number of microscopic events, behave deterministically with precision beyond any practical means of measurement.

Fractal determinism integrates quantum randomness as one mechanism through which necessity is realised at the macroscopic level. But it does not cancel determinism, since macroscopic processes remain determined by decoherence and statistical averaging. Thus quantum mechanics is compatible with fractal determinism, even if it is incompatible with linear Laplacean determinism.

Weather systems exemplify fractal determinism with particular clarity. The atmosphere is a turbulent fluid governed by nonlinear hydrodynamic equations. These equations are deterministic but generate chaotic dynamics because of nonlinear interactions and feedbacks. Edward Lorenz demonstrated that even a simple model of atmospheric convection exhibits sensitivity to initial conditions, making long-term prediction impossible. The famous “butterfly effect,” according to which the flap of a butterfly’s wings can alter the weather weeks later, illustrates this sensitivity.

Unpredictability, however, does not imply absence of determinism. The atmosphere remains a deterministic system in which each state is uniquely determined by the previous one. Unpredictability arises from the impossibility of measuring initial conditions with infinite precision and from the exponential growth of errors. At the same time the atmosphere displays a fractal structure on all scales. vortices of every size interact with one another, transferring energy from large scales to small via the turbulence cascade. The distribution of energy across scales follows Kolmogorov’s power law, which is a signature of scale invariance.

Climatic patterns such as El Niño exhibit self-organized criticality. The ocean – atmosphere system accumulates heat in the western Pacific until a threshold is reached, after which heat is rapidly redistributed eastward. This event affects weather worldwide via teleconnections – atmospheric waves propagating thousands of kilometres. The frequency and intensity of El Niño events are not regular, but they conform to statistical regularities that reflect the fractal structure of the climate system.

Financial markets display the same fractal logic. Prices and volumes reflect the continual operation of feedbacks. Market participants’ actions alter liquidity and sentiment; this changes subsequent decisions, which in turn affect prices. Most trades do not alter the dynamics, but when volume coincides and accumulates, a single signal can trigger a chain reaction and evolve into a global change of trend. Market crashes are lawful, though rare, fluctuations of a system in a critical state. The market constantly balances on the edge between stability and chaos, where the accumulation of imperceptible changes can suddenly produce a reconfiguration and a consequent collapse or rally. Therefore markets cannot be predicted exactly, but the probability of large moves can be assessed from signs of concentration of volume and volatility.

The fractal structure of markets arises spontaneously from the interaction of many participants, each acting on limited information and private aims. The market self-organizes into a critical state without external parameter tuning. This demonstrates the universality of self-organizing mechanisms that operate in systems of different natures: physical, biological, economic, social.

All this composes the logic of fractal determinism. The world is not predetermined in the Laplacean sense. The future is not encoded in initial conditions as an explicit plan awaiting execution. Yet the world inevitably unfolds according to its own internal links, feedbacks, path-dependencies, and scale-invariances. Chance is not the opposite of necessity but a form of its manifestation. Any deviation is built into the overall web of causes as a lawful fluctuation of a self-organizing system.

The synergetic aspect appears here as continuous self-renewal and self-organization. The Cosmos, understood fractally, constantly reproduces its own differences and generates new forms from the interaction of already existing ones.

Thus everything that exists is not merely the result of external causes but an active mode of being participating in its own self-determination. Human beings, like any other systems, are links in a large recursive pattern in which inner and outer, cause and effect, subject and object cease to be rigid oppositions. A choice deemed “free” is the outcome of the most complex interaction of genetic, hormonal, neural, and environmental factors operating across multiple temporal scales. In sum, not only human behaviour but all processes in nature and society are the results of multilayered, recursive causality in which each event is determined by the system’s entire prior history and by the whole network of current interactions.

This perspective naturally leads to a form of cosmic fatalism. Everything happens as it can happen because other outcomes are impossible within the given configuration of causal links. Every possibility is already the realization of one of the fractal directions of a system’s development, and any deviation is a lawful consequence of internal interaction. This fatalism does not exclude novelty; on the contrary, it makes novelty a necessary effect of complex interactions. The new does not arise in spite of determinism; it is its direct manifestation, a form born of the interplay among many causal threads, folds, and irregularities of being.

Therefore the idea of freedom, if it is possible at all, can only be understood as the recognition of one’s inclusion in an endless process of determination. Freedom is not the capacity to act independently of causes; it is rather an understanding of the causes that operate through us.

Fractal determinism unites self-motion, chance, necessity, and recursion into a single ontological schema in which everything – matter, consciousness, event – turns out to be different levels of the same self-generating process. There is no external observer, no position outside or above being from which one could judge it. Everything that exists is internal to being; the external is merely a shift of scale, a transition to another level of organization that likewise remains internal. Hence every difference is an internal differentiation of being with itself – a differentiation of the unified process into many local processes, each of which reflects the whole.

All that occurs is the infinite self-diffusion of being within its internal dimensions, the unfolding of potential differences into actual forms. If one regards such determinism as fatalism, it is a fatalism not of predestination by an external will or divine design but of the internal inevitability of a self-generating process. It is the fatalism of infinite self-similarity, in which every act repeats the structure of the whole, reproduces it at its own scale, yet never coincides with it completely, always adding a new difference, a new fold in the fabric of being.

This view has profound implications for the understanding of human existence. If our actions are determined not by a linear chain of causes but by a fractal chain of interactions, then the question of responsibility must be reformulated. We bear responsibility not because we could have chosen otherwise in some metaphysical sense, but because we are part of a causal network through which further trajectories of the system are realized – a system that already possesses, in its own terms, a predetermined end12. Our actions have consequences that propagate through networks of connection and affect future states. In this sense we are responsible as active participants in the process of self-organization, not as free agents standing outside causality.

Sapolsky emphasized that understanding the biological determinism of behaviour requires a revision of systems of punishment and reward. Fractal determinism extends this conclusion. If action and behaviour are properties of a multiplicity of interacting factors operating across different spatio-temporal scales, then the point of intervention must shift from the individual to the environment. We are not dealing with a “bad person” but with a person in a bad system (a bad society). This idea is taken to its ethical limit by Dietrich Bonhoeffer, who argued that society bears guilt for crime not merely because it failed to prevent it, but because its indifference produced it.

Fractal determinism, therefore, is not merely an update of classical determinism in the light of contemporary science, but a rethinking of the very nature of causality. It shows that determinism is compatible with unpredictability, necessity with contingency, lawfulness with novelty. It integrates the findings of quantum mechanics, chaos theory, self-organization theory, and complex-systems theory into a single philosophical picture in which being is understood as a self-generating, self-organizing process possessing an internal necessity that manifests across all scales through self-similarity.

Cosmic Pessimism

The Birth of the Universe from Nothing

In The Experience of the Tragic I intentionally set aside the question of the formal origin of cosmological eventfulness. It lay outside the project of the book and threatened to expand into an unwieldy survey of specific cosmological scenarios. Now, however, we can allow ourselves to pay closer attention to this topic.

The material Universe rarely becomes the explicit subject of contemporary pessimism. And when it does, such works usually leave no real trace of the Universe or cosmology. Of course, the Universe is studied by science, but science is not concerned with the character of the Universe as a place or with its relation to life – that is not its task. And yet it is science that inadvertently pushes philosophy toward transformation and toward a new self-definition.

Today, the philosophy of pessimism, it seems to me, faces an urgent problem: to comprehend the nihilistic character of the Universe itself. Moreover, all the conclusions of modern physics point either to scepticism or to direct pessimism regarding the Universe as such. The Universe, like the world as a whole, is not indifferent to our existence – it is actively hostile to it.

The early pessimists of the late nineteenth century, while not speaking directly of the Cosmos, simply lacked sufficient knowledge about it. What was there to say when the expansion of the Universe was only announced in 1929? With the rise of philosophical nihilism in the twentieth century and the rapid development of the sciences, the idea of “indifference” – of the nihilistic nature of the Cosmos and the meaninglessness of life and of all living things – prevailed. At least, virtually all contemporary pessimistic works and authors repeat this mantra.

Philipp Mainländer perhaps came closest to cosmic pessimism: he distinguished in the world a tendency toward destruction and self-negation. Yet, alas, his metaphysics and the spirit of his age prevented him from thinking about the Cosmos impartially – just as Schopenhauer’s notion of the “Will” hindered Schopenhauer himself.

In the twentieth century Peter Wessel Zapffe, reflecting on the re-equipment of our reason, argued that this very re-equipment makes us especially vulnerable to an indifferent, “silent” nature. We are forced to invent defensive mechanisms for reason itself. Subsequent pessimists of the twentieth and twenty-first centuries – from Cioran to Ligotti, from Benatar13 to others – continue this line: the Cosmos is taciturn and indifferent.

However, there are strands of pessimism that reject the nihilistic reading of the Cosmos. Thus, in the Russian-speaking community EFILism it is asserted14:

“The Universe was not created by an intelligent Being. Rather, its arising is conditioned by the fundamental necessity of reality itself. Perhaps the state of “nothing” (the absolute absence of everything) is logically impossible. Something must always exist.

The primordial state was not a true “nothing” but a quasi-stasis. Think of it as a potentially unstable equilibrium in which events were temporarily absent, while the intrinsic properties of reality (nomological laws) already contained the potential to disturb that calm. Put simply, the Universe is not a “creation” but a breach or disruption of an original, albeit fragile, equilibrium that occurred thanks to internal, necessary laws. It is like the “division of zero by zero” – not something someone did, but something that happened because of the fundamental properties of being itself. Just as gravity simply is, so too the existence of the Universe may be equally fundamental and require no external cause. But note an important point: most of these conclusions rest on the postulate of the Universe’s indifference – on the phenomenology of human experience of that “background,” rather than on detailed cosmology or an analysis of the world’s telos.

After that “beginning” the Universe does not move toward any goal or plan. It exists and unfolds as a purposeless but strictly deterministic chaos. It is an endless process of decay, recombination, and motion of matter and forces. There is no “plan,” “purpose,” or “meaning” in this dynamics. Events simply “happen” in accordance with unchanging physical laws and inertia. Imagine billions of falling dominoes where each fall is perfectly determined by the previous one, yet the entire chain has no final aim other than simply being. Life, including consciousness, is not some special, necessary, or “desirable” part of the Universe. It is merely a temporary, local “mutant of chaos” – an immensely complex but ephemeral configuration of matter and forces.”

Here the line of thought is strictly pessimistic, and the phrase “fundamental necessity” points toward a deterministic orientation. Yet contradictions soon appear, returning us once again to natural nihilism and the supposedly indifferent orientation of the cosmos. Let us return, however, to the question: why are works devoted specifically to “cosmic pessimism” either nonexistent or given that title only in a speculative sense?15 The idea of an indifferent Universe has become so familiar, even clichéd, that no one bothers to challenge it. Attention is focused on the “indifferent” role of the cosmos, while the direction of thought itself remains oriented toward the human being and the human place in the Universe. Such works are, of course, closer and more engaging to the reader, but this does not justify their titles. There are many reasons for this: a lack of philosophical interest in the cosmos, the Wittgensteinian principle that one should remain silent about what cannot be spoken of, and so on. But one of the key “problems” behind the absence of works on cosmic pessimism – apart from the difficulty of the topic itself – is, in my view, the unexamined and erroneous assumption of the cosmos’s indifference to life, from which further premises follow.

I will now try to show that the Universe has a direction and that this direction can, with some caution, be given the anthropocentric label “meaning” – though it is more accurate to call it a “process.” That the cosmos is, in fact, not quite so indifferent. And that in philosophy, especially in existential philosophy, it should be discussed not within the framework of nihilism, but strictly within the framework of cosmic pessimism.

bannerbanner