Inductive Cinema

    Is cinema invented or discovered?
Induction and deduction are the two sides of the process of giving meaning to experience: we induce a general rule from a set of cases, and we deduce from the previously established rules new possible cases. These are examples of inference: the movement of thoughts from the premises to the conclusion of a logical reasoning.

In the sciences, induction and deduction can correspond, respectively to discovery and invention. Are mathematics discovered - which is to say generalized from cases observed to be true, or invented - rules established first and tautologically ratified by the results of their application?

For my purposes, I will consider cinema as a mode of making meaning from experience. As such, the methods by which cinema is produced can be said to lie on the spectrum of the hypothetico-deductive method. Throughout the history of cinema, filmmakers have proceeded along the two tracks of deduction and induction, and though what I will define as inductive cinema has always existed, the story of XXth century cinema is one in which the deductive mode becomes the dominant form of filmmaking.

    Deductive cinema
By deductive filmmaking, I mean something much like “invention” in the sciences - established methods of cinematic fiction generally proceed by establishing a set of rules, a script, a set of locations and actors and costumes, storyboards, dialogues, before any of the filming actually occurs. As much of the film as possible must be made before any image is committed to a sensor of any kind.

Some structural cinema, video art of the conceptual times, a part of Fluxus can be said to have stretched to its logical extent the project of deductive filmmaking. Consider the tradition of flicker films, Kubelka’s Arnulf Rainer or Tony Conrad’s The Flicker. More than merely deductive, these films are programmatic; the existence of the film itself, the material support of inscription of light, becomes superfluous, and the algorithm that generates the patterns of inscription becomes the core of the piece.

In a way, the flicker film is industrial cinema’s horizon, methodologically speaking. The best laid plans crumble on first contact with the enemy, and it sometimes becomes obvious during the first day on set that the inevitable gap between the plan and the reality it presupposed might be so large that the entire endeavor is at risk of collapsing upon itself. Producers who want to avoid such a fate must make sure that the film matches the plan. This has to do with the rise of films that can be described as one long visual effect, the Marvel and Avatar theme park rides, prevized to death, exhausted long before they are ever realized. The material apparatus of the realization itself, of actors in mocap suits, slung about from cranes, their faces photoscanned to be comped above CG doubles, intimates the instrumentalization of the bodies that will appear on the screen.

    Inductive Cinema
Inductive cinema, on the other hand, is concerned with the discoveries among the existing images - from the mass of particular pictures, patterns may arise. If deductive cinema put its efforts in the pre-production, the inductive mode focuses on post, studying the links, the distances, the meanings that emerge unexpectedly between the images. Despite strong directives against this methodology in certain types of normative production environments, documentaries tend to be produced inductively - the found footage film is perhaps the limit for inductive cinema, one that adopts an almost paranoid posture in which any image is liable to react to any other, giving rise to unpredictable new meanings.

Practically, most cinematic production traverses the spectrum, from a given intensity of premaking to a given affordance for post-producing. In the center lies production, which is, in a way, irrelevant for these exclusively formal considerations.
Out of pure spirit of contradiction, and a natural suspicion towards forms that present themselves as dominant, I would like to outline the requirements for the mode of filmmaking that presents itself as an alternative to the industrial model.
   

 INDUCTIVE / DEDUCTIVE

    I. Quantity is a Quality
The parameter that causes the shift from deduction to induction is quantity.
Scarcity invites causality: the few things that are there are there for a reason, they must interact in a given way. Generally, producers attempt to make the amount of footage generated for a film as close as possible to the amount that will be needed to complete it.
To try and make a discovery among the images, we need lots of them. This is especially the case when we are investigating images themselves - a discovery about images, among images.

    II. Heterogeneity
The material must be heterogenous, so as to introduce noise in the signal. In communication theory, noise is always simultaneously that which must be removed from a signal, and that which carries it to its destination. Sets that contain lots of similar images tend to diminish the amount of noise in the material - they carry a synthetic signal, ordained before transmission. By contrast, large datasets, synthetic and otherwise, can contain errors, false positives or negatives, unusable material, blurry bloopers where the boom is in the frame, which constitutes the noise of reality. The inductive filmmaker should first gather or construct a set that contains more than the thoughts inside their own head, and their work consists of weaving the rules of assembly of the set so that the thoughts buried in the material may be fed back to us.

INDUCTIVE CINEMA: Database and Latent

Inductive cinema, broadly defined above, can be further subdivided. This will allow us to describe and attempt to understand the specificity of contemporary neural network technologies, in opposition to the now more established processes of database cinema.

Around the turn of the millenium, the database presented itself as the new hegemonic symbolic form for computerized society. It stood in perfect bilateral opposition to the sequential narrative forms that had dominated the XXth century, particularly those of cinema. The database as a cultural form, heralded by the advent of the web, offered open-ended, spatial exploration of large sets of discrete items, variably ordered and organized; breaking free from the ordained, destinal and unchanging trajectories of narrative cinema.

The current “AI boom”, ongoing in the public eye since the release of ChatGPT in late 2022, has given rise to a new symbolic form for a society we may now describe as vectorized, rather than merely computerized: the latent space. The latent space descends from the database - it is the result of its extreme compression. Perhaps surprisingly, the answer to the question: “What to do with all the decaying memory rotting on the internet?” involves its pulverization by the processes of tokenization and diffusion, and these have yielded a set of media orders of magnitude larger than the training datasets. In other words, the latent space contains more data than the set that was used to produce it.

    III. Declension (Possibility) not Indiciality
    IV. Atemporality (Infinity) not Bound
In theory, it’s best if the films are infinite. It’s enough for them to be either too long to be watched entirely, or constantly recombined (this is much better than prerendered), so that the same material never produces the same associations.

Things think, it is known, and the process of inductive filmmaking can be thought of as that by which the thoughts of the moving images are revealed or otherwise transmitted. To see things think, one needs lots of different things, that can do lots of different things, and one must look at them for a long time.

    V. Continuity (Fuzziness) not Discreteness
    VI. Spatiality not Sequentiality
    VII. Statistic not Heuristic
    VIII. Correlation not Causation
    IX. Nebula, not hierarchy

Saussure - “Our thought-apart from its expression in words is only a shapeless and indistinct mass. [...] Without language, thought is a vague uncharted nebula. Here are no pre-existing ideas, and nothing is defined before the appearance of language.”

The latent space may give access to thought as such, to the vague and uncharted nebula. However, it is not the formless space that precedes language that we are given over to, but rather the one that follows language. To thread the astrophysical metaphor: Nebulae can be star-forming or supernova remains. Star-forming regions are are clouds of interstellar matter that is collapsing inwards, contracting into areas of extreme density where stars can form. When stars collapse at the end of their lives, they explode and release matter, forming expanding nebulae. The latent space better correlates to this latter kind of nebula: the matter it carries is hot and enriched with matter forged in the heat of the dead star. So too is a latent space enriched, superheated. It doesn’t lack language, but rather liquefies it, pulverizes it: it unresolves deixis (reference), configuring a territory of radical anti-indexicality. In a latent space, the particular, indexed media of the dataset are absent - it is their destruction that enables the production of a continuum of possibility, the space between the elements.



Copyright / Guillaume Menguy / 2025