Movies. Film. Cinema. Motion Pictures. Some people who want to sound fancy make a distinction between these terms, but they are pretty much interchangeable. This page describes some basic terms for formal analysis of film.
Films are created by a team of people with highly specialized jobs. While the director has a lot of influence over a film, so too do the screenwriter(s), the production designer, the editor, and the producers. Scholars therefore often use the generic terms filmmaker or filmmakers in formal analysis to represent the collective authorship of a film.
Still. The still image is the most fundamental technical building block of film.
When we see images moving on a movie screen, we are not actually seeing movement. We are seeing a quick succession of still images that our brains interpret as movement.
This zoopraxiscope demonstrates the phenomenon known as apparent motion. When the disk is spun at an appropriate speed, the donkey seems to humans like it is in motion.
A shot is an unedited series of still images.
When a filmmaker films multiple versions of a shot, those are called takes. The film's editor will choose which take (or takes) to use in the film.
A scene is a sequence of shots edited together.
No matter a film's verisimilitude (or apparent realness), it is always a mediation of reality. The filmmaker focuses the audience's attention, manipulating the composition of individual frames and shots, as well as time and space. Form is the means by which a filmmaker focuses the audience's attention.
Film didn't just burst into the world fully formed. Over a few decades (1890s-1910s), early filmmakers created a visual language for storytelling. The conventions and patterns of this language are called cinematic language, or film grammar.
Compare the two clips below, for example.
Although film is an international medium, cinematic language is not entirely universal. As with any medium, conventions can differ by culture and time period. Compare the following clips, one from the 1990 American film Mermaids and the other from the 1959 Japanese film Good Morning.
Mise-en-scene (pronounced meez-on-SEN) is a theatre term that refers to the staging of a play; that is, all the physical elements on stage. In film, it refers to everything physical that appears on screen. A film's design can help to establish the time period, location, and character mood or emotions.
Movies can be filmed on a soundstage, on location, or a combination of both. A soundstage offers more control of the environment, particularly lighting and weather conditions. Location shooting can lend a film a heightened sense of realism (though not always).
Costumes, makeup, and hairstyling help to establish character personalities and emotions, as well as further the film's narrative.
The placement of all people, props, and lighting within the frame contributes to the shot's composition.
Framing the shot helps to focus reader attention. Just as a photographer or painter does, the filmmaker makes intentional choices about what to include in the frame and, sometimes more importantly, what not to include. Consider the following images of McCormick Hall taken by students during a Fall 2022 Introduction to Film class, for example. How does each image tell a story about McCormick Hall through framing?
Because film has a temporal quality, the composition of a shot can change. Reframing refers to the movement of the camera to change what appears on screen. Kinesis is movement within the frame. (The choreography of movement within the frame is called blocking.)
The composition of a shot can be categorized generally as open framed or closed frame. Compositions that are open framed include a fair amount of negative space and freedom for characters to move within and outside the frame. Closed framing feels claustrophobic, and characters feel trapped.
Action or sound in a scene can occur on-screen or off-screen.
The term cinematography refers to everything the camera is doing. The composition of a shot is created through a number of technical strategies. (Examples in this section are taken from Spike Lee's 1989 film Do the Right Thing.)
Film frames are, by convention, rectangular. A film's aspect ratio is the ratio of the film's width to height. There are several standard aspect ratios, but the two most common in the United States are Academy (1.375:1) and American widescreen (1.85:1).
The implied proximity of a shot refers to the implied distance between the camera and its subject. Common distances include the following:
Camera angles can create a variety of psychological and emotional effects. A low angle, for example, lends an air of authority or respect to the subject, while a high angle makes subjects seem smaller. Dutch (or canted) angles can suggest anger, fear, or other heightened emotions.
To give a sense of three dimensions (or depth) to a flat artistic medium, filmmakers often compose shots in multiple planes, including the foreground, the middle ground, and the background.
Camera movement adds another dimension to a shot. The term pan refers to the camera pivoting from side to side. Tilt refers to the camera pivoting up and down. In a tracking shot, the camera moves through space, on a dolly, a crane, or handheld.
Film stock comes in several formats (or gauges): 8mm, Super 8mm, 16mm, 35mm, 65mm, and 70mm. In general, the wider the film stock, the better the quality and the more expensive. 35mm is the most common format for commercial films.
The quality of the image is influenced by a number of factors, including the speed of the film stock.
Fast film is sensitive to light and is therefore appropriate for low-light situations. Because fast film stock has fewer areas that are sensitive to light, it tends to have a grainier look, especially when exposed in bright settings. Slow film stock is not as sensitive to light and therefore needs a lot of it. It tends to capture finer detail, however.
Lighting for a shot can come from natural and artificial sources. The sun is an obvious source for natural lighting. Artificial lights include spotlights, which throw hard lighting, and floodlights, which throw soft lighting. Light from a spotlight is focused, whereas light from a floodlight is more diffused.
The most common convention for lighting in feature films is three-point lighting, which includes a key light (the main source of lighting), a fill light (which fills in shadows created by the key light), and a back light (which defines the subject against the background).
In low-key lighting, there is high contrast between light and shadows, whereas in high-key lighting subjects are lit evenly without many (or any) shadows.
Camera lenses fall into two categories: prime and zoom. Prime lenses include three basic categories of focal lengths:
Short (or wide-angle) lenses (12.5-25mm) make objects in the frame seem farther apart than humans typically perceive.
Medium (or normal) lenses (35-50mm) approximate depth as perceived by humans.
Long (or telephoto) lenses (85-500mm) make objects in the frame seem closer together than humans typically perceive.
Zoom lenses are characterized by the ability to change focal lengths. Changing the focus of a continuous shot is called rack focus.
Filmmakers can combine rack focus and tracking (a technique called a push pull shot) by tracking forward at the same time the lens's focal length is shortened. This type of shot is often used to represent a character's surprise or disorientation.
Editing is the process of selecting and arranging components of a film in order. A film editor cuts and splices shots to clarify or manipulate the audience's understanding of the spatial and temporal relationships between them.
The Kuleshov effect explains how viewers make meaning from one shot to another: when two shots are juxtaposed (i.e., placed next to each other), audiences look for meaning in their pairing. The image on the left is an example. The shot of the man is the same in each pair of shots, but audiences perceive his expression differently based on the shot that follows.
Common editing techniques that establish temporal relationships between shots include flashback (which represents events in the past), flashforward (which represents events in the future), ellipsis (which indicates missing time), and montage (which compresses time).
The film editor also establishes the rhythm of a film through the duration (or length) of shots. The longer the shot, the more detail the audience is able to see. The slower the shot, the less detail the audience can see.
Editing styles can be categorized generally as continuous or discontinuous.
In continuity editing (sometimes called invisible editing), the film editor tries to give the impression that all the shots in a scene are taking place in the same physical space and in continuous time. Audiences are not even supposed to notice the editing. Continuity editing is used in many narrative films and is designed to help audiences follow the action on screen. Graphic, spatial, and temporal relations are maintained from shot to shot
In the master scene technique, the entire scene is filmed in one long shot (a master shot) to establish blocking, kinesis, and dialogue for the whole scene. The filmmakers then film a variety of shots (close ups, medium shots, etc.) that will be intercut with the master shot in editing to focus audience attention and create visual variety and rhythm.
The 180-degree rule is the most common feature of continuity editing because it ensures that action and movement on screen are consistent. By making sure that the camera stays on one side an imaginary line called the axis of action, filmmakers ensure that audiences can keep track of the action and movement on screen. There are a few cases in which filmmakers may decide to cross the axis of action to create a specific effect, but the 180-degree rule is the norm in continuity editing.
Other common techniques in continuity editing include the following:
Shot / reverse shot is a common technique used to edit a conversation or character reactions. For audiences to understand the interaction, shots must be filmed on the same side of the imaginary axis of action, often over the shoulder of one character.
In an eyeline match, the first shot shows a character looking offscreen, while the second shot shows what the character is looking at. The shots must be filmed on the same side of the imaginary axis of action so that the audience understands where the character is looking.
A match on action shows continuous motion from shot to shot. This type of transition can be as simple as switching from a long shot to a medium shot of the same action, but it can also be used to establish a new axis of action in a scene.
In a match on graphic, a shape or graphic image (or sound) in one shot is echoed in the next shot.
A point of view (POV) shot shows the perspective of a character or group of people.
Discontinuity editing, which was pioneered in the 1920s by the Soviet montage filmmakers, does not attempt to give the impression that shots represent the same time and place. In fact, discontinuity editing brings attention to itself as editing. In this type of editing, filmmakers rely on the Kuleshov effect. Viewers must make their own meaning from shots that do not seem to take place in the same time and space.
Other common editing transition types that can be used in any type of editing include:
Sound can be defined by four primary characteristics:
The pitch of a sound refers to its frequency. In music, pitch is indicated by notes, which can be high or low.
The amplitude of a sound is also known as its volume, or loudness.
A sound's quality, or timbre, is its harmonic content. This is how listeners can tell the difference between different sources of sound, such as different musical instruments.
The fidelity if a recorded sound is its faithfulness to the original sound.
Film sound can have many sources. Sound that has a source in the world of the film and can be heard by the characters is called diegetic. Examples of diegetic sound include dialogue and sound effects. Nondiegetic sound has its source outside the world of the film and cannot be heard by the characters. Nondiegetic sound may include included a musical soundtrack or voiceover narration.
When the source of a diegetic sound is visible in the frame, it is called on-screen. When the source is not visible, it is called off-screen.
Internal sound refers to the inner thoughts of a character that can be heard by the film's audience but not the other characters. External sound is audible to the characters in a scene.
Types of sound in a film may include dialogue, narration, music, and silence. Environmental sounds may include ambient sound (such as background noises), sound effects (which can be recorded from their natural sources), and Foley sounds (which are created by sound artists).
There are several good introductory textbooks on film analysis. Richard Barsam and Dave Monahan's Looking at Movies: An Introduction to Film, 5th ed., was the primary source for this page.
Want to learn more? Check out these video tutorials from Studio Binder: