- Messages
- 695
- Reaction score
- 1,042
- Trophy Points
- 118
I tested using Stable Diffusion to generate still images for color grading reference, and it works well. Below is a tutorial with screenshots and links:
This process uses:
1. Stable Diffusion A1111 (A self contained version of Stable Diffusion for a Windows PC). Much more powerful than using the online version.
2. Davinci Resolve - for the "SHOT MATCH TO THIS CLIP" function in the color grading panel. You can use a still image (from Stable Diffusion) as a clip in Resolve.
To install and use Stable Diffusion A1111, your PC should run Windows 10 or higher with a discrete Nvidia video card (GPU) with 4 GB VRAM or more. I've heard it can be installed onto a Macbook too, but I don't have experience trying that.
A very nice person made an automated installer for A1111 (if the installer freezes, just start it again and it will probably finish).
The online version of Stable Diffusion v1 is a viable alternative if you don't want to install A1111: https://huggingface.co/spaces/stabilityai/stable-diffusion-1
Note: you will not be able to generate content from a picture, only text. The latest online Stable Diffusion v2.1 is not recommended for generating artistic content, i.e., with artist styles or retro styles.
Here is a great beginners guide for using Stable Diffusion A1111 Web UI:
This is how I color graded a clip from T2 using Stable Diffusion content:
First, I needed a screenshot of the T2 clip to load into A1111. I removed the teal from the clip using Resolve's color warper in the color grading panel. Teal removed to avoid Stable Diffusion incorporating the teal look.
Resolve color warper tutorial:
A1111 can use different model files to generate images. I did not use the basic Stable Diffusion 1.5 model, instead I used realisticVision v1.3. You can use any model you want. See the A1111 guide above for installing additional models.
Here's the clip (teal removed) I loaded into the img2img panel of A1111 on the left and the generated output of a "retro" look on the right:
The img2img panel of A1111 lets you use a picture as a base to make a new picture, keeping it very similar or radically changing it. You can use text to tweak the output. In the lower right of the screenshot you can see all the parameters I used to generate the output, even including the image dimensions. It doesn't matter if the output looks a little weird or has completely different looking people as long as the coloring and picture has a retro look.
I added the following prompt in to give the base image on the left the vintage look I wanted:
dark, night, blue, scene from The Terminator 1984, photorealistic movie screenshot, (1985 movie aesthetics:1.5), shot on Eastman 250T 5293 film using Arriflex 35 III Camera
And I used the following negative prompt (stuff I didn't want in the output):
duplication, fake, cartoon, digital painting, text
I imported the Stable Diffusion output image to my timeline and matched an original shot below (without any teal removal) to the Stable Diffusion output images.
Shot match tutorial:
Here's the before and after (I tweaked the contrast and coloring a bit after matching the shot):
So there you have it: Stable Diffusion can be used to generate retro style images which in turn then can be used for grading in Resolve.
: )
This process uses:
1. Stable Diffusion A1111 (A self contained version of Stable Diffusion for a Windows PC). Much more powerful than using the online version.
2. Davinci Resolve - for the "SHOT MATCH TO THIS CLIP" function in the color grading panel. You can use a still image (from Stable Diffusion) as a clip in Resolve.
To install and use Stable Diffusion A1111, your PC should run Windows 10 or higher with a discrete Nvidia video card (GPU) with 4 GB VRAM or more. I've heard it can be installed onto a Macbook too, but I don't have experience trying that.
A very nice person made an automated installer for A1111 (if the installer freezes, just start it again and it will probably finish).
GitHub - EmpireMediaScience/A1111-Web-UI-Installer: Complete installer for Automatic1111's infamous Stable Diffusion WebUI
Complete installer for Automatic1111's infamous Stable Diffusion WebUI - EmpireMediaScience/A1111-Web-UI-Installer
github.com
The online version of Stable Diffusion v1 is a viable alternative if you don't want to install A1111: https://huggingface.co/spaces/stabilityai/stable-diffusion-1
Note: you will not be able to generate content from a picture, only text. The latest online Stable Diffusion v2.1 is not recommended for generating artistic content, i.e., with artist styles or retro styles.
Here is a great beginners guide for using Stable Diffusion A1111 Web UI:
Stable Diffusion WebUI AUTOMATIC1111: A Beginner's Guide - Stable Diffusion Art
Stable Diffusion WebUI (AUTOMATIC1111 or A1111 for short) is the de facto GUI for advanced users. Thanks to the passionate community, most new features come
stable-diffusion-art.com
This is how I color graded a clip from T2 using Stable Diffusion content:
First, I needed a screenshot of the T2 clip to load into A1111. I removed the teal from the clip using Resolve's color warper in the color grading panel. Teal removed to avoid Stable Diffusion incorporating the teal look.
Resolve color warper tutorial:
A1111 can use different model files to generate images. I did not use the basic Stable Diffusion 1.5 model, instead I used realisticVision v1.3. You can use any model you want. See the A1111 guide above for installing additional models.
Here's the clip (teal removed) I loaded into the img2img panel of A1111 on the left and the generated output of a "retro" look on the right:
The img2img panel of A1111 lets you use a picture as a base to make a new picture, keeping it very similar or radically changing it. You can use text to tweak the output. In the lower right of the screenshot you can see all the parameters I used to generate the output, even including the image dimensions. It doesn't matter if the output looks a little weird or has completely different looking people as long as the coloring and picture has a retro look.
I added the following prompt in to give the base image on the left the vintage look I wanted:
dark, night, blue, scene from The Terminator 1984, photorealistic movie screenshot, (1985 movie aesthetics:1.5), shot on Eastman 250T 5293 film using Arriflex 35 III Camera
And I used the following negative prompt (stuff I didn't want in the output):
duplication, fake, cartoon, digital painting, text
I imported the Stable Diffusion output image to my timeline and matched an original shot below (without any teal removal) to the Stable Diffusion output images.
Shot match tutorial:
Here's the before and after (I tweaked the contrast and coloring a bit after matching the shot):
So there you have it: Stable Diffusion can be used to generate retro style images which in turn then can be used for grading in Resolve.
: )
Last edited: