0

I'm trying to replicate blending two images in Photoshop with CIFilters in Swift. Let's say I want to blend img1 and img2 with softLight, then in Swift I can do:

let blendSoftLight = CIFilter.softLightBlendMode()
blendSoftLight.backgroundImage = img1
blendSoftLight.inputImage = img2

This works and I'll get the same result as in Photoshop. But in Photoshop you can blend with opacity, so if I want to blend img2 and img1 with soft light and 50% opacity - is this possible with CIFilter in Swift?

I have tried different methods, like blending img1 and img2 as above, and then blend the resulting image with the original image with CIFilter.blendWithAlphaMask(), with a 50% alpha mask. But the result is of course not the same as the blendWithAlphaMask only does a "normal" blend, it's not using soft light blend mode.

1 Answer 1

1

From the docs of the CISoftLightBlendMode filter it appears the only parameters it takes are inputImage and inputBackgroundImage. It doesn't appear that it lets you vary the opacity of the 2 images that it blends together. You might be able to apply an alpha channel to one or the other image before blending in order to change its opacity.

1
  • You might also be able to use multiple filters to simulate the effect of using opacity on the layers in PS.
    – Duncan C
    Commented Oct 23, 2023 at 13:09

Not the answer you're looking for? Browse other questions tagged or ask your own question.