Greedy-DiM: Greedy Algorithms for Unreasonably Effective Face Morphs

Clarkson University
IJCB 2024 (Spotlight)
Morph visual comparison

We propose Greedy-DiM a novel method which uses greedy methods to perform identity guided generation. Our proposed method has unrivaled morphing performance and visual fidelity fooling the studied FR systems 100% of the time. We compare our morphs to those produced by other DiM methods.

Abstract

Morphing attacks are an emerging threat to state-of-the-art Face Recognition (FR) systems, which aim to create a single image that contains the biometric information of multiple identities. Diffusion Morphs (DiM) are a recently proposed morphing attack that has achieved state-of-the-art performance for representation-based morphing attacks. However, none of the existing research on DiMs have leveraged the iterative nature of DiMs and left the DiM model as a black box, treating it no differently than one would a Generative Adversarial Network (GAN) or Variational AutoEncoder (VAE). We propose a greedy strategy on the iterative sampling process of DiM models which searches for an optimal step guided by an identity-based heuristic function. We compare our proposed algorithm against ten other state-of-the-art morphing algorithms using the open-source SYN-MAD 2022 competition dataset. We find that our proposed algorithm is unreasonably effective, fooling all of the tested FR systems with an MMPMR of 100%, outperforming all other morphing algorithms compared.

The face morphing attack attempts to fool a Face Recognition (FR) system by producing a single image which registers a false accept with both original identities. Previous face morphing attacks used landmark-based methods which align and warp the two faces before performing a pixel-wise average between the two faces. Such techniques are prone to considerably artefacts, especially outside the center of the face. Conversely, Generative Adversarial Network (GAN) based morphs produce more realistic looking faces but their performance, but their effectiveness in fooling the FR system leaves much to be desired. DiMs outperform GAN-based morphs; however, they still lag behind landmark-based morphs in terms of fooling an FR system. We propose Greedy-DiM* a family of algorithms which leverage a simple greedy strategy to massively improve the effectiveness of DiM while still retaining the high visual fidelity of DiM.

Example morphed image produced by Greedy-DiM

Proposed Face Morphing Method (Greedy-DiM*)

We propose a simple greedy strategy on the steps of the numerical ODE solver in diffusion models. During each timestep Greedy-DiM* performs a search using gradient descent to find the optimal predicted noise w.r.t. some heuristic function. This simple greedy strategy allows us to cheaply incorporate identity guidance while massively improving the effectiveness of the created morphs. Moreover, we prove that Greedy-DiM takes the correct step at each timestep, ie., Greedy-DiM is globally optimal.

Overview of Greedy-DiM method

Overview of DiM Overview of a single step of the Greedy-DiM* algorithm. Proposed changes highlighted in green.

Comparison of DiM methods

Comparison of DiM Algorithms Comparison of all current DiM methods to Greedy-DiM.

Justification for the Unreasonable Perfomance of Greedy-DiM

The search space of Greedy-DiM* is well-posed whereas previous attempts are ill-posed. We prove that the probability that the search space of Morph-PIPE contains the optimal solution is 0 and that the probability the search space of Greedy-DiM* contains the optimal solution is 1.

Fast-DiM MMPMR

Comparison of Search Spaces Illustration of the search space of different DiM algorithms. Purple denotes Morph-PIPE/Greedy-DiM-S, red denotes Greedy-DiM-S continuous, and green denotes Greedy-DiM*. Note the search spaces of all algorithms other than Greedy-DiM* lie in a low dimensional manifold.

Poster

BibTeX


        @INPROCEEDINGS{blasingame_greedy_dim,
              title={Greedy-DiM: Greedy Algorithms for Unreasonably Effective Face Morphs},
              booktitle={2024 IEEE International Joint Conference on Biometrics (IJCB)}, 
              author={Blasingame, Zander W. and Liu, Chen},
              year={2024},
              pages={1-10},
              url={https://arxiv.org/abs/2404.06025}, 
        }