improve quality when there is a lot of movement

Home Forums DeepFaceLab Training improve quality when there is a lot of movement

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #7942
    deepfakeclub
    Participant

      is there a way to improve the quality of the deepfake when there is a lot of motion
      1. sometimes the deepfake comes out really messy face’s fliping upside down etc
      2. somethimes the face doesnt even get picked up in the extraction so it will just show the original face when the deepfake is complete
      3. when the data_dst face turns the deepfake also gets really messy somethimes i know this is becaus of the source material however is there another way to prevent it from getting messy and losing quality/looking less realistic

      #8026
      deepanton
      Participant

        I cannot say sth for sure but i have two suggestions if i understood the problem that ypu are having
        – ypu should extract 12 images instead pf 24 to get rid of more blurry or bad images, also you should erase bad src face extract
        – it is motion blurry, there may be either model collapse issue if you selected wrong settings for the work or src is not good enough or diversed or small chance but your files in the internal may gor virused or spme phyton files may be deleted
        – if the original image is seen in the result video after merging, it is either not trained enough which means you should train more – if it continues, you have one solution – you have to delete that 1/24 image in premier pro by extending video into 24 frames

      Viewing 2 posts - 1 through 2 (of 2 total)
      • You must be logged in to reply to this topic.