某不知名第三方平台跑Koyha炼丹LoRA模型

问题多多…

推荐手动安装torch和xformers,国内机器走安装流程完全是浪费时间,除非你的平台提供加速服务
使用的是Pony Realism模型,Civitai无法直连,只能通过国内网站的中转,这里用的是:https://www.liblib.art/modelinfo/d172a234d21d48e29b81ed61eb3dfd65?from=search
clip-vit-large-patch14、CLIP-ViT-bigG-14-laion2B-39B-b160k的问题,这里也是国内中转,用的modelscope
https://modelscope.cn/models/AI-ModelScope/clip-vit-large-patch14/files
https://modelscope.cn/models/AI-ModelScope/CLIP-ViT-bigG-14-laion2B-39B-b160k/files
下载后随意放置,改sdxl_train_util.py,指定路径,1.5就改train_util.py

def load_tokenizers(args: argparse.Namespace):
    logger.info("Preparing tokenizers")

    # Define a list of original paths
    original_paths = ['/root/lanyun-tmp/lora-scripts/huggingface/hub/models/clip-vit-large-patch14']
    tokeniers = []

    for i, original_path in enumerate(original_paths):
        tokenizer = None

        # Check if there is a cache directory for the tokenizer
        if args.tokenizer_cache_dir:
            local_tokenizer_path = os.path.join(args.tokenizer_cache_dir, original_path.replace("/", "_"))
            if os.path.exists(local_tokenizer_path):
                logger.info(f"Loading tokenizer from cache: {local_tokenizer_path}")
                tokenizer = CLIPTokenizer.from_pretrained(local_tokenizer_path)

        # If tokenizer is not loaded from cache, load from the original path
        if tokenizer is None:
            logger.info(f"Loading tokenizer from: {original_path}")
            tokenizer = CLIPTokenizer.from_pretrained(original_path)

        # Save the tokenizer to cache if specified and not already cached
        if args.tokenizer_cache_dir and not os.path.exists(local_tokenizer_path):
            logger.info(f"Saving tokenizer to cache: {local_tokenizer_path}")
            tokenizer.save_pretrained(local_tokenizer_path)

        # Fix pad token ID for the second tokenizer in the list
        if i == 1:
            tokenizer.pad_token_id = 0  # Fix pad token ID to match OpenCLIP tokenizer

        tokeniers.append(tokenizer)

    # Update max token length if specified in the arguments
    if hasattr(args, "max_token_length") and args.max_token_length is not None:
        logger.info(f"Updating token length: {args.max_token_length}")

    return tokeniers

留下评论

您的邮箱地址不会被公开。 必填项已用 * 标注