HuggingFace学习笔记

HuggingFace学习笔记

2023年12月2日—from_pretrained(Langboat/bloom-1b4-zh,low_cpu_mem_usage=True).#基于bitfit只训练带有bias的参数.forname,paraminmodel.named_parameters ...。其他文章還包含有:「Bloom1b4ZhByLangboat」、「Langboatbloom」、「Langboatbloom」、「Langboatbloom-389m-zh」、「LangboatMengzi」、「P-Tuning和Prefix」、「README.md」、「多模态大模型-」

查看更多 離開網站

Provide From Google
Bloom 1b4 Zh By Langboat
Bloom 1b4 Zh By Langboat

https://llm.extractum.io

Details and insights about Bloom 1b4 Zh LLM by Langboat: benchmarks, internals, and performance insights. Features: 1.4b LLM, VRAM: 5.6GB, ...

Provide From Google
Langboatbloom
Langboatbloom

https://huggingface.co

This model is based on bigscience/bloom-1b7. We pruned its vocabulary from 250880 to 46145 with Chinese corpus to reduce GPU memory usage.

Provide From Google
Langboatbloom
Langboatbloom

https://huggingface.co

This model is based on bigscience/bloom-560m. We pruned its vocabulary from 250880 to 42437 with Chinese corpus to reduce GPU memory usage.

Provide From Google
Langboatbloom-389m-zh
Langboatbloom-389m-zh

http://zoo.bimant.com

This model is based on bigscience/bloom-560m. We pruned its vocabulary from 250880 to 42437 with Chinese corpus to reduce GPU memory usage.

Provide From Google
LangboatMengzi
LangboatMengzi

https://github.com

基于中文语料对多语言版本进行裁剪的BLOOM 模型,降低了对显存的需求, HuggingFace. BLOOM-1b4-zh, 1400M, 文本续写类任务, 基于中文语料对多语言版本进行裁剪的BLOOM 模型 ...

Provide From Google
P-Tuning和Prefix
P-Tuning和Prefix

https://blog.csdn.net

from_pretrained(Langboat/bloom-1b4-zh, low_cpu_mem_usage=True). peft_model = PeftModel.from_pretrained(model = model, model_id = /tmp_1203 ...

Provide From Google
README.md
README.md

https://github.com

BLOOM-389m-zh, 389M, 文本续写类任务, 基于中文语料对多语言版本进行裁剪的BLOOM 模型,降低了对显存的需求, HuggingFace. BLOOM-800m-zh, 800M, 文本续写类任务, 基于 ...

Provide From Google
多模态大模型-
多模态大模型-

https://zhuanlan.zhihu.com

/tokenize/Langboat/bloom-389m-zh) def process_func(example): MAX_LENGTH = 256 input_ids, attention_mask, labels = [], [], [] instruction = tokenizer(-n.