Langboat bloom:HuggingFace学习笔记
HuggingFace学习笔记
Bloom 1b4 Zh By Langboat
https://llm.extractum.io
Details and insights about Bloom 1b4 Zh LLM by Langboat: benchmarks, internals, and performance insights. Features: 1.4b LLM, VRAM: 5.6GB, ...
Langboatbloom
https://huggingface.co
This model is based on bigscience/bloom-1b7. We pruned its vocabulary from 250880 to 46145 with Chinese corpus to reduce GPU memory usage.
Langboatbloom
https://huggingface.co
This model is based on bigscience/bloom-560m. We pruned its vocabulary from 250880 to 42437 with Chinese corpus to reduce GPU memory usage.
Langboatbloom-389m-zh
http://zoo.bimant.com
This model is based on bigscience/bloom-560m. We pruned its vocabulary from 250880 to 42437 with Chinese corpus to reduce GPU memory usage.
LangboatMengzi
https://github.com
基于中文语料对多语言版本进行裁剪的BLOOM 模型,降低了对显存的需求, HuggingFace. BLOOM-1b4-zh, 1400M, 文本续写类任务, 基于中文语料对多语言版本进行裁剪的BLOOM 模型 ...
P-Tuning和Prefix
https://blog.csdn.net
from_pretrained(Langboat/bloom-1b4-zh, low_cpu_mem_usage=True). peft_model = PeftModel.from_pretrained(model = model, model_id = /tmp_1203 ...
README.md
https://github.com
BLOOM-389m-zh, 389M, 文本续写类任务, 基于中文语料对多语言版本进行裁剪的BLOOM 模型,降低了对显存的需求, HuggingFace. BLOOM-800m-zh, 800M, 文本续写类任务, 基于 ...
多模态大模型-
https://zhuanlan.zhihu.com
/tokenize/Langboat/bloom-389m-zh) def process_func(example): MAX_LENGTH = 256 input_ids, attention_mask, labels = [], [], [] instruction = tokenizer(-n.