GLM-4 series: Open Multilingual Multimodal Chat LMs | 开源多语言多模态对话模型
-
Updated
Aug 5, 2024 - Python
GLM-4 series: Open Multilingual Multimodal Chat LMs | 开源多语言多模态对话模型
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4
Use PEFT or Full-parameter to finetune 300+ LLMs or 50+ MLLMs. (Qwen2, GLM4v, Internlm2.5, Yi, Llama3.1, Llava-Video, Internvl2, MiniCPM-V-2.6, Deepseek, Baichuan2, Gemma2, Phi3-Vision, ...)
Accessing the GLM-4 translation PDF e-book in the specified language as an MD file using the SDK method.
Add a description, image, and links to the glm4 topic page so that developers can more easily learn about it.
To associate your repository with the glm4 topic, visit your repo's landing page and select "manage topics."