GLM-130B: An open 130B-parameter bilingual transformer for high-performance research and real-world NLP
GLM-130B is a 130-billion-parameter bilingual (Chinese–English) transformer-based General Language Model from THUDM/THUKEG, released as an open model for academic research and certain commercial use. Trained on 400B tokens with the GLM library, it delivers strong results on a range of NLP benchmarks and ships with downloadable checkpoints and inference/deployment code for efficient multi-GPU and mixed-precision serving.
Your rating helps others discover the best AI tools.
Please sign in to rate this tool.
Automated Tiny ML Platform
Simplify Your Privacy Policies with ParsePolicy AI
Streamline Your Machine Learning Workflow with Azure Machine Learning
Microsoft Cognitive Toolkit (CNTK) - Advanced Deep Learning Made Easy
Unlock the Power of Your Data with SAS Visual Data Mining and Machine Learning
IBM Dromedary: An open-source, self-aligned LLM for helpful, ethical, and reliable AI.