We develop and align the GLM model series, including ChatGLM2-6/12/32/66/130B, CodeGeeX2, and VisualGLM. Both our open-source ChatGLM-6B model and its successor, the ChatGLM2-6B, have attained #1 rankings on GitHub Trending and Hugging Face Trending (28 days). These two models have been downloaded over five million times within a span of three months on Hugging Face. We extend a warm invitation to talents who wish to join us in our ambitious mission of “teaching machines to think like humans”. We are currently offering positions across various fields, including Natural Language Processing (NLP), Vision, Multi-Modality Research, and Data Science and Engineering, and we welcome applicants of all levels of expertise.
Responsible for the pre-training, alignment, instruction tuning, RLHF, evaluation of large language models, ideal Candidate for the NLP Researcher Position:
Candidates should hold either a Ph.D., or a master’s degree with a minimum of two years’ experience in a research role.
Possess strong coding skills, with proficiency in Python, C++, and Git, along with familiarity with at least one common training framework such as PyTorch or TensorFlow.
You should have a proven track record of published work within the specific domain, showcased at leading conferences like KDD, ICML, NeurIPS, ICLR, ACL, or similar.
Additional Preferred Qualifications:
Originator of influential open-source projects.
Achievements or awards from competitions in the areas of algorithms, machine learning, or natural language processing.
Join us as we continue to innovate and redefine the landscape of artificial intelligence.