Beijing Institute of Mathematical Sciences and Applications Beijing Institute of Mathematical Sciences and Applications

  • About
    • President
    • Governance
    • Partner Institutions
    • Visit
  • People
    • Management
    • Faculty
    • Postdocs
    • Visiting Scholars
    • Administration
    • Academic Support
  • Research
    • Research Groups
    • Courses
    • Seminars
  • Join Us
    • Faculty
    • Postdocs
    • Students
  • Events
    • Conferences
    • Workshops
    • Forum
  • Life @ BIMSA
    • Accommodation
    • Transportation
    • Facilities
    • Tour
  • News
    • News
    • Announcement
    • Downloads
About
President
Governance
Partner Institutions
Visit
People
Management
Faculty
Postdocs
Visiting Scholars
Administration
Academic Support
Research
Research Groups
Courses
Seminars
Join Us
Faculty
Postdocs
Students
Events
Conferences
Workshops
Forum
Life @ BIMSA
Accommodation
Transportation
Facilities
Tour
News
News
Announcement
Downloads
Qiuzhen College, Tsinghua University
Yau Mathematical Sciences Center, Tsinghua University (YMSC)
Tsinghua Sanya International  Mathematics Forum (TSIMF)
Shanghai Institute for Mathematics and  Interdisciplinary Sciences (SIMIS)
BIMSA > BIMSA Computational Math Seminar A Journey of Large Foundation Models in Science: Language, Multi-Modality and Agent
A Journey of Large Foundation Models in Science: Language, Multi-Modality and Agent
Organizers
Pipi Hu , Xin Liang , Zhiting Ma , Hamid Mofidi , Axel G.R. Turnquist , Li Wang , Fansheng Xiong , Shuo Yang , Wuyue Yang
Speaker
Renqian Luo
Time
Thursday, October 23, 2025 1:00 PM - 2:00 PM
Venue
Online
Online
Zoom 518 868 7656 (BIMSA)
Abstract
Large foundation models are transforming the landscape of scientific discovery by bridging natural language understanding, multimodal perception, and autonomous reasoning. This talk presents Dr. Luo's work of large pretrained language models in science, with a focus on their applications across biomedicine, materials science, and molecular design. Beginning with BioGPT, an early domain-specific generative transformer for biomedical text mining, we further introduce the developement of NatureLM, a generalist foundation model across diverrse domains in science, and analyze how scaling laws and prompt-based adaptation enable cross-domain generalization. We then move forward to multimodality in science and introduce UniGenX, a unified multimodal framework that represents both symbolic and numerical scientific data as sequences, supporting high-precision generation of 3D molecular and material structures. Finally, we discuss the Science Copilot Agent, a multi-agent system that integrates large models, scientific computation, and human feedback to assist end-to-end scientific workflows—from hypothesis generation to experimental design. Together, these advances illustrate the emerging paradigm of AI-accelerated science, where foundation models act not only as language models, but as universal scientific engines.
Speaker Intro
Dr. Renqian Luo was a Senior Researcher at Microsoft Research. He obtained his PhD from University of Science and Technology of China in 2021, in a joint program with Microsoft Research Asia. He has devoted himself into artificial intelligence research and development in machine learning, deep learning, natural language processing, large language model and foundation model. He has published many papers on top conferences or journals like NeurIPS, ICML, EMNLP, SIDKDD, ICASSP, Briefings in bioinformatics with 4000+ citations. His representative works include BioGPT, MedPrompt, NAO, NAS-BERT, LightSpeech, etc. His BioGPT and MedPrompt are widely reported and bring high impact to the community.
Beijing Institute of Mathematical Sciences and Applications
CONTACT

No. 544, Hefangkou Village Huaibei Town, Huairou District Beijing 101408

北京市怀柔区 河防口村544号
北京雁栖湖应用数学研究院 101408

Tel. 010-60661855 Tel. 010-60661855
Email. administration@bimsa.cn

Copyright © Beijing Institute of Mathematical Sciences and Applications

京ICP备2022029550号-1

京公网安备11011602001060 京公网安备11011602001060