Recurrent neural networks like long short-term memory (LSTM) have been utilized  as a tool for modeling and predicting dynamics of complex stochastic molecular systems. Previous studies have shown that Transformer has an advantage over LSTM in dealing with the memory loss of long-sequence data, and exceeds LSTM in many natural language processing tasks. In this seminar, we will show the implementation of Transformer on learning molecular dynamics and compare it with LSTM, which is greatly affected by lag time. 

3 May 2021
3pm - 4pm
Where
https://hkust.zoom.com.cn/j/6218914432 (Passcode: hkust)
Speakers/Performers
Miss Wenqi ZENG
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
21 Jun 2024
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Alzheimer’s Disease is Likely a Lipid-disorder Complication: an Example of Functional Lipidomics for Biomedical and Biological Research
Abstract Functional lipidomics is a frontier in lipidomics research, which identifies changes of cellular lipidomes in disease by lipidomics, uncovers the molecular mechanism(s) leading to the chan...
24 May 2024
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Confinement Controlled Electrochemistry: Nanopore beyond Sequencing
Abstract Nanopore electrochemistry refers to the promising measurement science based on elaborate pore structures, which offers a well-defined geometric confined space to adopt and characterize sin...