academy
Linking across Multimodal Sensors with LLM Generative Agents
In this project, we expect the student to develop an evaluation setup to examine the capability of our proposed LLM-based model in interpreting and extracting insights from physical activity and physiological data.
www.unsw.edu.au
Paper
* COCOA: Cross Modality Contrastive Learning for Sensor Data
*TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series
https://arxiv.org/abs/2308.08241
TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series
This work summarizes two strategies for completing time-series (TS) tasks using today's language model (LLM): LLM-for-TS, design and train a fundamental large model for TS data; TS-for-LLM, enable the pre-trained LLM to handle TS data. Considering the insu
arxiv.org