Time Series Initiative

Advancing Time Series Analysis with Modern AI

We develop cutting-edge methods for time series classification and forecasting, leveraging multimodal language modeling, self-supervised learning, and reasoning-based approaches to push the boundaries of temporal data understanding.

21 Research Papers
Best of WSDM 🏆 Paper Award
ICML, NeurIPS, KDD, WWW, IJCAI, ACM TIST Top-tier Venues
2023-2026 Research Timeline

Research Projects

Time Series Classification & Forecasting

Our research spans multiple directions: multimodal language modeling for classification, reasoning-based forecasting, self-supervised representation learning, and comprehensive surveys.

Time Series Classification

InstructTime++

InstructTime++ · Preprint

Extends InstructTime with implicit feature enhancement via statistical feature extraction and vision-language-based image captioning for superior classification performance.

Paper →
InstructTime

InstructTime · WSDM 2025 (Best of WSDM)

Reformulates time series classification as a multimodal generative task, converting continuous sequences into discrete tokens and leveraging pre-trained language models.

Paper →
HiTime

HiTime · ACM TIST

Hierarchical multimodal LLMs with semantic space alignment for time series classification: hierarchical sequence encoding, coarse-to-fine cross-modal alignment, and generative classification via parameter-efficient fine-tuning.

Paper →
TableTime

TableTime · CIKM 2025

Reformulates multivariate time series classification as zero-shot table understanding via large language models, minimizing information loss through tabular representation.

Paper →
TimeMAE

TimeMAE · WSDM 2026

Self-supervised representation learning with decoupled masked autoencoders, learning enriched contextual representations through bidirectional encoding over sub-series.

Paper →
CrossTimeNet

CrossTimeNet · WSDM 2025

Cross-domain self-supervised pre-training with language models for transferable time series representations; vector quantization tokenization and corrupted-region recovery for classification and forecasting downstream tasks.

Code →
ConvTimeNet

ConvTimeNet · WWW 2025

Hierarchical fully convolutional model with deformable patch layers for adaptive perception of local patterns and multi-scale dependency modeling.

Paper →
FormerTime

FormerTime · WWW 2023

Hierarchical multi-scale representation for multivariate time series classification, capturing temporal dependencies at multiple granularities.

Paper →

Time Series Forecasting

CastMind

CastMind

Interaction-driven agentic reasoning framework for cognition-inspired forecasting, organizing prediction into multi-stage workflow with context preparation and reflective evaluation.

Paper →
OneCast

OneCast

Structured decomposition and modular generation for cross-domain time series forecasting: decomposes into seasonal (lightweight projection with basis functions) and trend (semantic-aware tokenizer + masked discrete diffusion) components for generalization across domains.

Paper →
MemCast

MemCast

Memory-driven forecasting with experience-conditioned reasoning, learning historical patterns, reasoning wisdom, and general laws from training data.

Code →
Time-R1

Time-R1

Slow-thinking approach with reinforced LLMs for time series forecasting, employing two-stage reinforcement fine-tuning to enhance multi-step reasoning ability.

Paper →
TokenCast

TokenCast

LLM-driven framework for context-aware forecasting via symbolic discretization, transforming continuous sequences into temporal tokens for unified representation.

Paper →
TimeReasoner

TimeReasoner · WSDM 2026

Empirical study on slow-thinking LLMs reasoning over temporal patterns, demonstrating non-trivial zero-shot forecasting capabilities in capturing trends and contextual shifts.

Paper →
TimeDART

TimeDART · ICML 2025

Diffusion autoregressive transformer for self-supervised time series representation, unifying causal transformer encoding with denoising diffusion for global and local patterns.

Paper →
FDF

FDF · IJCAI 2025

Flexible decoupled framework that decomposes time series into trend and seasonal components, employing conditional denoising (diffusion) for the fluctuating seasonal part and enhanced linear models for the smooth trend, trained end-to-end.

Paper →
HMF

HMF · IJCAI 2025

Hybrid multi-factor network with dynamic sequence modeling for early warning of intraoperative hypotension; decomposes physiological signals into trend and seasonal components with patch-based Transformers and symmetric normalization.

Paper →
PIR

PIR · NeurIPS 2025

Instance-aware post-hoc revision framework for improving forecasting via identification and revision of biased instances using contextual information.

Paper →
GPHT

GPHT · SIGKDD 2024

Generative pretrained hierarchical transformer for forecasting, employing mixed dataset pretraining and autoregressive generation for arbitrary horizon settings.

Paper →
SAN

SAN · NeurIPS 2023

Adaptive normalization for non-stationary time series forecasting from a temporal slice perspective, eliminating non-stationarity at local slice level.

Paper →

Survey

Time Series Survey

A Comprehensive Survey of Time Series Forecasting · Preprint

Examines key entities in TSF and their characteristics, introduces a general problem formulation and challenge analysis, proposes a taxonomy classifying methodologies from preprocessing and forecasting perspectives, and highlights emerging topics like transfer learning and trustworthy forecasting.

Paper →

Timeline

Milestones in Time Series Research

Key publications from 2023 to 2026 across classification, forecasting, and representation learning.

2023

SAN · NeurIPS 2023

Introduced adaptive normalization for non-stationary time series forecasting, addressing distribution shifts through temporal slice-level normalization.

Paper →

2023

FormerTime · WWW 2023

Proposed hierarchical multi-scale representation for multivariate time series classification, establishing foundation for multi-granularity temporal modeling.

Paper →

2024

GPHT · SIGKDD 2024

Developed generative pretrained hierarchical transformer enabling single model for arbitrary horizon forecasting through mixed dataset pretraining.

Paper →

2025

InstructTime · WSDM 2025 (Best of WSDM)

Awarded Best of WSDM for reformulating classification as multimodal generative task, demonstrating potential for universal foundation models in time series.

Paper →

2025

InstructTime++ · Preprint

Extends InstructTime with implicit feature enhancement for superior time series classification.

Paper →

2025

CrossTimeNet · WSDM 2025

Cross-domain pre-training with LMs for transferable time series representations via vector quantization tokenization and self-supervised recovery objective.

Code →

2025

TimeDART · ICML 2025

Unified diffusion and autoregressive paradigms for self-supervised representation learning, capturing both global trends and local patterns.

Paper →

2025

TableTime · CIKM 2025

Enabled zero-shot classification through table understanding reformulation, achieving natural alignment with LLM semantic space.

Paper →

2025

CastMind

Interaction-driven agentic reasoning framework for cognition-inspired time series forecasting.

Paper →

2025

MemCast

Memory-driven forecasting with experience-conditioned reasoning from historical patterns and reasoning wisdom.

Code →

2025

Time-R1

Slow-thinking approach with reinforced LLMs for multi-step reasoning in time series forecasting.

Paper →

2025

TokenCast

LLM-driven context-aware forecasting via symbolic discretization and unified semantic space.

Paper →

2025

PIR · NeurIPS 2025

Model-agnostic framework for instance-aware post-hoc revision, effectively mitigating instance-level errors in forecasting.

Paper →

2025

ConvTimeNet · WWW 2025

Demonstrated viability of pure convolutional models with hierarchical architecture and deformable patch layers for time series analysis.

Paper →

2025

HiTime · ACM TIST (Accepted Nov 2025)

Hierarchical multimodal LLMs with semantic space alignment for time series classification, bridging temporal representations with generative reasoning.

Paper →

2025

FDF · IJCAI 2025

Flexible decoupled framework combining diffusion for seasonal component and enhanced linear models for trend, trained end-to-end.

Paper →

2025

HMF · IJCAI 2025

Hybrid multi-factor network for early warning of intraoperative hypotension via dynamic sequence modeling and trend-seasonal decomposition.

Paper →

2025

OneCast

Structured decomposition and modular generation for cross-domain time series forecasting: seasonal projection with basis functions and trend modeling via semantic-aware tokenizer and masked discrete diffusion.

Paper →

2026

TimeMAE · WSDM 2026

Advanced self-supervised learning with decoupled masked autoencoders, learning transferable representations through bidirectional encoding.

Paper →

2026

TimeReasoner · WSDM 2026

Empirical study revealing slow-thinking LLMs' capabilities in temporal reasoning, paving way for reasoning-based forecasting paradigms.

Paper →

Comprehensive Survey of Time Series Forecasting

Concepts, challenges, taxonomy, and future directions for time series forecasting research.

Paper →

Resources

Research Directions & Contributions

Multimodal Language Modeling

Converting time series to discrete tokens and leveraging pre-trained language models for classification and forecasting tasks.

Reasoning-Based Forecasting

Slow-thinking approaches and agentic reasoning frameworks that enable multi-step reasoning over temporal patterns.

Self-Supervised Learning

Masked autoencoders, diffusion models, and hierarchical transformers for learning transferable time series representations.

Comprehensive Surveys

In-depth analysis of time series forecasting concepts, challenges, architectural trends, and future directions.

Advancing Time Series Understanding

We explore novel paradigms for time series analysis, from multimodal language modeling to reasoning-based forecasting. Our research addresses fundamental challenges in temporal data understanding, including non-stationarity, long-term dependencies, and contextual integration.

Multimodal Integration

Bridging numerical sequences with contextual features through symbolic discretization and alignment strategies for enhanced understanding.

Reasoning Capabilities

Developing slow-thinking approaches and memory-driven frameworks that enable explicit reasoning over temporal dynamics and dependencies.

Transferable Representations

Self-supervised learning methods that learn generalizable patterns across diverse time series domains and applications.