|
|
|
|
LEADER |
01000caa a22002652c 4500 |
001 |
NLM359296807 |
003 |
DE-627 |
005 |
20250305003919.0 |
007 |
cr uuu---uuuuu |
008 |
231226s2023 xx |||||o 00| ||eng c |
024 |
7 |
|
|a 10.1109/TPAMI.2023.3293516
|2 doi
|
028 |
5 |
2 |
|a pubmed25n1197.xml
|
035 |
|
|
|a (DE-627)NLM359296807
|
035 |
|
|
|a (NLM)37428671
|
040 |
|
|
|a DE-627
|b ger
|c DE-627
|e rakwb
|
041 |
|
|
|a eng
|
100 |
1 |
|
|a Li, Bing
|e verfasserin
|4 aut
|
245 |
1 |
0 |
|a DifFormer
|b Multi-Resolutional Differencing Transformer With Dynamic Ranging for Time Series Analysis
|
264 |
|
1 |
|c 2023
|
336 |
|
|
|a Text
|b txt
|2 rdacontent
|
337 |
|
|
|a ƒaComputermedien
|b c
|2 rdamedia
|
338 |
|
|
|a ƒa Online-Ressource
|b cr
|2 rdacarrier
|
500 |
|
|
|a Date Revised 04.10.2023
|
500 |
|
|
|a published: Print-Electronic
|
500 |
|
|
|a Citation Status PubMed-not-MEDLINE
|
520 |
|
|
|a Time series analysis is essential to many far-reaching applications of data science and statistics including economic and financial forecasting, surveillance, and automated business processing. Though being greatly successful of Transformer in computer vision and natural language processing, the potential of employing it as the general backbone in analyzing the ubiquitous times series data has not been fully released yet. Prior Transformer variants on time series highly rely on task-dependent designs and pre-assumed "pattern biases", revealing its insufficiency in representing nuanced seasonal, cyclic, and outlier patterns which are highly prevalent in time series. As a consequence, they can not generalize well to different time series analysis tasks. To tackle the challenges, we propose DifFormer, an effective and efficient Transformer architecture that can serve as a workhorse for a variety of time-series analysis tasks. DifFormer incorporates a novel multi-resolutional differencing mechanism, which is able to progressively and adaptively make nuanced yet meaningful changes prominent, meanwhile, the periodic or cyclic patterns can be dynamically captured with flexible lagging and dynamic ranging operations. Extensive experiments demonstrate DifFormer significantly outperforms state-of-the-art models on three essential time-series analysis tasks, including classification, regression, and forecasting. In addition to its superior performances, DifFormer also excels in efficiency - a linear time/memory complexity with empirically lower time consumption
|
650 |
|
4 |
|a Journal Article
|
700 |
1 |
|
|a Cui, Wei
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Zhang, Le
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Zhu, Ce
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Wang, Wei
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Tsang, Ivor W
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Zhou, Joey Tianyi
|e verfasserin
|4 aut
|
773 |
0 |
8 |
|i Enthalten in
|t IEEE transactions on pattern analysis and machine intelligence
|d 1979
|g 45(2023), 11 vom: 02. Nov., Seite 13586-13598
|w (DE-627)NLM098212257
|x 1939-3539
|7 nnas
|
773 |
1 |
8 |
|g volume:45
|g year:2023
|g number:11
|g day:02
|g month:11
|g pages:13586-13598
|
856 |
4 |
0 |
|u http://dx.doi.org/10.1109/TPAMI.2023.3293516
|3 Volltext
|
912 |
|
|
|a GBV_USEFLAG_A
|
912 |
|
|
|a SYSFLAG_A
|
912 |
|
|
|a GBV_NLM
|
912 |
|
|
|a GBV_ILN_350
|
951 |
|
|
|a AR
|
952 |
|
|
|d 45
|j 2023
|e 11
|b 02
|c 11
|h 13586-13598
|