Page 50 - 《水资源与水工程学报》2025年第1期
P. 50
6
4 & ' ( ) & * + , - 2025 $
¶·(BiLSTM), ¦³EJKZ[ðñr»2 [8]Wao. %ðp±²{ÿJðñiº\I R·ó
iº, f¢Ý{ÂðñÈ'>ÃÑ。6| p0AB[J]. þP¡Zb#$,2019,47(12):69.
B,Reg-Crossformer iºeL®/FÚ'p [9]I», )"Þ, ², :. % LSTM{8é¨w
è5ºJKJij\ðñ[J]. -Ïwè,2021,
´Ü^úû, Æ}Éé¯ðñß~í·Ñ。
43(4):11441156.
ò Reg-Crossformer iº¦ÿ+,ÏÎ@
[10]VASWANIA,SHAZEERN,PARMARN,etal.Atten
<íêrS0#D, ñíêðÿLDÈkb
tionisallyouneed [C]//NeuralInformationProcessing
¦®, ÂTu¦ÿÞß74îr。ôÎ
SystemsFoundation,AdvancesinNeuralInformationPro
Ç+¢ÿÆ{íêðÿ$£, Hú*µ
、 ÞßL cessingSystems30.LongBeach:CurranAssociates,Inc,
ÌÂ7CD:, Öéíêiº{ÂÃÑ。 2017:59986008.
[11]DEVLIN J,CHANG Mingwei,LEEK,etal.BERT:
6 @ c
pretrainingofdeepbidirectionaltransformersforlanguage
(1)Reg-Crossformer iº®/Ü^úû4{ understanding [C]//The2019ConferenceoftheNorthA
mericanChapteroftheAssociationforComputationalLin
5úQÞ, {4©ª¯Òú|Î{+,s
guistics :HumanLanguageTechnologies.Minneapolis:
oÑ, stuiº¦ÿÜëÏÎ@<íêÏS0
ACL,2019:41714176.
#D, I R、NSE6Ù×{ Crossformer iºµTé
[12]DOSOVITSKIYA,BEYERL,KOLESNIKOVA,etal.
¯ 7.46% 21.63%,RMSEÊÓ¯ 15.25%。
Animageisworth16x16words:transformersforimage
( 2)Reg-Crossformer iº¦³EJKÿJ
recognition atscale [C]//InternationalConference on
ðñ4{Á(³´ü3»2p0iº LearningRepresentations. VirtualConference: ICLR,
(SVM) cæ~0iº(LSTM Informer), S0# 2021.
¢é{ðñßà~í·Ñ。 [13]DONGLinhao,XUShuang,XUBo.Speechtransformer:
( 3) eLÃq³¯ÒÏÎ@<µô~ anorecurrencesequencetosequencemodelforspeech
routers í Reg-Crossformer iºÑÈ{MN。 recognition [C]//2018IEEEinternationalconferenceon
acoustics ,speechandsignalprocessing.Calgary:IEEE,
>i{µô~Ȥ)éuiº{ðñÈ', ¦
2018:58845888.
m{Ä(C^· routers íiº{ìàÑ
[14]LIShiyang,JINXiaoyong,XUANYao,etal.Enhancing
°SrMN, /0¯ Reg-Crossformer iº
thelocalityandbreakingthememorybottleneckoftrans
{ÂÃÑ。
formerontimeseriesforecasting[C]//Proceedingsofthe
33rdInternationalConferenceonNeuralInformationPro
opFG: cessingSystems.Vancouver :CurranAssociates,Inc.,
[1]ÄÑ, 5, ÒO. EJ·¸³´AB0Úc9Ó 2019:52435253.
[J]. ZbÆc¸,2021(4):5762. [15]ZHOUTian,MAZiqing,WENQingsong,etal.FED
[2])Ó, ¹Ô,, 9 $, :. *¸Zß<BEJZ[E former :frequencyenhanceddecomposedtransformerfor
[MNAB[J]. ZS0,È,2018,39(5):16. longtermseriesforecasting [C]//Proceedingsofthe39th
[3]¹Ø, ÕÖ(. % RVAµ¶Zß\¤Jç¬× InternationalConference on Machine Learning. Balti
¡4ÃZ[E[MN[J]. ZqÈ^#0,2020,38(5): more :PMLR,2022:2726827286.
4245+5. [16]ZHOUHaoyi,ZHANGShanghang,PENGJieqi,etal.
[4])Ó, w7, #ÈM, :. % IHA-RVA{]Z Infomer:beyondefficienttransformerforlongsequence
JKZ[E[³[J]. pZbZqa00n( wx timeseriesforecasting[C]//ProceedingsoftheAAAI
#0t),2019,40(2):1621+51. ConferenceonArtificialIntelligence.NewYork:AAAI,
[5]Øì, )s, 9 , :. p0{¨\I¦øùð 2021:1110611115.
ñ4{¦Ã[J]. aø#00n,2021,44(1):2638. [17]WUHaixu,XUJiehui,WangJianmin,etal.Autoform
[6]zÙw, ¹e, Ä ¼, :.BPNCµ¶·¦JK er :decompositiontransformerswithautocorrelationfor
Jij4{Ã[J]. Z[,2015,35(5):3540+96. longtermseriesforecasting [C]//AdvancesinNeuralIn
[7]Ä Ú, ÛnÜ, 9 Ý, :. ÂZðnáÈiº¦4!1ï formationProcessingSystems.CurranAssociates ,Inc.,
W1c[5{Ã6[J]. Z]^lm,2021,37(1): 2021:2241922430.
2835+60. ( ?@A 53 B)