Page 50 - 《水资源与水工程学报》2025年第1期
P. 50

6
             4                        & ' ( ) & * + , -                 2025 $
            ¶·(BiLSTM), ¦³EJKZ[ðñ€r»ƒ2†                        [8]Wao. %ƒðp±²{ÿ‚Jðñiº\I R·ó
            iº, ‡f¢Ý{ŒðñÈ'ž>ÃÑ。’6|                               p0AB[J]. þP¡Zb#$,2019,47(12):69.
            B,Reg-Crossformer iºeL®/–F™Ú‰'p                    [9]I», )"Þ, ², :. %ƒ LSTM{8Žé¨w
                                                                  è5Šº–JK‚Jij\ðñ[J]. -Ïwè,2021,
            ´žÜ^úû, Æ}ɖé¯ðñß~ží·Ñ。
                                                                  43(4):11441156.
                 òœ Reg-Crossformer iº¦ÿ„+,ÏÎ@
                                                               [10]VASWANIA,SHAZEERN,PARMARN,etal.Atten
            <íê€rS0#D, ñíêðÿ„LD›Èkb
                                                                   tionisallyouneed [C]//NeuralInformationProcessing
            ¦‹®, ÂTu¦ÿ„Þߞ74î€r。ôΛ
                                                                   SystemsFoundation,AdvancesinNeuralInformationPro
            Ç+¢ÿÆ{íêðÿ„$£, Hú*µ…、 ÞßL                              cessingSystems30.LongBeach:CurranAssociates,Inc,
            ̞Â7CD:, ֖éí꟞iº{ÂÃÑ。                               2017:59986008.
                                                               [11]DEVLIN J,CHANG Mingwei,LEEK,etal.BERT:
            6 @ c
                                                                   pretrainingofdeepbidirectionaltransformersforlanguage
                 (1)Reg-Crossformer iº®/Ü^úû4{                     understanding [C]//The2019ConferenceoftheNorthA
                                                                   mericanChapteroftheAssociationforComputationalLin
            5úQÞ, ˆ{4©ª¯‹Òú|Î{+,s
                                                                   guistics :HumanLanguageTechnologies.Minneapolis:
            oÑ, stuiº¦ÿ„ÜëÏÎ@<íêÏS0
                                                                   ACL,2019:41714176.
            #D, I R、NSE6Ù×{ Crossformer iºµT–é
                                                               [12]DOSOVITSKIYA,BEYERL,KOLESNIKOVA,etal.
            ¯ 7.46% ž 21.63%,RMSEÊÓ¯ 15.25%。
                                                                   Animageisworth16x16words:transformersforimage
                 ( 2)Reg-Crossformer iº¦³EJKÿ‚J
                                                                   recognition atscale [C]//InternationalConference on
            ðñ4{Á(³´ü†3»ƒ2†p€0iº                                 LearningRepresentations. VirtualConference: ICLR,
            (SVM) cæ~0iº(LSTMž Informer), S0#                     2021.
            ¢é{ðñßà~ží·Ñ。                                      [13]DONGLinhao,XUShuang,XUBo.Speechtransformer:
                 ( 3) eLÃq³€¯‹ÒÏÎ@<µ™ô~ž                           anorecurrencesequencetosequencemodelforspeech
            routers í‚ Reg-Crossformer iºÑÈ{MN。                   recognition [C]//2018IEEEinternationalconferenceon
                                                                   acoustics ,speechandsignalprocessing.Calgary:IEEE,
            >i{µ™ô~Ȥ)–éuiº{ðñÈ', ¦
                                                                   2018:58845888.
            •mŸ{Ä(C^· routers í‚iº{ìàÑ
                                                               [14]LIShiyang,JINXiaoyong,XUANYao,etal.Enhancing
            ‹‘°S“rMN, /0¯ Reg-Crossformer iº
                                                                   thelocalityandbreakingthememorybottleneckoftrans
            {ÂÃÑ。
                                                                   formerontimeseriesforecasting[C]//Proceedingsofthe
                                                                   33rdInternationalConferenceonNeuralInformationPro
            opFG:                                                  cessingSystems.Vancouver :CurranAssociates,Inc.,
            [1]Äў, 5žƒ, ”ÒO. EJ·¸³´AB0Úc9Ó                        2019:52435253.
                [J]. Zbƍc¸,2021(4):5762.                    [15]ZHOUTian,MAZiqing,WENQingsong,etal.FED
            [2])ӆ, ¹Ô,, 9 $, :. *¸Z߂<BEJZ[E                      former :frequencyenhanceddecomposedtransformerfor
                [MNAB[J]. ZS0,È,2018,39(5):16.                   longtermseriesforecasting [C]//Proceedingsofthe39th
            [3]¹Øž, ÕÖ(. %ƒ RVAµ¶Zß\¤Jç¬‚×                        InternationalConference on Machine Learning. Balti
                ¡4ÃZ[E[MN[J]. ZqÈ^#0,2020,38(5):                   more :PMLR,2022:2726827286.
                4245+5.                                       [16]ZHOUHaoyi,ZHANGShanghang,PENGJieqi,etal.
            [4])ӆ, Šw7, #ÈM, :. %ƒ IHA-RVA{]Z                    Infomer:beyondefficienttransformerforlongsequence
                JKZ[E[³€[J]. žpZbZqa00n( wx                        timeseriesforecasting[C]//ProceedingsoftheAAAI
                #0t),2019,40(2):1621+51.                          ConferenceonArtificialIntelligence.NewYork:AAAI,
            [5]­Øì, )‘s, 9 ž, :. p€0{¨„\I¦øùð                     2021:1110611115.
                ñ4{“¦Ã•[J]. aø#00n,2021,44(1):2638.           [17]WUHaixu,XUJiehui,WangJianmin,etal.Autoform
            [6]zÙw, ¹˜e, Ä ¼, :.BPNCµŸ¶·¦–JK                       er :decompositiontransformerswithautocorrelationfor
                ‚Jij4{Õ[J]. Z[,2015,35(5):3540+96.               longtermseriesforecasting [C]//AdvancesinNeuralIn
            [7]Ä Ú, ÛnÜ, 9 Ý, :. ÂZðnáÈiº¦4!1ï                     formationProcessingSystems.CurranAssociates ,Inc.,
                W1c[5{Ղ6[J]. Z]^lm,2021,37(1):                   2021:2241922430.
                2835+60.                                                                  ( ?@A 53 B)
   45   46   47   48   49   50   51   52   53   54   55