ResearchArticle AGRU ...

13
Research Article A GRU-Based Method for Predicting Intention of Aerial Targets Fei Teng , 1 Yafei Song , 1 Gang Wang , 1 Peng Zhang , 2 Liuxing Wang , 2 and Zongteng Zhang 1 1 Air and Missile Defense College, Air Force Engineering University, Xi’an 710051, China 2 AVIC Jiangxi Hongdu Aviation Industry Group Company Ltd., Nanchang 330024, China Correspondence should be addressed to Yafei Song; [email protected] Received 30 July 2021; Accepted 22 October 2021; Published 2 November 2021 Academic Editor: ippa Reddy G Copyright©2021FeiTengetal.isisanopenaccessarticledistributedundertheCreativeCommonsAttributionLicense,which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Since a target’s operational intention in air combat is realized by a series of tactical maneuvers, its state presents the characteristics of temporal and dynamic changes. Depending only on a single moment to take inference, the traditional combat intention recognition method is neither scientific nor effective enough. Based on a gated recurrent unit (GRU), a bidirectional propagation mechanism and attention mechanism are introduced in a proposed aerial target combat intention recognition method. e proposed method constructs an air combat intention characteristic set through a hierarchical approach, encodes into numeric time-series characteristics, and encapsulates domain expert knowledge and experience in labels. It uses a bidirectional gated recurrent units (BiGRU) network for deep learning of air combat characteristics and adaptively assigns characteristic weights using an attention mechanism to improve the accuracy of aerial target combat intention recognition. In order to further shorten the time for intention recognition and with a certain predictive effect, an air combat characteristic prediction module is introduced before intention recognition to establish the mapping relationship between predicted characteristics and combat intention types. Simulation experiments show that the proposed model can predict enemy aerial target combat intention one sampling point ahead of time based on 89.7% intent recognition accuracy, which has reference value and theoretical significance for assisting decision-making in real-time intention recognition. 1. Introduction With the development of military and aviation technology, informationization has gradually become the focus of the modern battlefield. Information-driven has become the main direction of modern war. As technological develop- ment and application have led to a dramatic increase in the amount of battlefield information, it has become difficult to recognize the enemy’s intention from multiple sources of battlefield data in a timely and effective manner by relying solely on the experience of domain experts. ere is a need for intelligent methods to eliminate the drawbacks of manual methods [1, 2]. To meet the needs of operational decision systems, many intention recognition studies have been conducted. Research on enemy target operational intent recognition mainly in- cludes methods such as evidence theory [3, 4], template matching [5, 6], expert systems [7], Bayesian networks [8, 9], and neural networks [2, 10–12]. Past research [2–12] has achieved enemy target operational intention recognition according to different operational contexts, but there are shortcomings in temporal characteristic learning and knowledge representation. On the one hand, the target operational intention is realized through a series of tactical maneuvers, so the dynamic attributes of the target and the battlefield environment will change over time. In addition, the enemy target has a certain degree of concealment and deception when performing combat operations. us, the above methods are not scientific enough as they determine the enemy target operational intention using characteristic information at a single moment. On the other hand, the above methods require explicit organization, abstraction, and description of military experts’ empirical knowledge, so the knowledge representation and engineering imple- mentation are difficult. Aiming at the drawbacks of the above methods, Ou et al. [13] proposed an intelligent rec- ognition model of tactical intention based on a long short- term memory (LSTM) network. e input characteristic of Hindawi Computational Intelligence and Neuroscience Volume 2021, Article ID 6082242, 13 pages https://doi.org/10.1155/2021/6082242

Transcript of ResearchArticle AGRU ...

Research ArticleA GRU-Based Method for Predicting Intention of Aerial Targets

Fei Teng 1 Yafei Song 1 Gang Wang 1 Peng Zhang 2 Liuxing Wang 2

and Zongteng Zhang 1

1Air and Missile Defense College Air Force Engineering University Xirsquoan 710051 China2AVIC Jiangxi Hongdu Aviation Industry Group Company Ltd Nanchang 330024 China

Correspondence should be addressed to Yafei Song yafei_song163com

Received 30 July 2021 Accepted 22 October 2021 Published 2 November 2021

Academic Editor ippa Reddy G

Copyright copy 2021 Fei Teng et alis is an open access article distributed under the Creative Commons Attribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Since a targetrsquos operational intention in air combat is realized by a series of tactical maneuvers its state presents the characteristics oftemporal and dynamic changes Depending only on a single moment to take inference the traditional combat intention recognitionmethod is neither scientific nor effective enough Based on a gated recurrent unit (GRU) a bidirectional propagation mechanism andattention mechanism are introduced in a proposed aerial target combat intention recognition method e proposed methodconstructs an air combat intention characteristic set through a hierarchical approach encodes into numeric time-series characteristicsand encapsulates domain expert knowledge and experience in labels It uses a bidirectional gated recurrent units (BiGRU) network fordeep learning of air combat characteristics and adaptively assigns characteristic weights using an attention mechanism to improve theaccuracy of aerial target combat intention recognition In order to further shorten the time for intention recognition and with a certainpredictive effect an air combat characteristic prediction module is introduced before intention recognition to establish the mappingrelationship between predicted characteristics and combat intention types Simulation experiments show that the proposed model canpredict enemy aerial target combat intention one sampling point ahead of time based on 897 intent recognition accuracy which hasreference value and theoretical significance for assisting decision-making in real-time intention recognition

1 Introduction

With the development of military and aviation technologyinformationization has gradually become the focus of themodern battlefield Information-driven has become themain direction of modern war As technological develop-ment and application have led to a dramatic increase in theamount of battlefield information it has become difficult torecognize the enemyrsquos intention from multiple sources ofbattlefield data in a timely and effective manner by relyingsolely on the experience of domain experts ere is a needfor intelligent methods to eliminate the drawbacks ofmanual methods [1 2]

To meet the needs of operational decision systems manyintention recognition studies have been conducted Researchon enemy target operational intent recognition mainly in-cludes methods such as evidence theory [3 4] templatematching [5 6] expert systems [7] Bayesian networks [8 9]and neural networks [2 10ndash12] Past research [2ndash12] has

achieved enemy target operational intention recognitionaccording to different operational contexts but there areshortcomings in temporal characteristic learning andknowledge representation On the one hand the targetoperational intention is realized through a series of tacticalmaneuvers so the dynamic attributes of the target and thebattlefield environment will change over time In additionthe enemy target has a certain degree of concealment anddeception when performing combat operations us theabove methods are not scientific enough as they determinethe enemy target operational intention using characteristicinformation at a single moment On the other hand theabove methods require explicit organization abstractionand description of military expertsrsquo empirical knowledge sothe knowledge representation and engineering imple-mentation are difficult Aiming at the drawbacks of theabove methods Ou et al [13] proposed an intelligent rec-ognition model of tactical intention based on a long short-term memory (LSTM) network e input characteristic of

HindawiComputational Intelligence and NeuroscienceVolume 2021 Article ID 6082242 13 pageshttpsdoiorg10115520216082242

the model is 12 consecutive frames of time sequencecharacteristics which can effectively overcome the judgmentby a single moment Moreover the model implicitly orga-nizes abstracts and describes the empirical knowledge ofmilitary experts making its knowledge representation andengineering implementation less difficult However it onlyuses historical moment information to make inferencesabout current information and cannot effectively use futuremoment information Since there are many characteristicsrelated to the intention of air targets it is necessary tohighlight the influence of key characteristics and reduce thecontribution of redundant characteristics In addition wewant to further improve the real-time performance of aerialtarget intent recognition in some way

Based on the above analysis we propose a gated re-current unit (GRU) based intelligent prediction model foraerial target combat intention e model has characteristicprediction and intention recognition modulese intentionrecognition module introduces a bidirectional propagationmechanism attention mechanism and particle swarm op-timization (PSO) algorithm based on a GRU to build anintelligent intention recognition model With similar per-formance to that of LSTM a GRU has less structuralcomplexity and requires less time for recognition Comparedwith a GRU a bidirectional gated recurrent unit (BiGRU)can use not only the information of historical moments butalso that of future moments to make comprehensivejudgments PSO can find the optimal parameters of a BiGRUnetwork [14] and the attention mechanism layer can furtherhighlight the key information affecting the intention andimprove the accuracy of intention recognition In order torealize our idea of further shortening the time used forintention recognition we build a characteristic predictionmodule that uses the BiGRU network to analyze the col-lective characteristics and predicts future aerial targetcharacteristics which are input to the intention recognitionmodule to establish themapping relationship between futureaerial target characteristics and the targetrsquos operational in-tention types Experiments show that the proposed modelcan predict the enemy aerial target operational intention onesampling point in advance and the accuracy rate is increasedby 29 compared with LSTM

e remaining sections of this paper are arranged asfollows Section 2 introduces the definition of intent rec-ognition of aerial targets and how to select intention cate-gories and characteristic types In Section 3 the frameworkof the proposed model is described in detail including theintention recognition module and the characteristic pre-diction module Experimental results are analyzed in Section4 to show the performance of the new method is paper isconcluded in the last section

2 Description of Aerial Target OperationalIntention Recognition Problem

Intention recognition is important for command andcontrol in modern war e operational intention of aerialtarget can be inferred based on real-time data from multiplesensors in a dynamic and complex battlefield environment

To enhance the reliability of intention recognition a prioriknowledge and the experience of experts in the relevantoperational field [12] should also be taken into consider-ation e process of intention recognition is shown inFigure 1

Aerial target intention recognition is a pattern recog-nition problem that can be described as a mapping of in-tention recognition characteristics to aerial target combatintention types Define the vector V(t) as the real-time aircombat characteristic information at time t andP (p1 p2 pn) as the aerial target combat intentionspace set Due to the complexity high confrontation anddeceptive nature of actual air combat environment condi-tions relying on the real-time air combat characteristicinformation detected at a single time can be somewhatdeceptive and one-sided To infer the combat intention of anenemy aircraft from air combat characteristic information atsuccessive times is far more accurate and scientific than torely on information at a single time [1] e mappingfunction from the space set P of operational intention to thetemporal characteristic set Vm is determined by defining Vmas the temporal characteristic set form consecutivemomentsfrom t1 to tm

P f Vm( 1113857 f V t1( )V t2( ) V tm( )1113874 1113875 (1)

It can be seen that to achieve accurate recognition of theoperational intention of aerial targets requires a combina-tion of professional military knowledge and operationalexperience and complex thinking activities such as extrac-tion comparison analysis association and inference of keyinformation of air warfare It is difficult to establish themapping relationship between Vm and P by a single formula[13] We implicitly establish the mapping relationship be-tween the characteristic set and operational intention bytraining bidirectional gated recurrent units with attentionmechanism (BiGRU-Attention) network structure using anaerial target operational intention recognition characteristicset

21 Description of Space Set of Aerial Target OperationalIntention e target operational intention space set variesfor different operational forms enemy entities and desiredcontexts erefore the operational intention space set ofenemy targets must be defined based on the correspondingoperational context attributes of the enemy targets andpossible operational tasks For example a target intentionspace set was established as avoidance patrol attack basedon the potential threat of underwater targets [15] an op-erational intention space set was established as retreatcover attack reconnaissance for a single group of enemymaritime ship formations [16] and an operational intentionspace set of aerial targets was defined as reconnaissancesurveillance attack penetration [17] By taking UAV close-range engagement as the research object we establish thecombat intention space set of enemy targets as seven types ofintention feint surveillance electronic interference pen-etration attack retreat reconnaissance

2 Computational Intelligence and Neuroscience

After determining the space set of enemy operationalintention the key to applying the proposed intelligentrecognition model is to convert human cognitive models tolabels that can be trained by intelligent models and whichcorrespond to the types of intention in the operationalintention space set [18] e cognitive experience of expertsat air warfare can be encapsulated in labels to train themodel A set of label values 0 1 2 3 4 5 6 is set for theintention types in the established combat intention space setFigure 2 shows the corresponding combat intention typecoding and model resolution mechanisms For example ifthe intention prediction result is 5 then the combat in-tention of the enemy target against our target is retreaterefore this knowledge encapsulation and model parsingcan clearly and easily describe human empirical knowledgeand facilitate model training

22 SelectionofAerialTargetCombat IntentionCharacteristice enemy aerial targetrsquos operational intention is highlycorrelated with its operational mission the mutual threatlevel between the two sides and the tactical maneuversAccording to the abovementioned three aspects and therequirement of easy acquisition by radar the characteristicsthat are closely related to the combat intention of the airtarget are selected

Analyzed from the perspective of operational taskswhen an enemy UAV performs a certain task enemy aircraftcharacteristics must meet certain conditions For examplewhen performing defense penetration tasks it is divided intohigh-altitude penetration and low-altitude penetration andthe corresponding heights are 10 sim 11 km and100m sim 1000m a high flight speed of fighter aircraft whenreceiving attacks is generally 735 sim 1470 kmh [12] ere isalso a connection between the aerial target radar signal statusand the combat mission A fighter usually turns on air-to-airradar and electronic jamming in air combat and marineradar and air-to-air radar on a reconnaissance mission [19]

Many factors affect the threat level between the twotargets For convenient experimental data collection weconsider the speed flight acceleration distance flight

altitude heading angle and azimuth angle of both theenemyrsquos and our warplanes [20] as shown in Figure 3

e air combat capability factor [21] affects the targetthreat level For fighter aircraft a single aircraft air combatcapability threat function is constructed as

C ln ε1 + ln ε2 + 1( 1113857 + ln 1113944 ε3 + 11113872 11138731113960 1113961ε4ε5ε6ε7 (2)

where ε1 sim ε7 are parameters of warplane maneuverabilityairborne weapon performance airborne detection capabil-ity warplane operational performance warplane surviv-ability warplane operational range and electronicinformation countermeasure capability respectively e aircombat capability factors of various warplanes of both sidescan be calculated through this formula for a certain period oftime and are saved in the database and updated at any timeaccording to current information [22]

e realization of the operational intention of the aerialtarget is closely related to the maneuvers of the aircraftere are two kinds of maneuver libraries in common usethe typical tactical maneuver library and the basic maneuverlibrary Since this paper studies intention recognition basedon temporal characteristics target combat intention rec-ognition is carried out using 12 consecutive momentarycharacteristics as a sample However the control algorithmof a typical tactical action library is complicated to solve andthe exit and conversion time nodes of a maneuver aredifficult to determinee traditional basic maneuver library[23] includes only seven maneuvers and the combinedmaneuvers are not rich enough ey all adopt the limitmaneuver which is inconsistent with the actual air combatsituation We adopt an improved basic maneuver library[24] that includes 11 maneuvers left turn right turnaccelerated forward flight even-speed forward flight de-celerated forward flight climb left climb right climb diveleft dive right dive

In summary we use an aerial target combat intentioncharacteristic consisting of a 16-dimensional characteristicvector of enemy aircraft flight altitude our aircraft altitudeenemy aircraft flight speed our aircraft flight speed enemyaircraft acceleration our aircraft acceleration enemy aircraftair combat capability factor our aircraft air combat capa-bility factor heading angle the distance between the twosides azimuth angle air-to-air radar status marine radarstatus maneuver type jamming status jammed statuswhich can be divided into numeric and nonnumeric char-acteristics as shown in Figure 4

3 Model Framework

e proposed aerial target operational intention predictionmodel consists of a characteristic prediction module and anintention recognition module as shown in Figure 5 echaracteristic prediction module is based on the BiGRUnetwork e historical aerial target operational intentionrecognition characteristic set Vm is used as input a lineardefault activation function of the fully connected layer isused to obtain the prediction characteristic set Wm andthese two sets are formed into the temporal characteristic

Combat intention

Combat operations

Target state Perception state

Identify intentIon

Reasoning processRepresentation process

Enemy target

Air combat intention

recognition rules

Figure 1 Hierarchical representation and reasoning process ofintention

Computational Intelligence and Neuroscience 3

data and input to the intention recognition module con-structed by the BiGRU-Attention [25] network e prob-ability of each intention type is calculated using the softmaxfunction and the maximum probability intention type labelis output as the aerial target combat intention recognitionresult e characteristic prediction module and intentionrecognition module are described below

31 Characteristic Prediction Module e BiGRU charac-teristic prediction module has three parts the aerial targetcombat intention characteristic set input layer hidden layerand output layer Reference [26] has confirmed that theprediction accuracy of each feature independently is higherthan the overall prediction accuracy us Input (Numberof samples81) where 8 is the time step and 1 is thecharacteristic dimension and Output 1 that is the outputcharacteristic dimension is 1e detailed description will begiven below

311 Input Layer e input layer preprocesses the collectedaerial target characteristic dataset into a vector form that canbe accepted and processed by the BiGRU layer as follows

(1) Read the dataset and clean the data(2) Code nonnumeric data of jamming state jammed

state air-to-air radar state and marine radar state as0 (off) or 1 (on) Millierrsquos nine-level quantization

theory [27] is used to quantize numeric data ofmaneuver types

(3) Normalize encoded nonnumeric data with numericdata so as to improve network convergence speedand accuracy and prevent model gradient explosionWe normalize 11 types of numeric data and five typesof encoded nonnumeric data For the ith dimen-sional characteristic data Gi [gi1 gi2 gix

gin] i 1 2 middot middot middot16 where n is the total number ofdata points grsquo

ix is the result of normalizing the xthoriginal data of the ith dimensional characteristic to[0 1] that is

gixprime

gix minus minGi

maxGi minus minGi

(3)

where maxGi and minGi are the maximum andminimum values respectively of Gi

(4) Divide the data into training and test sets at an 8 2ratio

(5) Construct training and test samples as follows Usingthe method of predicting a single characteristic inturn take the distance characteristic prediction ofthe enemy and ourselves as an example If the dis-tance data of the 1sim8 moments are used to predictthe distance D9 at moment ninth the functionmapping relationship is

D9 f d1 d2 d8( 1113857 (4)

where di i isin (1 2 middot middot middot 8) is the distance at the ithmoment d1 sim d8 are selected as the first set of inputdata labeled as d9 d2 sim d9 are selected as the inputdata labeled as d10 and so on e training sampleinput data and training sample labels are generated thisway and shown belowe test data are constructed inthe same way as the training sample data [28]

d1 d2 middot middot middot dm

d2 d3 middot middot middot dm+1

⋮ ⋮ ⋱ ⋮

d8 d9 middot middot middot dm+7

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

d9 d10 middot middot middot dm+81113858 1113859

(5)

Intention space

Category label

Pars

ing

Cod

ing

attack

3

penetration

4

surveillance

0

reconnaissance

1

feint

2

retreat

5

electronicinterference

6

Figure 2 Schematic diagram of combat intention coding and analysis (the same color is a set of correspondences)

D

V1

H2

H1

V2Enemy aircra

Our plane

φ

ψ

Figure 3 Relative geometric position of air combat H1 H2 are theflight altitudes of the enemy and oursV1V2 are the flight speeds ofthe enemy and ours D is the distance between the two parties ψ isthe heading angle and φ is the azimuth angle

4 Computational Intelligence and Neuroscience

e collected aerial target combat intention character-istic set Vm is now in a characteristic vector form that can bedirectly accepted and processed by the hidden layer

312 Hidden Layer As a variant of the recurrent neuralnetwork (RNN) a gated recurrent unit (GRU) [29] has asimilar recursive structure and a memory function to processtime-series data A GRU can alleviate the problems of gradientdisappearance and explosion that may occur during RNNtraining thus solving the long-termmemory problem AnotherRNN variant the long short-term memory (LSTM) network[30] performs similarly but a GRU has a simpler structure andcan reduce computation and improve training efficiency

Figure 6 shows the internal structure of the GRU Its twoinputs are the output state htminus1 at the previous moment andthe input sequence value xt at the current moment and theoutput is the state ht at the current moment It updates the

model state through two gates e reset gate rt controls thedegree of forgetting the historical state information so thatthe network can discard unimportant information and theupdate gate zt controls the weight of the previous momentrsquosstate information being brought into the current statehelping the network to remember long-time information[31] ese variables are related as follows

rt σ Wrxt + Urhtminus1( 1113857

zt σ Wzxt + Uzhtminus1( 1113857

1113957ht tan h W1113957hxt + U1113957h

rt ⊙htminus1( 11138571113874 1113875

ht 1 minus zt( 1113857⊙htminus1 + zt ⊙ 1113957ht

⎧⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎩

(6)

where σ is the sigmoid activation function which transformsthe intermediate states into the range [01] htminus1 and ht areoutput states at time t minus 1 and t respectively xt is the input

Major factors of targetintention in air combat

Numeric data

enemyaircraft

acceleration

enemyaircraftaltitude

enemyaircraftspeed

our aircraftacceleration

our aircraftaltitude

our aircraftspeed

enemy aircraftair combat

capability factor

our aircraft aircombat

capability factor

headingangle

azimuthangle

distancebetween the

two sides

Non-numericdata

air-to-airradar status

marineradar status

jammingstatus

maneuvertype

jammedstatus

Figure 4 Feature set of the tactical intention of aerial target

Characteristic prediction

Softmax

Intention recognition

Intention type

BiGRU1 BiGRU2

BiGRU2

BiGRU2

BiGRU1

BiGRU1

Attention

Data preprocessing

Collection of time series feature set Vm

Predictive feature set Wm

DenseOut

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

Out

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRU1

BiGRU1

BiGRU1

BiGRU2

BiGRU2

BiGRU2

Vm+Wm

Figure 5 Intention prediction model framework

Computational Intelligence and Neuroscience 5

sequence value at time t 1113957ht is the candidate output stateWrWz W1113957h

Ur Uz and U1113957hare weight coefficients corre-

sponding to each component tanh is the hyperbolic tangentfunction and ⊙ is the Hadamard product

e traditional GRU structure propagates unidirec-tionally along the sequence transmission direction and itacquires historical information before the current momentignoring future information BiGRU as shown in Figure 7includes a forward and backward GRU which can capturethe characteristics of the information before and after thecurrent moment [32] In Figure 7 GRU1 is a forward GRUand GRU2 is a backward GRUe output state ht of BiGRUat moment t can be obtained from the forward output statehrarr

t determined by the input xt at moment t and output statehrarr

tminus1 of the forward GRU at moment tminus1 and the backwardoutput state h

larr

t determined by the input xt at moment t andoutput state h

rarrt+1 of the backward GRU at moment t+ 1

313 Output Layer e output ht of the BiGRU network inthe hidden layer is fed to the fully connected layer in theoutput layer and the final prediction characteristic valuesare output using a linear activation function

32 Intention Recognition Module e BiGRU-Attentionintention recognition module has input hidden and outputlayers e hidden layer has BiGRU and Attention mech-anism layers In the network the Input (Number ofsamples1216) and the Output 7 where 12 denotes thetime step 16 denotes the number of characteristic dimen-sions and 7 denotes the total number of intention typesedetailed description will be given below

321 Input Layer e input layer of the characteristicprediction module has cleaned and normalized the collectedaerial target operational intention characteristics so theintention recognize module input layer is mainly for theconstruction of the sample data of the intention recognitionmodule If the characteristic data of the 1sim12 moments are

used to predict the intention during that time the functionmapping relationship is

Q1 f v1 v2 v11w12( 1113857 (7)

where Q1 denotes the prediction intention types in timeperiods 1sim12 vi i isin (1 2 11) denotes the historicalcharacteristic data at moment i and w12 denotes thecharacteristic data predicted by the characteristic predictionmodule at moment 12 (v1 v2 v11 v12) is the first set ofinput data labeled as intention type q1 corresponding totime periods 1sim12 (v2 v3 v12 v13) is the second set ofinput data labeled as the intention type q2 corresponding totime periods 2sim13 and so one training sample input dataand labels are composed as shown below e test andtraining sample data are constructed similarly except theformer replaces the characteristic vt at the last moment ofeach sample with the characteristic wt predicted by thecharacteristic prediction module that is the input data are(vi vi+1 vi+10wi+11) and the label is the intention typeqi corresponding to the time period isim i+ 11

v1 v2 middot middot middot vm

v2 v3 middot middot middot vm+1

⋮ ⋮ ⋱ ⋮

v11 v12 middot middot middot vm+10

v12 v13 middot middot middot vm+11

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

q1 q2 middot middot middot qm1113858 1113859

(8)

After one-hot encoding of the intention labels it is sentto the hidden layer together with the constructed sampledata

322 Hidden Layer e hidden layer contains the BiGRUnetwork layer and attention mechanism layer e BiGRUlayer has been described e attention mechanism layer isdescribed below e attention mechanism [33ndash35] operatessimilarly to the human brain by focusing on the local contentof an object according to its purpose e attentionmechanism highlights characteristics that account for agreater proportion of prediction results by calculating theweights of characteristic vectors output from the BiGRUnetwork at different moments In aerial target combat in-tention recognition the neural network assigns weight co-efficients so as to focus on some key characteristics duringthe training process through the attention mechanism Itsimplementation is to learn the importance of each char-acteristic and then assign the corresponding weight coeffi-cient according to its importance For example if an enemyaircraft executes a penetration its flight altitude and headingangle will be assigned higher weights e structure of theattention mechanism model is shown in Figure 8

e characteristic vector ht output by the BiGRU net-work at moment t is input to the attention mechanism layerto obtain the initial state vector St Learn the initializationvector et of the attention mechanism by equation (9) andthe attention weights are probability by equation (10) iethe softmax function to obtain the weight probability vectorαt e final state vector Y is obtained by equation (11) [36]e formula is as follows

times

times +

1-

times

tanhσσ

ht-1

xt

ht

rt zt ht~

Resetgate

Updategate

Figure 6 GRU structure

6 Computational Intelligence and Neuroscience

et tan h WwSt + bw( 1113857 (9)

αt exp etuw( 1113857

1113936ti1 etuw( 1113857

(10)

Y 1113944n

t1αtSt (11)

where Ww is the weight coefficient matrix bw is the biascoefficient matrix and uw is a matrix vector initializedrandomly and continuously learning with training

323 Output Layer e output of the attention mechanismlayer is fed into the multiclassification activation softmaxfunction which outputs the label with the highest proba-bility of aerial target combat intention e enemy aerialtarget combat intention can be recognized by parsing thelabel as in Figure 2 e output prediction label is

yk softmax(WY + b) (12)

where W is the weight coefficient matrix with training b isthe corresponding bias matrix and yk is the predicted labelof the output

4 Experimental Analysis

41 Experimental Dataset and Environment is experi-ment took an airspace UAV close-range engagement as theresearch background and the experimental data wereextracted from a combat simulation system Using thecharacteristic processing and coding described in this paperthe state characteristics of 12 consecutive frames of theenemy aerial target were collected for each sample [13]where 10000 samples (8000 training and 2000 testing) wereconstructed including 16 pieces of information such asflight speed flight altitude azimuth and jamming status ascharacteristics Due to a large amount of data in the sampleset intention-type labels were generated by computeraccording to the rules and experts corrected data with in-tention classification ambiguity Dataset labels includedseven intention types and the combat intention data

consisted of 118 surveillance 1465 reconnaissance1815 feint 18 attack 139 surprise 123 retreat and112 electronic jamming

We used Python 38 on a Quadro RTX 5000PCleSSE2GPU with CUDA 110 acceleration environment with aKeras 243 deep learning framework and an x86-64 Cen-tOS7 PC system Intelreg Xeonreg Sliver 4110 CPU 210GHzand 64GB RAM

42 Experimental Analysis of Characteristic PredictionModule e task of the characteristic prediction module isto predict the future characteristics of enemy aerial targetswhich are later input to the intent recognition module topredict enemy aerial target intent e average of the meansquare error (which we will refer to as ldquoerrorrdquo) of 16-di-mensional features was used as the evaluation index

421 Network Structure Selection e network structuremainly sets the time step number of hidden layers andnumber of hidden layer nodes Since an increased number ofhidden layers will rapidly increase the time cost consideringthe high requirement of air warfare on timeliness it was setto a single or double hidden layer structure without con-sidering more hidden layers e results of the time stepselection and hidden layer node number selection experi-ments are shown in Figure 9

From Figure 9 it can be seen that the mean value of theprediction mean square error was smallest when the timestep was 8 and themean value of the predictionmean squareerror was smallest when the number of nodes in the singlehidden layer was 18 erefore a network structure with atime step of 8 a single hidden layer and 18 network nodeswas chosen e optimizer was Adam [37] the initiallearning rate was 0001 the decay rate was 09 the trainingepochs were 100 and the batch size was 512

422 Comparison Experiment To verify that the proposedcharacteristic prediction module was effective and efficientit was compared with the RNN [38] LSTM [39] GRU [40]and BiLSTM networks in terms of both real time and meansquare error and a segment of the predicted trajectory withthe characteristic of the distance between the two enemy

GRU2

xt-1 xt

ht-1

ht-1 ht+1

ht+1ht

ht

ht-1α

htα

ht+1α

Θ Θ Θ

xt+1

GRU2

GRU1 GRU1 GRU1

GRU2

Figure 7 BiGRU structure

Y

α1αt-1 αt αn

h1 ht hn

ht-1

St-1 St SnS1

+

Figure 8 Attention mechanism structure

Computational Intelligence and Neuroscience 7

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

the model is 12 consecutive frames of time sequencecharacteristics which can effectively overcome the judgmentby a single moment Moreover the model implicitly orga-nizes abstracts and describes the empirical knowledge ofmilitary experts making its knowledge representation andengineering implementation less difficult However it onlyuses historical moment information to make inferencesabout current information and cannot effectively use futuremoment information Since there are many characteristicsrelated to the intention of air targets it is necessary tohighlight the influence of key characteristics and reduce thecontribution of redundant characteristics In addition wewant to further improve the real-time performance of aerialtarget intent recognition in some way

Based on the above analysis we propose a gated re-current unit (GRU) based intelligent prediction model foraerial target combat intention e model has characteristicprediction and intention recognition modulese intentionrecognition module introduces a bidirectional propagationmechanism attention mechanism and particle swarm op-timization (PSO) algorithm based on a GRU to build anintelligent intention recognition model With similar per-formance to that of LSTM a GRU has less structuralcomplexity and requires less time for recognition Comparedwith a GRU a bidirectional gated recurrent unit (BiGRU)can use not only the information of historical moments butalso that of future moments to make comprehensivejudgments PSO can find the optimal parameters of a BiGRUnetwork [14] and the attention mechanism layer can furtherhighlight the key information affecting the intention andimprove the accuracy of intention recognition In order torealize our idea of further shortening the time used forintention recognition we build a characteristic predictionmodule that uses the BiGRU network to analyze the col-lective characteristics and predicts future aerial targetcharacteristics which are input to the intention recognitionmodule to establish themapping relationship between futureaerial target characteristics and the targetrsquos operational in-tention types Experiments show that the proposed modelcan predict the enemy aerial target operational intention onesampling point in advance and the accuracy rate is increasedby 29 compared with LSTM

e remaining sections of this paper are arranged asfollows Section 2 introduces the definition of intent rec-ognition of aerial targets and how to select intention cate-gories and characteristic types In Section 3 the frameworkof the proposed model is described in detail including theintention recognition module and the characteristic pre-diction module Experimental results are analyzed in Section4 to show the performance of the new method is paper isconcluded in the last section

2 Description of Aerial Target OperationalIntention Recognition Problem

Intention recognition is important for command andcontrol in modern war e operational intention of aerialtarget can be inferred based on real-time data from multiplesensors in a dynamic and complex battlefield environment

To enhance the reliability of intention recognition a prioriknowledge and the experience of experts in the relevantoperational field [12] should also be taken into consider-ation e process of intention recognition is shown inFigure 1

Aerial target intention recognition is a pattern recog-nition problem that can be described as a mapping of in-tention recognition characteristics to aerial target combatintention types Define the vector V(t) as the real-time aircombat characteristic information at time t andP (p1 p2 pn) as the aerial target combat intentionspace set Due to the complexity high confrontation anddeceptive nature of actual air combat environment condi-tions relying on the real-time air combat characteristicinformation detected at a single time can be somewhatdeceptive and one-sided To infer the combat intention of anenemy aircraft from air combat characteristic information atsuccessive times is far more accurate and scientific than torely on information at a single time [1] e mappingfunction from the space set P of operational intention to thetemporal characteristic set Vm is determined by defining Vmas the temporal characteristic set form consecutivemomentsfrom t1 to tm

P f Vm( 1113857 f V t1( )V t2( ) V tm( )1113874 1113875 (1)

It can be seen that to achieve accurate recognition of theoperational intention of aerial targets requires a combina-tion of professional military knowledge and operationalexperience and complex thinking activities such as extrac-tion comparison analysis association and inference of keyinformation of air warfare It is difficult to establish themapping relationship between Vm and P by a single formula[13] We implicitly establish the mapping relationship be-tween the characteristic set and operational intention bytraining bidirectional gated recurrent units with attentionmechanism (BiGRU-Attention) network structure using anaerial target operational intention recognition characteristicset

21 Description of Space Set of Aerial Target OperationalIntention e target operational intention space set variesfor different operational forms enemy entities and desiredcontexts erefore the operational intention space set ofenemy targets must be defined based on the correspondingoperational context attributes of the enemy targets andpossible operational tasks For example a target intentionspace set was established as avoidance patrol attack basedon the potential threat of underwater targets [15] an op-erational intention space set was established as retreatcover attack reconnaissance for a single group of enemymaritime ship formations [16] and an operational intentionspace set of aerial targets was defined as reconnaissancesurveillance attack penetration [17] By taking UAV close-range engagement as the research object we establish thecombat intention space set of enemy targets as seven types ofintention feint surveillance electronic interference pen-etration attack retreat reconnaissance

2 Computational Intelligence and Neuroscience

After determining the space set of enemy operationalintention the key to applying the proposed intelligentrecognition model is to convert human cognitive models tolabels that can be trained by intelligent models and whichcorrespond to the types of intention in the operationalintention space set [18] e cognitive experience of expertsat air warfare can be encapsulated in labels to train themodel A set of label values 0 1 2 3 4 5 6 is set for theintention types in the established combat intention space setFigure 2 shows the corresponding combat intention typecoding and model resolution mechanisms For example ifthe intention prediction result is 5 then the combat in-tention of the enemy target against our target is retreaterefore this knowledge encapsulation and model parsingcan clearly and easily describe human empirical knowledgeand facilitate model training

22 SelectionofAerialTargetCombat IntentionCharacteristice enemy aerial targetrsquos operational intention is highlycorrelated with its operational mission the mutual threatlevel between the two sides and the tactical maneuversAccording to the abovementioned three aspects and therequirement of easy acquisition by radar the characteristicsthat are closely related to the combat intention of the airtarget are selected

Analyzed from the perspective of operational taskswhen an enemy UAV performs a certain task enemy aircraftcharacteristics must meet certain conditions For examplewhen performing defense penetration tasks it is divided intohigh-altitude penetration and low-altitude penetration andthe corresponding heights are 10 sim 11 km and100m sim 1000m a high flight speed of fighter aircraft whenreceiving attacks is generally 735 sim 1470 kmh [12] ere isalso a connection between the aerial target radar signal statusand the combat mission A fighter usually turns on air-to-airradar and electronic jamming in air combat and marineradar and air-to-air radar on a reconnaissance mission [19]

Many factors affect the threat level between the twotargets For convenient experimental data collection weconsider the speed flight acceleration distance flight

altitude heading angle and azimuth angle of both theenemyrsquos and our warplanes [20] as shown in Figure 3

e air combat capability factor [21] affects the targetthreat level For fighter aircraft a single aircraft air combatcapability threat function is constructed as

C ln ε1 + ln ε2 + 1( 1113857 + ln 1113944 ε3 + 11113872 11138731113960 1113961ε4ε5ε6ε7 (2)

where ε1 sim ε7 are parameters of warplane maneuverabilityairborne weapon performance airborne detection capabil-ity warplane operational performance warplane surviv-ability warplane operational range and electronicinformation countermeasure capability respectively e aircombat capability factors of various warplanes of both sidescan be calculated through this formula for a certain period oftime and are saved in the database and updated at any timeaccording to current information [22]

e realization of the operational intention of the aerialtarget is closely related to the maneuvers of the aircraftere are two kinds of maneuver libraries in common usethe typical tactical maneuver library and the basic maneuverlibrary Since this paper studies intention recognition basedon temporal characteristics target combat intention rec-ognition is carried out using 12 consecutive momentarycharacteristics as a sample However the control algorithmof a typical tactical action library is complicated to solve andthe exit and conversion time nodes of a maneuver aredifficult to determinee traditional basic maneuver library[23] includes only seven maneuvers and the combinedmaneuvers are not rich enough ey all adopt the limitmaneuver which is inconsistent with the actual air combatsituation We adopt an improved basic maneuver library[24] that includes 11 maneuvers left turn right turnaccelerated forward flight even-speed forward flight de-celerated forward flight climb left climb right climb diveleft dive right dive

In summary we use an aerial target combat intentioncharacteristic consisting of a 16-dimensional characteristicvector of enemy aircraft flight altitude our aircraft altitudeenemy aircraft flight speed our aircraft flight speed enemyaircraft acceleration our aircraft acceleration enemy aircraftair combat capability factor our aircraft air combat capa-bility factor heading angle the distance between the twosides azimuth angle air-to-air radar status marine radarstatus maneuver type jamming status jammed statuswhich can be divided into numeric and nonnumeric char-acteristics as shown in Figure 4

3 Model Framework

e proposed aerial target operational intention predictionmodel consists of a characteristic prediction module and anintention recognition module as shown in Figure 5 echaracteristic prediction module is based on the BiGRUnetwork e historical aerial target operational intentionrecognition characteristic set Vm is used as input a lineardefault activation function of the fully connected layer isused to obtain the prediction characteristic set Wm andthese two sets are formed into the temporal characteristic

Combat intention

Combat operations

Target state Perception state

Identify intentIon

Reasoning processRepresentation process

Enemy target

Air combat intention

recognition rules

Figure 1 Hierarchical representation and reasoning process ofintention

Computational Intelligence and Neuroscience 3

data and input to the intention recognition module con-structed by the BiGRU-Attention [25] network e prob-ability of each intention type is calculated using the softmaxfunction and the maximum probability intention type labelis output as the aerial target combat intention recognitionresult e characteristic prediction module and intentionrecognition module are described below

31 Characteristic Prediction Module e BiGRU charac-teristic prediction module has three parts the aerial targetcombat intention characteristic set input layer hidden layerand output layer Reference [26] has confirmed that theprediction accuracy of each feature independently is higherthan the overall prediction accuracy us Input (Numberof samples81) where 8 is the time step and 1 is thecharacteristic dimension and Output 1 that is the outputcharacteristic dimension is 1e detailed description will begiven below

311 Input Layer e input layer preprocesses the collectedaerial target characteristic dataset into a vector form that canbe accepted and processed by the BiGRU layer as follows

(1) Read the dataset and clean the data(2) Code nonnumeric data of jamming state jammed

state air-to-air radar state and marine radar state as0 (off) or 1 (on) Millierrsquos nine-level quantization

theory [27] is used to quantize numeric data ofmaneuver types

(3) Normalize encoded nonnumeric data with numericdata so as to improve network convergence speedand accuracy and prevent model gradient explosionWe normalize 11 types of numeric data and five typesof encoded nonnumeric data For the ith dimen-sional characteristic data Gi [gi1 gi2 gix

gin] i 1 2 middot middot middot16 where n is the total number ofdata points grsquo

ix is the result of normalizing the xthoriginal data of the ith dimensional characteristic to[0 1] that is

gixprime

gix minus minGi

maxGi minus minGi

(3)

where maxGi and minGi are the maximum andminimum values respectively of Gi

(4) Divide the data into training and test sets at an 8 2ratio

(5) Construct training and test samples as follows Usingthe method of predicting a single characteristic inturn take the distance characteristic prediction ofthe enemy and ourselves as an example If the dis-tance data of the 1sim8 moments are used to predictthe distance D9 at moment ninth the functionmapping relationship is

D9 f d1 d2 d8( 1113857 (4)

where di i isin (1 2 middot middot middot 8) is the distance at the ithmoment d1 sim d8 are selected as the first set of inputdata labeled as d9 d2 sim d9 are selected as the inputdata labeled as d10 and so on e training sampleinput data and training sample labels are generated thisway and shown belowe test data are constructed inthe same way as the training sample data [28]

d1 d2 middot middot middot dm

d2 d3 middot middot middot dm+1

⋮ ⋮ ⋱ ⋮

d8 d9 middot middot middot dm+7

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

d9 d10 middot middot middot dm+81113858 1113859

(5)

Intention space

Category label

Pars

ing

Cod

ing

attack

3

penetration

4

surveillance

0

reconnaissance

1

feint

2

retreat

5

electronicinterference

6

Figure 2 Schematic diagram of combat intention coding and analysis (the same color is a set of correspondences)

D

V1

H2

H1

V2Enemy aircra

Our plane

φ

ψ

Figure 3 Relative geometric position of air combat H1 H2 are theflight altitudes of the enemy and oursV1V2 are the flight speeds ofthe enemy and ours D is the distance between the two parties ψ isthe heading angle and φ is the azimuth angle

4 Computational Intelligence and Neuroscience

e collected aerial target combat intention character-istic set Vm is now in a characteristic vector form that can bedirectly accepted and processed by the hidden layer

312 Hidden Layer As a variant of the recurrent neuralnetwork (RNN) a gated recurrent unit (GRU) [29] has asimilar recursive structure and a memory function to processtime-series data A GRU can alleviate the problems of gradientdisappearance and explosion that may occur during RNNtraining thus solving the long-termmemory problem AnotherRNN variant the long short-term memory (LSTM) network[30] performs similarly but a GRU has a simpler structure andcan reduce computation and improve training efficiency

Figure 6 shows the internal structure of the GRU Its twoinputs are the output state htminus1 at the previous moment andthe input sequence value xt at the current moment and theoutput is the state ht at the current moment It updates the

model state through two gates e reset gate rt controls thedegree of forgetting the historical state information so thatthe network can discard unimportant information and theupdate gate zt controls the weight of the previous momentrsquosstate information being brought into the current statehelping the network to remember long-time information[31] ese variables are related as follows

rt σ Wrxt + Urhtminus1( 1113857

zt σ Wzxt + Uzhtminus1( 1113857

1113957ht tan h W1113957hxt + U1113957h

rt ⊙htminus1( 11138571113874 1113875

ht 1 minus zt( 1113857⊙htminus1 + zt ⊙ 1113957ht

⎧⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎩

(6)

where σ is the sigmoid activation function which transformsthe intermediate states into the range [01] htminus1 and ht areoutput states at time t minus 1 and t respectively xt is the input

Major factors of targetintention in air combat

Numeric data

enemyaircraft

acceleration

enemyaircraftaltitude

enemyaircraftspeed

our aircraftacceleration

our aircraftaltitude

our aircraftspeed

enemy aircraftair combat

capability factor

our aircraft aircombat

capability factor

headingangle

azimuthangle

distancebetween the

two sides

Non-numericdata

air-to-airradar status

marineradar status

jammingstatus

maneuvertype

jammedstatus

Figure 4 Feature set of the tactical intention of aerial target

Characteristic prediction

Softmax

Intention recognition

Intention type

BiGRU1 BiGRU2

BiGRU2

BiGRU2

BiGRU1

BiGRU1

Attention

Data preprocessing

Collection of time series feature set Vm

Predictive feature set Wm

DenseOut

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

Out

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRU1

BiGRU1

BiGRU1

BiGRU2

BiGRU2

BiGRU2

Vm+Wm

Figure 5 Intention prediction model framework

Computational Intelligence and Neuroscience 5

sequence value at time t 1113957ht is the candidate output stateWrWz W1113957h

Ur Uz and U1113957hare weight coefficients corre-

sponding to each component tanh is the hyperbolic tangentfunction and ⊙ is the Hadamard product

e traditional GRU structure propagates unidirec-tionally along the sequence transmission direction and itacquires historical information before the current momentignoring future information BiGRU as shown in Figure 7includes a forward and backward GRU which can capturethe characteristics of the information before and after thecurrent moment [32] In Figure 7 GRU1 is a forward GRUand GRU2 is a backward GRUe output state ht of BiGRUat moment t can be obtained from the forward output statehrarr

t determined by the input xt at moment t and output statehrarr

tminus1 of the forward GRU at moment tminus1 and the backwardoutput state h

larr

t determined by the input xt at moment t andoutput state h

rarrt+1 of the backward GRU at moment t+ 1

313 Output Layer e output ht of the BiGRU network inthe hidden layer is fed to the fully connected layer in theoutput layer and the final prediction characteristic valuesare output using a linear activation function

32 Intention Recognition Module e BiGRU-Attentionintention recognition module has input hidden and outputlayers e hidden layer has BiGRU and Attention mech-anism layers In the network the Input (Number ofsamples1216) and the Output 7 where 12 denotes thetime step 16 denotes the number of characteristic dimen-sions and 7 denotes the total number of intention typesedetailed description will be given below

321 Input Layer e input layer of the characteristicprediction module has cleaned and normalized the collectedaerial target operational intention characteristics so theintention recognize module input layer is mainly for theconstruction of the sample data of the intention recognitionmodule If the characteristic data of the 1sim12 moments are

used to predict the intention during that time the functionmapping relationship is

Q1 f v1 v2 v11w12( 1113857 (7)

where Q1 denotes the prediction intention types in timeperiods 1sim12 vi i isin (1 2 11) denotes the historicalcharacteristic data at moment i and w12 denotes thecharacteristic data predicted by the characteristic predictionmodule at moment 12 (v1 v2 v11 v12) is the first set ofinput data labeled as intention type q1 corresponding totime periods 1sim12 (v2 v3 v12 v13) is the second set ofinput data labeled as the intention type q2 corresponding totime periods 2sim13 and so one training sample input dataand labels are composed as shown below e test andtraining sample data are constructed similarly except theformer replaces the characteristic vt at the last moment ofeach sample with the characteristic wt predicted by thecharacteristic prediction module that is the input data are(vi vi+1 vi+10wi+11) and the label is the intention typeqi corresponding to the time period isim i+ 11

v1 v2 middot middot middot vm

v2 v3 middot middot middot vm+1

⋮ ⋮ ⋱ ⋮

v11 v12 middot middot middot vm+10

v12 v13 middot middot middot vm+11

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

q1 q2 middot middot middot qm1113858 1113859

(8)

After one-hot encoding of the intention labels it is sentto the hidden layer together with the constructed sampledata

322 Hidden Layer e hidden layer contains the BiGRUnetwork layer and attention mechanism layer e BiGRUlayer has been described e attention mechanism layer isdescribed below e attention mechanism [33ndash35] operatessimilarly to the human brain by focusing on the local contentof an object according to its purpose e attentionmechanism highlights characteristics that account for agreater proportion of prediction results by calculating theweights of characteristic vectors output from the BiGRUnetwork at different moments In aerial target combat in-tention recognition the neural network assigns weight co-efficients so as to focus on some key characteristics duringthe training process through the attention mechanism Itsimplementation is to learn the importance of each char-acteristic and then assign the corresponding weight coeffi-cient according to its importance For example if an enemyaircraft executes a penetration its flight altitude and headingangle will be assigned higher weights e structure of theattention mechanism model is shown in Figure 8

e characteristic vector ht output by the BiGRU net-work at moment t is input to the attention mechanism layerto obtain the initial state vector St Learn the initializationvector et of the attention mechanism by equation (9) andthe attention weights are probability by equation (10) iethe softmax function to obtain the weight probability vectorαt e final state vector Y is obtained by equation (11) [36]e formula is as follows

times

times +

1-

times

tanhσσ

ht-1

xt

ht

rt zt ht~

Resetgate

Updategate

Figure 6 GRU structure

6 Computational Intelligence and Neuroscience

et tan h WwSt + bw( 1113857 (9)

αt exp etuw( 1113857

1113936ti1 etuw( 1113857

(10)

Y 1113944n

t1αtSt (11)

where Ww is the weight coefficient matrix bw is the biascoefficient matrix and uw is a matrix vector initializedrandomly and continuously learning with training

323 Output Layer e output of the attention mechanismlayer is fed into the multiclassification activation softmaxfunction which outputs the label with the highest proba-bility of aerial target combat intention e enemy aerialtarget combat intention can be recognized by parsing thelabel as in Figure 2 e output prediction label is

yk softmax(WY + b) (12)

where W is the weight coefficient matrix with training b isthe corresponding bias matrix and yk is the predicted labelof the output

4 Experimental Analysis

41 Experimental Dataset and Environment is experi-ment took an airspace UAV close-range engagement as theresearch background and the experimental data wereextracted from a combat simulation system Using thecharacteristic processing and coding described in this paperthe state characteristics of 12 consecutive frames of theenemy aerial target were collected for each sample [13]where 10000 samples (8000 training and 2000 testing) wereconstructed including 16 pieces of information such asflight speed flight altitude azimuth and jamming status ascharacteristics Due to a large amount of data in the sampleset intention-type labels were generated by computeraccording to the rules and experts corrected data with in-tention classification ambiguity Dataset labels includedseven intention types and the combat intention data

consisted of 118 surveillance 1465 reconnaissance1815 feint 18 attack 139 surprise 123 retreat and112 electronic jamming

We used Python 38 on a Quadro RTX 5000PCleSSE2GPU with CUDA 110 acceleration environment with aKeras 243 deep learning framework and an x86-64 Cen-tOS7 PC system Intelreg Xeonreg Sliver 4110 CPU 210GHzand 64GB RAM

42 Experimental Analysis of Characteristic PredictionModule e task of the characteristic prediction module isto predict the future characteristics of enemy aerial targetswhich are later input to the intent recognition module topredict enemy aerial target intent e average of the meansquare error (which we will refer to as ldquoerrorrdquo) of 16-di-mensional features was used as the evaluation index

421 Network Structure Selection e network structuremainly sets the time step number of hidden layers andnumber of hidden layer nodes Since an increased number ofhidden layers will rapidly increase the time cost consideringthe high requirement of air warfare on timeliness it was setto a single or double hidden layer structure without con-sidering more hidden layers e results of the time stepselection and hidden layer node number selection experi-ments are shown in Figure 9

From Figure 9 it can be seen that the mean value of theprediction mean square error was smallest when the timestep was 8 and themean value of the predictionmean squareerror was smallest when the number of nodes in the singlehidden layer was 18 erefore a network structure with atime step of 8 a single hidden layer and 18 network nodeswas chosen e optimizer was Adam [37] the initiallearning rate was 0001 the decay rate was 09 the trainingepochs were 100 and the batch size was 512

422 Comparison Experiment To verify that the proposedcharacteristic prediction module was effective and efficientit was compared with the RNN [38] LSTM [39] GRU [40]and BiLSTM networks in terms of both real time and meansquare error and a segment of the predicted trajectory withthe characteristic of the distance between the two enemy

GRU2

xt-1 xt

ht-1

ht-1 ht+1

ht+1ht

ht

ht-1α

htα

ht+1α

Θ Θ Θ

xt+1

GRU2

GRU1 GRU1 GRU1

GRU2

Figure 7 BiGRU structure

Y

α1αt-1 αt αn

h1 ht hn

ht-1

St-1 St SnS1

+

Figure 8 Attention mechanism structure

Computational Intelligence and Neuroscience 7

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

After determining the space set of enemy operationalintention the key to applying the proposed intelligentrecognition model is to convert human cognitive models tolabels that can be trained by intelligent models and whichcorrespond to the types of intention in the operationalintention space set [18] e cognitive experience of expertsat air warfare can be encapsulated in labels to train themodel A set of label values 0 1 2 3 4 5 6 is set for theintention types in the established combat intention space setFigure 2 shows the corresponding combat intention typecoding and model resolution mechanisms For example ifthe intention prediction result is 5 then the combat in-tention of the enemy target against our target is retreaterefore this knowledge encapsulation and model parsingcan clearly and easily describe human empirical knowledgeand facilitate model training

22 SelectionofAerialTargetCombat IntentionCharacteristice enemy aerial targetrsquos operational intention is highlycorrelated with its operational mission the mutual threatlevel between the two sides and the tactical maneuversAccording to the abovementioned three aspects and therequirement of easy acquisition by radar the characteristicsthat are closely related to the combat intention of the airtarget are selected

Analyzed from the perspective of operational taskswhen an enemy UAV performs a certain task enemy aircraftcharacteristics must meet certain conditions For examplewhen performing defense penetration tasks it is divided intohigh-altitude penetration and low-altitude penetration andthe corresponding heights are 10 sim 11 km and100m sim 1000m a high flight speed of fighter aircraft whenreceiving attacks is generally 735 sim 1470 kmh [12] ere isalso a connection between the aerial target radar signal statusand the combat mission A fighter usually turns on air-to-airradar and electronic jamming in air combat and marineradar and air-to-air radar on a reconnaissance mission [19]

Many factors affect the threat level between the twotargets For convenient experimental data collection weconsider the speed flight acceleration distance flight

altitude heading angle and azimuth angle of both theenemyrsquos and our warplanes [20] as shown in Figure 3

e air combat capability factor [21] affects the targetthreat level For fighter aircraft a single aircraft air combatcapability threat function is constructed as

C ln ε1 + ln ε2 + 1( 1113857 + ln 1113944 ε3 + 11113872 11138731113960 1113961ε4ε5ε6ε7 (2)

where ε1 sim ε7 are parameters of warplane maneuverabilityairborne weapon performance airborne detection capabil-ity warplane operational performance warplane surviv-ability warplane operational range and electronicinformation countermeasure capability respectively e aircombat capability factors of various warplanes of both sidescan be calculated through this formula for a certain period oftime and are saved in the database and updated at any timeaccording to current information [22]

e realization of the operational intention of the aerialtarget is closely related to the maneuvers of the aircraftere are two kinds of maneuver libraries in common usethe typical tactical maneuver library and the basic maneuverlibrary Since this paper studies intention recognition basedon temporal characteristics target combat intention rec-ognition is carried out using 12 consecutive momentarycharacteristics as a sample However the control algorithmof a typical tactical action library is complicated to solve andthe exit and conversion time nodes of a maneuver aredifficult to determinee traditional basic maneuver library[23] includes only seven maneuvers and the combinedmaneuvers are not rich enough ey all adopt the limitmaneuver which is inconsistent with the actual air combatsituation We adopt an improved basic maneuver library[24] that includes 11 maneuvers left turn right turnaccelerated forward flight even-speed forward flight de-celerated forward flight climb left climb right climb diveleft dive right dive

In summary we use an aerial target combat intentioncharacteristic consisting of a 16-dimensional characteristicvector of enemy aircraft flight altitude our aircraft altitudeenemy aircraft flight speed our aircraft flight speed enemyaircraft acceleration our aircraft acceleration enemy aircraftair combat capability factor our aircraft air combat capa-bility factor heading angle the distance between the twosides azimuth angle air-to-air radar status marine radarstatus maneuver type jamming status jammed statuswhich can be divided into numeric and nonnumeric char-acteristics as shown in Figure 4

3 Model Framework

e proposed aerial target operational intention predictionmodel consists of a characteristic prediction module and anintention recognition module as shown in Figure 5 echaracteristic prediction module is based on the BiGRUnetwork e historical aerial target operational intentionrecognition characteristic set Vm is used as input a lineardefault activation function of the fully connected layer isused to obtain the prediction characteristic set Wm andthese two sets are formed into the temporal characteristic

Combat intention

Combat operations

Target state Perception state

Identify intentIon

Reasoning processRepresentation process

Enemy target

Air combat intention

recognition rules

Figure 1 Hierarchical representation and reasoning process ofintention

Computational Intelligence and Neuroscience 3

data and input to the intention recognition module con-structed by the BiGRU-Attention [25] network e prob-ability of each intention type is calculated using the softmaxfunction and the maximum probability intention type labelis output as the aerial target combat intention recognitionresult e characteristic prediction module and intentionrecognition module are described below

31 Characteristic Prediction Module e BiGRU charac-teristic prediction module has three parts the aerial targetcombat intention characteristic set input layer hidden layerand output layer Reference [26] has confirmed that theprediction accuracy of each feature independently is higherthan the overall prediction accuracy us Input (Numberof samples81) where 8 is the time step and 1 is thecharacteristic dimension and Output 1 that is the outputcharacteristic dimension is 1e detailed description will begiven below

311 Input Layer e input layer preprocesses the collectedaerial target characteristic dataset into a vector form that canbe accepted and processed by the BiGRU layer as follows

(1) Read the dataset and clean the data(2) Code nonnumeric data of jamming state jammed

state air-to-air radar state and marine radar state as0 (off) or 1 (on) Millierrsquos nine-level quantization

theory [27] is used to quantize numeric data ofmaneuver types

(3) Normalize encoded nonnumeric data with numericdata so as to improve network convergence speedand accuracy and prevent model gradient explosionWe normalize 11 types of numeric data and five typesof encoded nonnumeric data For the ith dimen-sional characteristic data Gi [gi1 gi2 gix

gin] i 1 2 middot middot middot16 where n is the total number ofdata points grsquo

ix is the result of normalizing the xthoriginal data of the ith dimensional characteristic to[0 1] that is

gixprime

gix minus minGi

maxGi minus minGi

(3)

where maxGi and minGi are the maximum andminimum values respectively of Gi

(4) Divide the data into training and test sets at an 8 2ratio

(5) Construct training and test samples as follows Usingthe method of predicting a single characteristic inturn take the distance characteristic prediction ofthe enemy and ourselves as an example If the dis-tance data of the 1sim8 moments are used to predictthe distance D9 at moment ninth the functionmapping relationship is

D9 f d1 d2 d8( 1113857 (4)

where di i isin (1 2 middot middot middot 8) is the distance at the ithmoment d1 sim d8 are selected as the first set of inputdata labeled as d9 d2 sim d9 are selected as the inputdata labeled as d10 and so on e training sampleinput data and training sample labels are generated thisway and shown belowe test data are constructed inthe same way as the training sample data [28]

d1 d2 middot middot middot dm

d2 d3 middot middot middot dm+1

⋮ ⋮ ⋱ ⋮

d8 d9 middot middot middot dm+7

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

d9 d10 middot middot middot dm+81113858 1113859

(5)

Intention space

Category label

Pars

ing

Cod

ing

attack

3

penetration

4

surveillance

0

reconnaissance

1

feint

2

retreat

5

electronicinterference

6

Figure 2 Schematic diagram of combat intention coding and analysis (the same color is a set of correspondences)

D

V1

H2

H1

V2Enemy aircra

Our plane

φ

ψ

Figure 3 Relative geometric position of air combat H1 H2 are theflight altitudes of the enemy and oursV1V2 are the flight speeds ofthe enemy and ours D is the distance between the two parties ψ isthe heading angle and φ is the azimuth angle

4 Computational Intelligence and Neuroscience

e collected aerial target combat intention character-istic set Vm is now in a characteristic vector form that can bedirectly accepted and processed by the hidden layer

312 Hidden Layer As a variant of the recurrent neuralnetwork (RNN) a gated recurrent unit (GRU) [29] has asimilar recursive structure and a memory function to processtime-series data A GRU can alleviate the problems of gradientdisappearance and explosion that may occur during RNNtraining thus solving the long-termmemory problem AnotherRNN variant the long short-term memory (LSTM) network[30] performs similarly but a GRU has a simpler structure andcan reduce computation and improve training efficiency

Figure 6 shows the internal structure of the GRU Its twoinputs are the output state htminus1 at the previous moment andthe input sequence value xt at the current moment and theoutput is the state ht at the current moment It updates the

model state through two gates e reset gate rt controls thedegree of forgetting the historical state information so thatthe network can discard unimportant information and theupdate gate zt controls the weight of the previous momentrsquosstate information being brought into the current statehelping the network to remember long-time information[31] ese variables are related as follows

rt σ Wrxt + Urhtminus1( 1113857

zt σ Wzxt + Uzhtminus1( 1113857

1113957ht tan h W1113957hxt + U1113957h

rt ⊙htminus1( 11138571113874 1113875

ht 1 minus zt( 1113857⊙htminus1 + zt ⊙ 1113957ht

⎧⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎩

(6)

where σ is the sigmoid activation function which transformsthe intermediate states into the range [01] htminus1 and ht areoutput states at time t minus 1 and t respectively xt is the input

Major factors of targetintention in air combat

Numeric data

enemyaircraft

acceleration

enemyaircraftaltitude

enemyaircraftspeed

our aircraftacceleration

our aircraftaltitude

our aircraftspeed

enemy aircraftair combat

capability factor

our aircraft aircombat

capability factor

headingangle

azimuthangle

distancebetween the

two sides

Non-numericdata

air-to-airradar status

marineradar status

jammingstatus

maneuvertype

jammedstatus

Figure 4 Feature set of the tactical intention of aerial target

Characteristic prediction

Softmax

Intention recognition

Intention type

BiGRU1 BiGRU2

BiGRU2

BiGRU2

BiGRU1

BiGRU1

Attention

Data preprocessing

Collection of time series feature set Vm

Predictive feature set Wm

DenseOut

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

Out

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRU1

BiGRU1

BiGRU1

BiGRU2

BiGRU2

BiGRU2

Vm+Wm

Figure 5 Intention prediction model framework

Computational Intelligence and Neuroscience 5

sequence value at time t 1113957ht is the candidate output stateWrWz W1113957h

Ur Uz and U1113957hare weight coefficients corre-

sponding to each component tanh is the hyperbolic tangentfunction and ⊙ is the Hadamard product

e traditional GRU structure propagates unidirec-tionally along the sequence transmission direction and itacquires historical information before the current momentignoring future information BiGRU as shown in Figure 7includes a forward and backward GRU which can capturethe characteristics of the information before and after thecurrent moment [32] In Figure 7 GRU1 is a forward GRUand GRU2 is a backward GRUe output state ht of BiGRUat moment t can be obtained from the forward output statehrarr

t determined by the input xt at moment t and output statehrarr

tminus1 of the forward GRU at moment tminus1 and the backwardoutput state h

larr

t determined by the input xt at moment t andoutput state h

rarrt+1 of the backward GRU at moment t+ 1

313 Output Layer e output ht of the BiGRU network inthe hidden layer is fed to the fully connected layer in theoutput layer and the final prediction characteristic valuesare output using a linear activation function

32 Intention Recognition Module e BiGRU-Attentionintention recognition module has input hidden and outputlayers e hidden layer has BiGRU and Attention mech-anism layers In the network the Input (Number ofsamples1216) and the Output 7 where 12 denotes thetime step 16 denotes the number of characteristic dimen-sions and 7 denotes the total number of intention typesedetailed description will be given below

321 Input Layer e input layer of the characteristicprediction module has cleaned and normalized the collectedaerial target operational intention characteristics so theintention recognize module input layer is mainly for theconstruction of the sample data of the intention recognitionmodule If the characteristic data of the 1sim12 moments are

used to predict the intention during that time the functionmapping relationship is

Q1 f v1 v2 v11w12( 1113857 (7)

where Q1 denotes the prediction intention types in timeperiods 1sim12 vi i isin (1 2 11) denotes the historicalcharacteristic data at moment i and w12 denotes thecharacteristic data predicted by the characteristic predictionmodule at moment 12 (v1 v2 v11 v12) is the first set ofinput data labeled as intention type q1 corresponding totime periods 1sim12 (v2 v3 v12 v13) is the second set ofinput data labeled as the intention type q2 corresponding totime periods 2sim13 and so one training sample input dataand labels are composed as shown below e test andtraining sample data are constructed similarly except theformer replaces the characteristic vt at the last moment ofeach sample with the characteristic wt predicted by thecharacteristic prediction module that is the input data are(vi vi+1 vi+10wi+11) and the label is the intention typeqi corresponding to the time period isim i+ 11

v1 v2 middot middot middot vm

v2 v3 middot middot middot vm+1

⋮ ⋮ ⋱ ⋮

v11 v12 middot middot middot vm+10

v12 v13 middot middot middot vm+11

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

q1 q2 middot middot middot qm1113858 1113859

(8)

After one-hot encoding of the intention labels it is sentto the hidden layer together with the constructed sampledata

322 Hidden Layer e hidden layer contains the BiGRUnetwork layer and attention mechanism layer e BiGRUlayer has been described e attention mechanism layer isdescribed below e attention mechanism [33ndash35] operatessimilarly to the human brain by focusing on the local contentof an object according to its purpose e attentionmechanism highlights characteristics that account for agreater proportion of prediction results by calculating theweights of characteristic vectors output from the BiGRUnetwork at different moments In aerial target combat in-tention recognition the neural network assigns weight co-efficients so as to focus on some key characteristics duringthe training process through the attention mechanism Itsimplementation is to learn the importance of each char-acteristic and then assign the corresponding weight coeffi-cient according to its importance For example if an enemyaircraft executes a penetration its flight altitude and headingangle will be assigned higher weights e structure of theattention mechanism model is shown in Figure 8

e characteristic vector ht output by the BiGRU net-work at moment t is input to the attention mechanism layerto obtain the initial state vector St Learn the initializationvector et of the attention mechanism by equation (9) andthe attention weights are probability by equation (10) iethe softmax function to obtain the weight probability vectorαt e final state vector Y is obtained by equation (11) [36]e formula is as follows

times

times +

1-

times

tanhσσ

ht-1

xt

ht

rt zt ht~

Resetgate

Updategate

Figure 6 GRU structure

6 Computational Intelligence and Neuroscience

et tan h WwSt + bw( 1113857 (9)

αt exp etuw( 1113857

1113936ti1 etuw( 1113857

(10)

Y 1113944n

t1αtSt (11)

where Ww is the weight coefficient matrix bw is the biascoefficient matrix and uw is a matrix vector initializedrandomly and continuously learning with training

323 Output Layer e output of the attention mechanismlayer is fed into the multiclassification activation softmaxfunction which outputs the label with the highest proba-bility of aerial target combat intention e enemy aerialtarget combat intention can be recognized by parsing thelabel as in Figure 2 e output prediction label is

yk softmax(WY + b) (12)

where W is the weight coefficient matrix with training b isthe corresponding bias matrix and yk is the predicted labelof the output

4 Experimental Analysis

41 Experimental Dataset and Environment is experi-ment took an airspace UAV close-range engagement as theresearch background and the experimental data wereextracted from a combat simulation system Using thecharacteristic processing and coding described in this paperthe state characteristics of 12 consecutive frames of theenemy aerial target were collected for each sample [13]where 10000 samples (8000 training and 2000 testing) wereconstructed including 16 pieces of information such asflight speed flight altitude azimuth and jamming status ascharacteristics Due to a large amount of data in the sampleset intention-type labels were generated by computeraccording to the rules and experts corrected data with in-tention classification ambiguity Dataset labels includedseven intention types and the combat intention data

consisted of 118 surveillance 1465 reconnaissance1815 feint 18 attack 139 surprise 123 retreat and112 electronic jamming

We used Python 38 on a Quadro RTX 5000PCleSSE2GPU with CUDA 110 acceleration environment with aKeras 243 deep learning framework and an x86-64 Cen-tOS7 PC system Intelreg Xeonreg Sliver 4110 CPU 210GHzand 64GB RAM

42 Experimental Analysis of Characteristic PredictionModule e task of the characteristic prediction module isto predict the future characteristics of enemy aerial targetswhich are later input to the intent recognition module topredict enemy aerial target intent e average of the meansquare error (which we will refer to as ldquoerrorrdquo) of 16-di-mensional features was used as the evaluation index

421 Network Structure Selection e network structuremainly sets the time step number of hidden layers andnumber of hidden layer nodes Since an increased number ofhidden layers will rapidly increase the time cost consideringthe high requirement of air warfare on timeliness it was setto a single or double hidden layer structure without con-sidering more hidden layers e results of the time stepselection and hidden layer node number selection experi-ments are shown in Figure 9

From Figure 9 it can be seen that the mean value of theprediction mean square error was smallest when the timestep was 8 and themean value of the predictionmean squareerror was smallest when the number of nodes in the singlehidden layer was 18 erefore a network structure with atime step of 8 a single hidden layer and 18 network nodeswas chosen e optimizer was Adam [37] the initiallearning rate was 0001 the decay rate was 09 the trainingepochs were 100 and the batch size was 512

422 Comparison Experiment To verify that the proposedcharacteristic prediction module was effective and efficientit was compared with the RNN [38] LSTM [39] GRU [40]and BiLSTM networks in terms of both real time and meansquare error and a segment of the predicted trajectory withthe characteristic of the distance between the two enemy

GRU2

xt-1 xt

ht-1

ht-1 ht+1

ht+1ht

ht

ht-1α

htα

ht+1α

Θ Θ Θ

xt+1

GRU2

GRU1 GRU1 GRU1

GRU2

Figure 7 BiGRU structure

Y

α1αt-1 αt αn

h1 ht hn

ht-1

St-1 St SnS1

+

Figure 8 Attention mechanism structure

Computational Intelligence and Neuroscience 7

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

data and input to the intention recognition module con-structed by the BiGRU-Attention [25] network e prob-ability of each intention type is calculated using the softmaxfunction and the maximum probability intention type labelis output as the aerial target combat intention recognitionresult e characteristic prediction module and intentionrecognition module are described below

31 Characteristic Prediction Module e BiGRU charac-teristic prediction module has three parts the aerial targetcombat intention characteristic set input layer hidden layerand output layer Reference [26] has confirmed that theprediction accuracy of each feature independently is higherthan the overall prediction accuracy us Input (Numberof samples81) where 8 is the time step and 1 is thecharacteristic dimension and Output 1 that is the outputcharacteristic dimension is 1e detailed description will begiven below

311 Input Layer e input layer preprocesses the collectedaerial target characteristic dataset into a vector form that canbe accepted and processed by the BiGRU layer as follows

(1) Read the dataset and clean the data(2) Code nonnumeric data of jamming state jammed

state air-to-air radar state and marine radar state as0 (off) or 1 (on) Millierrsquos nine-level quantization

theory [27] is used to quantize numeric data ofmaneuver types

(3) Normalize encoded nonnumeric data with numericdata so as to improve network convergence speedand accuracy and prevent model gradient explosionWe normalize 11 types of numeric data and five typesof encoded nonnumeric data For the ith dimen-sional characteristic data Gi [gi1 gi2 gix

gin] i 1 2 middot middot middot16 where n is the total number ofdata points grsquo

ix is the result of normalizing the xthoriginal data of the ith dimensional characteristic to[0 1] that is

gixprime

gix minus minGi

maxGi minus minGi

(3)

where maxGi and minGi are the maximum andminimum values respectively of Gi

(4) Divide the data into training and test sets at an 8 2ratio

(5) Construct training and test samples as follows Usingthe method of predicting a single characteristic inturn take the distance characteristic prediction ofthe enemy and ourselves as an example If the dis-tance data of the 1sim8 moments are used to predictthe distance D9 at moment ninth the functionmapping relationship is

D9 f d1 d2 d8( 1113857 (4)

where di i isin (1 2 middot middot middot 8) is the distance at the ithmoment d1 sim d8 are selected as the first set of inputdata labeled as d9 d2 sim d9 are selected as the inputdata labeled as d10 and so on e training sampleinput data and training sample labels are generated thisway and shown belowe test data are constructed inthe same way as the training sample data [28]

d1 d2 middot middot middot dm

d2 d3 middot middot middot dm+1

⋮ ⋮ ⋱ ⋮

d8 d9 middot middot middot dm+7

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

d9 d10 middot middot middot dm+81113858 1113859

(5)

Intention space

Category label

Pars

ing

Cod

ing

attack

3

penetration

4

surveillance

0

reconnaissance

1

feint

2

retreat

5

electronicinterference

6

Figure 2 Schematic diagram of combat intention coding and analysis (the same color is a set of correspondences)

D

V1

H2

H1

V2Enemy aircra

Our plane

φ

ψ

Figure 3 Relative geometric position of air combat H1 H2 are theflight altitudes of the enemy and oursV1V2 are the flight speeds ofthe enemy and ours D is the distance between the two parties ψ isthe heading angle and φ is the azimuth angle

4 Computational Intelligence and Neuroscience

e collected aerial target combat intention character-istic set Vm is now in a characteristic vector form that can bedirectly accepted and processed by the hidden layer

312 Hidden Layer As a variant of the recurrent neuralnetwork (RNN) a gated recurrent unit (GRU) [29] has asimilar recursive structure and a memory function to processtime-series data A GRU can alleviate the problems of gradientdisappearance and explosion that may occur during RNNtraining thus solving the long-termmemory problem AnotherRNN variant the long short-term memory (LSTM) network[30] performs similarly but a GRU has a simpler structure andcan reduce computation and improve training efficiency

Figure 6 shows the internal structure of the GRU Its twoinputs are the output state htminus1 at the previous moment andthe input sequence value xt at the current moment and theoutput is the state ht at the current moment It updates the

model state through two gates e reset gate rt controls thedegree of forgetting the historical state information so thatthe network can discard unimportant information and theupdate gate zt controls the weight of the previous momentrsquosstate information being brought into the current statehelping the network to remember long-time information[31] ese variables are related as follows

rt σ Wrxt + Urhtminus1( 1113857

zt σ Wzxt + Uzhtminus1( 1113857

1113957ht tan h W1113957hxt + U1113957h

rt ⊙htminus1( 11138571113874 1113875

ht 1 minus zt( 1113857⊙htminus1 + zt ⊙ 1113957ht

⎧⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎩

(6)

where σ is the sigmoid activation function which transformsthe intermediate states into the range [01] htminus1 and ht areoutput states at time t minus 1 and t respectively xt is the input

Major factors of targetintention in air combat

Numeric data

enemyaircraft

acceleration

enemyaircraftaltitude

enemyaircraftspeed

our aircraftacceleration

our aircraftaltitude

our aircraftspeed

enemy aircraftair combat

capability factor

our aircraft aircombat

capability factor

headingangle

azimuthangle

distancebetween the

two sides

Non-numericdata

air-to-airradar status

marineradar status

jammingstatus

maneuvertype

jammedstatus

Figure 4 Feature set of the tactical intention of aerial target

Characteristic prediction

Softmax

Intention recognition

Intention type

BiGRU1 BiGRU2

BiGRU2

BiGRU2

BiGRU1

BiGRU1

Attention

Data preprocessing

Collection of time series feature set Vm

Predictive feature set Wm

DenseOut

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

Out

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRU1

BiGRU1

BiGRU1

BiGRU2

BiGRU2

BiGRU2

Vm+Wm

Figure 5 Intention prediction model framework

Computational Intelligence and Neuroscience 5

sequence value at time t 1113957ht is the candidate output stateWrWz W1113957h

Ur Uz and U1113957hare weight coefficients corre-

sponding to each component tanh is the hyperbolic tangentfunction and ⊙ is the Hadamard product

e traditional GRU structure propagates unidirec-tionally along the sequence transmission direction and itacquires historical information before the current momentignoring future information BiGRU as shown in Figure 7includes a forward and backward GRU which can capturethe characteristics of the information before and after thecurrent moment [32] In Figure 7 GRU1 is a forward GRUand GRU2 is a backward GRUe output state ht of BiGRUat moment t can be obtained from the forward output statehrarr

t determined by the input xt at moment t and output statehrarr

tminus1 of the forward GRU at moment tminus1 and the backwardoutput state h

larr

t determined by the input xt at moment t andoutput state h

rarrt+1 of the backward GRU at moment t+ 1

313 Output Layer e output ht of the BiGRU network inthe hidden layer is fed to the fully connected layer in theoutput layer and the final prediction characteristic valuesare output using a linear activation function

32 Intention Recognition Module e BiGRU-Attentionintention recognition module has input hidden and outputlayers e hidden layer has BiGRU and Attention mech-anism layers In the network the Input (Number ofsamples1216) and the Output 7 where 12 denotes thetime step 16 denotes the number of characteristic dimen-sions and 7 denotes the total number of intention typesedetailed description will be given below

321 Input Layer e input layer of the characteristicprediction module has cleaned and normalized the collectedaerial target operational intention characteristics so theintention recognize module input layer is mainly for theconstruction of the sample data of the intention recognitionmodule If the characteristic data of the 1sim12 moments are

used to predict the intention during that time the functionmapping relationship is

Q1 f v1 v2 v11w12( 1113857 (7)

where Q1 denotes the prediction intention types in timeperiods 1sim12 vi i isin (1 2 11) denotes the historicalcharacteristic data at moment i and w12 denotes thecharacteristic data predicted by the characteristic predictionmodule at moment 12 (v1 v2 v11 v12) is the first set ofinput data labeled as intention type q1 corresponding totime periods 1sim12 (v2 v3 v12 v13) is the second set ofinput data labeled as the intention type q2 corresponding totime periods 2sim13 and so one training sample input dataand labels are composed as shown below e test andtraining sample data are constructed similarly except theformer replaces the characteristic vt at the last moment ofeach sample with the characteristic wt predicted by thecharacteristic prediction module that is the input data are(vi vi+1 vi+10wi+11) and the label is the intention typeqi corresponding to the time period isim i+ 11

v1 v2 middot middot middot vm

v2 v3 middot middot middot vm+1

⋮ ⋮ ⋱ ⋮

v11 v12 middot middot middot vm+10

v12 v13 middot middot middot vm+11

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

q1 q2 middot middot middot qm1113858 1113859

(8)

After one-hot encoding of the intention labels it is sentto the hidden layer together with the constructed sampledata

322 Hidden Layer e hidden layer contains the BiGRUnetwork layer and attention mechanism layer e BiGRUlayer has been described e attention mechanism layer isdescribed below e attention mechanism [33ndash35] operatessimilarly to the human brain by focusing on the local contentof an object according to its purpose e attentionmechanism highlights characteristics that account for agreater proportion of prediction results by calculating theweights of characteristic vectors output from the BiGRUnetwork at different moments In aerial target combat in-tention recognition the neural network assigns weight co-efficients so as to focus on some key characteristics duringthe training process through the attention mechanism Itsimplementation is to learn the importance of each char-acteristic and then assign the corresponding weight coeffi-cient according to its importance For example if an enemyaircraft executes a penetration its flight altitude and headingangle will be assigned higher weights e structure of theattention mechanism model is shown in Figure 8

e characteristic vector ht output by the BiGRU net-work at moment t is input to the attention mechanism layerto obtain the initial state vector St Learn the initializationvector et of the attention mechanism by equation (9) andthe attention weights are probability by equation (10) iethe softmax function to obtain the weight probability vectorαt e final state vector Y is obtained by equation (11) [36]e formula is as follows

times

times +

1-

times

tanhσσ

ht-1

xt

ht

rt zt ht~

Resetgate

Updategate

Figure 6 GRU structure

6 Computational Intelligence and Neuroscience

et tan h WwSt + bw( 1113857 (9)

αt exp etuw( 1113857

1113936ti1 etuw( 1113857

(10)

Y 1113944n

t1αtSt (11)

where Ww is the weight coefficient matrix bw is the biascoefficient matrix and uw is a matrix vector initializedrandomly and continuously learning with training

323 Output Layer e output of the attention mechanismlayer is fed into the multiclassification activation softmaxfunction which outputs the label with the highest proba-bility of aerial target combat intention e enemy aerialtarget combat intention can be recognized by parsing thelabel as in Figure 2 e output prediction label is

yk softmax(WY + b) (12)

where W is the weight coefficient matrix with training b isthe corresponding bias matrix and yk is the predicted labelof the output

4 Experimental Analysis

41 Experimental Dataset and Environment is experi-ment took an airspace UAV close-range engagement as theresearch background and the experimental data wereextracted from a combat simulation system Using thecharacteristic processing and coding described in this paperthe state characteristics of 12 consecutive frames of theenemy aerial target were collected for each sample [13]where 10000 samples (8000 training and 2000 testing) wereconstructed including 16 pieces of information such asflight speed flight altitude azimuth and jamming status ascharacteristics Due to a large amount of data in the sampleset intention-type labels were generated by computeraccording to the rules and experts corrected data with in-tention classification ambiguity Dataset labels includedseven intention types and the combat intention data

consisted of 118 surveillance 1465 reconnaissance1815 feint 18 attack 139 surprise 123 retreat and112 electronic jamming

We used Python 38 on a Quadro RTX 5000PCleSSE2GPU with CUDA 110 acceleration environment with aKeras 243 deep learning framework and an x86-64 Cen-tOS7 PC system Intelreg Xeonreg Sliver 4110 CPU 210GHzand 64GB RAM

42 Experimental Analysis of Characteristic PredictionModule e task of the characteristic prediction module isto predict the future characteristics of enemy aerial targetswhich are later input to the intent recognition module topredict enemy aerial target intent e average of the meansquare error (which we will refer to as ldquoerrorrdquo) of 16-di-mensional features was used as the evaluation index

421 Network Structure Selection e network structuremainly sets the time step number of hidden layers andnumber of hidden layer nodes Since an increased number ofhidden layers will rapidly increase the time cost consideringthe high requirement of air warfare on timeliness it was setto a single or double hidden layer structure without con-sidering more hidden layers e results of the time stepselection and hidden layer node number selection experi-ments are shown in Figure 9

From Figure 9 it can be seen that the mean value of theprediction mean square error was smallest when the timestep was 8 and themean value of the predictionmean squareerror was smallest when the number of nodes in the singlehidden layer was 18 erefore a network structure with atime step of 8 a single hidden layer and 18 network nodeswas chosen e optimizer was Adam [37] the initiallearning rate was 0001 the decay rate was 09 the trainingepochs were 100 and the batch size was 512

422 Comparison Experiment To verify that the proposedcharacteristic prediction module was effective and efficientit was compared with the RNN [38] LSTM [39] GRU [40]and BiLSTM networks in terms of both real time and meansquare error and a segment of the predicted trajectory withthe characteristic of the distance between the two enemy

GRU2

xt-1 xt

ht-1

ht-1 ht+1

ht+1ht

ht

ht-1α

htα

ht+1α

Θ Θ Θ

xt+1

GRU2

GRU1 GRU1 GRU1

GRU2

Figure 7 BiGRU structure

Y

α1αt-1 αt αn

h1 ht hn

ht-1

St-1 St SnS1

+

Figure 8 Attention mechanism structure

Computational Intelligence and Neuroscience 7

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

e collected aerial target combat intention character-istic set Vm is now in a characteristic vector form that can bedirectly accepted and processed by the hidden layer

312 Hidden Layer As a variant of the recurrent neuralnetwork (RNN) a gated recurrent unit (GRU) [29] has asimilar recursive structure and a memory function to processtime-series data A GRU can alleviate the problems of gradientdisappearance and explosion that may occur during RNNtraining thus solving the long-termmemory problem AnotherRNN variant the long short-term memory (LSTM) network[30] performs similarly but a GRU has a simpler structure andcan reduce computation and improve training efficiency

Figure 6 shows the internal structure of the GRU Its twoinputs are the output state htminus1 at the previous moment andthe input sequence value xt at the current moment and theoutput is the state ht at the current moment It updates the

model state through two gates e reset gate rt controls thedegree of forgetting the historical state information so thatthe network can discard unimportant information and theupdate gate zt controls the weight of the previous momentrsquosstate information being brought into the current statehelping the network to remember long-time information[31] ese variables are related as follows

rt σ Wrxt + Urhtminus1( 1113857

zt σ Wzxt + Uzhtminus1( 1113857

1113957ht tan h W1113957hxt + U1113957h

rt ⊙htminus1( 11138571113874 1113875

ht 1 minus zt( 1113857⊙htminus1 + zt ⊙ 1113957ht

⎧⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎩

(6)

where σ is the sigmoid activation function which transformsthe intermediate states into the range [01] htminus1 and ht areoutput states at time t minus 1 and t respectively xt is the input

Major factors of targetintention in air combat

Numeric data

enemyaircraft

acceleration

enemyaircraftaltitude

enemyaircraftspeed

our aircraftacceleration

our aircraftaltitude

our aircraftspeed

enemy aircraftair combat

capability factor

our aircraft aircombat

capability factor

headingangle

azimuthangle

distancebetween the

two sides

Non-numericdata

air-to-airradar status

marineradar status

jammingstatus

maneuvertype

jammedstatus

Figure 4 Feature set of the tactical intention of aerial target

Characteristic prediction

Softmax

Intention recognition

Intention type

BiGRU1 BiGRU2

BiGRU2

BiGRU2

BiGRU1

BiGRU1

Attention

Data preprocessing

Collection of time series feature set Vm

Predictive feature set Wm

DenseOut

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

Out

put

Laye

rIn

put

Laye

rH

idde

nLa

yer

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRUn

BiGRU1

BiGRU1

BiGRU1

BiGRU2

BiGRU2

BiGRU2

Vm+Wm

Figure 5 Intention prediction model framework

Computational Intelligence and Neuroscience 5

sequence value at time t 1113957ht is the candidate output stateWrWz W1113957h

Ur Uz and U1113957hare weight coefficients corre-

sponding to each component tanh is the hyperbolic tangentfunction and ⊙ is the Hadamard product

e traditional GRU structure propagates unidirec-tionally along the sequence transmission direction and itacquires historical information before the current momentignoring future information BiGRU as shown in Figure 7includes a forward and backward GRU which can capturethe characteristics of the information before and after thecurrent moment [32] In Figure 7 GRU1 is a forward GRUand GRU2 is a backward GRUe output state ht of BiGRUat moment t can be obtained from the forward output statehrarr

t determined by the input xt at moment t and output statehrarr

tminus1 of the forward GRU at moment tminus1 and the backwardoutput state h

larr

t determined by the input xt at moment t andoutput state h

rarrt+1 of the backward GRU at moment t+ 1

313 Output Layer e output ht of the BiGRU network inthe hidden layer is fed to the fully connected layer in theoutput layer and the final prediction characteristic valuesare output using a linear activation function

32 Intention Recognition Module e BiGRU-Attentionintention recognition module has input hidden and outputlayers e hidden layer has BiGRU and Attention mech-anism layers In the network the Input (Number ofsamples1216) and the Output 7 where 12 denotes thetime step 16 denotes the number of characteristic dimen-sions and 7 denotes the total number of intention typesedetailed description will be given below

321 Input Layer e input layer of the characteristicprediction module has cleaned and normalized the collectedaerial target operational intention characteristics so theintention recognize module input layer is mainly for theconstruction of the sample data of the intention recognitionmodule If the characteristic data of the 1sim12 moments are

used to predict the intention during that time the functionmapping relationship is

Q1 f v1 v2 v11w12( 1113857 (7)

where Q1 denotes the prediction intention types in timeperiods 1sim12 vi i isin (1 2 11) denotes the historicalcharacteristic data at moment i and w12 denotes thecharacteristic data predicted by the characteristic predictionmodule at moment 12 (v1 v2 v11 v12) is the first set ofinput data labeled as intention type q1 corresponding totime periods 1sim12 (v2 v3 v12 v13) is the second set ofinput data labeled as the intention type q2 corresponding totime periods 2sim13 and so one training sample input dataand labels are composed as shown below e test andtraining sample data are constructed similarly except theformer replaces the characteristic vt at the last moment ofeach sample with the characteristic wt predicted by thecharacteristic prediction module that is the input data are(vi vi+1 vi+10wi+11) and the label is the intention typeqi corresponding to the time period isim i+ 11

v1 v2 middot middot middot vm

v2 v3 middot middot middot vm+1

⋮ ⋮ ⋱ ⋮

v11 v12 middot middot middot vm+10

v12 v13 middot middot middot vm+11

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

q1 q2 middot middot middot qm1113858 1113859

(8)

After one-hot encoding of the intention labels it is sentto the hidden layer together with the constructed sampledata

322 Hidden Layer e hidden layer contains the BiGRUnetwork layer and attention mechanism layer e BiGRUlayer has been described e attention mechanism layer isdescribed below e attention mechanism [33ndash35] operatessimilarly to the human brain by focusing on the local contentof an object according to its purpose e attentionmechanism highlights characteristics that account for agreater proportion of prediction results by calculating theweights of characteristic vectors output from the BiGRUnetwork at different moments In aerial target combat in-tention recognition the neural network assigns weight co-efficients so as to focus on some key characteristics duringthe training process through the attention mechanism Itsimplementation is to learn the importance of each char-acteristic and then assign the corresponding weight coeffi-cient according to its importance For example if an enemyaircraft executes a penetration its flight altitude and headingangle will be assigned higher weights e structure of theattention mechanism model is shown in Figure 8

e characteristic vector ht output by the BiGRU net-work at moment t is input to the attention mechanism layerto obtain the initial state vector St Learn the initializationvector et of the attention mechanism by equation (9) andthe attention weights are probability by equation (10) iethe softmax function to obtain the weight probability vectorαt e final state vector Y is obtained by equation (11) [36]e formula is as follows

times

times +

1-

times

tanhσσ

ht-1

xt

ht

rt zt ht~

Resetgate

Updategate

Figure 6 GRU structure

6 Computational Intelligence and Neuroscience

et tan h WwSt + bw( 1113857 (9)

αt exp etuw( 1113857

1113936ti1 etuw( 1113857

(10)

Y 1113944n

t1αtSt (11)

where Ww is the weight coefficient matrix bw is the biascoefficient matrix and uw is a matrix vector initializedrandomly and continuously learning with training

323 Output Layer e output of the attention mechanismlayer is fed into the multiclassification activation softmaxfunction which outputs the label with the highest proba-bility of aerial target combat intention e enemy aerialtarget combat intention can be recognized by parsing thelabel as in Figure 2 e output prediction label is

yk softmax(WY + b) (12)

where W is the weight coefficient matrix with training b isthe corresponding bias matrix and yk is the predicted labelof the output

4 Experimental Analysis

41 Experimental Dataset and Environment is experi-ment took an airspace UAV close-range engagement as theresearch background and the experimental data wereextracted from a combat simulation system Using thecharacteristic processing and coding described in this paperthe state characteristics of 12 consecutive frames of theenemy aerial target were collected for each sample [13]where 10000 samples (8000 training and 2000 testing) wereconstructed including 16 pieces of information such asflight speed flight altitude azimuth and jamming status ascharacteristics Due to a large amount of data in the sampleset intention-type labels were generated by computeraccording to the rules and experts corrected data with in-tention classification ambiguity Dataset labels includedseven intention types and the combat intention data

consisted of 118 surveillance 1465 reconnaissance1815 feint 18 attack 139 surprise 123 retreat and112 electronic jamming

We used Python 38 on a Quadro RTX 5000PCleSSE2GPU with CUDA 110 acceleration environment with aKeras 243 deep learning framework and an x86-64 Cen-tOS7 PC system Intelreg Xeonreg Sliver 4110 CPU 210GHzand 64GB RAM

42 Experimental Analysis of Characteristic PredictionModule e task of the characteristic prediction module isto predict the future characteristics of enemy aerial targetswhich are later input to the intent recognition module topredict enemy aerial target intent e average of the meansquare error (which we will refer to as ldquoerrorrdquo) of 16-di-mensional features was used as the evaluation index

421 Network Structure Selection e network structuremainly sets the time step number of hidden layers andnumber of hidden layer nodes Since an increased number ofhidden layers will rapidly increase the time cost consideringthe high requirement of air warfare on timeliness it was setto a single or double hidden layer structure without con-sidering more hidden layers e results of the time stepselection and hidden layer node number selection experi-ments are shown in Figure 9

From Figure 9 it can be seen that the mean value of theprediction mean square error was smallest when the timestep was 8 and themean value of the predictionmean squareerror was smallest when the number of nodes in the singlehidden layer was 18 erefore a network structure with atime step of 8 a single hidden layer and 18 network nodeswas chosen e optimizer was Adam [37] the initiallearning rate was 0001 the decay rate was 09 the trainingepochs were 100 and the batch size was 512

422 Comparison Experiment To verify that the proposedcharacteristic prediction module was effective and efficientit was compared with the RNN [38] LSTM [39] GRU [40]and BiLSTM networks in terms of both real time and meansquare error and a segment of the predicted trajectory withthe characteristic of the distance between the two enemy

GRU2

xt-1 xt

ht-1

ht-1 ht+1

ht+1ht

ht

ht-1α

htα

ht+1α

Θ Θ Θ

xt+1

GRU2

GRU1 GRU1 GRU1

GRU2

Figure 7 BiGRU structure

Y

α1αt-1 αt αn

h1 ht hn

ht-1

St-1 St SnS1

+

Figure 8 Attention mechanism structure

Computational Intelligence and Neuroscience 7

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

sequence value at time t 1113957ht is the candidate output stateWrWz W1113957h

Ur Uz and U1113957hare weight coefficients corre-

sponding to each component tanh is the hyperbolic tangentfunction and ⊙ is the Hadamard product

e traditional GRU structure propagates unidirec-tionally along the sequence transmission direction and itacquires historical information before the current momentignoring future information BiGRU as shown in Figure 7includes a forward and backward GRU which can capturethe characteristics of the information before and after thecurrent moment [32] In Figure 7 GRU1 is a forward GRUand GRU2 is a backward GRUe output state ht of BiGRUat moment t can be obtained from the forward output statehrarr

t determined by the input xt at moment t and output statehrarr

tminus1 of the forward GRU at moment tminus1 and the backwardoutput state h

larr

t determined by the input xt at moment t andoutput state h

rarrt+1 of the backward GRU at moment t+ 1

313 Output Layer e output ht of the BiGRU network inthe hidden layer is fed to the fully connected layer in theoutput layer and the final prediction characteristic valuesare output using a linear activation function

32 Intention Recognition Module e BiGRU-Attentionintention recognition module has input hidden and outputlayers e hidden layer has BiGRU and Attention mech-anism layers In the network the Input (Number ofsamples1216) and the Output 7 where 12 denotes thetime step 16 denotes the number of characteristic dimen-sions and 7 denotes the total number of intention typesedetailed description will be given below

321 Input Layer e input layer of the characteristicprediction module has cleaned and normalized the collectedaerial target operational intention characteristics so theintention recognize module input layer is mainly for theconstruction of the sample data of the intention recognitionmodule If the characteristic data of the 1sim12 moments are

used to predict the intention during that time the functionmapping relationship is

Q1 f v1 v2 v11w12( 1113857 (7)

where Q1 denotes the prediction intention types in timeperiods 1sim12 vi i isin (1 2 11) denotes the historicalcharacteristic data at moment i and w12 denotes thecharacteristic data predicted by the characteristic predictionmodule at moment 12 (v1 v2 v11 v12) is the first set ofinput data labeled as intention type q1 corresponding totime periods 1sim12 (v2 v3 v12 v13) is the second set ofinput data labeled as the intention type q2 corresponding totime periods 2sim13 and so one training sample input dataand labels are composed as shown below e test andtraining sample data are constructed similarly except theformer replaces the characteristic vt at the last moment ofeach sample with the characteristic wt predicted by thecharacteristic prediction module that is the input data are(vi vi+1 vi+10wi+11) and the label is the intention typeqi corresponding to the time period isim i+ 11

v1 v2 middot middot middot vm

v2 v3 middot middot middot vm+1

⋮ ⋮ ⋱ ⋮

v11 v12 middot middot middot vm+10

v12 v13 middot middot middot vm+11

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

q1 q2 middot middot middot qm1113858 1113859

(8)

After one-hot encoding of the intention labels it is sentto the hidden layer together with the constructed sampledata

322 Hidden Layer e hidden layer contains the BiGRUnetwork layer and attention mechanism layer e BiGRUlayer has been described e attention mechanism layer isdescribed below e attention mechanism [33ndash35] operatessimilarly to the human brain by focusing on the local contentof an object according to its purpose e attentionmechanism highlights characteristics that account for agreater proportion of prediction results by calculating theweights of characteristic vectors output from the BiGRUnetwork at different moments In aerial target combat in-tention recognition the neural network assigns weight co-efficients so as to focus on some key characteristics duringthe training process through the attention mechanism Itsimplementation is to learn the importance of each char-acteristic and then assign the corresponding weight coeffi-cient according to its importance For example if an enemyaircraft executes a penetration its flight altitude and headingangle will be assigned higher weights e structure of theattention mechanism model is shown in Figure 8

e characteristic vector ht output by the BiGRU net-work at moment t is input to the attention mechanism layerto obtain the initial state vector St Learn the initializationvector et of the attention mechanism by equation (9) andthe attention weights are probability by equation (10) iethe softmax function to obtain the weight probability vectorαt e final state vector Y is obtained by equation (11) [36]e formula is as follows

times

times +

1-

times

tanhσσ

ht-1

xt

ht

rt zt ht~

Resetgate

Updategate

Figure 6 GRU structure

6 Computational Intelligence and Neuroscience

et tan h WwSt + bw( 1113857 (9)

αt exp etuw( 1113857

1113936ti1 etuw( 1113857

(10)

Y 1113944n

t1αtSt (11)

where Ww is the weight coefficient matrix bw is the biascoefficient matrix and uw is a matrix vector initializedrandomly and continuously learning with training

323 Output Layer e output of the attention mechanismlayer is fed into the multiclassification activation softmaxfunction which outputs the label with the highest proba-bility of aerial target combat intention e enemy aerialtarget combat intention can be recognized by parsing thelabel as in Figure 2 e output prediction label is

yk softmax(WY + b) (12)

where W is the weight coefficient matrix with training b isthe corresponding bias matrix and yk is the predicted labelof the output

4 Experimental Analysis

41 Experimental Dataset and Environment is experi-ment took an airspace UAV close-range engagement as theresearch background and the experimental data wereextracted from a combat simulation system Using thecharacteristic processing and coding described in this paperthe state characteristics of 12 consecutive frames of theenemy aerial target were collected for each sample [13]where 10000 samples (8000 training and 2000 testing) wereconstructed including 16 pieces of information such asflight speed flight altitude azimuth and jamming status ascharacteristics Due to a large amount of data in the sampleset intention-type labels were generated by computeraccording to the rules and experts corrected data with in-tention classification ambiguity Dataset labels includedseven intention types and the combat intention data

consisted of 118 surveillance 1465 reconnaissance1815 feint 18 attack 139 surprise 123 retreat and112 electronic jamming

We used Python 38 on a Quadro RTX 5000PCleSSE2GPU with CUDA 110 acceleration environment with aKeras 243 deep learning framework and an x86-64 Cen-tOS7 PC system Intelreg Xeonreg Sliver 4110 CPU 210GHzand 64GB RAM

42 Experimental Analysis of Characteristic PredictionModule e task of the characteristic prediction module isto predict the future characteristics of enemy aerial targetswhich are later input to the intent recognition module topredict enemy aerial target intent e average of the meansquare error (which we will refer to as ldquoerrorrdquo) of 16-di-mensional features was used as the evaluation index

421 Network Structure Selection e network structuremainly sets the time step number of hidden layers andnumber of hidden layer nodes Since an increased number ofhidden layers will rapidly increase the time cost consideringthe high requirement of air warfare on timeliness it was setto a single or double hidden layer structure without con-sidering more hidden layers e results of the time stepselection and hidden layer node number selection experi-ments are shown in Figure 9

From Figure 9 it can be seen that the mean value of theprediction mean square error was smallest when the timestep was 8 and themean value of the predictionmean squareerror was smallest when the number of nodes in the singlehidden layer was 18 erefore a network structure with atime step of 8 a single hidden layer and 18 network nodeswas chosen e optimizer was Adam [37] the initiallearning rate was 0001 the decay rate was 09 the trainingepochs were 100 and the batch size was 512

422 Comparison Experiment To verify that the proposedcharacteristic prediction module was effective and efficientit was compared with the RNN [38] LSTM [39] GRU [40]and BiLSTM networks in terms of both real time and meansquare error and a segment of the predicted trajectory withthe characteristic of the distance between the two enemy

GRU2

xt-1 xt

ht-1

ht-1 ht+1

ht+1ht

ht

ht-1α

htα

ht+1α

Θ Θ Θ

xt+1

GRU2

GRU1 GRU1 GRU1

GRU2

Figure 7 BiGRU structure

Y

α1αt-1 αt αn

h1 ht hn

ht-1

St-1 St SnS1

+

Figure 8 Attention mechanism structure

Computational Intelligence and Neuroscience 7

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

et tan h WwSt + bw( 1113857 (9)

αt exp etuw( 1113857

1113936ti1 etuw( 1113857

(10)

Y 1113944n

t1αtSt (11)

where Ww is the weight coefficient matrix bw is the biascoefficient matrix and uw is a matrix vector initializedrandomly and continuously learning with training

323 Output Layer e output of the attention mechanismlayer is fed into the multiclassification activation softmaxfunction which outputs the label with the highest proba-bility of aerial target combat intention e enemy aerialtarget combat intention can be recognized by parsing thelabel as in Figure 2 e output prediction label is

yk softmax(WY + b) (12)

where W is the weight coefficient matrix with training b isthe corresponding bias matrix and yk is the predicted labelof the output

4 Experimental Analysis

41 Experimental Dataset and Environment is experi-ment took an airspace UAV close-range engagement as theresearch background and the experimental data wereextracted from a combat simulation system Using thecharacteristic processing and coding described in this paperthe state characteristics of 12 consecutive frames of theenemy aerial target were collected for each sample [13]where 10000 samples (8000 training and 2000 testing) wereconstructed including 16 pieces of information such asflight speed flight altitude azimuth and jamming status ascharacteristics Due to a large amount of data in the sampleset intention-type labels were generated by computeraccording to the rules and experts corrected data with in-tention classification ambiguity Dataset labels includedseven intention types and the combat intention data

consisted of 118 surveillance 1465 reconnaissance1815 feint 18 attack 139 surprise 123 retreat and112 electronic jamming

We used Python 38 on a Quadro RTX 5000PCleSSE2GPU with CUDA 110 acceleration environment with aKeras 243 deep learning framework and an x86-64 Cen-tOS7 PC system Intelreg Xeonreg Sliver 4110 CPU 210GHzand 64GB RAM

42 Experimental Analysis of Characteristic PredictionModule e task of the characteristic prediction module isto predict the future characteristics of enemy aerial targetswhich are later input to the intent recognition module topredict enemy aerial target intent e average of the meansquare error (which we will refer to as ldquoerrorrdquo) of 16-di-mensional features was used as the evaluation index

421 Network Structure Selection e network structuremainly sets the time step number of hidden layers andnumber of hidden layer nodes Since an increased number ofhidden layers will rapidly increase the time cost consideringthe high requirement of air warfare on timeliness it was setto a single or double hidden layer structure without con-sidering more hidden layers e results of the time stepselection and hidden layer node number selection experi-ments are shown in Figure 9

From Figure 9 it can be seen that the mean value of theprediction mean square error was smallest when the timestep was 8 and themean value of the predictionmean squareerror was smallest when the number of nodes in the singlehidden layer was 18 erefore a network structure with atime step of 8 a single hidden layer and 18 network nodeswas chosen e optimizer was Adam [37] the initiallearning rate was 0001 the decay rate was 09 the trainingepochs were 100 and the batch size was 512

422 Comparison Experiment To verify that the proposedcharacteristic prediction module was effective and efficientit was compared with the RNN [38] LSTM [39] GRU [40]and BiLSTM networks in terms of both real time and meansquare error and a segment of the predicted trajectory withthe characteristic of the distance between the two enemy

GRU2

xt-1 xt

ht-1

ht-1 ht+1

ht+1ht

ht

ht-1α

htα

ht+1α

Θ Θ Θ

xt+1

GRU2

GRU1 GRU1 GRU1

GRU2

Figure 7 BiGRU structure

Y

α1αt-1 αt αn

h1 ht hn

ht-1

St-1 St SnS1

+

Figure 8 Attention mechanism structure

Computational Intelligence and Neuroscience 7

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

sides was selected for comparison with the actual trajectorye results are shown in Table 1 and Figure 10

From Table 1 we can see that BiLSTM had the longestsingle-step prediction time 0311ms and RNN had theshortest single-step prediction time but its prediction errorwas larger reaching 814 times 10minus 5 BiGRU had the smallestprediction error and its single-step prediction time de-creased by about one-third compared with BiLSTM so it canbe concluded that BiGRU had less internal structure com-plexity [41] with similar or even better performance thanBiLSTM e prediction error of BiGRU was half that ofGRU and it can be concluded that the bidirectionalpropagation mechanism could more effectively utilize theinformation of future moments than the one-way propa-gation Although BiGRU had a longer single-step predictiontime than GRU the single-step prediction time of 0202ms issufficient to provide timely prediction information when thesampling interval is 05s It can be seen from Figure 10 thatthe characteristic trajectories predicted by RNN had a low fitto the actual characteristic trajectories and those predictedby the other four methods had a high fit which is consistentwith the error results in Table 1

43 Experimental Analysis of Intention Recognition Modulee data used in this experiment did not contain futurecharacteristics predicted by the characteristic predictionmodule that is the 12 frames of temporal characteristics ineach sample were all historical with no added predictioncharacteristics e purpose of the experiment was tocompare our methods with those proposed in other liter-atures e intention prediction experiment is described inSection 44

431 Network Structure Selection e optimal networkstructure of BiGRU was selected using PSO [42] with fourparameters number of hidden layers number of hiddenlayer nodes batch size and learning rate e upper andlower limits were set as [450010000005] and[11010000001] e intention recognition error rate wasset as the fitness function the maximum number of itera-tions was 20 and the population size was 30 e experi-mental results and settings of other key hyperparameters []are shown in Table 2

432 Analysis of Intention Recognition Results e BiGRU-Attention intention recognition module was trained andthen the test samples were input into the module e ex-periment showed that the accuracy of the proposed inten-tion recognition network model proposed was achieved Tofurther observe the relationship between the recognitionintentions a confusion matrix of the intention recognitionresults of the test samples was produced e diagonal lineindicates the number of correctly recognized samples andthe results are shown in Figure 11

It can be seen from Figure 11 that the BiGRU-Attentionintention recognitionmodel had a high recognition accuracyand recall rate for all seven intentions In particular the

electronic interference intention recognition precision andrecall rate could reach 100 and 96 respectively A fewcases of mutual recognition errors occurred between sur-veillance and reconnaissance intentions and between feintand attack intentions which should be attributed to the highcharacteristic similarity and deception between the intentionpairs causing the model trained by BiGRU-Attention modelto have similar weights for the two intentions in each pair Asa result the attention layer failed to accurately sense theweight difference between the two intentions leading to asmall number of incorrectly recognized intentions whichaccords with the actual situation

433 Comparison Experiments In the experiment thehighest accuracy of the test set during 200 iterations wasselected as the accuracy of the intention recognition modeland the corresponding loss value was the loss value eBiGRU-Attention intention recognition model was com-pared with the LSTM-based tactical intention recognitionmodel for the battlefield against enemy targets [13] the aerialtarget combat intention recognition model using the Adamalgorithm and ReLU function optimized DBP neural net-work [12] and the stacked self-encoder tactical intentionintelligent recognition model [15] e parameters of thecomparison experiments were set as shown in Table 3 andthe experimental results are shown in Table 4

As can be seen from Table 4 the BiGRU-Attentionintention recognition model was superior to the other threemodels in terms of both accuracy and loss value with 25improvement in accuracy over LSTM and nearly 10 overSAE and DBP thus verifying its effectiveness for aerial targetcombat intention recognition Further analysis shows thatBiGRU-Attention and LSTM as temporal characteristicnetworks based on recurrent neural networks were moreapplicable to aerial target combat intention recognition thanthe other two models further indicating that it is morescientific to make inferences about aerial target combatintention based on temporal characteristic changes

434 Ablation Experiments Although the BiGRU-Atten-tion intention recognition model has been validated incomparison experiments for its effectiveness in operationalintention recognition of aerial targets the comparisonmethod is not a comparison of hybrid experimental modelsof the same type Results of ablation experiments on thesame dataset are shown in Table 5 and Figure 12

From Table 5 the BiGRU-Attention intention recog-nition model had an accuracy 27 19 and 16 per-centages higher than the accuracy of the GRU BiGRU andGRU-attention intention recognition models respectivelye BiGRU-Attention model also had lower loss values thanthe other models BiGRU and GRU-Attention models havesimilar accuracy and loss values and are better than GRUmodels From Figure 12 we can see that the accuracy of thefour models increased and the loss value decreased with thenumber of training epochs the accuracy and loss value of theBiGRU-Attention and BiGRU models converged at around50 rounds and the other two models at about 70 rounds so

8 Computational Intelligence and Neuroscience

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

the introduction of bidirectional propagation seems to haveeffectively improved model convergence and acceleratedlearning e curves of the BiGRU-Attention model weresignificantly better than those of the other three e ac-curacy and loss curve of the BiGRU and GRU-Attentionmodels after convergence are similar and they are betterthan the GRU model which shows that the basic GRUmodel was significantly improved by introducing the at-tention mechanism and bidirectional propagationmechanism

44 Experimental Analysis of Intention Prediction is ex-periment combined future characteristic states predicted bythe characteristic prediction module and historical charac-teristic states into 12 frames of temporal characteristics thatis the first 11 frames were historical characteristics the 12thwas predicted future characteristics and sample data wereconstructed as described in Section 31

0

1

2

3

4

5

6

7

3 4 5 6 7 8 9 10

Erro

r (10

-5)

Time Step

(a)

012345678

4 6 8 10 12 14 16 18 20Number of Nodes

One LayerTwo Layer

Erro

r (10

-5)

(b)

Figure 9 (a) e effect of different time step on the results (b) e effect of the number of different notes on the results

Table 1 Experimental results of five models

Method Error (10minus5) Prediction time (ms)RNN 814 0089LSTM 275 0133GRU 246 0110BiLSTM 129 0311BiGRU 116 0202

05

04

03

02

010 10 20 30

Moment

Dist

ance

TrueBiGRUBiLSTM

GRULSTMRNN

Figure 10 Characteristic prediction trajectory

Table 2 Experimental parameters

Parameter ValueLoss function Categorical_crossentropyOptimizer AdamDropout 05Hidden layer 3Hidden nodes 334 10 338Batch size 100Learning rate 00014Epoch 200

211 211

31

0

0

0

0

8

25

256

0

0

0

0

0

0

0

314

28

0

4

0

0

38

326

17

0

1

0

0

0

6

255

9

0

0

0

0

0

6

233

0

0

0

0

0

0

0

215

6

300

250

200

150

100

50

0

surveillance

reconnaissance

surv

eilla

nce

reco

nnai

ssan

ce

feint

fein

t

attack

atta

ck

penetration

pene

trat

ion

retreat

retre

at

electronicinterference

elec

troni

cin

terfe

renc

e

Figure 11 Confusion matrix of intention recognition

Computational Intelligence and Neuroscience 9

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

To verify that the proposed intention prediction methodcan effectively recognize the enemyrsquos intention in advance itwas compared with the intention recognition method inSection 5 which has no prediction effect and the modelevaluation indicators of precision rate recall rate and F1-score were used to verify the model with results as shown inTable 6 and Figure 13

From Table 6 and Figure 13 the proposed intentionprediction method had high prediction accuracy for retreatand electronic jamming intentions but relatively low ac-curacy for surveillance and attack intention recognitionAfter analysis the air combat characteristics of the first twointentions were more obvious and the air combat

characteristics of the latter two intentions were more similarto those of the reconnaissance and feint intentions whichcauses mutual prediction error that results in relatively lowaccuracy of intention prediction Overall the accuracy of theproposed intention prediction method could reach 897which is a significant improvement in accuracy comparedwith LSTM DBP and SAE and could produce the pre-diction one sampling interval (05 s) earlier

In addition the attempt to predict the enemyrsquos intentionin advance of two sampling points did not yield satisfactoryresults and the accuracy could only reach 70 Comparedwith single-step prediction the two-step prediction of thecharacteristic prediction module has too much error

Table 3 Comparison of model parameter settings

Model Number of hidden layers Hidden notes Learning rate OptimizerSAE 3 256 128 128 002 SGDLSTM 3 256 128 128 0001 AdamDBP 4 256 512 512 256 001 Adam

Table 4 Comparison of different intention recognition models

Model Accuracy () LossBiLSTM-Attention 905 0257LSTM 876 0346SAE 813 0473DBP 793 0492

Table 5 Results of ablation experiment

Model composition structureAccuracy () Loss

Bidirectional GRU Attentionradic radic radic 905 0257

radic radic 886 0305radic radic 889 0289

radic 874 0337

80

60

40

20

0 50 100 200Epoch

Accu

racy

()

150

BiGRU-AttentionGRU-Attention

BiGRUGRU

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

BiGRU-AttentionGRU-Attention

BiGRUGRU

(b)

Figure 12 (a) Changes in accuracy of ablation experiment (b) Changes in loss of ablation experiment

10 Computational Intelligence and Neuroscience

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

accumulation and the goodness of fit is low which leads tothe low accuracy of intention prediction However with thecontinued development and improvement of multistepprediction methods it is believed that the proposed aerialtarget combat intention prediction method can have betterapplication prospects

5 Conclusions

For the problem of aerial target combat intention recog-nition we adopted a hierarchical strategy to select 16-di-mensional air combat characteristics from threeperspectives enemy combat mission threat level betweentwo sides and tactical maneuvers e sample vector isconstructed by preprocessing the intent feature set data ofthe aerial target and encapsulating the domain expertknowledge and experience into labels We improved the airtarget intention recognition method based on LSTM pro-posed a GRU-based aerial target operational intentionrecognition model and introduced a bidirectional propa-gation mechanism and attention mechanism to significantlyimprove accuracy compared to the LSTM SAE and DBPintention recognitionmodels In order to further shorten thetime of air target intention recognition we proposed the

BiGRU-based air combat characteristic prediction methodand experimental results showed that it can effectivelyperform single-step characteristic prediction Combiningthe BiGRU-Attention intention recognitionmodule with theBiGRU characteristic prediction module we were able topredict enemy aerial target operational intention onesampling point in advance with 897 accuracy How tomore accurately distinguish confusing intentions will be ournext research direction

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that there are no conflicts of interestregarding the publication of this study

Acknowledgments

is research was funded by the National Natural ScienceFoundation of China (Grants nos 61703426 and 72001214)Young Talent fund of University Association for Science and

Table 6 Intention prediction performance measurement

Evaluation Index Precision() Recall () F1 scoreI II III IV I II III IV I II III IV

Intent type

Surveillance 830 792 705 687 900 856 771 754 0859 0823 0737 0719Reconnaissance 899 868 810 763 853 788 754 724 0876 0826 0781 0743

Feint 906 891 825 781 854 829 738 755 0879 0859 0779 0768Attack 823 796 712 702 906 897 814 733 0862 0843 0759 0717

Penetration 903 900 834 808 899 896 849 831 0901 0897 0841 0819Retreat 975 983 934 953 947 939 915 911 0961 0960 0924 0931

Electronic jamming 995 964 966 944 955 960 897 906 0975 0962 0931 0925I II III and IV respectively represent the BiGRU-Attention LSTM SAE and DBP aerial target air tactical intention recognition models

80

60

40

20

0 50 100 200Epoch

150

Accu

racy

()

PredictionLSTM

DBPSAE

(a)

20

15

10Loss

05

0 50 100 150 200Epoch

PredictionLSTM

DBPSAE

(b)

Figure 13 (a) Changes in accuracy of four models (b) Changes in loss of four models

Computational Intelligence and Neuroscience 11

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

Technology in Shaanxi China under Grant no 2019038and the Innovation Capability Support Plan of ShaanxiChina under Grant no 2020KJXX-065

References

[1] Z Liu Q Wu S Chen and M Chen ldquoPrediction of un-manned aerial vehicle target intention under incompleteinformationrdquo SCIENTIA SINICA Informationis vol 50 no 5pp 704ndash717 2020

[2] T Zhou M Chen Y Wang J He and C Yang ldquoInformationentropy-based intention prediction of aerial targets underuncertain and incomplete informationrdquo Entropy vol 22no 3 p 279 2020

[3] Y L Sun and L Bao ldquoStudy on recognition tecnique oftargetsrsquo tactical intentions in sea battle field based on D-Sevidence theoryrdquo Ship Electronic Engineering vol 32 no 5pp 48ndash51 2012

[4] F J Zhao Z J Zhou and C H Hu ldquoAerial target intentionrecognition approach based on belief-rule-base and evidentialreasoningrdquo Electronics Optics and Control vol 24 no 8pp 15ndash19+50 2017

[5] X Xiae Study of Target Intent Assessment Method Based onthe Template-Matching School of National University ofDefense Technology Changsha China 2006

[6] X T Li e Research and Implementation of Situation As-sessment in the Target Intention Recognition North Universityof China Taiyuan China 2012

[7] X Yin M Zhang and M Q Chen ldquoCombat intentionrecognition of the target in the air based on discri-minantanalysisrdquo Journal of Projectiles Rockets Missiles and Guid-ance vol 38 no 3 pp 46ndash50 2018

[8] Q Jin X Gou and W Jin ldquoIntention recognition of aerialtargets based on bayesian optimization algorithmrdquo in Pro-ceedings of the 2017 2nd IEEE International Conference onIntelligent Transportation Engineering (ICITE) IEEE Singa-pore September 2017

[9] Y Song X H Zhang and Z K Wang ldquoTarget intentioninference model based on variable structure bayesian net-workrdquo in Proceedings of the CiSE 2009 pp 333ndash340 WuhanChina December 2009

[10] Y J Liu G H Kou and J H Song ldquoTarget recognition basedon RBF neural networkrdquo Fire Control and Command Controlvol 40 no 08 pp 9ndash13 2015

[11] X Y Zhai F B Yang and L N Ji ldquoAir combat targets threatassessment based on standardized fully connected networkand residual networkrdquo Fire Control and Command Controlvol 45 no 6 pp 39ndash44 2020

[12] W W Zhou P Y Yao and J Y Zhang ldquoCombat intentionrecognition foraerial targets based on deep neural networkrdquoActa Aeronautica et Astronautica Sinca vol 39 no 11 ArticleID 322468 2018

[13] W Ou S J Liu and X Y He ldquoStudy on intelligent recog-nition model of enemy targetrsquos tactical intention on battle-fieldrdquo Computer Simulation vol 34 no 9 pp 10ndash14+192017

[14] S Bhattacharya P K R Maddikunta R Kaluri et al ldquoA novelPCA-firefly based XGBoost classification model for intrusiondetection in networks using GPUrdquo Electronics vol 9 no 2p 219 2020

[15] W Ou S J Liu and X Y He ldquoTactical intention recognitionalgorithm based on encoded temporal featuresrdquo CommandControl amp Simulation vol 38 no 6 pp 36ndash41 2016

[16] G Y Lu and Y Y Ding ldquoStudy on intention recognition tofoe of underwater platformrdquo Command Control amp Simula-tion vol 34 no 6 pp 100ndash102 2012

[17] H Chen Q L Ren and Y Hua ldquoFuzzy neural network basedtactical intention recognition for sea targetsrdquo Systems Engi-neering and Electronics vol 38 no 08 pp 1847ndash1853 2016

[18] F Teng S Liu and Y F Song ldquoBiLSTM-attention an tacticalintention recognition modelrdquo Aero Weaponry vol 50 2020

[19] J Xue J Zhu J Xiao S Tong and L Huang ldquoPanoramicconvolutional long short-term memory networks for combatintension recognition of aerial targetsrdquo IEEE Access vol 8pp 183312ndash183323 2020

[20] H Guo H J Xu and L Liu ldquoTarget threat assessment of aircombat based on support vector machines for regressionrdquoJournal of Beijing University of Aeronautics and Astronauticsvol 36 no 1 pp 123ndash126 2010

[21] I Kojadinovic and J-L Marichal ldquoEntropy of bi-capacitiesrdquoEuropean Journal of Operational Research vol 178 no 1pp 168ndash184 2007

[22] Z F Xi A Xu and Y X Kou ldquoTarget threat assessment in aircombat based on PCA-MPSO-ELM algo-rithmrdquo Acta Aero-nautica et Astronautica Sinica vol 41 no 9 p 323895 2020

[23] K Q Zhu and Y F Dong ldquoStudy on the design of air combatmaneuver libraryrdquo Aeronautical Computing Technologyno 04 pp 50ndash52 2001

[24] S Y Zhou W H Wu and X Li ldquoAnalysis of air combatmaneuver decision set modelrdquo Aircraft Design vol 32 no 03pp 42ndash45 2012

[25] G YatingWWu L Qiongbin C Fenghuang and C QinqinldquoFault diagnosis for power converters based on optimizedtemporal convolutional networkrdquo IEEE Transactions on In-strumentation and Measurement vol 70 pp 1ndash10 2021

[26] L Xie D L Ding and Z L Wei ldquoReal time prediction ofmaneuver trajectory based on adaboost-PSO-LSTM net-workrdquo Systems Engineering and Electronics vol 43 no 6pp 1651ndash1658 2021

[27] J S Li W Liang and X M Liu ldquoe multi-attribute eval-uation of menace of targets in midcourse of ballistic missilebased on maximal windage methodrdquo Systems Engineeringeory amp Practice no 5 pp 164ndash167 2007

[28] X Wang R N Yang and J L Zuo ldquoTrajectory prediction oftarget aircraft based on HPSO-TPFENN neural networkrdquoJournal of Northwestern Polytechnical University vol 37no 3 pp 612ndash620 2019

[29] X Wei L Zhang H-Q Yang L Zhang and Y-P YaoldquoMachine learning for pore-water pressure time-series pre-diction application of recurrent neural networksrdquo GeoscienceFrontiers vol 12 no 1 pp 453ndash467 2021

[30] Z Xu W Zeng X Chu and P Cao ldquoMulti-aircraft trajectorycollaborative prediction based on social long short-termmemory networkrdquo Aerospace vol 8 no 4 p 115 2021

[31] Y C Sun R L Tian and X F Wang ldquoEmitter signal rec-ognition based on improved CLDNNrdquo Systems Engineeringand Electronics vol 43 no 1 pp 42ndash47 2021

[32] J X Chen D M Jiang and Y N Zhang ldquoA hierarchicalbidirectional GRU model with attention for EEG-basedemotion classificationrdquo IEEE Access vol 7 pp 118530ndash118540 2019

[33] Y Song S Gao Y Li L Jia Q Li and F Pang ldquoDistributedattention-based temporal convolutional network forremaining useful life predictionrdquo IEEE Internet of ingsJournal vol 8 no 12 pp 9594ndash9602 2021

12 Computational Intelligence and Neuroscience

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13

[34] P Anderson X He C Buehler et al ldquoBottom-up and top-down attention for image captioning and visual questionansweringrdquo 2017 httpsarxivorgabs170707998

[35] W Wang Y X Sun and Q J Qi ldquoText sentiment classifi-cation model based on BiGRU-attention neural networkrdquoApplication Research of Computers vol 36 no 12pp 3558ndash3564 2019

[36] Z Yang D Yang C Dyer X He A J Smola and E H HovyldquoHierarchical attention networks for document classifica-tionrdquo in Proceedings of the HLT-NAACL pp 1480ndash1489 SanDiego CA USA June 2016

[37] D Kingma and J Ba ldquoAdam a method for stochastic opti-mizationrdquo Computer Science vol 1 2014

[38] Y Zhang Y Li and W Xian ldquoA recurrent neural networkbased method for predicting the state of aircraft air condi-tioning systemrdquo in Proceedings of the 2017 IEEE SymposiumSeries on Computational Intelligence (SSCI) December 2017

[39] W Zeng Z Quan Z Zhao C Xie and X Lu ldquoA deeplearning approach for aircraft trajectory prediction in ter-minal airspacerdquo IEEE Access vol 8 pp 151250ndash151266 2020

[40] J Cui Y Cheng and X Cui ldquoState change trend prediction ofaircraft pump source system based on GRU networkrdquo inProceedings of the 2020 39th Chinese Control Conference(CCC) Shenyang China July 2020

[41] R Adelia S Suyanto and U N Wisesty ldquoIndonesian ab-stractive text summarization using bidirectional gated re-current unitrdquo Procedia Computer Science vol 157pp 581ndash588 2019

[42] P C Fourie and A A Groenwold ldquoe particle swarmoptimization algorithm in size and shape optimizationrdquoStructural and Multidisciplinary Optimization vol 23 no 4pp 259ndash267 2002

Computational Intelligence and Neuroscience 13