Quaternion Gated Recurrent Unit Neural Network

Main Authors: Uche Onyekpe, Stratis Kanarachos, Stavros Christopoulos
Format: info publication-preprint
Terbitan: , 2019
Subjects:
Online Access: https://zenodo.org/record/3355557
Daftar Isi:
  • Recurrent neural networks (RNN) are distinguishable form other classes of artificial neural networks by their ability to make nodal connections along temporal sequences. Gated Recurrent Unit (GRU) proposed by Cho et al have found use in several time dependent applications such as natural language processing (NLP), financial analysis and sensor fusion applications due to their immunity to the vanishing gradient problem. GRU’s are also known to be more computationally efficient than their variant Long Short-term memory neural network (LSTM) due to their less complex structure and as such are more suitable for applications requiring more efficient management of computational resources. Many of such applications require a stronger mapping of their features to further enhance the prediction accuracy. A novel Quaternion gated recurrent unit (QGRU) is proposed which leverages the internal and external dependencies within the quaternion algebra to map correlations within and across multidimensional features. QGRU can be used to efficiently capture the inter and intra dependencies within multidimensional features unlike the GRU which only captures the dependencies within the sequence. Furthermore, the performance of the algorithm is evaluated on a sensor fusion problem involving navigation with INS sensors in GPS deprived environments.