A Session-Based Customer Preference Learning Method by Using the Gated Recurrent Units with Attention Function

  • Jenhui Chen*
  • , Ashu Abdul
  • *Corresponding author for this work

Research output: Contribution to journalJournal Article peer-review

15 Scopus citations

Abstract

In this paper, we investigate an attention function combined with the gated recurrent units (GRUs), named GRUA, to raise the accuracy of the customer preference prediction. The attention function extracts the important product features by using the time-bias parameter and the term frequency-inverse document frequency parameter for recommending products to a customer in the ongoing session. We show that the attention function with the GRUs can learn the customer's intention in the ongoing session more precisely than the existing session-based recommendation (SBR) methods. The experimental results show that the GRUA outperforms two SBR methods: the stacked denoising autoencoders with collaborative filtering (SDAE/CF) and the GRUs with collaborative filtering (GRU/CF) based on the precision and recall evaluation metrics. The data from three publicly available datasets, the Amazon Product Review dataset, the Xing dataset, and the Yoo-Choose Click dataset, are used to evaluate the performance of the GRUA with the SDAE/CF and the GRU/CF. This paper shows that adopting the attention function into the GRUs can dramatically increase the accuracy of the product recommendation in the SBR.

Original languageEnglish
Article number8628957
Pages (from-to)17750-17759
Number of pages10
JournalIEEE Access
Volume7
DOIs
StatePublished - 2019

Bibliographical note

Publisher Copyright:
© 2019 IEEE.

Keywords

  • Attention function
  • collaborative filtering
  • gated recurrent units
  • prediction
  • recommendation

Fingerprint

Dive into the research topics of 'A Session-Based Customer Preference Learning Method by Using the Gated Recurrent Units with Attention Function'. Together they form a unique fingerprint.

Cite this