Hierarchical Embedding Model (HEM)

Hierarchical Embedding Model (HEM)


 


Overview


This is an implementation of the Hierarchical Embedding Model (HEM) for personalized product search.

The HEM is a deep neural network model that jointly learn latent representations for queries, products and users. It is designed as a generative model and the embedding representations for queries, users and items in the HEM are learned through optimizing the log likelihood of observed user-query-item purchases.

The probability (which is also the rank score) of an item being purchased by a user with a query can be computed with their corresponding latent representations.

Details on HEM and its use can be found in the paper listed below. Please cite this paper if planning to you it for your project.


Additional references relating to this methodology are listed in the README.txt file in this release.

Email Qingyao Ai for questions or comments concerning this software or methodology.


Requirements


Requirements


Procedures

See the README.txt file for details.


Download


The README.txt file is included in both archives, but is available here individually so one may obtain an overview of dataset characteristics and content.

Uncompress the zip archive using unzip or (7zip) on Windows machines. Both these utilities may also be installed on Unix machines.
     unzip HEM.zip
     7zip x -r HEM.zip

On Unix machines, untar the gziped tar archive using tar.
     tar xvzf HEM.tar.gz


File Name
Compressed
Size
Uncompressed
Size
README.txt
---
9K
HEM zip archive
29M
32M
HEM gzip tar archive
29M
32M


Acknowledgements


This work was supported in part by the Center for Intelligent Information Retrieval and in part by the National Science Foundation grant #IIS-1160894. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect those of the sponsor.