This is a demo of the Relation Extraction and Named Entity Recognition model developed in FashionBrain work package 6.
Given a full-text query such as "Tom is obsessed with Kimono jackets", our approach extracts the type of relation depicted in that sentence, "Wants/Likes" in this case. Additionally, it extracts the entities that make up this relation. The underlying model employs a Hierarchical Reinforcement Learning approach in which the outer policy predicts the relation type sequentially for each token, while the inner policy works similarly for the entities. The embedding network for full text queries bidirectional Long-Short-Term-Memory (LSTM) unit that works with GloVe embeddings. For each relation type a separate Linear network is learned, that predicts the entities.
The entire architecture is trained "end-to-end" to minimize a similarity cost function with supervised learning over the crowd-sources fashion corpus created by the University of Sheffield. The model itself is implemented in Pytorch and Python.