Description
In recent years, the resurgence of deep learning has greatly advanced this field and led to a hot topic named NeuIR (neural information retrieval), especially the paradigm of pre-training methods (PTMs). Owing to sophisticated pre-training objectives and huge model size, pre-trained models can learn universal language representations from massive textual data that are beneficial to the ranking task of IR. Considering the rapid progress of this direction, this survey provides a systematic review of PTMs in IR. The authors present an overview of PTMs applied in different components of an IR system, including the retrieval component and the re-ranking component. In addition, they introduce PTMs specifically designed for IR, and summarize available datasets as well as benchmark leaderboards. Lastly, they discuss some open challenges and highlight several promising directions with the hope of inspiring and facilitating more works on these topics for future research.
Book Information
ISBN 9781638280620
Author Yixing Fan
Format Paperback
Page Count 156
Imprint now publishers Inc
Publisher now publishers Inc
Weight(grams) 230g