NLP

[Dialog Response Selection] An Effective Domain Adaptive Post-Training Method for BERT in Response Selection

코딩무민 2022. 5. 24. 17:16
반응형

1. 핵심 요약 

  • multi-turn response selection in a retrieval-based dialog system
  • BERT로 domain-specific corpus post-training 진행
  • 2 response selection benchmarks SOTA

2. 논문 링크

https://arxiv.org/abs/1908.04812

 

An Effective Domain Adaptive Post-Training Method for BERT in Response Selection

We focus on multi-turn response selection in a retrieval-based dialog system. In this paper, we utilize the powerful pre-trained language model Bi-directional Encoder Representations from Transformer (BERT) for a multi-turn dialog system and propose a high

arxiv.org

3. 논문 설명 링크 

https://coding-moomin.notion.site/An-Effective-Domain-Adaptive-Post-Training-Method-for-BERT-in-Response-Selection-b8a03286980c418fa06c9d55d33e3b53

 

An Effective Domain Adaptive Post-Training Method for BERT in Response Selection

Abstract

coding-moomin.notion.site

 

반응형