LMOps/adaptllm at main · microsoft/LMOps
General technology for enabling AI capabilities w/ LLMs and MLLMs - microsoft/LMOps
github.com
Adapting Large Language Models via Reading Comprehension
This repo contains the model, code and data for our paper Adapting Large Language Models via Reading ComprehensionWe explore continued pre-training on domain-specific corpora for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to transform large-scale pre-training corpora into reading comprehension texts, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. Our 7B model competes with much larger domain-specific models like BloombergGPT-50B. Moreover, our domain-specific reading comprehension texts enhance model performance even on general benchmarks, indicating potential for developing a general LLM across more domains.