Juan

Feel free to reach out! 👇


Curriculum vitae



Institute of Data Science and Artificial Intelligence (DATAI), University of Navarra (UNAV)

University Campus, Pamplona 31009 Navarra Spain



A Few-Shot Approach for Relation Extraction Domain Adaptation using Large Language Models


Conference


V. Zavarella, J.C. Gamero-Salinas, S. Consoli
CEUR-WS, KiL 2024 - Proceedings of the 4th International Workshop on Knowledge-Infused Learning: Towards Consistent, Reliable, Explainable, and Safe LLMs, co-located with 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2024, vol. 3894, 2024

DOI: https://ceur-ws.org/Vol-3894/dl4kg_paper3

Cite

Cite

APA   Click to copy
Zavarella, V., Gamero-Salinas, J. C., & Consoli, S. (2024). A Few-Shot Approach for Relation Extraction Domain Adaptation using Large Language Models. In CEUR-WS (Ed.), KiL 2024 - Proceedings of the 4th International Workshop on Knowledge-Infused Learning: Towards Consistent, Reliable, Explainable, and Safe LLMs, co-located with 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2024 (Vol. 3894). https://doi.org/https://ceur-ws.org/Vol-3894/dl4kg_paper3


Chicago/Turabian   Click to copy
Zavarella, V., J.C. Gamero-Salinas, and S. Consoli. “A Few-Shot Approach for Relation Extraction Domain Adaptation Using Large Language Models.” In KiL 2024 - Proceedings of the 4th International Workshop on Knowledge-Infused Learning: Towards Consistent, Reliable, Explainable, and Safe LLMs, Co-Located with 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2024, edited by CEUR-WS. Vol. 3894, 2024.


MLA   Click to copy
Zavarella, V., et al. “A Few-Shot Approach for Relation Extraction Domain Adaptation Using Large Language Models.” KiL 2024 - Proceedings of the 4th International Workshop on Knowledge-Infused Learning: Towards Consistent, Reliable, Explainable, and Safe LLMs, Co-Located with 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2024, edited by CEUR-WS, vol. 3894, 2024, doi:https://ceur-ws.org/Vol-3894/dl4kg_paper3.


BibTeX   Click to copy

@conference{zavarella2024a,
  title = {A Few-Shot Approach for Relation Extraction Domain Adaptation using Large Language Models},
  year = {2024},
  volume = {3894},
  doi = {https://ceur-ws.org/Vol-3894/dl4kg_paper3},
  author = {Zavarella, V. and Gamero-Salinas, J.C. and Consoli, S.},
  editor = {CEUR-WS},
  booktitle = {KiL 2024 - Proceedings of the 4th International Workshop on Knowledge-Infused Learning: Towards Consistent, Reliable, Explainable, and Safe LLMs, co-located with 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2024}
}

Abstract
Knowledge graphs (KGs) have been successfully applied to the analysis of complex scientific and technological domains, with automatic KG generation methods typically building upon relation extraction models capturing fine-grained relations between domain entities in text. While these relations are fully applicable across scientific areas, existing models are trained on few domain-specific datasets such as SciERC and do not perform well on new target domains. In this paper, we experiment with leveraging in-context learning capabilities of Large Language Models to perform schema-constrained data annotation, collecting in-domain training instances for a Transformer-based relation extraction model deployed on titles and abstracts of research papers in the Architecture, Construction, Engineering and Operations (AECO) domain. By assessing the performance gain with respect to a baseline Deep Learning architecture trained on off-domain data, we show that by using a few-shot learning strategy with structured prompts and only minimal expert annotation the presented approach can potentially support domain adaptation of a science KG generation model. 


Tools
Translate to