Evaluating the effectiveness of large language models in meeting summarization with transcript segmentation techniques

How well does gpt-3.5-turbo perform on meeting summarization with topic and context-length window segmentation?

Bachelor Thesis (2023)
Author(s)

K.A. Sándor (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Morita Tarvirdians – Mentor (TU Delft - Interactive Intelligence)

CM Jonker – Mentor (TU Delft - Interactive Intelligence)

M. Molenaar – Graduation committee member (TU Delft - Computer Graphics and Visualisation)

Faculty
Electrical Engineering, Mathematics and Computer Science
Copyright
© 2023 Kristóf Sándor
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 Kristóf Sándor
Graduation Date
03-07-2023
Awarding Institution
Delft University of Technology
Project
['CSE3000 Research Project']
Programme
['Computer Science and Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Large Language Models (LLM) have brought significant performance increase on many Natural Language Processing tasks. However LLMs have not been tested for meeting summarization. This research paper examines the effectiveness of the gpt-3.5-turbo model on the meeting summarization domain. However due to input length limitations, it cannot be applied directly to this task. Thus the paper investigates two segmentation methods: a simple context-length window approach and topic segmentation using Latent Dirichlet Allocation (LDA). The context-length window approach's performance is close to the Pointer Generator framework. The topic segmentation gives worse results. Overall gpt-3.5-turbo performs worse with both approaches than state-of-the-art models which use a transformer architecture adapted for long documents.

Files

License info not available