Zobrazit minimální záznam

dc.contributor.authorSun, Yujia
dc.contributor.authorPlatoš, Jan
dc.date.accessioned2024-10-21T09:10:00Z
dc.date.available2024-10-21T09:10:00Z
dc.date.issued2024
dc.identifier.citationExpert Systems with Applications. 2024, vol. 248, art. no. 123356.cs
dc.identifier.issn0957-4174
dc.identifier.issn1873-6793
dc.identifier.urihttp://hdl.handle.net/10084/155183
dc.description.abstractText summarization research is significant and challenging in the domain of natural language processing. Abstractive text summarization mainly uses the encoder-decoder framework, wherein the encoder component does not have a sufficient semantic comprehension of the input text, and there are exposure biases and semantic inconsistencies between the reference and generated summaries during the training process. We propose an improved encoder-decoder model that incorporates a hierarchical attention mechanism and multiobjective reinforcement learning. The encoder introduces a multihead self-attention mechanism to allow for the acquisition of more comprehensive semantic information from multiple angles and dimensions, while the decoder introduces a pointer-generator network to solve the out-of-vocabulary problem. Multiobjective reinforcement learning methods are constructed throughout the training process to optimize the model in terms of addressing exposure bias, maintaining semantic consistency, and enhancing readability. The results of the comparative experiments demonstrate that the proposed model significantly improved in terms of the ROUGE evaluation metric, and the generated summaries were semantically similar to the reference summaries.cs
dc.language.isoencs
dc.publisherElseviercs
dc.relation.ispartofseriesExpert Systems with Applicationscs
dc.relation.urihttps://doi.org/10.1016/j.eswa.2024.123356cs
dc.rights© 2024 Elsevier Ltd. All rights reserved.cs
dc.subjectdeep neural networkcs
dc.subjectabstractive text summarizationcs
dc.subjectmultihead self-attention mechanismcs
dc.subjectreinforcement learningcs
dc.subjectsemantic consistencycs
dc.titleAbstractive text summarization model combining a hierarchical attention mechanism and multiobjective reinforcement learningcs
dc.typearticlecs
dc.identifier.doi10.1016/j.eswa.2024.123356
dc.type.statusPeer-reviewedcs
dc.description.sourceWeb of Sciencecs
dc.description.volume248cs
dc.description.firstpageart. no. 123356cs
dc.identifier.wos001186959800001


Soubory tohoto záznamu

SouboryVelikostFormátZobrazit

K tomuto záznamu nejsou připojeny žádné soubory.

Tento záznam se objevuje v následujících kolekcích

Zobrazit minimální záznam