Zobrazit minimální záznam

dc.contributor.authorPrauzek, Michal
dc.contributor.authorKonečný, Jaromír
dc.contributor.authorPaterová, Tereza
dc.date.accessioned2024-04-25T06:54:52Z
dc.date.available2024-04-25T06:54:52Z
dc.date.issued2023
dc.identifier.citationIEEE Internet of Things Journal. 2023, vol. 10, issue 21, p. 18919-18929.cs
dc.identifier.issn2327-4662
dc.identifier.urihttp://hdl.handle.net/10084/152575
dc.description.abstractThe study presents a self-learning controller for managing the energy in an Internet of Things (IoT) device pow ered by energy harvested from a thermoelectric generator (TEG). The device’s controller is based on a double Q-learning (DQL) method; the hardware incorporates a TEG energy harvesting subsystem with a dc/dc converter, a load module with a microcon troller, and a LoRaWAN communications interface. The model is controlled according to adaptive measurements and transmis sion periods. The controller’s reward policy evaluates the level of charge available to the device. The controller applies and evaluates various learning parameters and reduces the learning rate over time. Using four years of historical soil temperature data in an experimental simulation of several controller config urations, the DQL controller demonstrated correct operation, a low learning rate, and high cumulative rewards. The best energy management controller operated with a completed cycle and missed cycle ratio of 98.5%. The novelty of the presented approach is discussed in relation to state-of-the-art methods in adaptive ability, learning processes, and practical applications of the device.cs
dc.language.isoencs
dc.publisherIEEEcs
dc.relation.ispartofseriesIEEE Internet of Things Journalcs
dc.relation.urihttps://doi.org/10.1109/JIOT.2023.3283599cs
dc.rights© 2023 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License.cs
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/cs
dc.subjectenergy harvestingcs
dc.subjectenergy managementcs
dc.subjectInternet of Things (IoT)cs
dc.subjectreinforcement learningcs
dc.subjectthermoelectric generator (TEG)cs
dc.titleAn analysis of double Q-learning-based energy management strategies for TEG-powered IoT devicescs
dc.typearticlecs
dc.identifier.doi10.1109/JIOT.2023.3283599
dc.rights.accessopenAccesscs
dc.type.versionpublishedVersioncs
dc.type.statusPeer-reviewedcs
dc.description.sourceWeb of Sciencecs
dc.description.volume10cs
dc.description.issue21cs
dc.description.lastpage18929cs
dc.description.firstpage18919cs
dc.identifier.wos001098109800046


Soubory tohoto záznamu

Tento záznam se objevuje v následujících kolekcích

Zobrazit minimální záznam

© 2023 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License.
Kromě případů, kde je uvedeno jinak, licence tohoto záznamu je © 2023 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License.