Open Access System for Information Sharing

Login Library

 

Article
Cited 1 time in webofscience Cited 1 time in scopus
Metadata Downloads
Full metadata record
Files in This Item:
There are no files associated with this item.
DC FieldValueLanguage
dc.contributor.authorSehwan Rho-
dc.contributor.authorSeong min Park-
dc.contributor.authorJuhyun Pyo-
dc.contributor.authorMeungsuk Lee-
dc.contributor.authorMaolin Jin-
dc.contributor.authorYu, Son-Cheol-
dc.date.accessioned2023-02-24T05:41:21Z-
dc.date.available2023-02-24T05:41:21Z-
dc.date.created2023-02-17-
dc.date.issued2023-04-
dc.identifier.issn1530-437X-
dc.identifier.urihttps://oasis.postech.ac.kr/handle/2014.oak/115614-
dc.description.abstractThis article proposes a method of point cloud generation for indoor low-visibility disaster environments. Recently, robots have been developed to perform several missions in such environments, which are potentially harmful to humans. However, an indoor disaster environment often consists of a dense fog, which makes robot navigation challenging because widely used sensors [e.g., optical cameras and light detection and ranging (LiDAR)] cannot be used due to low visibility. Several methods have been used to address this problem. In this article, we propose a sensor-fusion method that can generate point clouds of uneven foggy indoor environments using LiDAR and stereo thermal infrared cameras. We generate point clouds using stereo depth estimation and process them to have the same angular resolution as a LiDAR point cloud. We then approximate them based on thermal edge information and finally integrate the LiDAR point cloud with fog points removed. Furthermore, we performed an indoor experiment and the results showed that the proposed method can generate applicable point clouds by applying conventional LiDAR odometry and mapping algorithms.-
dc.languageEnglish-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.isPartOfIEEE Sensors Journal-
dc.titleLiDAR-Stereo thermal sensor fusion for indoor disaster environment-
dc.typeArticle-
dc.identifier.doi10.1109/JSEN.2023.3245619-
dc.type.rimsART-
dc.identifier.bibliographicCitationIEEE Sensors Journal, v.23, no.7, pp.7816 - 7827-
dc.identifier.wosid001011420400142-
dc.citation.endPage7827-
dc.citation.number7-
dc.citation.startPage7816-
dc.citation.titleIEEE Sensors Journal-
dc.citation.volume23-
dc.contributor.affiliatedAuthorSehwan Rho-
dc.contributor.affiliatedAuthorSeong min Park-
dc.contributor.affiliatedAuthorYu, Son-Cheol-
dc.identifier.scopusid2-s2.0-85149391019-
dc.description.journalClass1-
dc.description.journalClass1-
dc.description.isOpenAccessN-
dc.type.docTypeArticle-
dc.subject.keywordAuthorIndoor low-visibility disaster environment-
dc.subject.keywordAuthorlight detection and ranging (LiDAR)-
dc.subject.keywordAuthorpoint cloud generation-
dc.subject.keywordAuthorsensor fusion-
dc.subject.keywordAuthorstereo thermal infrared cameras-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryInstruments & Instrumentation-
dc.relation.journalWebOfScienceCategoryPhysics, Applied-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaInstruments & Instrumentation-
dc.relation.journalResearchAreaPhysics-

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher

유선철YU, SON-CHEOL
Div. of Advanced Nuclear Enginrg
Read more

Views & Downloads

Browse