WEKO3
アイテム
Enhanced Pedestrian Detection Model Transfer-Trained on YOLOv8 Using DenseFused RGB and FIR Images
http://hdl.handle.net/10061/0002000780
http://hdl.handle.net/10061/00020007800d89f3e9-1f89-41f6-9de0-68855c6652b1
| 名前 / ファイル | ライセンス | アクション |
|---|---|---|
|
Download is available from 2026/7/24.
|
|
| アイテムタイプ | 会議発表論文 / Conference Paper(1) | |||||||
|---|---|---|---|---|---|---|---|---|
| 公開日 | 2025-02-18 | |||||||
| タイトル | ||||||||
| タイトル | Enhanced Pedestrian Detection Model Transfer-Trained on YOLOv8 Using DenseFused RGB and FIR Images | |||||||
| 言語 | ||||||||
| 言語 | eng | |||||||
| キーワード | ||||||||
| 主題Scheme | Other | |||||||
| 主題 | pedestrian tracking | |||||||
| キーワード | ||||||||
| 主題Scheme | Other | |||||||
| 主題 | smart city | |||||||
| キーワード | ||||||||
| 主題Scheme | Other | |||||||
| 主題 | image processing | |||||||
| キーワード | ||||||||
| 主題Scheme | Other | |||||||
| 主題 | RGB image | |||||||
| キーワード | ||||||||
| 主題Scheme | Other | |||||||
| 主題 | FIR image | |||||||
| 資源タイプ | ||||||||
| 資源タイプ | conference paper | |||||||
| アクセス権 | ||||||||
| アクセス権 | embargoed access | |||||||
| 著者 |
Yoshihara, Arata
× Yoshihara, Arata
× 新井, イスマイル× 遠藤, 新× 垣内, 正年× 藤川, 和利 |
|||||||
| 抄録 | ||||||||
| 内容記述タイプ | Abstract | |||||||
| 内容記述 | There are broad benefits to developing pedestrian spaces in terms of environment, culture, and economy. Given this context, accurately measuring pedestrian traffic is considered a critical indicator of sidewalk usage. Currently, the mainstream method for detecting pedestrians utilizes RGB camera footage installed along sidewalks. However, especially during nighttime and adverse weather conditions, insufficient lighting hampers detection accuracy. In contrast, Far-Infrared(FIR) imaging does not require a light source as it measures radiated heat. This study proposes a pedestrian detection and tracking model that integrates the strengths of both RGB and FIR cameras through image fusion processing. Specifically, pedestrians are detected from fused images using a pedestrian detector and then tracked using a tracking system to measure pedestrian counts. Additionally, a version of the pedestrian detector trained on the fused images through transfer learning is developed, and its detection results are compared with those from a non-transfer learning model. The experimental results demonstrate that using fused images for detection and tracking is effective in specific data scenarios, confirming the utility of the image fusion model under varied conditions. | |||||||
| 書誌情報 |
en : 2024 IEEE International Conference on Smart Computing (SMARTCOMP) p. 246-248, 発行日 2024-07-24 |
|||||||
| 会議情報 | ||||||||
| 会議名 | 2024 IEEE International Conference on Smart Computing (SMARTCOMP) | |||||||
| 開始年 | 2024 | |||||||
| 開始月 | 06 | |||||||
| 開始日 | 29 | |||||||
| 終了年 | 2024 | |||||||
| 終了月 | 07 | |||||||
| 終了日 | 02 | |||||||
| 開催地 | Osaka | |||||||
| 開催国 | JPN | |||||||
| 出版者 | ||||||||
| 出版者 | IEEE | |||||||
| ISSN | ||||||||
| 収録物識別子タイプ | EISSN | |||||||
| 収録物識別子 | 2693-8340 | |||||||
| 出版者版DOI | ||||||||
| 関連タイプ | isVersionOf | |||||||
| 識別子タイプ | DOI | |||||||
| 関連識別子 | https://doi.org/10.1109/SMARTCOMP61445.2024.00057 | |||||||
| 出版者版URI | ||||||||
| 関連タイプ | isVersionOf | |||||||
| 識別子タイプ | URI | |||||||
| 関連識別子 | https://ieeexplore.ieee.org/abstract/document/10595647 | |||||||
| 権利 | ||||||||
| 権利情報 | $00A9 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. 出版社許諾条件により、本文は2026年7月24日以降に公開 | |||||||
| 著者版フラグ | ||||||||
| 出版タイプ | AM | |||||||