ログイン 新規登録
言語:

WEKO3

  • トップ
  • ランキング
To
lat lon distance
To

Field does not validate



インデックスリンク

インデックスツリー

メールアドレスを入力してください。

WEKO

One fine body…

WEKO

One fine body…

アイテム

  1. 研究報告
  2. システム・アーキテクチャ(ARC)
  3. 2022
  4. 2022-ARC-250

Implementation of Edge-Cloud Cooperative CNN Inference on an IoT Platform and Its Performance Analysis

https://ipsj.ixsq.nii.ac.jp/records/220323
https://ipsj.ixsq.nii.ac.jp/records/220323
01fd8631-0858-4a0a-86c1-3899f8c68ba6
名前 / ファイル ライセンス アクション
IPSJ-ARC22250010.pdf IPSJ-ARC22250010.pdf (1.2 MB)
Copyright (c) 2022 by the Information Processing Society of Japan
オープンアクセス
Item type SIG Technical Reports(1)
公開日 2022-10-04
タイトル
タイトル Implementation of Edge-Cloud Cooperative CNN Inference on an IoT Platform and Its Performance Analysis
タイトル
言語 en
タイトル Implementation of Edge-Cloud Cooperative CNN Inference on an IoT Platform and Its Performance Analysis
言語
言語 eng
キーワード
主題Scheme Other
主題 IoT・エッジAI
資源タイプ
資源タイプ識別子 http://purl.org/coar/resource_type/c_18gh
資源タイプ technical report
著者所属
Kyushu University
著者所属
Kyushu University
著者所属
Kyushu University
著者所属(英)
en
Kyushu University
著者所属(英)
en
Kyushu University
著者所属(英)
en
Kyushu University
著者名 Yuan, Wang

× Yuan, Wang

Yuan, Wang

Search repository
Hidetomo, Shibamura

× Hidetomo, Shibamura

Hidetomo, Shibamura

Search repository
Koji, Inoue

× Koji, Inoue

Koji, Inoue

Search repository
著者名(英) Yuan, Wang

× Yuan, Wang

en Yuan, Wang

Search repository
Hidetomo, Shibamura

× Hidetomo, Shibamura

en Hidetomo, Shibamura

Search repository
Koji, Inoue

× Koji, Inoue

en Koji, Inoue

Search repository
論文抄録
内容記述タイプ Other
内容記述 Three mainstream computing paradigms can be considered to meet the demands of real-time and low-energy AI executions in the IoT era: edge computing, cloud computing, and edge-cloud cooperative computing. For significant energy restriction on edge, offloading some workload to the cloud is a promising solution, but it tends to generate extra overhead in terms of overall execution latency. This paper focuses on the performance overhead (computing latency) evaluation of edge-cloud cooperative CNN inference towards high-performance edge-cloud AI executions on an IoT platform. We design and implement an edge-cloud cooperative CNN inference framework by targeting TensorFlow Lite. Through verification, we have confirmed the feasibility and accuracy of the provided implementation. For platform optimizations, we identify the current bottlenecks of the target platform and its lack of support for edge-cloud CNN inference. This paper also discusses suggestions for enhancing the targeted IoT platform.
論文抄録(英)
内容記述タイプ Other
内容記述 Three mainstream computing paradigms can be considered to meet the demands of real-time and low-energy AI executions in the IoT era: edge computing, cloud computing, and edge-cloud cooperative computing. For significant energy restriction on edge, offloading some workload to the cloud is a promising solution, but it tends to generate extra overhead in terms of overall execution latency. This paper focuses on the performance overhead (computing latency) evaluation of edge-cloud cooperative CNN inference towards high-performance edge-cloud AI executions on an IoT platform. We design and implement an edge-cloud cooperative CNN inference framework by targeting TensorFlow Lite. Through verification, we have confirmed the feasibility and accuracy of the provided implementation. For platform optimizations, we identify the current bottlenecks of the target platform and its lack of support for edge-cloud CNN inference. This paper also discusses suggestions for enhancing the targeted IoT platform.
書誌レコードID
収録物識別子タイプ NCID
収録物識別子 AN10096105
書誌情報 研究報告システム・アーキテクチャ(ARC)

巻 2022-ARC-250, 号 10, p. 1-7, 発行日 2022-10-04
ISSN
収録物識別子タイプ ISSN
収録物識別子 2188-8574
Notice
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc.
出版者
言語 ja
出版者 情報処理学会
戻る
0
views
See details
Views

Versions

Ver.1 2025-01-19 14:36:15.308550
Show All versions

Share

Mendeley Twitter Facebook Print Addthis

Cite as

エクスポート

OAI-PMH
  • OAI-PMH JPCOAR
  • OAI-PMH DublinCore
  • OAI-PMH DDI
Other Formats
  • JSON
  • BIBTEX

Confirm


Powered by WEKO3


Powered by WEKO3