| タイトル |
-
en
Principal Component Analysis for Gaussian Process Posteriors
|
| 作成者 |
-
-
en
Ishibashi, Hideaki
ja
石橋, 英朗
ja-Kana
イシバシ, ヒデアキ
-
e-Rad 30838389
-
|
| 権利情報 |
-
This is the author’s final version, and that the article has been accepted for publication in Neural Computation.
|
| 主題 |
-
Other
Gaussian process
-
Other
Information geometry
-
Other
Multi-task learning
-
Other
Metalearning
-
Other
Functional data analysis
|
| 内容注記 |
-
Abstract
en
This letter proposes an extension of principal component analysis for gaussian process (GP) posteriors, denoted by GP-PCA. Since GP-PCA estimates a low-dimensional space of GP posteriors, it can be used for metalearning, a framework for improving the performance of target tasks by estimating a structure of a set of tasks. The issue is how to define a structure of a set of GPs with an infinite-dimensional parameter, such as coordinate system and a divergence. In this study, we reduce the infiniteness of GP to the finite-dimensional case under the information geometrical framework by considering a space of GP posteriors that have the same prior. In addition, we propose an approximation method of GP-PCA based on variational inference and demonstrate the effectiveness of GP-PCA as meta-learning through experiments.
|
| 出版者 |
Massachusetts Institute of Technology Press
|
| 日付 |
|
| 言語 |
|
| 資源タイプ |
journal article |
| 出版タイプ |
AM |
| 資源識別子 |
HDL
http://hdl.handle.net/10228/0002000098
,
URI
https://kyutech.repo.nii.ac.jp/records/2000098
|
| 関連 |
-
isVersionOf
DOI
https://doi.org/10.1162/neco_a_01489
|
| 収録誌情報 |
-
-
PISSN
0899-7667
-
EISSN
1530-888X
-
en
Neural Computation
-
巻34
号5
開始ページ1189
終了ページ1219
|
| ファイル |
|
| コンテンツ更新日時 |
2025-07-14 |