語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Building Adaptable Generalist Robots.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Building Adaptable Generalist Robots./
作者:
Xu, Mengdi.
面頁冊數:
1 online resource (249 pages)
附註:
Source: Dissertations Abstracts International, Volume: 85-11, Section: B.
Contained By:
Dissertations Abstracts International85-11B.
標題:
Computer science. -
電子資源:
click for full text (PQDT)
ISBN:
9798382611907
Building Adaptable Generalist Robots.
Xu, Mengdi.
Building Adaptable Generalist Robots.
- 1 online resource (249 pages)
Source: Dissertations Abstracts International, Volume: 85-11, Section: B.
Thesis (Ph.D.)--Carnegie Mellon University, 2024.
Includes bibliographical references
Over the past decade, advancements in deep robot learning have enabled robots to acquire remarkable capabilities. However, these robots often struggle to generalize to new, unseen tasks, highlighting the need for the development of generalist robots. While existing research primarily focuses on enhancing generalization through large-scale pre-training-providing robots with vast datasets and extensive parameters and treating generalization as a naturally emerging trait-this approach does not fully address the complexities of the real world. The real world presents an infinite array of tasks, many of which extend beyond the training scenarios previously encountered by these robots. For example, in healthcare, robots must manage the partial observability resulting from the diverse latent intents of patients, which are not to be covered in the dataset. Similarly, autonomous vehicles must navigate unpredictable traffic, weather, and road conditions, which may go beyond the training data.This thesis contends that, alongside scalability, a strong adaptation capability is crucial for improving generalization in real-world applications. It explores strategies for building robots that can adapt effectively at the time of deployment, with a focus on data efficiency, parameter efficiency, and robustness. The study delves into various adaptive learning methods, including in-context robot learning that conditions on a limited number of demonstrations, unsupervised continual reinforcement learning that uncovers the structure of robot tasks, and the use of large foundation models for building embodied agents. These methodologies demonstrate significant potential, enabling robots to acquire new motor skills across diverse applications and solve complex, long-horizon physical puzzles through creative uses of tools.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2024
Mode of access: World Wide Web
ISBN: 9798382611907Subjects--Topical Terms:
573171
Computer science.
Subjects--Index Terms:
Machine learningIndex Terms--Genre/Form:
554714
Electronic books.
Building Adaptable Generalist Robots.
LDR
:03148ntm a22003857 4500
001
1151276
005
20241104055844.5
006
m o d
007
cr bn ---uuuuu
008
250605s2024 xx obm 000 0 eng d
020
$a
9798382611907
035
$a
(MiAaPQ)AAI31294401
035
$a
AAI31294401
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Xu, Mengdi.
$3
1478010
245
1 0
$a
Building Adaptable Generalist Robots.
264
0
$c
2024
300
$a
1 online resource (249 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 85-11, Section: B.
500
$a
Advisor: Zhao, Ding.
502
$a
Thesis (Ph.D.)--Carnegie Mellon University, 2024.
504
$a
Includes bibliographical references
520
$a
Over the past decade, advancements in deep robot learning have enabled robots to acquire remarkable capabilities. However, these robots often struggle to generalize to new, unseen tasks, highlighting the need for the development of generalist robots. While existing research primarily focuses on enhancing generalization through large-scale pre-training-providing robots with vast datasets and extensive parameters and treating generalization as a naturally emerging trait-this approach does not fully address the complexities of the real world. The real world presents an infinite array of tasks, many of which extend beyond the training scenarios previously encountered by these robots. For example, in healthcare, robots must manage the partial observability resulting from the diverse latent intents of patients, which are not to be covered in the dataset. Similarly, autonomous vehicles must navigate unpredictable traffic, weather, and road conditions, which may go beyond the training data.This thesis contends that, alongside scalability, a strong adaptation capability is crucial for improving generalization in real-world applications. It explores strategies for building robots that can adapt effectively at the time of deployment, with a focus on data efficiency, parameter efficiency, and robustness. The study delves into various adaptive learning methods, including in-context robot learning that conditions on a limited number of demonstrations, unsupervised continual reinforcement learning that uncovers the structure of robot tasks, and the use of large foundation models for building embodied agents. These methodologies demonstrate significant potential, enabling robots to acquire new motor skills across diverse applications and solve complex, long-horizon physical puzzles through creative uses of tools.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2024
538
$a
Mode of access: World Wide Web
650
4
$a
Computer science.
$3
573171
650
4
$a
Robotics.
$3
561941
650
4
$a
Mechanical engineering.
$3
557493
653
$a
Machine learning
653
$a
Deep robot learning
653
$a
Adaptive learning methods
653
$a
Reinforcement learning
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0548
690
$a
0771
690
$a
0984
710
2
$a
Carnegie Mellon University.
$b
Mechanical Engineering.
$3
1148639
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
773
0
$t
Dissertations Abstracts International
$g
85-11B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31294401
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入