語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Sustained Mobile Visual Computing : ...
~
Yan, Zhisheng.
Sustained Mobile Visual Computing : = A Human-Centered Perspective.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
Sustained Mobile Visual Computing :/
其他題名:
A Human-Centered Perspective.
作者:
Yan, Zhisheng.
面頁冊數:
1 online resource (164 pages)
附註:
Source: Dissertation Abstracts International, Volume: 78-11(E), Section: B.
Contained By:
Dissertation Abstracts International78-11B(E).
標題:
Computer science. -
電子資源:
click for full text (PQDT)
ISBN:
9780355047349
Sustained Mobile Visual Computing : = A Human-Centered Perspective.
Yan, Zhisheng.
Sustained Mobile Visual Computing :
A Human-Centered Perspective. - 1 online resource (164 pages)
Source: Dissertation Abstracts International, Volume: 78-11(E), Section: B.
Thesis (Ph.D.)
Includes bibliographical references
Visual content is the most natural abstraction of the real world. Interacting with various types of visual content via mobile devices at anytime and anywhere promises the future of mobile computing. Such sustained mobile visual computing will revolutionize cyber-physical systems, especially the cyber-human interaction, in numerous applications from education and entertainment to infrastructure and healthcare monitoring. However, today's systems, highly optimized for simple graphic user interface or stationary devices, fail to achieve the energy and bandwidth efficiency required by this long-term vision. In this dissertation, we propose a human-centered perspective to bridge this gap by understanding the human perception of the dynamic visual content under the specific mobile context.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355047349Subjects--Topical Terms:
573171
Computer science.
Index Terms--Genre/Form:
554714
Electronic books.
Sustained Mobile Visual Computing : = A Human-Centered Perspective.
LDR
:05072ntm a2200373Ki 4500
001
910773
005
20180517112610.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355047349
035
$a
(MiAaPQ)AAI10283803
035
$a
(MiAaPQ)buffalo:15143
035
$a
AAI10283803
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
099
$a
TUL
$f
hyy
$c
available through World Wide Web
100
1
$a
Yan, Zhisheng.
$3
1182227
245
1 0
$a
Sustained Mobile Visual Computing :
$b
A Human-Centered Perspective.
264
0
$c
2017
300
$a
1 online resource (164 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertation Abstracts International, Volume: 78-11(E), Section: B.
500
$a
Adviser: Chang Wen Chen.
502
$a
Thesis (Ph.D.)
$c
State University of New York at Buffalo
$d
2017.
504
$a
Includes bibliographical references
520
$a
Visual content is the most natural abstraction of the real world. Interacting with various types of visual content via mobile devices at anytime and anywhere promises the future of mobile computing. Such sustained mobile visual computing will revolutionize cyber-physical systems, especially the cyber-human interaction, in numerous applications from education and entertainment to infrastructure and healthcare monitoring. However, today's systems, highly optimized for simple graphic user interface or stationary devices, fail to achieve the energy and bandwidth efficiency required by this long-term vision. In this dissertation, we propose a human-centered perspective to bridge this gap by understanding the human perception of the dynamic visual content under the specific mobile context.
520
$a
In particular, this dissertation makes a simple yet fundamental switch in system design: exposing the subjective human perception of dynamic content in mobile contexts to the decision modules of mobile visual computing, instead of purely depending on objective performance metrics. The system will then be able to perform intelligent adaptation and control in order to boost the resource efficiency. We have redesigned four typical mobile visual computing systems to reveal the benefits of the human-centered perspective. First, ShutPix leverages the unnecessarily high pixel density of smartphones and the limited visual acuity of human eyes, and selectively shuts off redundant subpixels in order to save the image display power without impacting mobile viewing experience. Second, CrowdDBS shows that dynamic brightness scaling during mobile video playback can be visually acceptable under various scaling frequency, magnitude and temporal consistency, and presents a crowdsourced brightness scaling scheme to minimize the mobile video display energy. Third, RnB pushes brightness scaling into video streaming, and introduces a joint rate and brightness adaptation framework for mobile video streaming, which shifts the classic Rate-Distortion tradeoff to a fresh Rate-Distortion-Energy tradeoff tailored for mobile devices. Forth, Prius targets the multi-client mobile video streaming and develops a hybrid adaptation framework by overlaying a layer of adaptation intelligence at the edge cloud to finalize the rate adaptation decisions initialized from the clients, thereby overcoming the playback unfairness and bandwidth inefficiency.
520
$a
The high-level contribution of this dissertation lies in building a strong connection between human vision theory and mobile system design. Specifically, this work is a significant step to show that human vision characteristics can be accurately modeled and cleanly integrated into commercial off-the-shelf smartphones to deliver practical and measurable gains. Second, this dissertation presents novel mobile visual computing algorithms that enrich the theory of human vision system, extending it to operate over subpixel shutoff, dynamic brightness scaling, joint bitrate and brightness adaptation, and multi-client video adaptation. Third, this dissertation makes a clear departure by blurring the border between applications and lower-layer/hardware support. This allows the visual computing applications and the lower layers as well as the hardware to collaborate on the common objective of enhancing user experience and resource efficiency. Finally, this work validates the feasibility and performance of the proposed designs using extensive analysis, simulation and testbed implementation. The results show that human-centered mobile visual computing can achieve substantial efficiency improvement from a few percent to several-fold depending on the visual content, mobile device, and network environment.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Computer science.
$3
573171
650
4
$a
Computer engineering.
$3
569006
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0984
690
$a
0464
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
State University of New York at Buffalo.
$b
Computer Science and Engineering.
$3
1180201
773
0
$t
Dissertation Abstracts International
$g
78-11B(E).
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10283803
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入