語系:
繁體中文
English
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
The Emotional Impact of Audio - Visu...
~
Thomas, Titus Pallithottathu.
The Emotional Impact of Audio - Visual Stimuli.
紀錄類型:
書目-語言資料,手稿 : Monograph/item
正題名/作者:
The Emotional Impact of Audio - Visual Stimuli./
作者:
Thomas, Titus Pallithottathu.
面頁冊數:
1 online resource (77 pages)
附註:
Source: Masters Abstracts International, Volume: 56-06.
標題:
Computer engineering. -
電子資源:
click for full text (PQDT)
ISBN:
9780355169379
The Emotional Impact of Audio - Visual Stimuli.
Thomas, Titus Pallithottathu.
The Emotional Impact of Audio - Visual Stimuli.
- 1 online resource (77 pages)
Source: Masters Abstracts International, Volume: 56-06.
Thesis (M.S.)--Rochester Institute of Technology, 2017.
Includes bibliographical references
Induced affect is the emotional effect of an object on an individual. It can be quantified through two metrics: valence and arousal. Valance quantifies how positive or negative something is, while arousal quantifies the intensity from calm to exciting. These metrics enable researchers to study how people opine on various topics. Affective content analysis of visual media is a challenging problem due to differences in perceived reactions. Industry standard machine learning classifiers such as Support Vector Machines can be used to help determine user affect. The best affect-annotated video datasets are often analyzed by feeding large amounts of visual and audio features through machine-learning algorithms. The goal is to maximize accuracy, with the hope that each feature will bring useful information to the table.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2018
Mode of access: World Wide Web
ISBN: 9780355169379Subjects--Topical Terms:
569006
Computer engineering.
Index Terms--Genre/Form:
554714
Electronic books.
The Emotional Impact of Audio - Visual Stimuli.
LDR
:03037ntm a2200361K 4500
001
914375
005
20180703084808.5
006
m o u
007
cr mn||||a|a||
008
190606s2017 xx obm 000 0 eng d
020
$a
9780355169379
035
$a
(MiAaPQ)AAI10619818
035
$a
(MiAaPQ)rit:12747
035
$a
AAI10619818
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
100
1
$a
Thomas, Titus Pallithottathu.
$3
1187612
245
1 4
$a
The Emotional Impact of Audio - Visual Stimuli.
264
0
$c
2017
300
$a
1 online resource (77 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Masters Abstracts International, Volume: 56-06.
500
$a
Adviser: Raymond Ptucha.
502
$a
Thesis (M.S.)--Rochester Institute of Technology, 2017.
504
$a
Includes bibliographical references
520
$a
Induced affect is the emotional effect of an object on an individual. It can be quantified through two metrics: valence and arousal. Valance quantifies how positive or negative something is, while arousal quantifies the intensity from calm to exciting. These metrics enable researchers to study how people opine on various topics. Affective content analysis of visual media is a challenging problem due to differences in perceived reactions. Industry standard machine learning classifiers such as Support Vector Machines can be used to help determine user affect. The best affect-annotated video datasets are often analyzed by feeding large amounts of visual and audio features through machine-learning algorithms. The goal is to maximize accuracy, with the hope that each feature will bring useful information to the table.
520
$a
We depart from this approach to quantify how different modalities such as visual, audio, and text description information can aid in the understanding affect. To that end, we train independent models for visual, audio and text description. Each are convolutional neural networks paired with support vector machines to classify valence and arousal. We also train various ensemble models that combine multi-modal information with the hope that the information from independent modalities benefits each other.
520
$a
We find that our visual network alone achieves state-of-the-art valence classification accuracy and that our audio network, when paired with our visual, achieves competitive results on arousal classification. Each network is much stronger on one metric than the other. This may lead to more sophisticated multimodal approaches to accurately identifying affect in video data.
520
$a
This work also contributes to induced emotion classification by augmenting existing sizable media datasets and providing a robust framework for classifying the same.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2018
538
$a
Mode of access: World Wide Web
650
4
$a
Computer engineering.
$3
569006
650
4
$a
Computer science.
$3
573171
655
7
$a
Electronic books.
$2
local
$3
554714
690
$a
0464
690
$a
0984
710
2
$a
ProQuest Information and Learning Co.
$3
1178819
710
2
$a
Rochester Institute of Technology.
$b
Computer Engineering.
$3
1184443
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10619818
$z
click for full text (PQDT)
筆 0 讀者評論
多媒體
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼[密碼必須為2種組合(英文和數字)及長度為10碼以上]
登入