Saliency Map for Visual Attention Region Prediction: A Comparison of Two Methods
- 10.2991/icmii-15.2015.159How to use a DOI?
- Human-Computer Interaction, Saliency Map, Fuzzy Inference, Fuzzy Neural Network, Intention Recognition.
Visual attention region prediction has been paid much attention by researchers in intelligent systems recent years because it can make the interaction between human and intelligent agents to be more convenient. Saliency determines the capability of an image detail to attract visual attention and thus guide eye movements in a bottom-up way. A lot of models for saliency map combining color, intensity and orientation feature maps by simple normalization and linear summation, which can not reflect the importance of each feature in saliency map well. Therefore, in this paper, the prediction method of the visual attention region inferred by using fuzzy inference and fuzzy neural network (FNN) after extracting and computing of images feature maps and saliency maps were proposed and compared. A method for training FNN is also proposed. A user experiment was conducted to evaluate and compare the prediction effect of proposed methods by making surveys for the prediction results. Furthermore, t test results shown that there are significant difference between the results got by two different methods. This also indicated that prediction method based on FNN proposed by us has a better performance in the level of attention regions' position prediction according to different images.
- © 2015, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Mao Wang PY - 2015/10 DA - 2015/10 TI - Saliency Map for Visual Attention Region Prediction: A Comparison of Two Methods BT - Proceedings of the 3rd International Conference on Mechatronics and Industrial Informatics PB - Atlantis Press SP - 906 EP - 914 SN - 2352-538X UR - https://doi.org/10.2991/icmii-15.2015.159 DO - 10.2991/icmii-15.2015.159 ID - Wang2015/10 ER -