Multimodal Image Search System on Mobile Device
|
International Journal of Computer & Organization Trends (IJCOT) | |
© 2014 by IJCOT Journal | ||
Volume - 4 Issue - 2 |
||
Year of Publication : 2014 | ||
Authors : Anushma C R | ||
DOI : 10.14445/22492593/IJCOT-V7P305 |
Citation
Anushma C R. "Multimodal Image Search System on Mobile Device ", International Journal of Computer & organization Trends (IJCOT), V4(2):65-68 Mar - Apr 2014, ISSN:2249-2593, www.ijcotjournal.org. Published by Seventh Sense Research Group.
Abstract
Mobile phones have involved into powerful image and video processing devices equipped with built-in cameras, color displays, and hardware-accelerated graphics. These more features allow users to give multimodal queries for searching information on the go from the world wide web. In this paper, we propose a multimodal image search system that fully utilized multimodal and multi-touch functionalities of smart phones. The system allows searching images on the web by using an existing image query or a speech query with the help of existing image search engine. If the user doesn’t have an existing image query or captured photo, they can input a speech query that clearly represents a picture description in the user’s mind. The proposed system enhances the mobile search experience and increases relevance of search results. It involves a natural interactive process through which user has to express their search intent very well.
References
[1] Google Goggles. http://www.google.com/mobile/goggles/.
[2] SnapTell. http://www.snaptell.com/.
[3] Philip R. Cohen, Michael Johnston, David McGee, Sharon Oviatt, Jay Pittman, Ira Smith, Liang Chen and Josh Clow. QuickSet: multimodal interaction for distributed applications. In MULTIMEDIA ’97: Proceedings of the fifth ACM international conference on Multimedia, pages 31–40, New York, NY, USA, 1997. ACM.
[4] Ehlen, P., Johnston, M., Ave, P., & Park, F. (2010). Speak4it?: Multimodal Interaction for Local Search Categories and Subject Descriptors
[5] Michael Johnston and Patrick Ehlen, SPEAK4IT?: Multimodal Interaction in the Wild, AT & T Labs Research, AT & T Labs. (2010), 147–148
[6] X. Xie, L. Lu, M. Jia, H. Li, F. Seide, and W.-Y. Ma. Mobile Search with Multimodal Queries. Proceedings of the IEEE, 96(4):589{601, April 2008
[7] Fan, X., Xie, X., Li, Z., Li, M., & Ma, W. (2005). Photo-to-Search?: Using Multimodal Queries to Search the Web from Mobile Devices, 143–150
[8] Wang, H., & Wang, C. (n.d.). MindFinder?: Interactive Sketch-based Image Search, 1605–1608.
[9] M. Flickner, H. Sawhney, W. Niblack, J. Ashley, Q. Huang, B. Dom, M. Gorkani, J. Hafner, D. Lee, D. Petkovic, and P. Yanker, "Query by image and video content: The QBIC system," IEEE Computer, vol. 28, no. 9, pp. 23-32, Sep. 1995
[10] Axenopoulos, A., Daras, P., Malassiotis, S., & Croce, V. (n.d.). I-SEARCH?: A Unified Framework for Multimodal Search and Retrieval, 130–141
[11] Zahariadis T., Daras P., Bouwen J., Niebert N., Griffin D., Alvarez F., Camarillo G., "Towards a Content-Centric Internet", Towards the Future Internet - Emerging Trends from European Research, IOS Press, ISBN 978-1-60750- 539-6, pp. 227-236, Apr 2010
[12] Zhang, N., Mei, T., Hua, X.-S., Guan, L., & Li, S. (2011). Tap-to-search: Interactive and contextual visual search on mobile devices.2011 IEEE 13th International Workshop on Multimedia Signal Processing,1–5. doi:10.1109/MMSP.2011.6093802. S. M. Metev and V. P. Veiko, Laser Assisted Microtechnology, 2nd ed., R. M. Osgood, Jr., Ed. Berlin, Germany: Springer-Verlag, 1998.
Keywords
multimodal search, visual search, mobile phone, interactive search, information retrieval.