Emotion detection technology has made a place for itself in the trending field of science associated technical applications. While talking about its examples, we cannot conclude without mentioning its effective monitoring of events in Medicine, E-Learning, Entertainment, Laws and much more. Although, it includes various important aspects, today’s news is all about entertainment and how emotion recognition can simply make life easier and interactive.
It is said, a song can reflect your mood, life style and determine your preferences. Well, till now it was psychologically possible and we could only get an idea regarding it. Now, a research team from the University of Strathclyde, Glasgow could make it possible to devise a data orientated instrument that can potentially select music clips according to your mood and choice. U2 fans have just launched a video catalog in YouTube consisting of 150 videos specially curated to uncover information related to the same using a range of methods, both visual and musical, and the results are also location based.
Dr Diane Pennington, a Lecturer in Strathclyde's Department of Computer and Information Sciences and a project associate revealed that, although music itself could not properly distinguish between emotions by means of its tone, it indeed rings an emotional bell when it reaches the human ear. The current software running at the back-end lacks the sophistication to retrieve information and use generously. Dr. Diane further added that she collaborated with U2 as it balances high emotional content in their concerts.
The site unfurled the information to fans who ultimately came up with customized songs with various colourful pictures of their loved ones or background noise. Others posted original clips with U2 Posters, T-shirts and photos. As an intangible feature, emotion is very difficult to calculate, but the research could come up with a procedure that could assist commercial music service providers on how to design an individual playlist for a customer. The research could be further progressed to implant a system providing music therapy. The complete research paper was published in the Journal of Documentation.
Source: University of Strathclyde | Journal of Documentation