Tag Expression: Tagging with Feeling Jesse Vig, Matthew Soukup, Shilad Sen, John Riedl UIST2010...
-
Upload
coleen-armstrong -
Category
Documents
-
view
214 -
download
0
description
Transcript of Tag Expression: Tagging with Feeling Jesse Vig, Matthew Soukup, Shilad Sen, John Riedl UIST2010...
Tag Expression: Tagging with FeelingJesse Vig, Matthew Soukup, Shilad Sen, John RiedlUIST2010
February 23, 2011Hyewon Lim
2
Outline Introduction Design of Tag Expression Experimental Methods Empirical Evaluation Impact on Tagging System Conclusions
3
Introduction Existing tagging systems do not explicitly capture
user preference– Web users express preference in many ways
Rating movies on Netflix, digging articles on Digg, writing book reviews on Amazon, …
– Rather than simply indicating how much they like something, users explain why they like or dislike something
Rating systems– Typically accept preferences that are narrow and explicit– Easily machine-readable
Free-form text – Rich expression, but requires more effort from users– Implicit rating
4
Introduction Tag expression
– Bridges the gap between traditional narrow-but-explicit rat-ings systems and broad-but-implicit text review systems
– Users explain their preferences for an item by choosing tags &associating each with one of three affects
Affects measures a user’s pleasure or displeasure with the item w.r.t. the tag
E.g., movie “Speed”,like action, dislike Keanu Reeves, neutral Sandra Bullock
– Enables users to share their tags and the associated affect with the community
5
Design of Tag Expression MovieLenswww.movielens.org
– Primary purpose is movie recommendation User rate movies on a scale of ½ to 5
– Launched in 1997, tagging was introduced in January 2006– Users may apply tags on the movie details page and search
results page 86.2% of tagging activity occurred on the movie details page
Design of tag expression focused on three elements– Preference dimensions– Affect expression– Display of community affect
6
Design of Tag ExpressionPreference dimensions How to choose the appropriate set of preference di-
mensions– Expert-based approach
Domain experts or system designers hand pick the dimensions Experts cost money & may not be able to anticipate the varied
and changing interests of the user community– User-based approach*
Users define the space of preferences Tagging is well-suited for articulating preferences
– Tags are atomic – Tagging supports collaboration between users
Tagging systems face many challenges– Redundant tags (dark comedy, black comedy)– Low quality tags (bah)– Personal tags (Erlend’s DVDs)
7
Design of Tag ExpressionExpressing affect How users associate affect with each of the preference
dimensions– 1. Rating scale
Range of values that a user may associate with a preference di-mension (tag)
Use a ternary scale with positive, negative and neutral options– 2. Granularity
per-tag approach– User associates affect with a tag in isolation, independent of any
particular item– E.g., like or dislike Arnold Schwarzenegger
per-item-tag approach*– user associates affect with a (movie, tag) paiar– E.g., like Arnold Schwarzenegger for “Terminator”
but dislike Arnold Schwarzenegger for “Kindergarten Cop”
8
Design of Tag ExpressionExpressing affect How users associate affect with each of the preference
dimensions– 3. Interface
Displays an affect container for each of the three affect values
9
Design of Tag ExpressionDisplaying community affect 1. Aggregation function
– Plurality voting Single-winner voting system Using this approach, simply choose the affect value applied by
the most users to the (movie, tag) pair– Proportional representation
Multi-winner voting system Using this model, display a histogram showing the distinct affect
values applied and their relative frequencies
*
10
Design of Tag ExpressionDisplaying community affect 2. Visualization
– Tag clouds* and tag lists– Augment the tag cloud with affect information
Manipulation of a tag’s color
-1.0 ~ -0.6 -0.6 ~ -0.2 -0.2 ~ 0.2 0.2 ~ 0.6 0.6 ~ 1.0
11
Design of Tag ExpressionOther considerations Incorporated tag expression into the search results
page
Create two additional temporary pages– Introduction page & affect migration page
12
Experimental Methods For analyses comprised activity logs
– May 27, 2009 ~ August 27, 2009– Track the tags and associated affect – Tag expression is introduced in April 27, 2009 (excluded the
first month)
For comparing tagging activity under tag expression & under previous tagging system– February 2006 ~ April 2009– Tagging activity and user activity were fairly consistent over
time
13
Experimental Methods For analyzing tagging behavior
– Online survey to measure user satisfaction and explore moti-vations for using tag expressions
Compare the tag expression interface (A) to the previous tag-ging interface (B)
– Based on overall preference and in respect to specific tasks– Selecting one of the responses: A is much better, A is better, Both
are about the same, B is better, B is much better Rate several reasons for using tag expression using a 5-point
Likert scale – Reasons: contribute to the community, self-expression, improve rec-
ommendations, organize movies, fun, curiosity– 5-point: strongly agree, agree, neutral, disagree, strongly disagree
14
Empirical Evaluation 1. Preference dimensions
– Users expressed positive or negative affect for the majority (61.8%) of tag applications
– Survey result Users largely tagged in order to express preference
15
Empirical Evaluation 2. Expressing affect
– Affect distribution Positive affect: 53.4%, neutral affect: 38.2%, negative affect:
8.4%
– Relative benefits of negative vs. positive affect depend on how tag expression is used
Recommendation system, predict a user’s preference, …
16
Empirical Evaluation 2. Expressing affect
– Affect distribution Certain tags elicited a wider range of affect values than others
– Grouped affects applied to a given tag – Measured the entropy of the affect distribution
bleak (0.982)gore (0.974)boxing (0.970)
acting (0.980)
keanu reeves (0.972)julia roberts (0.969)kevin costner (0.968)eddie murphy (0.968)
incest (0.966)torture (0.964)
User have differing preference
Neutral tags that typically require a qualifier
Specific actors
Culturally sensitive topics
17
Empirical Evaluation 2. Expressing affect
– Granularity of affect How users chose affect when applying the same tag to different
movies– 75.9% of cases, users chose uniform affect when applying a tag to
multiple movies per-tag model may have been sufficiently expressive to capture user preference (per-item-tag was needed for a non-trivial percent-age (24.1%) of cases)
3. Displaying affect– Breadth of opinion
Counted the number of distinct(movie, tag) pairs of each color across all movies
48.4% positive affect, 44.3% neutral affect, 7.3% negative affect
18
Empirical Evaluation 3. Displaying affect
– Breadth of opinion (cont’d) Scarcity of negative affect: users overwhelmingly express posi-
tive affect Relationship between affect expressed and aggregate affect dis-
played
19
Empirical Evaluation 3. Displaying affect
– Overall community preference Computes the average rating of each movie among just its tag-
gers as well as over the entire user population Mean movie rating among taggers was 3.86 compared to 3.63
among all users positive bias
20
Empirical Evaluation 4. User satisfaction
– Survey result measuring user satisfaction
“I like A because it allows me to mark tags as like, dislike, and neutral.”
“A is far better than B; it shows more information and allows much clearer ideas of what the community thinks about a film.”
21
Impact on Tagging System 1. Tagging volume
– Tags: 206% increase (3,031 -> 9,273) More users (+44%) tagged under tag expression and that they
applied significantly more tags (+68%) to each movie they tagged
– Tag reuse: 26.7% -> 69.2%– New tags: 2,220 -> 2,869 per month
2. Tag diversity– Measure diversity based on (1) number of distinct tags and
(2) Shannon entropy of tag applications over distinct tags– The # of distinct tags per month: +119.5% & entropy:
+12.4%
22
Impact on Tagging System 3. Tag quality
– Use a tag “searchability” as a measure– Define the searchability of a tag
The # of distinct users who have searched for items using that tag
– +48% mean searchability & +168% median searchability
4. Types of tags– Earlier work divided tags into three classes: factual, subjec-
tive, personal Users generally preferred factual tags to others
23
Conclusions Preferences are shared
– Users can see in aggregate how the overall community feels about a movie
Tag expression encouraged more community-oriented tagging behavior– Reused others’ tags more frequently– The # of active taggers per month: +44%– 79% of users preferred tag expression to a traditional system
Future work– Explore recommender algorithms that utilize tag expression
data