以下为卖家选择提供的数据验证报告:
数据描述
Content
ModCloth | Electronics | |
---|---|---|
#review | 99,893 | 1,292,954 |
#item | 1,020 | 9,560 |
#user | 44,783 | 1,157,633 |
time span | 2010-2019 | 1999-2018 |
bias type | body shape | gender |
product image | Small (838) ,Small&Large (182) | Female (4,090) ,Female&Male (2,466) ,Male (3,004) |
user identity | Large (9,395) ,Small (30,140) ,N/A (5,248) | Female (71,043) ,Male (61,350) ,N/A (1,025,240) |
Context
Feedback interactions in recommender systems are significantly affected by bias of various kinds: Selection bias, Popularity bias, etc.
The paper “Addressing Marketing Bias in Product Recommendations” aims to focus on highlighting marketing bias in consumer-product interaction data and the effect it has on recommendation algorithms.
When bias creeps into recommendation systems, there can be various ramifications like:
- product retailers losing potential consumers
- users struggling to find relevant products
- ethical and social concerns
This image from the paper illustrates how the same product can be marketed using different human images (different body shapes, different genders) and affect consumers’ behaviour, thus resulting in a biased interaction dataset, which is commonly used as the input for modern recommender systems.
Acknowledgements
The Kaggle course Intro to AI Ethics sparked my interest in understanding the moral design of AI systems. 👀
Paper Citation: Addressing Marketing Bias in Product Recommendations Mengting Wan, Jianmo Ni, Rishabh Misra, Julian McAuley WSDM, 2020
License: Apache License 2.0
