Loading...
Please wait, while we are loading the content...
Similar Documents
What's in Your Face? Discrimination in Facial Recognition Technology
| Content Provider | Semantic Scholar |
|---|---|
| Author | Wang, J. |
| Copyright Year | 2018 |
| Abstract | This paper examines the discrimination in facial recognition technology (FRT) and how to mitigate it in the contexts of academia, product development, and industrial research. FRT is the automation of the processing of human faces. In recent years, given the fast development of machine learning techniques, FRT gained considerable momentum. FRT is increasingly trained on extraordinarily large datasets and sophisticated algorithms, and its accuracy has been increased to the point that surpasses human capacity. Applications of FRT emerge in a variety of fields, such as surveillance, military, security, and e-commerce. At the same time, many ethical issues have been raised. In this paper, two types of FRT applications are distinguished—identification and classification. The former aims to search and match the captured face in the target database to pinpoint the identity, while the latter classifies people into different groups according to some properties drawn from their facial features, for example, gender, race, age, and sexual orientation. The latter type raises serious discrimination issues, because the training data is inherently biased, and it could be easily used to develop discriminatory applications and increase the number of people who suffer from discrimination. In order to mitigate the discrimination issue, three types of FRT design practices are identified—product development, academic research, |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | https://repository.library.georgetown.edu/bitstream/handle/10822/1050752/Wang_georgetown_0076M_14043.pdf?isAllowed=y&sequence=1 |
| Language | English |
| Access Restriction | Open |
| Content Type | Text |
| Resource Type | Article |