Please use this identifier to cite or link to this item:
https://anrows.intersearch.com.au/anrowsjspui/handle/1/11383
Record ID: 5fcb7bba-b6ad-4d77-8da3-b504636eaf12
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Clough, Jonathan | en |
dc.contributor.author | Cooke, Talani | en |
dc.contributor.author | Powell, Anastasia | en |
dc.contributor.author | Flynn, Asher | en |
dc.contributor.author | Sugiura, Lisa | en |
dc.coverage.spatial | National | en |
dc.date.accessioned | 2022-06-30T22:47:15Z | - |
dc.date.available | 2022-06-30T22:47:15Z | - |
dc.date.issued | 2021 | en |
dc.identifier.isbn | 9783030837334 (hardcover) | en |
dc.identifier.isbn | 3030837335 (hardcover) | en |
dc.identifier.uri | https://anrows.intersearch.com.au/anrowsjspui/handle/1/11383 | - |
dc.format | xxxi, 722 pages ; 22 cm | en |
dc.language | en | en |
dc.publisher | Palgrave Macmillan | en |
dc.subject | Technology - Social aspects | en |
dc.subject | Women - Abuse of | en |
dc.subject | Women - Crimes against | en |
dc.subject | Women - Violence Against | en |
dc.title | The Palgrave Handbook of Gendered Violence and Technology | en |
dc.title.alternative | The Palgrave handbook of gendered violence and technology | en |
dc.type | Book Chapter | en |
dc.identifier.doi | https://doi.org/10.1007/978-3-030-83734-1_29 | en |
dc.identifier.catalogid | 17219 | en |
dc.subject.keyword | Invalid URL | en |
dc.subject.keyword | new_record | en |
dc.subject.readinglist | National | en |
dc.subject.readinglist | Policing and legal responses | en |
dc.subject.readinglist | Technology-facilitated abuse | en |
dc.subject.readinglist | ANROWS Notepad 2022 February 16 | en |
dc.description.notes | <p>Artificial Intelligence (AI) is transforming the landscape of technology-facilitated abuse. In late 2017, a Reddit user uploaded a series of ‘fake’ pornographic videos transposing female celebrities’ faces onto the bodies of pornography actors. This was the first documented example of amateur deepfakes appearing in the mainstream. Since then, the commercialisation of AI-technologies has meant anyone with a social media or online profile—or indeed, who has had an image or video taken of them—is at potential risk of being ‘deepfaked’. AI-technologies have essentially eliminated the need for victims and abusers to have any kind of personal relationship or interaction, which substantially expands the pool of potential deepfake abusers and targets. As a result, new demands exist on the types of interventions needed to prevent, disrupt and respond to this form of abuse. In this chapter, drawing from an analysis of Australian criminal law, we consider whether legal responses are keeping pace with these ever-changing tools to abuse. We conclude by providing recommendations for future, multifaceted responses to deepfake abuse and the need for further research in this space.</p> | en |
dc.identifier.source | The Palgrave handbook of gendered violence and technology | en |
dc.date.entered | 2022-02-15 | en |
dc.subject.list | ANROWS Notepad 2022 February 16 | en |
dc.subject.anratopic | Policing and legal responses | en |
dc.subject.anratopic | Technology-facilitated abuse | en |
dc.publisher.place | Cham, Switzerland | en |
dc.description.physicaldescription | xxxi, 722 pages ; 22 cm | en |
Appears in Collections: | Book Chapters |
Files in This Item:
There are no files associated with this item.
Items in ANROWS library are protected by copyright, with all rights reserved, unless otherwise indicated.