Deepfakes & legal evidence – how will the quest for authenticity be navigated?

0
341

In a 1745 English judgment Lord Hardwicke stated, “The judges and sages of the law have laid it down that there is but one general rule of evidence, the best that the nature of the case will allow”. This has come to be known as the best evidence rule.

We have moved a long way since the 1849 English case where a tombstone was produced as a document proving that a person was deceased.

It used to be said that a document cannot be proved except by production of the document itself. The efforts to transact a paperless society has the result that “the document itself” has little meaning. Some legislation refers to “writing” as including any communication by any appropriate electronic medium that is accurately and readily reducible to a written or printed form. That recognises that there may be no such thing as an original of any document nor of any electronic communication.

What, therefore, does it mean to say that the best evidence that can be produced in the circumstances must be produced. The court will always exercise discretion whether to accept secondary evidence as accurate evidence of the authority of a document or other form of writing such as a text message. These issues are not new and go back to the days before faxes and photocopying machines.

The peril of deepfake technology

The emergence of deepfake technology casts a long shadow over the sanctity of evidence, a cornerstone upon which the edifice of justice is built. As these advanced digital creations become indistinguishable from reality, they threaten to undermine the fairness of trials and the justice system at its core.

A deepfake is a synthetic media in which a person’s likeness, including their face, voice, and mannerisms, is superimposed onto or substituted for another individual in a video or audio recording, using advanced artificial intelligence and machine learning techniques. This technology leverages algorithms known as neural networks to analyse and replicate the details of human expressions, movements, and speech patterns with high precision. Initially developed for legitimate applications, such as in the film industry for de-aging actors, deepfakes have gained notoriety for their potential misuse in creating misleading or harmful content, such as fake news, political disinformation, or non-consensual explicit material.

Technological improvements have made it possible to generate realistic synthetic media with less data, lower computational costs, and minimal expertise. These technologies can now produce high-quality fakes by learning from vast amounts of available digital content, making it increasingly difficult to distinguish between real and synthetic media. This democratisation of AI tools has raised concerns over the potential for their use in misinformation campaigns, personal attacks, and the erosion of trust in digital content.

Admissibility challenges

The potential for deepfakes to prejudice judicial processes is significant. They can be used to fabricate evidence that appears highly convincing, potentially leading to wrongful judicial findings based on falsified evidence. The South African legal system’s principle of fairness and the right to a fair trial, enshrined in the Constitution could be undermined using such deceptive evidence.

The question of faked documents

The problem in relation to faked documents (not deepfakes) was illustrated in


Patrick Bracher | Director |
Norton Rose Fulbright |
mail me |

Tristan Marot | Associate |
Norton Rose Fulbright |
mail me |


The full article is reserved for our subscribers!

Read the full article by Patrick Bracher, Director and Tristan Marot, Associate, Norton Rose Fulbright, as well as a host of other topical management articles written by professionals, consultants and academics in the April/May 2024 edition of BusinessBrief.


VIEW our subscription options

ALREADY SUBSCRIBED?


Questions or problems?

admin@bbrief.co.za | +27 (0)11 788 0880 |


 



LEAVE A REPLY

Please enter your comment!
Please enter your name here