Can blockchain and artificial intelligence (AI) be used to combat deepfake, to restore the public confidence back into the system? We believe they can.
What is "deepfake", a term which combines 'deep learning' and 'fake'? According to Wikipedia, it's a technique for human image synthesis based on AI. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique known as generative adversarial network.
Because of these capabilities, deepfakes have been used to create fake X-rated videos, fake news and malicious hoaxes or financial fraud. The combination of the existing and source videos results in a fake video that shows a person or persons performing an action at an event that never occurred in reality.
Deepfakes can influence public opinions, election results, trigger ethnic violence or escalate situations that can lead to armed conflicts. On the personal level, deepfakes have been used to create fake X-rated videos of celebrities, videos which negatively impacted the reputation of that person once posted online, even if they weren't real.
The deepfake mechanism
According to John Villasenor, a professor of electrical engineering at the University of California, anybody who has a computer and access to the Internet can technically produce deepfake content.
Villasenor told CNBC the technology “can be used to undermine the reputation of a political candidate by making the candidate appear to say or do things that never actually occurred. Deepfakes are a powerful new tool for those who might want to (use) misinformation to influence an election”, added Villasenor. Paul Barrett, adjunct professor of law at New York University, seconds that, explaining that deepfakes are falsified videos made by means of deep learning.
Deep learning is a subset of AI and refers to arrangements of algorithms that can learn and make intelligent decisions on their own. Peter Singer, cybersecurity and defense-focused strategist and senior fellow at New America think tank, believes that the danger of deepfakes comes from the fact that this technology can be used to make people believe something is real when it is not.
How deepfakes are created
A deep-learning system can produce a persuasive counterfeit by studying photographs and videos of a target person from multiple angles, and then mimicking its behavior and speech patterns. After multiple rounds of detection and improvement, the deepfake video is completed, experts from the industry say.
While artificial intelligence (AI) can be used to make deepfakes, it can also be used to detect them
According to a MIT technology report, a device that enables deepfakes can be “a perfect weapon for purveyors of fake news who want to influence everything from stock prices to elections. AI tools are already being used to put pictures of other people’s faces on the bodies of porn stars and put words in the mouths of politicians", wrote Martin Giles, San Francisco bureau chief of MIT Technology Review.
Fighting deepfakes with blockchain technology
Blockchain expert Kevin Gannon, PwC blockchain tech lead and solutions architect: "When it comes to the area of deepfake, emerging technologies like blockchain can come to the fore to provide some levels of security, approval and validation. Blockchain has typically been touted as a visibility and transparency play, where once something is done, the "who" and "when" becomes apparent. But it can go further! When a user with a digital identity wants to do something, they could be prompted for proof of their identity before access to something (like funds) can be granted."
Gannon continues: "From another angle, the actual authenticity of video, audio files can be proven via a blockchain application where the hash of certain files can be compared against the originals. Though, it is not a silver bullet, and as always, the adoption and applicability of the technology in the right way is key.
From a security perspective, more open data mechanisms (like a public ledger) have an increased attack surface, so inherent protection can't just be assumed. But enhancing security protocols around the approvals process, where smart contracts could also come into play, can strengthen such processes. In addition, at a more technical level, by applying multiple signature transactions in the processes can mean that even if one identity is compromised, there is more than one identity needed to provide ultimate approval."
Detecting deepfakes through Artificial Intelligence
While artificial intelligence (AI) can be used to make deepfakes, it can also be used to detect them, Villasenor wrote. Since this technology has become more and more accessible to computer users, an increasing number of researchers are focusing on deepfake detection, looking at efficient ways to combat and regulate it. Even large corporations such as Facebook and Microsoft have taken initiatives to detect and remove deepfake videos.
According to Reuters, the two companies announced earlier this year that they would collaborate with top universities across the U.S. to create a large database of fake videos for research. “Presently, there are slight visual aspects that are off if you look closer, anything from the ears or eyes not matching to fuzzy borders of the face or too smooth skin to lighting and shadows,” said Peter Singer, cybersecurity and defense focused strategist and senior fellow at New America. Singer added that detecting the “tells” is getting harder and harder as the deepfake technology becomes more advanced and videos look more realistic.
Currently, the majority of blockchain solutions present on the market are oriented towards blockchain as a service, limiting themselves to a rigid view and application of the technology.
Dr. Alexander Adam shares more insights on how deepfakes can be counteracted with the help of AI. “Machine learning algorithms are great at recognising patterns in large amounts of data. ML can provide a way to detect fake audio from real audio by using classification techniques that work by showing an algorithm large amounts of deepfake and real audio and teaching it to distinguish the difference in the frequency composition between the two.
For example, by using image classification on the audio spectrograms you can teach an ML model to ‘spot the difference’. However, as far as I am aware no out-of-the-box solution exists yet.”
Adam continues: “In part, this may be because audio deepfake hasn’t been regarded as being as much of a threat as video deepfake. Audio deepfake are not pitch perfect and you should be able to tell the difference if it’s tailored to a specific person that you know.
That said, interference across phone lines or staging general ‘outside’ background noise could probably be used to mask a lot of this. And as there has been so much high profile media attention on deepfake videos, the public is perhaps less aware of the potential risks of audio deepfake. So, if you have a reason to be suspicious, you should always validate it’s who you think it might be.
How is the future shaping up in this regard? “We expect that the creation and use of audio deepfake for malicious purposes will increase in the coming years and become more sophisticated.
This is because there is a better understanding of machine learning models and how to transfer what was used on one model to another person and train it quickly. But, it’s worth noting that as the generation of deepfake content gets better, typically so do the detection methods”, concludes Dr. Alexander Adam.
Modex BCDB, a new take on blockchain technology
Currently, the majority of blockchain solutions present on the market are oriented towards blockchain as a service, limiting themselves to a rigid view and application of the technology. A company or the CTO (Chief Technology Officer) of a company can come to the realization, after a bit of study that their business can solve several issues and streamline back-end processes by implementing blockchain.
The problem is that in order for a company to implement blockchain technology only through its own tech team, they need to invest a significant amount of time and resources to study what type of blockchain is most suited for their needs, and commence a lengthy process of learning the development specificity of the respective blockchain, as well as scouting for developers proficient in the technology.
Modex BCDB is a new take on blockchain technology which removes the need to invest resources in blockchain training and facilitates fast adoption of the technology in businesses. The solution proposed by Modex is a middleware that fuses a blockchain with a database to create a structure which is easy to use and understand by developers with no prior knowledge in blockchain development.
As a result, any developer who knows to work with a database system can operate with our solution, without needing to change their programming style or learn blockchain. Through our blockchain component, Modex BCDB is able to transform with minimal changes any type of database into a decentralized database which holds the same valuable characteristics inherent to blockchain technology: transparency, increased security, data immutability, and integrity.
Modex BCDB doesn’t work by deleting the existing database or data entries. The database is maintained intact throughout the process, data integrity is ensured by calculating the metadata of the records and storing it on the blockchain.
The system does not restrict access to the blockchain or to the database, so when a developer needs to make a reporting or ETL transformations, they can perform warehouse analytics by accessing the database directly. This is because Modex BCDB has been purposely designed to be agnostic. With our solution, clients are able to set up a network, regardless of the type of database employed. In a consortium, each company can maintain what type of database they prefer and connect them through a blockchain-powered network to ensure cohesion while protecting corporate interests.