Hstoday Combating the Rising Criminal Use of AI with Digital Intelligence – HS Today – HSToday

While society has seen vast and various benefits from Artificial Intelligence (AI), the technology has also enabled criminals to expand their avenues and malice digitally. Perpetrators of child and elder abuse use AI to change their appearance and voice to target and manipulate these vulnerable populations. Such AI-generated images and data have been linked to increased fraud that is only expected to worsen.
The increasing criminal use of AI poses a growing number of challenges to agencies as they look to capture and convict bad actors. Most agencies do not have the resources required to handle the increased volume of cases involving AI, and those that do face an uphill battle with differing regulations across jurisdictions. Additionally, federal legislation on AI is behind the rate of its criminal use, leaving agencies largely on their own to navigate these evolving challenges.
As these trends continue, law enforcement agencies must have access to digital intelligence capabilities that can identify AI-generated or enabled data in evidence.
Deepfakes and Fraudulent Money Transfers
The Federal Bureau of Investigations (FBI) recently published an alert warning of the increasing use of AI to generate phony videos for use in sextortion schemes that attempt to harass minors and non-consulting adults into paying ransoms or complying with other demands. Criminals are using the technology with sophistication so much so that a single image of a person’s face can be used to create realistic, fake videos in someone’s likeness, otherwise known as “deepfakes.”
Sextortion can start on any site, app, messaging platform or game where people meet and communicate. In many cases, the perpetrator will contact a minor and claim to already have content depicting obscene activity that they will share if the victim does not fulfill their requests. Often, these crimes involve young people being led to believe they are communicating with someone of similar age who is interested in a relationship.
Scammers can obtain images from social media accounts easily and can alter them with AI deepfakes to create images that appear true to life. These altered images of victims, usually children, are copied and circulated, with the victim receiving threatening and harassing messages alongside the images and photos.
Agencies have also reported an increase in cryptocurrency and fraud cases utilizing AI. Criminals use AI chatbots and assistants to mimic human conversations with investors, leading them to believe they are speaking with a different person. They will often leverage these platforms to promote fake tokens and offer fraudulent investment opportunities. Criminals also utilize AI-generated social media content to inflate the value of tokens while making a profit by selling their holdings, a scheme also known as a “pump and dump.” Beyond investing, perpetrators use AI to modify their voices to mimic someone a victim trusts and then deceivingly request instant money transfers.
Utilizing Digital Intelligence to Secure Convictions
Collaboration between law enforcement agencies and technology providers is critical to expediting justice in cases involving AI. Digital intelligence solutions allow agencies to identify AI-generated images and audio files quickly and accurately, ultimately helping law enforcement personnel process massive volumes of data without manually searching thousands of digital artifacts per case.
Digital intelligence technology can also identify patterns and evidence that human analysts might miss. For example, deep learning algorithms are valuable in analyzing digital images and videos, and identifying objects, people, and patterns relevant to an investigation, especially when they are not immediately apparent. This is extremely beneficial to cases involving sextortion using deepfakes.
Traditionally, investigators and officers have closely scrutinized every image on a confiscated device. In cases like those exploiting children, these images are traumatic for those reviewing the evidence. By automatically identifying, labeling, and bucketing relevant content, digital forensics technology can reduce trauma for investigators as they can refrain from manually sifting through thousands of obscene assets.
As criminals expand their capabilities with AI, law enforcement and government agencies must remain one step ahead. By investing in digital intelligence capabilities, agencies can not only expedite justice but also improve the experience for their hardworking investigators.

POWERED BY MHA Visuals

source

Leave a Reply

The Future Is A.I. !
To top
en_USEnglish