Not only are police using AI to catch criminals, criminal gangs are also using the technology to create child sexual content and commit financial fraud.
According to Constable Phillipe Gravel, an investigator with the RCMP's National Child Exploitation Coordination Centre (NCECC), this is the first case the police force has documented of AI being used to create child sexual abuse material.
According to the RCMP, it is worrying that criminals are using this technology to produce sensitive content related to children in exchange for benefits, and this number is expected to increase in the coming time as they become more sophisticated in using new technology.
The consequences of using AI in crime extend beyond child victims. In today’s digital culture, posting images and videos of every aspect of life has become tacit knowledge. But this practice has also inadvertently facilitated criminal activity, as AI can read personal information, location, and even the children of those posting online.
Sergeant Gravel advised people to think carefully before sharing any information online, especially images involving children. “The best way to avoid exposing yourself or your children is to not post,” he added.
The use of artificial intelligence in crime and policing is on the rise. The RCMP is using the technology to improve operational efficiency.
In addition, the Canadian Anti-Fraud Centre (CAFC) has also noted the use of AI in scams and fraud, according to Jeff Horncastle, a communications officer at the centre. There is evidence of the technology being used in scam calls and fake investment opportunities.
Voice spoofing and deep-fake technology is becoming increasingly sophisticated, making elaborate traps more convincing than ever.
To prevent scams, Horncastle advises people to be cautious, not to react hastily and to always verify the authenticity of suspicious communications. “Scammers often create a sense of urgency, panic or fear to prey on victims,” he added.
AI in the fight against crime
On the other hand, AI is also actively used in police investigations. NCECC uses this technology to speed up the investigation of online child sexual abuse, helping to identify and protect children from abusive environments. This technology helps investigators quickly identify infringing images, reducing manual work and protecting the mental health of staff.
In this context, the RCMP’s National Technology Implementation Centre (NTOP) is ensuring the use of operational technologies such as AI complies with privacy and transparency laws. “Our team is motivated to help all areas of the RCMP gather the information they need to ensure they are using the tools appropriately and effectively,” said Michael Billinger, NTOP’s head of transparency and access.
The NTOP program was established in 2021 following privacy concerns surrounding the RCMP’s use of Clearview AI facial recognition technology. NTOP was created to ensure responsible use of the technology and increase transparency with the public. NTOP recently released its “Transparency Blueprint: Operational Technology Overview.”
The RCMP is changing its approach to privacy and is committed to increased transparency to build public trust.
The development of AI is opening up many opportunities as well as challenges in the fields of security and finance, requiring authorities and communities to continuously update and adapt to ensure safety and efficiency.