A majority of the United States population own technologies such as laptops, smartphones, and PCs. Nowadays, it is not uncommon for children as young as 10 to already own a smartphone. As time passes, even more individuals have access to the internet. However, the internet is not safe; there are often many dangers that lurk behind our screens. One such threat is the presence of child pedophiles online.
The process of catching online predators often involves human communication with potential online predators. Officers from task forces such as the Internet Crimes Against Children (ICAC) are trained to develop personas of minors as young as 12 years old. Officers would then carry out conversations with potential predators online to begin the process of arresting one. Carrying out such a task can come at a cost to mental health. Therefore, there are wellness programs online, such as the Innocent Justice Foundation program, which provides aid and support to individuals exposed to child sexual abuse through their work.
With the advancement of research and application of machine learning, it’s high time that law enforcement groups should use these applications to catch online crimes, starting with online child sex crimes. One way to apply artificial intelligence within this field is to create an AI chatbot and deploy them over different online platforms such as social media. The goal of these chatbots is to expose online predators and any potential signs of online child sex trafficking. Each of these chatbots will be unique based on the online platform that they run on. Social media each have their own different cultures and attract certain types of users. Therefore, a chatbot that runs on Facebook will be different from a chatbot that runs on Instagram.
These chatbots must be able to mimic how a minor communicates and uses social media. For chatbots to successfully mimic minors, they must be able to learn about new trends amongst minors, such as hobbies and text lingos. It must be able to carry out a conversation with potential predators online with the goal of exposing online predators. Chatbots should never force the individuals that they are communicating with to do criminal acts. The potential predators must be the ones to lead the conversations and are the ones to goad the chatbots into performing illicit activities online. For a chatbot to successfully carry out these procedures, it must randomize itself, which would make it difficult for anyone to identify them as chatbots.
With the emergence of this type of AI chatbots, the internet could potentially be a safer place for children. On top of that, this type of technology can be valuable to law enforcement task forces such as the ICAC. It would rid of the cost to train officers to develop personas of a minor, and it would also decrease the risk of trauma to officers who are directly involved with carrying out conversations with potential predators online.
While this form of technology to catch online child predators is not yet common, it is not the first time that technology such as this was used to detect online sex crimes. One example of such technology is Sweetie. Sweetie is an internet avatar based on a Filipina child, and it was used to lure child predators online. Sweetie specializes in using A.I motion capture to catch predators that use webcams. For more information on Sweetie, click this URL: https://en.wikipedia.org/wiki/Sweetie_(internet_avatar)