AI (Artificial Intelligence) is defined as, simply, the use of computers to perform cognitive tasks, typically performed by humans. These technologies and methods have been adapted so that computers and machines and precisely simulate what the human brain has only been able to do. (Zawacki-Richter, et al., 2019)
AI essentially refers to computing technologies that are inspired by the ways people use their brains and nervous systems to reason and make decisions, but they typically operate quite differently. (Mehta, et al., 2020)
According to Davenport and Ronaki (2018), AI is categorized three ways: Business Process Automation — robotic processes dedicated to automation of “back office” or administrative needs; Cognitive Insights — used to detect patterns and themes in a set of data points; and lastly Customer engagement — Chatbots utilizing natural language, most used across industries.
“With the right planning and development, cognitive technology could usher in a golden age of productivity, work satisfaction, and prosperity.” (Davenport, Ronaki, 2018)
The term “Artificial Intelligence” or AI was first introduced by John McCarthy in 1956 during a conference at Dartmouth College.
McCarthy said that “every aspect of learning [should be] so precisely described that a machine can be made to simulate it”
AI has focused on linguistic, mathematical, and logical training. Withing the next generation of AI, more focus is being placed on emotional intelligence. In the education industry, there are attempts to provide customized learning programs for each student using AI. (Mehta, et al., 2020)