British science fiction writer and futurist, Arthur C. Clarke once said, “any sufficiently advanced technology is indistinguishable from magic”. Artificial Intelligence (AI) brings in a host of real-world applications which had earlier merely been a subject of science fiction novels or movies.

AI empowered cars are already under rigorous testing and they are quite likely to ply on the roads soon. The social humanoid robot Sophia became a citizen of Saudi Arabia in 2017.

Apple’s intelligent personal assistant, Siri, can receive instructions and interact with human beings in natural language. Autonomous weapons can execute military missions on their own, identify and engage targets without any human intervention.

In the words of John McCarthy, AI, is the “science and engineering of making intelligent machines, especially intelligent computer programs”. As a burgeoning discipline of computer science, AI enables intelligent machines that can execute functions, similar to human abilities like speech, facial, object or gesture recognition, learning, problem solving, reasoning, perception and response.

The term AI was coined in 1956, and the early research in the 1950s was confined to problem solving and symbolic methods. The interest of the US Department of Defense led it towards mimicking basic human reasoning during the 1960s.1 The use of ‘neural networks’ dominated the period from 1950 to 1970s. Read more from…

thumbnail courtesy of