Counterterrorism Procedures And Face Recognition Technology: A Matter Of Scientific Progress

In honour of ‘Azadi ka Amrit Mahotsav,’ a holiday honouring India’s independence, the first-ever ‘Artificial Intelligence in Defence (AIDef)’ conference and exhibition was held on 11 July 2022, featuring 75 AI-based innovations. Armed forces, academic institutions, enterprises, and newcomers in defence technology participated. Four years spent presenting AI and AI-based technologies were worth it. The objective is to enhance decision-making speed, raise cybersecurity, reinforce perimeter security, allow predictive maintenance, and employ NLP algorithms to give immediate translation for soldiers encountering opponents along disputed boundaries.

In February 2018, the Ministry of Defense (MoD) launched an AI Task Force, which published its findings and recommendations in June. Professor V. Kamakoti’s Task Force on AI for India’s Economic Transformation highlighted 10 areas where AI may be employed on January 19, 2018. National security-related themes include autonomous surveillance and warfare, adaptive communication, cyber-attack mitigation and counterattack, and multi-sensor data fusion.

In response to these proposals, the Raksha Mantri-led Defence AI Council (DAIC) and the Defence AI Project Agency (DAIPA) were created in February 2019. DRDO, academic institutions, and private sector firms will execute DAIC’s policy recommendations. DPSUs produced a road map for applying AI in defence in August 2019 and built 40 AI products by March 2. 15 of the 75 products presented at AIDef were C4ISR-based; 10 were based on autonomous and unmanned robotic systems; 10 were based on intelligent monitoring systems; were based on manufacturing and maintenance; process flow automation; natural language processing; AI platform automation; perimeter security system; internet of things; operational data analytics; and artifical intelligence. iSentinel and Silent Sentry employ Facial Recognition Technology (FRT).

The iSentinel can “historically track persons” and “detect emotions, facial expressions, and body language for conflict, restlessness, and sweat” to identify possible threats. Silent Sentry incorporates “person detection” and “facial recognition.” 12 Both products may employ a sort of artificial intelligence called FRT, which measures a person’s unique facial traits (80 in total, according to one research), such as eye distance, forehead size, chin breadth, etc. This information is linked to a database to identify the camera’s subject.

The Indian Army might use FRT in CI/CT operations in Kashmir and the North East, however there are hurdles. FRT is used to identify possible threats near a military base (COB). AI solutions on the market are developed for permanently installed systems and mainly work as an early warning system for defensive actions.


An effective FRT system involves a digital library of resident or terrorist face data, a high-quality camera to take photos of persons around the camp, reliable and quick connectivity, and a strong processor to conduct mapping and matching algorithms and produce real-time results. Keeping them operating with a dependable power source in CI/CT is tricky, but feasible.

Using FRT software in CI/CT requires a strong data set and trustworthy software, but there are other challenges. Before anything else can happen, obligatory identification must be decided. If the purpose is to “negatively identify,” everyone who doesn’t fit the resident database is considered to be a militant and handled accordingly. Urbanization and the necessity for job are driving more people to cities. Due to individuals moving between rural and semi-urban areas, the picture library must be accessible across Union Territory (UT) and maybe beyond.

To “positively identify” a terrorist, militant, or OGW, a database of such people is needed. Lack of updated images of these terrorists may hinder efforts. The Army must also consider any legal implications of creating and maintaining this database. In Jammu and Kashmir, only two FRT projects are operational (J&K). The Srinagar Municipal Corporation and Jammu and Kashmir Police collaborated on the FRT installation project. The second is done by HUD to check new residents’ paperwork.

Activists question municipal, state, and federal governments’ legality in using and maintaining FRT. Poor eyesight, fluctuating settings, varying image quality, and a sophisticated learning approach complicate matters. AI-based systems may contain false positives and negatives. In CI/CT, the outcomes may be life-or-death. FRT-based systems must be tested, analysed, and adjusted before application in the real world.


FRT may be classified into two categories for individual recognition. First, a face-recognition system. “Affect recognition” seeks to comprehend someone’s emotions and intentions. Many FRT firms use affect recognition, however it’s based on shaky science and may not reflect a person’s real emotional state. 15 Using poor scientific standards in CI/CT might lead to mistaken identification. The recorded data must be protected against being hacked, falsified, or changed. All of these challenges make FRT implementation risky. The Indian Army must overcome these challenges before deploying AI-based CI/CT systems.