
Do AI agents foretell the next wave of autonomy or liability? | Analysis Premium
The Hindu
Learn about the evolution of AI assistants, the different types of AI agents, their capabilities, ethical concerns, and legal implications.
Assistants based on artificial intelligence (AI), such as Apple’s Siri and Amazon’s Alexa, have been around for more than a decade. An AI assistant can be defined in many ways. According to an April 2024 report, Google DeepMind defines “an AI assistant … as an artificial agent with a natural language interface, the function of which is to plan and execute sequences of actions on the user’s behalf across one or more domains and in line with the user’s expectations”.
The next-generation AI assistants are called AI agents (AIA) and are set to surpass their predecessors in ability as well as efficiency. AIAs can be broadly classified into three categories.
Reactive agents are first-generation AI agents developed to respond to specific inputs or commands. They follow predefined rules and perform tasks limited in scope as they can’t learn anything new and lack the ability to adapt. Learning agents were enabled by machine learning, which enabled them to learn from experience. They have better abilities, such as pattern detection and data analysis, and can improve their performance over time. Finally, cognitive agents can reason, analyse, and plan. They have cognitive skills because they can learn from their environment, and adapt and make decisions based on algorithms and their own ‘knowledge’. These agents use techniques including natural language processing, computer vision, and deep learning to perform tasks. The present generation of AIAs are cognitive agents. AIAs can perform multiple functions as users’ agents or autonomously (that is without instructions or user intervention). They can be integrated with the ‘internet of things’, allowing them to connect with multiple devices and their sensors and collect and analyse data in real-time.
Cognitive AIAs can also ‘understand’ human speech and language and with this skill can perform tasks that require multiple proficiencies. For example, they can plan a trip after listening in on a user’s phone calls and reading their emails, understanding their preferences, and parsing their previous travel experiences.
Recently, a Bengaluru-based startup launched an AIA that could autonomously handle items in a warehouse. It receives inputs as voice commands and responds with real-time decisions.
Companies and research facilities have also deployed AIAs to drive autonomous vehicles and to guide financial investments and treatment plans. A tool called Orby AI automates repetitive tasks while 4149 AI collaborates with humans inside apps like Slack and Notion to improve their productivity.
In sum, cognitive AIAs are not limited to their training data, are able to acquire new knowledge without human intervention, and can integrate with other systems. In turn, they enable personalisation by tailoring their responses to users’ preferences and needs. But in doing so, cognitive AIAs also pose many risks.

‘Instead of accusing Gen-Z of lacking skills or discipline, we need to ask what drives them’ Premium
At a recent event held in the city, Cambridge University Press & Assessment launched an advisory panel comprising leaders from top global corporations, aiming to bridge the employability gap in India and better align academic output with industry needs. A whitepaper released at the event highlighted the growing importance of communication skills, the need for stronger collaboration between industry and universities, and strategies to bridge the persistent skill gap.

Under the NBS, newborns are screened for communication disorders before they are discharged from the hospital. For this, AIISH has collaborated with several hospitals to conduct screening which is performed to detect hearing impairment and other developmental disabilities that can affect speech and language development. The screening has been helping in early intervention for those identified with the disorders, as any delay in the identification poses risk and affects successful management of children with hearing loss, according to AIISH.