Gangtok, March 28 -- As AI is getting democratized across enterprises, it is imperative to make the inner mechanisms of the entire AI ecosystem more comprehensible to those who are not directly in the frontlines. The need for this arises because AI is often looked at as a "black box", with major focus on the inputs and outputs and not much effort is invested by data scientists to make the systems more explainable. This is exacerbated by the fact that deep learning models are highly abstract to even skilled data scientists. Some of the popular large language models have over 100 billion parameters, and it is a herculean task to shed some light on how the models fundamentally operate.

Why simplify AI The need for simplifying AIto make it m...