AI explainability or the art of explaining why a certain AI model makes the predictions it makes is all the buzz these days. This is because unlike traditional algorithms we don't really understand what goes on inside AI systems and while we could peer into these models as they operate or log their actions in many cases we cannot really explain exactly why they make the decisions they do.
Musings on Knowledge, Technology, Life, Intelligence (Natural/Synthetic (Artificial)) and the future of humanity.