Words acquire meaning from adaptively relational dependency and entanglement with other words and definitions in whichever language, mind or culture they inhabit. Extrapolating this to sentences (and beyond, to narrative and ideology or any other linguistically-mediated system of belief), the combinatorial complexity of referential dependencies quickly becomes effectively unintelligible. A useful mnemonic and metaphor is that, because meaning is always in some sense displaced and offset or deferred, it occupies a role in language similar to that of an electron hole in the physics of superconductors. In a general sense, there is a binding systemic algebra and logic of emptiness and conspicuous absence here.
Where O.G. systems thinker and proto-cybernetician Gregory Bateson cryptically asserted that information is “𝘵𝘩𝘦 𝘥𝘪𝘧𝘧𝘦𝘳𝘦𝘯𝘤𝘦 𝘵𝘩𝘢𝘵 𝘮𝘢𝘬𝘦𝘴 𝘢 𝘥𝘪𝘧𝘧𝘦𝘳𝘦𝘯𝘤𝘦”, I wonder if we might consider meaning as “𝘵𝘩𝘦 𝘢𝘣𝘴𝘦𝘯𝘤𝘦 𝘵𝘩𝘢𝘵 𝘪𝘯𝘧𝘭𝘢𝘵𝘦𝘴 𝘭𝘢𝘯𝘨𝘶𝘢𝘨𝘦 𝘸𝘪𝘵𝘩 𝘮𝘦𝘢𝘯𝘪𝘯𝘨.” The presence of a dynamical feature or functional utility of significance may critically depend on its own opposite and counter-intuitive self-negation. By implication: meaning is grounded in meaninglessness.
Personally, I would say that AI does not understand (or experience and retain the significance of) anything. It currently represents an abstraction under some definition of intelligence and knowledge, 𝘸𝘪𝘵𝘩𝘰𝘶𝘵 experience or knowing. Of course, I can never directly know anything of another person or entity’s mind and experience. In this regard, and with a nod to philosophy, the presence of other minds (and prospective machine sentience) comes to acquire a kind of persistent absence very similar to that of meaning in language. This 𝘤𝘰𝘶𝘭𝘥 be purely coincidental…