Words acquire meaning from adaptively relational dependency and entanglement with other words and definitions in whichever language, mind or culture they inhabit. Extrapolating this to sentences (and beyond, to narrative and ideology or any other linguistically-mediated system of belief), the combinatorial complexity of referential dependencies quickly becomes effectively unintelligible. A useful mnemonic and metaphor is that, because meaning is always in some sense displaced and offset or deferred, it occupies a role in language similar to that of an electron hole in the physics of superconductors. In a general sense, there is a binding systemic algebra and logic of emptiness and conspicuous absence here.
Where O.G. systems thinker and proto-cybernetician Gregory Bateson cryptically asserted that information is “๐ต๐ฉ๐ฆ ๐ฅ๐ช๐ง๐ง๐ฆ๐ณ๐ฆ๐ฏ๐ค๐ฆ ๐ต๐ฉ๐ข๐ต ๐ฎ๐ข๐ฌ๐ฆ๐ด ๐ข ๐ฅ๐ช๐ง๐ง๐ฆ๐ณ๐ฆ๐ฏ๐ค๐ฆ”, I wonder if we might consider meaning as “๐ต๐ฉ๐ฆ ๐ข๐ฃ๐ด๐ฆ๐ฏ๐ค๐ฆ ๐ต๐ฉ๐ข๐ต ๐ช๐ฏ๐ง๐ญ๐ข๐ต๐ฆ๐ด ๐ญ๐ข๐ฏ๐จ๐ถ๐ข๐จ๐ฆ ๐ธ๐ช๐ต๐ฉ ๐ฎ๐ฆ๐ข๐ฏ๐ช๐ฏ๐จ.” The presence of a dynamical feature or functional utility of significance may critically depend on its own opposite and counter-intuitive self-negation. By implication: meaning is grounded in meaninglessness.
Personally, I would say that AI does not understand (or experience and retain the significance of) anything. It currently represents an abstraction under some definition of intelligence and knowledge, ๐ธ๐ช๐ต๐ฉ๐ฐ๐ถ๐ต experience or knowing. Of course, I can never directly know anything of another person or entity’s mind and experience. In this regard, and with a nod to philosophy, the presence of other minds (and prospective machine sentience) comes to acquire a kind of persistent absence very similar to that of meaning in language. This ๐ค๐ฐ๐ถ๐ญ๐ฅ be purely coincidental…