for sure sensationalist as profile, and that is for sure not the traditional "boring" science
But the eyeopener was there.... for me at least.
It is at first a bit like little children, frequently playing with each other, that also often develop their own language, their own group bonding.
The mind leap I made (which I did not describe clearly): if AI's get at a higher level and the way we deploy them would need communication between them (because there are different specialties/characteristics between AI's), this communication could rapidly evolve to a language we could possibly not be able to follow anymore. That language at the same time being their group bonding marking their "society" borders.
The issue would be bigger than some oldies not understanding the language of some younger generation.
Those youngsters have more or less the same instincts and drivers as we and will converge in the long run.
These AI's being alien in that respect, unless programmed with similar innate drivers as humans.
For example:
Why would a human have issues with the climate change, when he is long dead before it really starts hurting himself ?
Some humanist consideration ?
More likely humans with children, grandchildren want them to have their chance on a good life as well. Classical: you want them to have a better life than yourself. And if you have no children, the classic role is the aunt/uncle role for the tribe.
But AI's, without children, are alien compared to one of the strongest instinctual drivers we have.
So all in all, not disagreeing with what you said, there could be an issue when AI's develop and communicate
My issue with the conclusions (or allusion) in the article is that i am not at all seeing how the AI actually does something which is communication. If a program is set to have a discussion with you, the program obviously is not aware of you or of the meaning of discussion. It wouldn't in the case of doing the analogous with another program. What is not there is the sense of something being done. A program isn't sensing it does anything, nor that it exists. No context or sense cannot be producing a deliberation, but only an automatic progression of the program, which itself is not tied to any sense of change either; a rock falls if you drop it from above, but it isn't aware it is falling nor does it need to so as to keep falling until it reaches the ground.
If i would hazard a guess, based on the very little i know of machines working that way (automatically), the basis of what is going on is the triggering of some change through inherent change-ability of some power source, eg electricity. Ie the machine changes to some mode it can, if some property of the circuit it runs changes and the human creator has tied that change to the other one. There is no sense or deliberation or goal there.