I am wondering what effects our attempts to ensure terminological consistency will have on our languages going forward.

 

Consistent terminology is necessary for efficient localization in a world that values business expansion in a global market while operating in a cost-efficient manner. Consistent source terminology ensures consistent target terminology. The more formulaic and “controlled” the source text, the better the results of machine translation. Texts with consistent terminology, grammar and syntax are the cheapest to localize.

 

In areas with a naturally limited technical nomenclature such as IT terminology, terminology management that results in the elimination of inconsistent terms (e.g. terms that differ from a predefined standard term while denoting the same object) is advisable. It improves the usability of the localized product while reducing the localization cost.

 

I believe in Sapir-Whorf’s hypothesis of linguistic relativism that postulates that differences in language are related to differences in cognition of the persons who use the language. Differences in vocabulary sizes of individual languages reflect on difference in the cognitive state of the language users.

 

It seems to be in the best interest of a global industry to keep limiting vocabularies. Will this eventually influence our cognitive abilities?