denature

Definitions


[diːˈneɪtʃə], (Verb)

Definitions:
- take away or alter the natural qualities of
(e.g: this system denatures education)


Phrases:

Origin:
late 17th century: from French dénaturer, from dé- (expressing reversal) + nature ‘nature’




definition by Oxford Dictionaries