WordUnscrambler Home
denature
Definitions
[diːˈneɪtʃə], (Verb)
Definitions:
- take away or alter the natural qualities of
(e.g: this system denatures education)
Phrases:
Origin
:
late 17th century: from French dénaturer, from dé- (expressing reversal) + nature ‘nature’
Click here to see the free dictionary definition for denature
definition by Oxford Dictionaries