Managing Dry Winter Skin
Jamie Mull, MD, Deaconess Clinic Dermatology
The arrival of winter usually signals the arrival of dry skin. Anyone can develop dry skin, but the cold climate of the winter months, accompanied by the dry heat indoors, PLUS the changes in hand hygiene practices with the pandemic - can aggravate the condition.