In addition to longer days and balmy weather, summer also brings with it the risk of overexposure to the sun’s damaging rays. While most spa-goers know the importance of sunscreen, many still neglect to apply it. In fact, according to research published in the Journal of the American Academy of Dermatology, the majority of Americans do not regularly use sunscreen to protect themselves from the sun. Unfortunately, even those who do wear sunscreen may find themselves inadequately protected. That’s because, for years, Americans have relied primarily on sunscreens that protect them only from ultraviolet B (UVB) rays.
“The global sunscreen market is still dominated by UVB-based sunscreens, particularly in North America,” says Denis Dudley, M.D., a double board-certified endocrinologist and co-founder of Cyberderm, The Sunscreen Company. “They mostly prevent UVB effects like sunburn to some degree but offer little or no protection against skin cancer or photoaging, where UVA plays a major role. We believe there is a compelling evidence-based argument that this incomplete protection contributes to rising annual skin cancer rates—two to three percent in North America.” New evidence suggests that we should also be concerned about infrared rays. Here we tackle this complicated subject with a look at the misconceptions surrounding sunscreen, the challenges the industry faces, and some of the promising developments on the horizon.