The human body has always been an integral part of the art of European civilization, indicating the anthropocentrism of Western culture in general. Throughout the history of mankind and art, approaches and goals in depicting human nature have changed along with it, corresponding to the general mood and views of one or another era. Our present is no exception, and even a person who is far from art can daily observe what the body is today in art and mass culture.