In a 1971 interview with Playboy, actor John Wayne was quoted as saying, "I believe in white supremacy." What was the public reaction to this? When did it become no longer fashionable for celebrities and public figures to openly express racist and misogynistic views like this in public?