Does the average American still consider their country to be democratic?
Given the recent events it looks like from an outside perspective that it’s a lot like authoritarianism now, especially after your presidents posts about demonstrations. What are your thoughts about all this? Is there a shift?