Katy Perry Discovers America

Singing sensation Katy Perry had a wake-up call when she left Hollywood and relocated her family in Kentucky. After settling in the Bluegrass State, Perry realized Hollywood is not America.

Pop star Katy Perry is confessing something few of her elite liberal Hollywood peers would, saying moving to Kentucky has been an “amazing experience” because “it reminds you that Hollywood is not America.”

“I’m living in Kentucky, and I have for almost a month now, and that’s quite an amazing experience, because it reminds you that Hollywood is not America,” Perry told comedian Chelsea Handler on the Dear Chelsea podcast. “And you need to remember that. Because I think you can understand people better.”

Of course, Perry was never going to ditch her terrible political choices, but whatever.

Still, however, Perry has not abandoned her leftist political campaigning. She joined her fellow celebrities in 2020 to team up with Planned Parenthood for an abortion-themed ad campaign to push voter turnout in battleground states like Florida, Pennsylvania, and Michigan.

For the record, Katy Perry was at the DNC Convention in 2016, and a lot of my coworkers were downtown during the convention. To a person, every single detective there said Perry was extraordinarily nice and gladly took pictures with police officers and detectives.

No matter her politics, I will always respect her for taking time out to say hello.