Every day at some point I encounter some sort of anti-American feeling.
My parents taught me to approach the world critically, but also to approach it with a sense of responsibility.
I love the right words. I think economy and precision of language are important.
My parents were very firm about me always getting my homework done.
For most young Americans I know, 'serving' in the broadest sense now seems like the only thing to do.
That's not what I want my children to hear. That's not representative of the country that I want my children to grow up in. And so that actually I found far more upsetting as a mom, as a woman, as an American, and even as my mother's daughter than anything they said about my mom.