Separating children from their parents is something Americans did during slavery. It was also done to erase the culture of Native Americans. As a teacher (and both an African and Native American) and just a decent human being, I cannot be silent on this issue. Families belong together. Tearing children from the arms of their mothers is a monstrous thing to do. Children do not get a say in where they live or what they do. Do not punish them. In textbooks, I’m reading that we’re “a nation of immigrants.” Is this how we treat our nation? Fix this. What a disgrace.