If it wasn't, then why would the American school system teach American students that it is?
For pete's sake, it's not legal in many states to even cover slavery, the indian genocide, or japanese internment in a way that actually holds the US responsible
reply
If it wasn't, then why would the American school system teach American students that it is?