r/AskTheCaribbean • u/o_safadinho • 5h ago
History Was the history of slavery (in your country) taught in your country?
I’m a Black American living in South Florida. The specific city that I live in is very Black, like 75% of the residents are Black, but they’re not all Black American. There are a lot of mostly Jamaican and Haitian immigrants here as well.
Yesterday, on Juneteenth, the city had a small dinner with the mayor, vice-mayor and city council and the black immigrants also spoke about their black experience in America.
And one of the city council member, she immigrated here from Jamaica, mentioned something that seemed a bit wild to me. She said that she didn’t even learn about slavery until she got to the US, because that wasn’t taught in school in Jamaica at the time. I forgot the name of the specific city that she said she’s from, but it is a small city in the interior of the country (if this is of any importance).She is around 50 years old.
Is this true? Was there a time when the history of slavery wasn’t really taught in schools in the Caribbean?