Florida
Trivia
Florida is the only state in the continental United States that has a tropical climate.
Florida is the only state in the continental United States that has a tropical climate.