I am getting an error message when trying to pull great lakes data using the rnaturalearth
package. I've been using this same code for a while and am only today getting the error. I'm not sure if the site is down but it has been this way for a couple of weeks. I posted on their github but haven't received a response. Can anyone else replicate this error or know how to solve it or a workaround to get the same data?
install.packages('rnaturalearth','sf')
lakes <- rnaturalearth::ne_download(scale = 110,
type = 'lakes',
category = 'physical') %>%
sf::st_as_sf(lakes110, crs = 4269) %>% filter(name_alt =="Great Lakes")
Error Message:
trying URL 'http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/110m/physical/ne_110m_lakes.zip'
Error in utils::download.file(file.path(address), zip_file <- tempfile()) :
cannot open URL 'http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/110m/physical/ne_110m_lakes.zip'
In addition: Warning message:
In utils::download.file(file.path(address), zip_file <- tempfile()) :
cannot open URL 'https://www.naturalearthdata.com/http/www.naturalearthdata.com/download/110m/physical/ne_110m_lakes.zip': HTTP status was '404 Not Found'
I had the same problem recently and used the following code to obtain a hi-res shapefile of all the world's lakes:
library(sf)
url <- paste0("https://www.naturalearthdata.com/",
"http//www.naturalearthdata.com/download/10m/physical/",
"ne_10m_lakes.zip")
path <- tempdir()
download.file(url, paste0(path, "/lakes.zip"))
unzip(paste0(path, "/lakes.zip"))
lakes <- read_sf("ne_10m_lakes.shp")
For example, here is the data used to plot the Great Lakes:
library(ggplot2)
ggplot(lakes) +
geom_sf(fill = "lightblue") +
coord_sf(xlim = c(-100, -75), ylim = c(40, 50)) +
theme(panel.background = element_rect(fill = '#d0d890'),
panel.grid = element_line(color = '#00000010'))