My goal is to automate fetching of trends data along with the extraction of "interest_over_time"
from the lists.
Considering it a two-step problem:
Automating fetching of data based on a group of keywords
Extracting elements automatically in a methodical manner
I am not able to complete step 2.
Any ideas on how to get this done?
Step1 - Automating fetch of google trends data
library(gtrendsR)
a <- c("sony", "apple")
for (i in a) {
name <- (paste(i, sep=""))
assign(name, gtrends(keyword=i, time="now 1-H"))
print(name)
}
Step2 - Extracting elements
sony[["interest_over_time"]]
instead of doing it like above manually, can I use for or some other function to automate this?
Here are a couple of ways that I think will give you what you want.
library(gtrendsR)
a = c("sony","apple")
First Way: using base
R
gtrends_interest <- function(keyword) gtrends(keyword = keyword, time = "now 1-H")[["interest_over_time"]]
trends_data <- do.call('rbind', lapply(a, gtrends_interest))
Second Way: using purrr
library(purrr)
trends_data2 <- a %>%
map_df( ~ gtrends(keyword = ., time = "now 1-H")[["interest_over_time"]])
Both methods will return a data.frame
with the interest_over_time
from each element in a
stacked.
I prefer the second since map()
becomes very powerful once you get a hang of it.