I've created a dynamic fileInput in shiny using lapply. When I want to read the file, I've also used lapply in an observer.
The problem of using lapply here is, it is triggered every time I upload a new file and thus, reads all files again and again if a new file is uploaded.
Here I provide a Hello World app. The lapply function depends on an input paramter which I abtracted from for simplicity.
library(shiny)
ui <- fluidPage(
titlePanel("Hello World"),
sidebarLayout(
sidebarPanel(),
mainPanel(
lapply(1:2, function(i) {
fileInput(
paste0("file", i),
label = NULL,
multiple = F,
accept = c(
"text/csv",
"text/comma-separated-values,text/plain",
".csv"
),
buttonLabel = paste("File", i)
)
}),
verbatimTextOutput("list")
)
)
)
server <- function(input, output) {
r <- reactiveValues()
observe({
lapply(1:2, function(i) {
file <- input[[paste0("file",i)]]
if(is.null(file)) return()
isolate({
r$file[[paste(i)]] <- readr::read_csv2(file = file$datapath)
})
})
})
output$list <- renderPrint(reactiveValuesToList(r))
}
shinyApp(ui = ui, server = server)
How to replace the loop or add a requirement to lapply?
While I started down the road of cache-invalidation in the comments, I think something else may work better for you since you have a fixed number of fileInput
fields: swap the lapply
and observe
lines in your code (plus a couple of other tweaks).
server <- function(input, output) {
lapply(paste0("file", 1:2), function(nm) {
observeEvent(input[[ nm ]], {
req(input[[nm]], file.exists(input[[nm]]$datapath))
readr::read_csv2(file = input[[nm]]$datapath)
})
})
}
Explanation:
"file1"
won't react to "file2"
.paste0(...)
in the data of the lapply
instead of in the function, though it'd be just as easy to do
lapply(1:2, function(i) {
nm <- paste0("file", i)
# ...
})
nm
defined outside of the observeEvent
, and it has to do with delayed evaluation and namespace search order. I fell prey to this a few years ago and was straightened out by Joe Cheng: you can't use a for
loop, it must be some environment-preserving operation like this.N.B.: this is a stub of code, and it is far from complete: having an observe
or observeEvent
read the data and then discard it is wrong ... it's missing something. Ideally, this should really be a reactive
or eventReactive
block, or the processed data should be stored in a reactiveValues
or reactiveVal
. For example:
server <- function(input, output) {
mydata <- lapply(paste0("file", 1:2), function(nm) {
observeEvent(input[[ nm ]], {
req(input[[nm]], file.exists(input[[nm]]$datapath))
readr::read_csv2(file = input[[nm]]$datapath)
})
})
observe({
# the following are identical, the latter more declarative
mydata[[1]]
mydata[["file1"]]
})
}
(And another note about defensive programming: you cannot control perfectly how readr::read_csv2
reacts to that file ... it may error out for some reason. One further step would be to wrap it in tryCatch(..., error = function(e) { errfun(e); NULL; })
where errfun(e)
does something meaningful with the error message (logs it and/or gives it to the user in a modal popup) and then returns NULL
so that reactive blocks downstream can use req(mydata[[1]])
and will not try to process the NULL
.
server <- function(input, output) {
mydata <- lapply(paste0("file", 1:2), function(nm) {
observeEvent(input[[ nm ]], {
req(input[[nm]])
file <- input[[nm]]
tryCatch(
readr::read_csv2(file = input[[nm]]$datapath),
error = function(e) { errfun(e); NULL; })
})
})
observe({
# the following are identical, the latter more declarative
mydata[[1]]
mydata[["file1"]]
})
}