I am a Stata user trying to learn R.
I have a couple of lengthy folder paths which, in my Stata code, I stored as local macros. I have multiple files in both those folders to use in my analysis.
I know, in R, I can change the working directory each time I want to refer to a file in one of the folders but it is definitely not a good way to do it. Even if I store the folder paths as strings in R, I can't figure out how to refer to those. For example, in Stata I would use `folder1'.
I am wondering if trying to re-write Stata code line by line in R is not the best way to learn R.
Can someone please help?
First, as a former Stata user, let me recommend R for Stata Users. There is also this article on Macros in R. I think @Nick Cox is right that you need to learn to do things more differently. But like you (at least in this case), I often find myself starting a new task with my prior knowledge of how to do it in Stata and going from there. Sometimes I find the approaches are similar. Sometimes I can make R act like Stata when a different approach would be better (e.g., loops vs. vectorization).
I'm not sure if I will capture your question with the following, but let me try.
In Stata, it would be common to write:
global mydata "path to my data directory/"
To import the data, I would just type:
insheet using "${mydata}myfile.csv"
As a former Stata user, I want to do something similar in R. Here is what I do:
mydata <- "path to my data directory/"
To import a csv file located in this directory and create a data frame called myfile, I would use:
myfile <- read.csv(paste(mydata, "myfile.csv", sep=""))
or more efficiently...
myfile <- read.csv(paste0(mydata, "myfile.csv"))
I'm not a very efficient R user yet, so maybe others will see some flaws in this approach.