Search code examples
rjupyter-notebookcondajupyter-irkernel

conda build r-ldavis/ not working


I am new to jupyter, and I am looking to install an R package (tseries) that is available on CRAN

I was trying to follow a question that was raised, but I think I am getting a different problem.

I was originally following this link

conda - How to install R packages that are not available in "R-essentials"?

But it seems obvious that the answer direct me to another link which is

https://www.continuum.io/content/conda-data-science

under the Building a conda R package

They said to run

conda skeleton cran ldavis

and then I got the following

C:\Users\Rami>conda skeleton cran ldavis
Tip: install CacheControl (conda package) to cache the CRAN metadata
Fetching metadata from http://cran.r-project.org/
Tip: install CacheControl (conda package) to cache the CRAN metadata
Traceback (most recent call last):
  File "d:\Users\Rami\Anaconda3\Scripts\conda-skeleton-script.py", line 5, in <module>
    sys.exit(conda_build.cli.main_skeleton.main())
  File "d:\Users\Rami\Anaconda3\lib\site-packages\conda_build\cli\main_skeleton.py", line 65, in main
    return execute(sys.argv[1:])
  File "d:\Users\Rami\Anaconda3\lib\site-packages\conda_build\cli\main_skeleton.py", line 61, in execute
    api.skeletonize(package, args.repo, config=config)
  File "d:\Users\Rami\Anaconda3\lib\site-packages\conda_build\api.py", line 192, in skeletonize
    recursive=recursive, config=config, **kwargs)
  File "d:\Users\Rami\Anaconda3\lib\site-packages\conda_build\skeletons\cran.py", line 527, in skeletonize
    raise RuntimeError("directory already exists: %s" % dir_path)
RuntimeError: directory already exists: .\r-ldavis

Please help me in steps as I am really new to this on windows 10


Solution

  • Thank you all for your remarks but the easiest way I found for not worrying about the r directory is through downloading the package directly from jupyter using the following command

    install.packages('tseries', repos='http://cran.us.r-project.org')
    

    This worked fine for me.