Motivated by some of the places I land, I decided to play with using TF and Keras from R. I’ve already walked through the python tutorials but want to stretch a bit. This brought me to the land of python-R interface and virtual env and reticulate. The TF and keras packages in R seem to function by accessing python either by an assumed anaconda installation or through virtualenv. I don’t want to install anaconda at this point so I went the virtualenv route which was interesting. There is a problem however. the TensorFlow R-package always creates its own virtualenv, I’ve had no success getting it to use one I pre-set. Also, it always installs the latest version of pip. It does this so that *it* can control the python modules brought into the virtualenv. However, it wants to directly invoke the __init__ (I think that was the function) function from pip, as in “from pip import __init__”. That function is no longer exposed in the latest pip release. Bummer, at the moment. I’d like to try looking at the R code and see if I can prevent it from always installing the latest version of pip. That could solve it.
I also want to go back to 2 unfinished projects: I’ve built two web-scraping apps in python to get volcano and earthquake data. Thought it would be an interesting dataset to test some deep learning software on. Very unfinished but I did put them up into Git as well.
Another project came via a Kaggle competition to classify geological (salt) content in images (rasterized). I have some ideas there – not in time to compete in competition but I feel its a good playground.
Then I want to go back to installing hadoop and spark on the local machines here.
