How to import packages ?

Hi,


I’d like to use either Keras or TensorFlow but have been struggling to import them properly into my environment.



Dear Chiekh - The usage of GPUs on Alphien is still experimental for now and can be subject to bugs / issues. Tensorflow requires special set-up on the machine and a simple R package installation doesn’t work for Tensorflow or Keras using Tensorflow as back-end engine.


If you use https://studio-gpu.alphien.com/ with your user name and password (you may have to re-enter your details) you will be able to run Tensor Flow and Keras.


I suggest you try the following code which is the basic and classical mnist problem (https://en.wikipedia.org/wiki/MNIST_database), if you can work out this code successfully that’s a confirmation that you can use both Keras and Tensorflow library succcessfully with all the power of Alphien Dataset.


For any questions related to the Lyxor Alphathon you can contact us on https://alphachat.alphien.com/#/room/#lyxoretf:chat.alphien.com.


.sourceQlib()

library(keras)

library(tensorflow)


# Configure memory allocation not to take all memory

gpu_options <- tf$GPUOptions(per_process_gpu_memory_fraction = 0.1)

config <- tf$ConfigProto(gpu_options = gpu_options)

k_set_session(tf$Session(config = config))


# Preparing the Data

mnist <- dataset_mnist()

x_train <- mnist$train$x

y_train <- mnist$train$y

x_test <- mnist$test$x

y_test <- mnist$test$y


# reshape

x_train <- array_reshape(x_train, c(nrow(x_train), 784))

x_test <- array_reshape(x_test, c(nrow(x_test), 784))

# rescale

x_train <- x_train / 255

x_test <- x_test / 255


y_train <- to_categorical(y_train, 10)

y_test <- to_categorical(y_test, 10)


# Defining the Model

model <- keras_model_sequential() 

model %>% 

 layer_dense(units = 256, activation = ‘relu’, input_shape = c(784)) %>% 

 layer_dropout(rate = 0.4) %>% 

 layer_dense(units = 128, activation = ‘relu’) %>%

 layer_dropout(rate = 0.3) %>%

 layer_dense(units = 10, activation = ‘softmax’)

# Show sumarry

summary(model)



# compile the model with appropriate loss function, optimizer, and metrics:

model %>% compile(

 loss = ‘categorical_crossentropy’,

 optimizer = optimizer_rmsprop(),

 metrics = c(‘accuracy’)

)


# Making sure that verbose is off to work in the scheduler.

# Training and Evaluation

history <- model %>% fit(

 x_train, y_train, 

 verbose = 0,

 epochs = 30, batch_size = 128, 

 validation_split = 0.2

)


# Plotting result (no plot for the scheduler)

plot(history)


# Evaluate the model’s performance on the test data:

model %>% evaluate(x_test, 

          y_test, 

          verbose = 0)


#Generate predictions on new data:

model %>% predict_classes(x_test)