Cannot allocate vector in R of size 11.8 Gb

18,360

The below function is helpful to free the workspace , by removing large objects which you already have in the workspace. This is not a direct solution to your problem. But it also helps.

.ls.objects <- function (pos = 1, pattern, order.by,
                        decreasing=FALSE, head=FALSE, n=5) {
    napply <- function(names, fn) sapply(names, function(x)
                                         fn(get(x, pos = pos)))
    names <- ls(pos = pos, pattern = pattern)
    obj.class <- napply(names, function(x) as.character(class(x))[1])
    obj.mode <- napply(names, mode)
    obj.type <- ifelse(is.na(obj.class), obj.mode, obj.class)
    obj.prettysize <- napply(names, function(x) {
                           capture.output(print(object.size(x), units = "auto")) })
    obj.size <- napply(names, object.size)
    obj.dim <- t(napply(names, function(x)
                        as.numeric(dim(x))[1:2]))
    vec <- is.na(obj.dim)[, 1] & (obj.type != "function")
    obj.dim[vec, 1] <- napply(names, length)[vec]
    out <- data.frame(obj.type, obj.size, obj.prettysize, obj.dim)
    names(out) <- c("Type", "Size", "PrettySize", "Rows", "Columns")
    if (!missing(order.by))
        out <- out[order(out[[order.by]], decreasing=decreasing), ]
    if (head)
        out <- head(out, n)
    out
}

lsos <- function(..., n=10) {
    .ls.objects(..., order.by="Size", decreasing=TRUE, head=TRUE, n=n)
}

lsos()

Which would show you the list of the objects in your workspace and to occassionally rm() some of them.

Share:
18,360
Khuyagbaatar Batsuren
Author by

Khuyagbaatar Batsuren

Probably reading...

Updated on June 04, 2022

Comments

  • Khuyagbaatar Batsuren
    Khuyagbaatar Batsuren almost 2 years

    I got this error "Cannot allocate vector in R of size 11.8 Gb" because my desktop has 8 gb ram and size of matrix what I used is 3000x52128.

    In here is any solution to avoid this memory error? Even though I tried to decrease size of matrix as 1500x52128, also I got same error and same size as 11.8 Gb.

    So what should I do? Only one solution is to work on the computer with 16 gb ram?

    Addtional comment:
    when following commands are running, I got this error.

    svmDS <- read.csv("TrainDataSet_ver1.2.csv");
    model<-naiveBayes(as.factor(class)~., data=svmDS)