Hi there,
I'm running full matching on a large data set (sample size is about 40,000). There appeared an error "
In data.frame(control = factor(controls[idx]), treated =
factor(treatments[idx]), :
Reached total
allocation of 3965Mb: see help(memory.size)
" This is because the datasets are too big and taking up too much memory. Do you know how to deal with this problem? Or do you know what packages in R can be used to deal with large dataset in MatchIt?
Thank you in advance.
Best,
Xin
Hello Everyone,
I am trying to use the data nuclearplants from the library optmatch to
do the matching.
library(optmatch) # for data nuclearplants
library (MatchIt) # for matching
data(nuclearplants)
zz<-matchit(pr ~ t1 + t2, data=nuclearplants, method = "nearest",
distance = "mahalanobis",replace=TRUE)
zz.out<-zz$match.matrix
head(zz.out)
1
A "I"
B "N"
C "M"
D "V"
E "X"
F "
zz$distance
[1] NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA NA
I was wondering why I am having NA values for the output distance.
Best regards,
Nayan