Dear Colleagues:
In the process of simulating the sex ratio at birth using Zelig, I
encountered a memory problem. Here is my program:
-------------------
z.out <- zelig(...)
x.0 <- setx(z.out, abort = 0, fn = NULL)
x.1 <- setx(z.out, abort = 1, fn = NULL)
s.out <- sim(z.out, x=x.0, x1=x.1)
summary(s.out)
-------------------
The first three statements ran fine. But I got error message with the
fourth statement saying something like "cannot allocate vector size of
380 MB". If I get rid of the "fn = NULL" option, the problem goes
away. is this because my data is too big? I am running Ubuntu linux on
a core2duo laptop with 2 GB of memory. My data file contains about
50,000 observations.
The workaround (other than buying a new computer with 64-bit
processor) is draw a random sample from the data file and do the
counterfactual simulation on that smaller data, something like:
------------------------
z.out <- zelig(...)
sd <- d[sample(1:nrow(d), 2000, replace = F),]
x.0 <- setx(z.out, abort = 0, fn = NULL, data=sd)
x.1 <- setx(z.out, abort = 1, fn = NULL, data=sd)
s.out <- sim(z.out, x=x.0, x1=x.1)
summary(s.out)
-----------------------
Is this a viable option? Are there other options?
Thanks.
Best,
Shige
-
Zelig Mailing List, served by Harvard-MIT Data Center
Send messages: zelig(a)lists.gking.harvard.edu
[un]subscribe Options:
http://lists.gking.harvard.edu/?info=zelig
Zelig program information:
http://gking.harvard.edu/zelig/