Dear Sean,
Could you tell us when you get stuck (at zelig() or sim() or somewhere
else)? In particular, could you send us the commands you tried to run
when you get the error?
Thanks,
Kosuke
---------------------------------------------------------
Kosuke Imai Office: Corwin Hall 041
Assistant Professor Phone: 609-258-6601
Department of Politics eFax: 973-556-1929
Princeton University Email: kimai(a)Princeton.Edu
Princeton, NJ 08544-1012
---------------------------------------------------------
On Fri, 23 Jul 2004, Richey, Sean wrote:
Hi,
I have similar memory problems. I am using 5 multiply imputed data sets.
I have tried both fixes suggested in the How-to link you provided. I do not
have access to a Unix system. What else can people with large or multiple
data sets do on a Windows system?
Thanks,
Sean Richey
-----Original Message-----
From: owner-zelig(a)latte.harvard.edu
[mailto:owner-zelig@latte.harvard.edu]On Behalf Of Kosuke Imai
Sent: Thursday, July 22, 2004 10:54 PM
To: David Mermelstein
Cc: zelig(a)latte.harvard.edu
Subject: Re: [zelig] Memory problems using Zelig-Bprobit
Hi David,
Have you tried everything on the following page?
http://gking.harvard.edu/zelig/docs/How_do_I2.html
If you have an access to Unix server, then it would be best if you could
run your model on that server rather than Windows.
Kosuke
---------------------------------------------------------
Kosuke Imai Office: Corwin Hall 041
Assistant Professor Phone: 609-258-6601
Department of Politics eFax: 973-556-1929
Princeton University Email: kimai(a)Princeton.Edu
Princeton, NJ 08544-1012
http://www.princeton.edu/~kimai
---------------------------------------------------------
On Thu, 22 Jul 2004, David Mermelstein wrote:
Hi all,
I'm trying to find some help with some memory problems.
I've been reading the FAQ archive on this topic and I could see that I'm
not
the first one with this issue. The solutions
included in previous posts
did
not work with my model and dataset.
I am trying to estimate a bivariate probit model. The fact is that my
model
includes 27 variables (considering the
explanatory ones included in the
two
equations) and about 60.000 records (but I also
tryied with a sub-sample
of
about 20.000).
The mesagge i obtain is the following:
Error: cannot allocate vector of size 287850 Kb
In addition: Warning message:
Reached total allocation of 904Mb: see help(memory.size)
My PC has 512 MB RAM and is running with a P-IV and and W2K(Proffesional)
Does anyone know what could be the upper limit that i would be able to
manage in terms of dataset size?
And, more important, with my model and dataset, and having tryied the
suggested alternatives in the FAQ, does anyone have another suggestion on
this?
Many thanks in advance.
David Mermelstein
_________________________________________________________________
STOP MORE SPAM with the new MSN 8 and get 2 months FREE*
http://join.msn.com/?page=features/junkmail
-
Zelig Mailing List, served by Harvard-MIT Data Center
Send messages: zelig(a)latte.harvard.edu
[un]subscribe Options:
http://lists.hmdc.harvard.edu/?info=zelig
Zelig program information:
http://gking.harvard.edu/zelig/
-
Zelig Mailing List, served by Harvard-MIT Data Center
Send messages: zelig(a)latte.harvard.edu
[un]subscribe Options:
http://lists.hmdc.harvard.edu/?info=zelig
Zelig program information:
http://gking.harvard.edu/zelig/
-
Zelig Mailing List, served by Harvard-MIT Data Center
Send messages: zelig(a)latte.harvard.edu
[un]subscribe Options:
http://lists.hmdc.harvard.edu/?info=zelig
Zelig program information:
http://gking.harvard.edu/zelig/
-
Zelig Mailing List, served by Harvard-MIT Data Center
Send messages: zelig(a)latte.harvard.edu
[un]subscribe Options: