[protege-owl] OutOfMemory Error While trying to store instances into an OWL

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

[protege-owl] OutOfMemory Error While trying to store instances into an OWL

slkjso lskjfl
Hello,
 
I have a text file with about 1750 rows where each row contains 6 numbers. What I am trying to do is import all these numbers into an ontology I have. At first, I declared owlModel 
 OWLModel owlModel = ProtegeOWL.createJenaOWLModelFromInputStream(new FileInputStream(fileName););
 
and created 6 OwlNamedClasses. Then, I have a big while loop that reads the input file line by line, and for each line 6 RDFIndividuals are created and their properties are set as well.
 
After the while loop, I have this:
JenaOWLModel modelToWrite = (JenaOWLModel) owlModel;
modelToWrite.save(new File(fileName).toURI(), FileUtils.langXMLAbbrev, errors);  
 
 
The problem I have is, the program is fast when it runs at first, but as the number of read rows increases, it gets slower and slower till it eventually almost stops at row number 1642. I tried decreasing the number of  rows to be read to only 1500, but when the program reaches the save line (mentioned above), it gives me an outOfMemory error (Java heaps size is not enough) !!!!!!
I tried increasing the heap size to -Xmx512M, but it had no effect at all.
 
Anybody else had a similar problem ?????? or knows what's causing these errors ?? Any help is very much appreciated.
 
Regards,
Abdu
Reply | Threaded
Open this post in threaded view
|

[protege-owl] Re: OutOfMemory Error While trying to store instances into an OWL

Massimo Coletti
slkjso lskjfl ha scritto:

> Hello,
>  
> I have a text file with about 1750 rows where each row contains 6
> numbers. What I am trying to do is import all these numbers into an
> ontology I have. At first, I declared owlModel
>  OWLModel owlModel = ProtegeOWL.createJenaOWLModelFromInputStream(new
> FileInputStream(fileName););
>  
> and created 6 OwlNamedClasses. Then, I have a big while loop that
> reads the input file line by line, and for each line 6
> RDFIndividuals are created and their properties are set as well.
>  
> After the while loop, I have this:
> JenaOWLModel modelToWrite = (JenaOWLModel) owlModel;
> modelToWrite.save(new File(fileName).toURI(), FileUtils.langXMLAbbrev,
> errors);  
>  
>  
> The problem I have is, the program is fast when it runs at first, but
> as the number of read rows increases, it gets slower and slower till
> it eventually almost stops at row number 1642. I tried decreasing the
> number of  rows to be read to only 1500, but when the program reaches
> the save line (mentioned above), it gives me an outOfMemory error
> (Java heaps size is not enough) !!!!!!
> I tried increasing the heap size to -Xmx512M, but it had no effect at all.
try higher values (I have reached 1G in worst cases), checking that the
memory used doesn't exceed the free memory in your system, in order to
avoid abrupt performance degradetion due to paging.
>  
> Anybody else had a similar problem ?????? or knows what's causing
> these errors ?? Any help is very much appreciated.
>  
> Regards,
> Abdu
Have you tried to use a database backend?

bye

Massimo


This e-mail and any attachments may contain confidential and
privileged information. If you are not the intended recipient,
please notify the sender immediately by return e-mail, delete this
e-mail and destroy any copies. Any dissemination or use of this
information by a person other than the intended recipient is
unauthorized and may be illegal.
-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html

Reply | Threaded
Open this post in threaded view
|

[protege-owl] Re: OutOfMemory Error While trying to store instances into an OWL

slkjso lskjfl
I solved the problem by using Jena APIs. With the Jena APIs, there was no need to increase the heap size or anything.
However, the reasoner is running so slow. I am using Racer Pro, and the number of individuals in the ontology is over 12000. Is it normal for the reasoner to take 15 to 20 minutes to answer one simple nRQL query ??????
 
Also, can anybody provide more info on how to use a database as a backend for the individuals, and reason on them using Racer Pro ?????
 
Regards,
Abdu

 
On 7/21/06, Massimo Coletti <[hidden email]> wrote:
slkjso lskjfl ha scritto:

> Hello,
>
> I have a text file with about 1750 rows where each row contains 6
> numbers. What I am trying to do is import all these numbers into an
> ontology I have. At first, I declared owlModel
>  OWLModel owlModel = ProtegeOWL.createJenaOWLModelFromInputStream(new
> FileInputStream(fileName););
>
> and created 6 OwlNamedClasses. Then, I have a big while loop that
> reads the input file line by line, and for each line 6
> RDFIndividuals are created and their properties are set as well.
>
> After the while loop, I have this:
> JenaOWLModel modelToWrite = (JenaOWLModel) owlModel;
> modelToWrite.save(new File(fileName).toURI(), FileUtils.langXMLAbbrev,
> errors);
>
>
> The problem I have is, the program is fast when it runs at first, but
> as the number of read rows increases, it gets slower and slower till
> it eventually almost stops at row number 1642. I tried decreasing the
> number of  rows to be read to only 1500, but when the program reaches
> the save line (mentioned above), it gives me an outOfMemory error
> (Java heaps size is not enough) !!!!!!
> I tried increasing the heap size to -Xmx512M, but it had no effect at all.
try higher values (I have reached 1G in worst cases), checking that the
memory used doesn't exceed the free memory in your system, in order to
avoid abrupt performance degradetion due to paging.
>
> Anybody else had a similar problem ?????? or knows what's causing
> these errors ?? Any help is very much appreciated.
>
> Regards,
> Abdu
Have you tried to use a database backend?

bye

Massimo


This e-mail and any attachments may contain confidential and
privileged information. If you are not the intended recipient,
please notify the sender immediately by return e-mail, delete this
e-mail and destroy any copies. Any dissemination or use of this
information by a person other than the intended recipient is
unauthorized and may be illegal.
-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html