Re: Huge .owl files

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: Huge .owl files

Matt Williams-9
Try the nciCancer Ontology: http://swserver.cs.vu.nl/partitioning/NCI/ 
(32.8 MB)

OpenCyc: http://www.cyc.com/2004/06/04/cyc (700 MB)

Matt

Cláudio Fernandes wrote:
> Hi,
>
> can someone point me to some huge owl/rdf files?
> I'm writing a owl parser with different tools, and I'd like to benchmark
> them all with some really really big files.
>
> thanks in advance,
>
>
-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html

Reply | Threaded
Open this post in threaded view
|

Re: Huge .owl files

Raj M Verma
hi list,

when I tried to load this file into my Protege3.1, after loading Triple 170000 it gives out of memory error...

java.lang.OutOfMemoryError: Java heap space

is this because Protege cudn't handle such a big files or any other reason? I have 2 GB RAM in my Fujitsu/Siemens Celsius with Pentium(R) D CPU 2.80 GHz... so I don't think it shud be a problem with my system resources...

on the other hand, I'm using FMA, which is an 108 MB file, without any problem... but I guess Protege doesn't have any problem with FMA because it uses the mySQL database engine... is that right?

so is there a way that we can use this nciCancer Ontology with Protege 3.1 without any memory problem?

thanx,
Raj.

On 25/01/06, Matt Williams <[hidden email]> wrote:
Try the nciCancer Ontology: http://swserver.cs.vu.nl/partitioning/NCI/
(32.8 MB)

OpenCyc: http://www.cyc.com/2004/06/04/cyc (700 MB)

Matt

Cláudio Fernandes wrote:
> Hi,
>
> can someone point me to some huge owl/rdf files?
> I'm writing a owl parser with different tools, and I'd like to benchmark
> them all with some really really big files.
>
> thanks in advance,
>
>
-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html

Reply | Threaded
Open this post in threaded view
|

Re: Huge .owl files

Matthew Horridge
Hi Raja,

when I tried to load this file into my Protege3.1, after loading Triple 170000 it gives out of memory error...

java.lang.OutOfMemoryError: Java heap space

You just need to allocate more memory when starting Protege.  The java vm option is -Xmx1500M for a max heap size of 1.5 gigs.  By the way, you should really get the latest beta.

Cheers,

Matthew


is this because Protege cudn't handle such a big files or any other reason? I have 2 GB RAM in my Fujitsu/Siemens Celsius with Pentium(R) D CPU 2.80 GHz... so I don't think it shud be a problem with my system resources...

on the other hand, I'm using FMA, which is an 108 MB file, without any problem... but I guess Protege doesn't have any problem with FMA because it uses the mySQL database engine... is that right?

so is there a way that we can use this nciCancer Ontology with Protege 3.1 without any memory problem?

thanx,
Raj.

On 25/01/06, Matt Williams <[hidden email]> wrote:
Try the nciCancer Ontology: http://swserver.cs.vu.nl/partitioning/NCI/
(32.8 MB)

OpenCyc: http://www.cyc.com/2004/06/04/cyc (700 MB)

Matt

Cláudio Fernandes wrote:
> Hi,
>
> can someone point me to some huge owl/rdf files?
> I'm writing a owl parser with different tools, and I'd like to benchmark
> them all with some really really big files.
>
> thanks in advance,
>
>
-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html


Reply | Threaded
Open this post in threaded view
|

Re: Huge .owl files

Matt Williams-9
Raja,

You can certainly load the nciCancer.owl file in Protege 3.2b.
I set the heap size to ~700Mb, and it went fine. Takes about 90 secs.

Incidentally, (Matthew) is there (an approximate) mapping between .owl
text file size /number of triples and memory requirement?

HTH,

Matt

Matthew Horridge wrote:

> Hi Raja,
>
>> when I tried to load this file into my Protege3.1, after loading
>> Triple 170000 it gives out of memory error...
>>
>> java.lang.OutOfMemoryError: Java heap space
>
> You just need to allocate more memory when starting Protege.  The java
> vm option is -Xmx1500M for a max heap size of 1.5 gigs.  By the way, you
> should really get the latest beta.
>
> Cheers,
>
> Matthew
>
>
>> is this because Protege cudn't handle such a big files or any other
>> reason? I have 2 GB RAM in my Fujitsu/Siemens Celsius with Pentium(R)
>> D CPU 2.80 GHz... so I don't think it shud be a problem with my system
>> resources...
>>
>> on the other hand, I'm using FMA, which is an 108 MB file, without any
>> problem... but I guess Protege doesn't have any problem with FMA
>> because it uses the mySQL database engine... is that right?
>>
>> so is there a way that we can use this nciCancer Ontology with Protege
>> 3.1 without any memory problem?
>>
>> thanx,
>> Raj.
>>
>> On 25/01/06, Matt Williams <[hidden email]> wrote:
>> Try the nciCancer Ontology: http://swserver.cs.vu.nl/partitioning/NCI/
>> (32.8 MB)
>>
>> OpenCyc: http://www.cyc.com/2004/06/04/cyc (700 MB)
>>
>> Matt
>>
>> Cláudio Fernandes wrote:
>> > Hi,
>> >
>> > can someone point me to some huge owl/rdf files?
>> > I'm writing a owl parser with different tools, and I'd like to
>> benchmark
>> > them all with some really really big files.
>> >
>> > thanks in advance,
>> >
>> >
>> -------------------------------------------------------------------------
>> To unsubscribe go to http://protege.stanford.edu/community/subscribe.html
>>
>
>

--
Dr. M. Williams MRCP(UK)
Clinical Research Fellow,
Cancer Research UK
+44 (0)7834 899570
-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html

Reply | Threaded
Open this post in threaded view
|

Re: Huge .owl files

Ronnie Valkky
In reply to this post by Raj M Verma
Hi Raj,
 
try this after exiting Protege:
 
1. find file %ProtegeHome%\Protege.lax
where %ProtegHome% is the folder where you have Proge installed, for example I have it in folder F:\sw\Protege3.2beta
2. change heap size:
#   LAX.NL.JAVA.OPTION.JAVA.HEAP.SIZE.INITIAL
#   -----------------------------------------
#   initial heap size 10000000
#   increased to     100000000 
lax.nl.java.option.java.heap.size.initial=100000000 <---------- just an example, works for me
 
3. restart Protege
 
this might help you also with your Protoge version; tune the value for you
 
good luck,
Ronnie
----- Original Message -----
Sent: Thursday, January 26, 2006 12:30 PM
Subject: [protege-owl] Re: Huge .owl files

hi list,

when I tried to load this file into my Protege3.1, after loading Triple 170000 it gives out of memory error...

java.lang.OutOfMemoryError: Java heap space

is this because Protege cudn't handle such a big files or any other reason? I have 2 GB RAM in my Fujitsu/Siemens Celsius with Pentium(R) D CPU 2.80 GHz... so I don't think it shud be a problem with my system resources...

on the other hand, I'm using FMA, which is an 108 MB file, without any problem... but I guess Protege doesn't have any problem with FMA because it uses the mySQL database engine... is that right?

so is there a way that we can use this nciCancer Ontology with Protege 3.1 without any memory problem?

thanx,
Raj.

On 25/01/06, Matt Williams <[hidden email]> wrote:
Try the nciCancer Ontology: http://swserver.cs.vu.nl/partitioning/NCI/
(32.8 MB)

OpenCyc: http://www.cyc.com/2004/06/04/cyc (700 MB)

Matt

Cláudio Fernandes wrote:
> Hi,
>
> can someone point me to some huge owl/rdf files?
> I'm writing a owl parser with different tools, and I'd like to benchmark
> them all with some really really big files.
>
> thanks in advance,
>
>
-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html

Reply | Threaded
Open this post in threaded view
|

Re: Huge .owl files

Raj M Verma
thanx Ronnie, and others... it works now...
Reply | Threaded
Open this post in threaded view
|

Re: Huge .owl files

Matthew Horridge
In reply to this post by Matt Williams-9
Hi Matt,

> Raja,
>
> You can certainly load the nciCancer.owl file in Protege 3.2b.
> I set the heap size to ~700Mb, and it went fine. Takes about 90 secs.
>
> Incidentally, (Matthew) is there (an approximate) mapping  
> between .owl text file size /number of triples and memory requirement?

Interesting question, but I'm not sure of the answer (apart from "it  
will vary" :) ).  I know that it will depend on the type of triples,  
because for example rdfs:subClassOf triples get represented in  
Protege-Core by adding :DIRECT-SUBCLASSES and :DIRECT-SUPERCLASSES  
slots, so there's three "triples" for every rdfs:subClassOf triple  
(same for rdf:type) - other triple e.g. disjointWith, only get  
represented in their native OWL form.

Cheers,

Matthew

>
> HTH,
>
> Matt
>
> Matthew Horridge wrote:
>> Hi Raja,
>>> when I tried to load this file into my Protege3.1, after loading  
>>> Triple 170000 it gives out of memory error...
>>>
>>> java.lang.OutOfMemoryError: Java heap space
>> You just need to allocate more memory when starting Protege.  The  
>> java vm option is -Xmx1500M for a max heap size of 1.5 gigs.  By  
>> the way, you should really get the latest beta.
>> Cheers,
>> Matthew
>>> is this because Protege cudn't handle such a big files or any  
>>> other reason? I have 2 GB RAM in my Fujitsu/Siemens Celsius with  
>>> Pentium(R) D CPU 2.80 GHz... so I don't think it shud be a  
>>> problem with my system resources...
>>>
>>> on the other hand, I'm using FMA, which is an 108 MB file,  
>>> without any problem... but I guess Protege doesn't have any  
>>> problem with FMA because it uses the mySQL database engine... is  
>>> that right?
>>>
>>> so is there a way that we can use this nciCancer Ontology with  
>>> Protege 3.1 without any memory problem?
>>>
>>> thanx,
>>> Raj.
>>>
>>> On 25/01/06, Matt Williams <[hidden email]> wrote:
>>> Try the nciCancer Ontology: http://swserver.cs.vu.nl/partitioning/ 
>>> NCI/
>>> (32.8 MB)
>>>
>>> OpenCyc: http://www.cyc.com/2004/06/04/cyc (700 MB)
>>>
>>> Matt
>>>
>>> Cláudio Fernandes wrote:
>>> > Hi,
>>> >
>>> > can someone point me to some huge owl/rdf files?
>>> > I'm writing a owl parser with different tools, and I'd like to  
>>> benchmark
>>> > them all with some really really big files.
>>> >
>>> > thanks in advance,
>>> >
>>> >
>>> --------------------------------------------------------------------
>>> -----
>>> To unsubscribe go to http://protege.stanford.edu/community/ 
>>> subscribe.html
>>>
>
> --
> Dr. M. Williams MRCP(UK)
> Clinical Research Fellow,
> Cancer Research UK
> +44 (0)7834 899570
> ----------------------------------------------------------------------
> ---
> To unsubscribe go to http://protege.stanford.edu/community/ 
> subscribe.html
>

-------------------------------------------------------------------------
To unsubscribe go to http://protege.stanford.edu/community/subscribe.html