R: Re: trouble in testing consistency

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

R: Re: trouble in testing consistency

emanu.storti@tiscali.it
Thank you for your reply, I'm trying to get used to think in terms of
Open World Assumption.

I'm wondering if it is possible to "close" an
instance, that is whether an instance can be constrained to have
properties with a specific cardinality (I mean, at instance level,
rather than at  class level).
An example:
m3 could be described with a
datatype property maxNumberOfTask = "2": so it should have only 2 task
connected (that is, the cardinality of "specifies" property must be
exactly 2).

It's different from saying that its class "specifies
exactly 2 tasks", because I'd like to specify this from instance to
instance.

My class "Classification_Method" should be "the set of all
instances that "specifies"  the task "classification" AND that have
"specifies" property with a cardinality defined by the instances
themselves, as the value of maxNumberOfTask.

Classification_Method=
(specifies_task HAS "classification") AND (specifies_task EXACTLY (the
value of the property maxNumberOfTask, different from instance to
instance))
 
My question is: is it feasible to restrict a class using
the value of its instances properties? It would mean that an instance
must be semantically "self-coherent" to join that class (or, in other
words, everything about that instance  has been said).

The last
question is: I would like to define concept and relations using a
formalism, in order to have a high level representation of the ontology
and also to describe it in my thesis with a human-readable syntax
(unlike RDF/XML).

Is there a standard logic language to do this? Maybe
should I use DL standard representation?
Where can I find any
resources to learn how to use the syntax?


Thank you again,
Emanuele
Storti

>
>> Then, to test the inconsistency, I added this
>>
connection:
>> "c1" uses "m3".
>> This should not be correct, because
an
>> instance of classification_algorithm, according to the
restriction,
>> should "use" an instance of a method which "specify"
the
>> "classification" task, and this is not the case, because c1 uses
m3, a
>> method which "specify" the "feature_selection" task.
>
>
>

>This is an example of the open world assumption.  You have actually  

>said very little about c1 at this point.  Pellet correctly figures  

>that a1 could be a member of
>
>     "uses SOME (Method AND (specifies
HAS classification))"
>
>There are two obvious ways that this could
happen.  The simplest is  
>that perhaps m3 also specifies
classification.  Nowhere did you rule  
>this out so it remains a
possibility.  Also perhaps c1uses some other  
>method than m3 and that
other method specifies classification.  
>Finally there is another
possibility that might not occur to you  
>immediately.  It is possible
that classification and feature_selection  
>are actually two different
names for the same individual.
>
>One way to get an inconsistency would
be to add the following assertions
>
>    uses is functional
>    
specifies is functional
>    the individuals classification and
feature_selection are distinct.
>
>If you do this then pellet will
indeed generate an inconsistency.
>
>> In addition, if I delete at all

>> the connection between the c1 and m3 (so c1 has no connection to any

>> Method), Protégé shows a red square around the "uses" property when
I
>> focus on the c1 instance (pointing out that there is an error).
>

>To my way of thinking this is a problem with Protege.  What it is  

>trying to suggest to you is that - since you did have an assertion  

>implying that c1 would have a "uses" value then perhaps you should  

>give that value a name and include the assertion.  But the reality
is  
>that it is perfectly acceptable not all not to include a uses
value  
>for c1.  Indeed pellet will simply figure out that there must
be an  
>individual there but not be able to determine if that
individual is  
>the same as any other individual in  your  ontology.
>

>-Timothy
>
>
>
>On Oct 1, 2008, at 3:25 AM, [hidden email]
wrote:
>
>> Hello, I'm a university student in Computer Engineering.
I'm building
>> an owl ontology about algorithms with Protègè 3.3.1.
>>
My problem is to
>> verify whether my ontology is consistent or not.
>>

>> I have this simple
>> class structure:
>> owl:Thing
>> -Algorithm
>>
---classification_algorithm (with
>> 1 instance: named "a1")
>> ---
clustering_algorithm
>>
>> -Method
>> ---
>> classification_method
(with 1 instance: "m1")
>> ---clustering_method
>> (with 1 instance:
"m2")
>> ---da_method (with 1 instance:"m3")
>>
>> -Task
>> (with 2
instances: "classification" and "feature_selection")
>>
>>
>> Then, I

>> defined 2 object properties in this way:
>> uses (domain:Algorithm,
range:
>> Method) == an algorithm uses a method
>> specifies (domain:
Method, range:
>> Task) == a method specifies a task
>>
>> And the
connections are:
>>
>> "m1"
>> specifies "classification"
>> "m3"
specifies "feature_selection"
>>
>>
>> Well,
>> now I'd like to
restrict the property "uses" for class
>> "classification_algorithm",
saying that a classification algorithm  
>> MUST
>> use at least 1
method which specifies "classification".
>> Written in a
>> formal way,
in the "asserted condition" Tab of the
>> classification_algorithm
class I wrote:
>>
>> "uses SOME (Method AND
>> (specifies HAS
classification))" as a NECESSARY condition.
>>
>> 1st
>> question: is
it syntactically wrong?
>> 2nd question: does it express the
>> correct
meaning?
>>
>>
>> Then, to test the inconsistency, I added this
>>
connection:
>> "c1" uses "m3".
>> This should not be correct, because
an
>> instance of classification_algorithm, according to the
restriction,
>> should "use" an instance of a method which "specify"
the
>> "classification" task, and this is not the case, because c1 uses
m3, a
>> method which "specify" the "feature_selection" task.
>>
>>
Anyway, Pellet
>> says that the ontology is consistent.
>> In addition,
if I delete at all
>> the connection between the c1 and m3 (so c1 has
no connection to any
>> Method), Protégé shows a red square around the
"uses" property when I
>> focus on the c1 instance (pointing out that
there is an error).
>> Even in
>> this case, the reasoner says that the
ontology is consistent.
>>
>> 3rd
>> question: What's wrong? Doesn't
consistency concern this kind of
>> mistakes?
>> 4th question: What
kind of ontology test can show me that
>> there is an error?
>>
>>
Thank you in advance,
>> Emanuele S.
>>
>>
>> Con Tiscali Adsl 8 Mega
navighi SENZA LIMITI e GRATIS PER I PRIMI  
>> TRE MESI. In seguito
paghi solo € 19,95 al mese. Attivala subito,  
>> l’offerta è valida
fino al 02/10/2008! http://abbonati.tiscali.it/promo/adsl8mega/
>>
_______________________________________________
>> protege-owl mailing
list
>> [hidden email]
>> https://mailman.stanford.
edu/mailman/listinfo/protege-owl
>>
>> Instructions for unsubscribing:
http://protege.stanford.edu/doc/faq.html#01a.03
>
>




_______________________________________________________________



Con Tiscali Adsl 8 Mega navighi SENZA LIMITI e GRATIS PER I PRIMI TRE MESI. In seguito paghi solo € 19,95 al mese. Attivala subito, l'offerta è valida fino al 09/10/2008! http://abbonati.tiscali.it/promo/adsl8mega/
_______________________________________________
protege-owl mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-owl

Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03 
Reply | Threaded
Open this post in threaded view
|

Re: R: Re: trouble in testing consistency

Thomas Russ

On Oct 3, 2008, at 4:05 AM, [hidden email] wrote:

> Thank you for your reply, I'm trying to get used to think in terms of
> Open World Assumption.
>
> I'm wondering if it is possible to "close" an
> instance, that is whether an instance can be constrained to have
> properties with a specific cardinality (I mean, at instance level,
> rather than at  class level).
> An example:
> m3 could be described with a
> datatype property maxNumberOfTask = "2": so it should have only 2 task
> connected (that is, the cardinality of "specifies" property must be
> exactly 2).
>
> It's different from saying that its class "specifies
> exactly 2 tasks", because I'd like to specify this from instance to
> instance.

You would do that by making the instance belong to a type with that  
restriction.  You can create an anonymous type for this purpose if you  
like.  That gives you the restriction at the instance level:

   m3  type  (specifies exactly 2)

> My class "Classification_Method" should be "the set of all
> instances that "specifies"  the task "classification" AND that have
> "specifies" property with a cardinality defined by the instances
> themselves, as the value of maxNumberOfTask.

Well, you can't use the value of a property as part of the restriction  
definition in OWL.  This was possible in the Loom language, using some  
of the first-order escape forms and also relying on closed world  
reasoning.  But in OWL, you can't do it.

> Classification_Method=
> (specifies_task HAS "classification") AND (specifies_task EXACTLY (the
> value of the property maxNumberOfTask, different from instance to
> instance))

This is not expressible using OWL.

>
>
> My question is: is it feasible to restrict a class using
> the value of its instances properties? It would mean that an instance
> must be semantically "self-coherent" to join that class (or, in other
> words, everything about that instance  has been said).

Well, that seems to rely on closed-world reasoning, which is also not  
supported in OWL.

> The last
> question is: I would like to define concept and relations using a
> formalism, in order to have a high level representation of the  
> ontology
> and also to describe it in my thesis with a human-readable syntax
> (unlike RDF/XML).
>
> Is there a standard logic language to do this? Maybe
> should I use DL standard representation?
> Where can I find any
> resources to learn how to use the syntax?

Well, there are various alternative syntaxes for OWL, such as the  
Manchester Syntax, Turtle Syntax, Compact OWL, etc.

Whether that will do what you want is an open question.



_______________________________________________
protege-owl mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-owl

Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03 
Reply | Threaded
Open this post in threaded view
|

Closed world/Open world

JMiller

Hello everyone,

This is a follow-up to a statement (from Thomas Russ)  in an earlier post:

"Well, that seems to rely on closed-world reasoning, which is also not  
supported in OWL."

I have seen this statement for several years now, but I am curious--what is it about OWL in particular that would prevent a reasoner from treating the given ontology and data in a closed-world fashion?  I don't understand how OWL itself enforces open world reasoning; it seems that a reasoner could implement a mode that treats the data as complete (closed).  Why is this not possible?


Jim

_______________________________________________
protege-owl mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-owl

Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03 
Reply | Threaded
Open this post in threaded view
|

Re: Closed world/Open world

Timothy Redmond

> I have seen this statement for several years now, but I am curious--
> what is it about OWL in particular that would prevent a reasoner  
> from treating the given ontology and data in a closed-world  
> fashion?  I don't understand how OWL itself enforces open world  
> reasoning; it seems that a reasoner could implement a mode that  
> treats the data as complete (closed).  Why is this not possible?

Two things.  First the job of a reasoner is determined by the meaning  
of the language over which it is reasoning.  So for owl, much of the  
work has been in defining the semantics ([1] for owl 1.0 and [2] for  
owl 2.0).  This existing work gives a semantics that has an open world  
behavior.  If you want to create a reasoner that does closed world  
reasoning then you would have to introduce another semantics for the  
language.  There are some ideas on how to introduce closed world  
capabilities to the owl language but as far as I know we are not close  
to adding these to capabilities to the language.

Second, I suspect that doing a good job of a closed world semantics is  
more difficult than the open world assumption.  The semantics of owl  
for example is pretty simple.  (Though there is a great deal of  
attention to  detail that is needed to meet all the different  
requirements on the language.)  But a the semantics of a logic like  
frame logic [3] which attempts to capture closed world assumptions  
and  non-monotonic reasoning is  much more complicated.

-Timothy


[1] http://www.w3.org/TR/owl-semantics/
[2] http://www.w3.org/TR/owl2-semantics/
[3] http://www.cs.umbc.edu/771/papers/flogic.pdf

On Oct 5, 2008, at 6:11 PM, James A Miller wrote:

>
> Hello everyone,
>
> This is a follow-up to a statement (from Thomas Russ)  in an earlier  
> post:
>
> "Well, that seems to rely on closed-world reasoning, which is also not
> supported in OWL."
>
> I have seen this statement for several years now, but I am curious--
> what is it about OWL in particular that would prevent a reasoner  
> from treating the given ontology and data in a closed-world  
> fashion?  I don't understand how OWL itself enforces open world  
> reasoning; it seems that a reasoner could implement a mode that  
> treats the data as complete (closed).  Why is this not possible?
>
> Jim
> _______________________________________________
> protege-owl mailing list
> [hidden email]
> https://mailman.stanford.edu/mailman/listinfo/protege-owl
>
> Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03

_______________________________________________
protege-owl mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-owl

Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03 
Reply | Threaded
Open this post in threaded view
|

Re: Closed world/Open world

Thomas Russ
In reply to this post by JMiller

On Oct 5, 2008, at 6:11 PM, James A Miller wrote:

>
> Hello everyone,
>
> This is a follow-up to a statement (from Thomas Russ)  in an earlier  
> post:
>
> "Well, that seems to rely on closed-world reasoning, which is also not
> supported in OWL."
>
> I have seen this statement for several years now, but I am curious--
> what is it about OWL in particular that would prevent a reasoner  
> from treating the given ontology and data in a closed-world  
> fashion?  I don't understand how OWL itself enforces open world  
> reasoning; it seems that a reasoner could implement a mode that  
> treats the data as complete (closed).  Why is this not possible?

It is certainly possible to write reasoners like that.

Other description logics(*) have had reasoners that support closed  
world assumptions.  But that does take you into a different semantic  
arena (as Timothy Redmond indicates).  One aspect is that closed world  
is generally treated as an ASSUMPTION rather than a hard fact, so you  
have to introduce a notion of defeasability into the language.  It  
also requires that you have semantics for non-monotonic reasoning,  
since adding new information can cause previous conclusions to have to  
be revised.

For example, if you had the following classes:

    2-door-car <=>  car and exactly 2 has-door
    4-door-car <=>  car and exactly 4 has-door

and then asserted

    car-1 has-door door-1
    car-2 has-door door-2

you could satisfy 2-door-car.  Adding additional assertions

    car-1 has-door door-3
    car-1 has-door door-4

would require retracting the previous type classification (2-door-car)  
and adding 4-door-car in its place.  (Assuming, of course that all of  
the doors are different from each other).

-Tom Russ

(*) Loom http://www.isi.edu/isd/LOOM is one such system that I have  
been involved in the development of.  It has closed-world and other  
practical reasoning strategies, but lacks the clearly articulated  
semantics of OWL.

_______________________________________________
protege-owl mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-owl

Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03 
Reply | Threaded
Open this post in threaded view
|

Re: Closed world/Open world

JMiller

Timothy, Tom,

Thanks for your answers.  They are very helpful.

Jim




Thomas Russ <[hidden email]>
Sent by: [hidden email]

10/06/2008 02:28 PM

Please respond to
User support for the Protege-OWL editor        <[hidden email]>

To
User support for the Protege-OWL editor <[hidden email]>
cc
Subject
Re: [protege-owl] Closed world/Open world






On Oct 5, 2008, at 6:11 PM, James A Miller wrote:

>
> Hello everyone,
>
> This is a follow-up to a statement (from Thomas Russ)  in an earlier  
> post:
>
> "Well, that seems to rely on closed-world reasoning, which is also not
> supported in OWL."
>
> I have seen this statement for several years now, but I am curious--
> what is it about OWL in particular that would prevent a reasoner  
> from treating the given ontology and data in a closed-world  
> fashion?  I don't understand how OWL itself enforces open world  
> reasoning; it seems that a reasoner could implement a mode that  
> treats the data as complete (closed).  Why is this not possible?

It is certainly possible to write reasoners like that.

Other description logics(*) have had reasoners that support closed  
world assumptions.  But that does take you into a different semantic  
arena (as Timothy Redmond indicates).  One aspect is that closed world  
is generally treated as an ASSUMPTION rather than a hard fact, so you  
have to introduce a notion of defeasability into the language.  It  
also requires that you have semantics for non-monotonic reasoning,  
since adding new information can cause previous conclusions to have to  
be revised.

For example, if you had the following classes:

   2-door-car <=>  car and exactly 2 has-door
   4-door-car <=>  car and exactly 4 has-door

and then asserted

   car-1 has-door door-1
   car-2 has-door door-2

you could satisfy 2-door-car.  Adding additional assertions

   car-1 has-door door-3
   car-1 has-door door-4

would require retracting the previous type classification (2-door-car)  
and adding 4-door-car in its place.  (Assuming, of course that all of  
the doors are different from each other).

-Tom Russ

(*) Loom http://www.isi.edu/isd/LOOM is one such system that I have  
been involved in the development of.  It has closed-world and other  
practical reasoning strategies, but lacks the clearly articulated  
semantics of OWL.

_______________________________________________
protege-owl mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-owl

Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03




_______________________________________________
protege-owl mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-owl

Instructions for unsubscribing: http://protege.stanford.edu/doc/faq.html#01a.03