Re: Object validation - Question

From: Andrus Adamchik (andru..bjectstyle.org)
Date: Sat Sep 20 2003 - 00:55:37 EDT

  • Next message: Andrus Adamchik: "[ANN] Cayenne 1.0 FINAL"

    On Tuesday, September 16, 2003, at 12:25 PM, Fabricio Voznika wrote:

    > Hi all,
    >
    > As I was implementing the auto validation thing, I got to a problem
    > that I want to know your opinion about it. Doing the validation for
    > mandatory fields is ok, but when it comes to length, precision and
    > dates it becomes tricky. That is because one could possibly change the
    > mapping Java->DB and DB->Java. So using DbAttribute.getType() doesn't
    > do it because a char could map either a Boolean or a String (or
    > anything else). Using ObjAttribute.getType() doesn't do also because
    > one may want to map a Date to varchar(2000) having no upper or lower
    > limit (sorry, couldn't come up with a better example :-) ). Also
    > different databases may (and do) have different boundaries values (I'm
    > not sure whether this is mapped correctly to max length and precision
    > with the modeler?)
    > One way of doing this is to add a validate(ValidationResult) method
    > to the ExtendedType interface. Then every mapping could check whether
    > the property value can be correctly mapped to the DB. How does that
    > sound?

    Sorry, took a while to reply..

    Yes this is a non-trivial problem given conversion between various data
    types. ExtendedType seems like a good place to take care of this. Quick
    look at the existing types shows that out of currently implemented type
    converters, only CharType and ByteArrayType will need to do a check on
    max length. I don't think we need to do any checks for the rest of the
    ExtendedTypes. So API would probably be something like:

    ExtendedType.validate(ValidationResult, Object value, DbAttribute
    attribute);

    Thanks.
    Andrus



    This archive was generated by hypermail 2.0.0 : Sat Sep 20 2003 - 00:50:19 EDT