CVS difference for ai12s/ai12-0366-1.txt
--- ai12s/ai12-0366-1.txt 2020/02/11 05:41:26 1.1
+++ ai12s/ai12-0366-1.txt 2020/03/07 03:03:15 1.2
@@ -662,3 +662,311 @@
with Pre => (if Is_Valid (L) then raise Program_Error);
****************************************************************
+
+From: Jean-Pierre Rosen
+Sent: Monday, February 10, 2020 11:58 PM
+
+> function "+" (L : Big_Integer) return Big_Integer
+> with Pre => (if Is_Valid (L) then raise Program_Error);
+
+ with Pre => (if {not} Is_Valid (L) then raise Program_Error); Presumably ;-)
+
+****************************************************************
+
+From: Randy Brukardt
+Sent: Tuesday, February 11, 2020 2:01 AM
+
+> with Pre => (if {not} Is_Valid (L) then raise Program_Error);
+> Presumably ;-)
+
+Just checking if anyone is reading carefully. ;-)
+
+****************************************************************
+
+From: Tucker Taft
+Sent: Tuesday, February 11, 2020 8:41 AM
+
+> I'm trying to organize this for an AI, and it seems underspecified and
+> underjustified...
+
+Steve and I can help fill in some of the missing rationale. We had long
+discussions on this topic internally. I can try to augment the !discussion
+section if you want to hand it off to me.
+
+****************************************************************
+
+From: Claire Dross
+Sent: Tuesday, February 11, 2020 8:46 AM
+
+>> For SPARK users: having two different types makes generating
+>> provable formulas much harder and not direct. Since we
+>> anticipate SPARK users to be "big users" of "big integers",
+>> not having a straightforward and optimal mapping is really
+>> not suitable.
+
+> This I don't understand, could you explain further? I would expect that a
+> subtype would make analyzing a value of a type easier, since it necessarily
+> restricts the set of values that are possible. And it has no other effect so
+> it shouldn't make anything harder.
+
+The point here is that in SPARK, uninitialized values are not modelled. Said
+otherwise, we check that objects are initialized by other means without having
+to represent the invalid "uninitialized" value. So here, having to care about
+this value which is nearly always incorrect makes things more complicated for
+no gain.
+
+I hope it is clearer now,
+
+****************************************************************
+
+From: Tucker Taft
+Sent: Tuesday, February 11, 2020 8:54 AM
+
+> The point here is that in SPARK, uninitialized values are not modelled.
+> Said otherwise, we check that objects are initialized by other means without
+> having to represent the invalid "uninitialized" value. So here, having to
+> care about this value which is nearly always incorrect makes things more
+> complicated for no gain.
+
+Is it fair to saying that SPARK doesn't model the uninitialized *value* (at
+run time), but it does model the uninitialized "state" (at compile time)?
+
+> I hope it is clearer now,
+
+This issue is pretty subtle! ;-)
+
+****************************************************************
+
+From: Claire Dross
+Sent: Tuesday, February 11, 2020 9:07 AM
+
+It is a good image. Currently, SPARK does a simple flow analysis to check if
+objects are initialized, something that could be done at compile time.
+
+****************************************************************
+
+From: Tucker Taft
+Sent: Tuesday, February 11, 2020 9:49 AM
+
+I was including SPARK's flow analysis and proof within the notion of "compile
+time" ;-)
+
+****************************************************************
+
+From: Randy Brukardt
+Sent: Tuesday, February 11, 2020 4:49 PM
+
+>>> For SPARK users: having two different types makes generating
+>>> provable formulas much harder and not direct. Since we anticipate
+>>> SPARK users to be "big users" of "big integers", not having a
+>>> straightforward and optimal mapping is really not suitable.
+
+>>This I don't understand, could you explain further? I would expect
+>>that a subtype would make analyzing a value of a type easier, since it
+>>necessarily restricts the set of values that are possible. And it has no
+>>other effect so it shouldn't make anything harder.
+
+>The point here is that in SPARK, uninitialized values are not modelled.
+>Said otherwise, we check that objects are initialized by other means
+>without having to represent the invalid "uninitialized" value. So here,
+>having to care about this value which is nearly always incorrect makes
+>things more complicated for no gain.
+
+OK, but there aren't any "uninitialized" values here, this is a private type
+so the values are default initialized and Ada has nothing to say as to whether
+that is OK or not. For many private types (think File_Type, for example), a
+default initialized value is the normal case.
+
+The only way to bring "uninitialized value" checking to bear on these (or
+any) private types (Big_Integer and Big_Real) means applying some sort of
+non-Ada stuff ("magic" from an Ada perspective) to the types.
+
+And that's not a good thing, since people can and often do use the packages in
+the Standard as patterns for their own packages. But those programmers don't
+have the same "magic" available, and they will get burned.
+
+I suppose we could try to bring that "magic" to Ada in a defined way, but that
+has it's own problems. (How much flow analysis is enough analysis?
+There have been strong objections in the past to leaving this sort of thing
+implementation-defined.)
+
+It also occurs to me that not mandating the initial state implies that
+implementations that initialize everything to a known-good value (such as
+zero) would avoid needing to do any validity checking at all. That clearly
+would be the best implementation that meets the spec (the less work the
+better :-). I think we considered requiring that but some people (Bob in
+particular, if my memory is correct) had strong objections to that sort of
+implementation.
+
+What we can't do is make the SPARK implementation easier at the cost of making
+the Ada implementation unsafe or unportable. One expects plenty of non-SPARK
+Ada users to use these packages once they're generally available.
+
+****************************************************************
+
+From: Claire Dross
+Sent: Wednesday, February 12, 2020 2:13 AM
+
+> What we can't do is make the SPARK implementation easier at the cost
+> of making the Ada implementation unsafe or unportable. One expects
+> plenty of non-SPARK Ada users to use these packages once they're generally
+> available.
+
+I don't agree that this makes the Ada implementation unsafe or unportable.
+These big scalars would really be handled like normal scalars, which I think
+is the easiest to understand for everybody. Most people don't have a
+Valid_Natural type with a predicate right? If they care about initialization,
+they simply initialize all Natural objects at declaration, and I think this
+will be the same for these big scalar types.
+
+****************************************************************
+
+From: Tucker Taft
+Sent: Wednesday, February 12, 2020 4:36 PM
+
+The difference is that we have a run-time representation for "uninitialized"
+for such big integers, but not necessarily for normal integers. I agree with
+the notion that we don't want to "reify" this representation by providing,
+for example, a declared "Invalid_Value" constant. But we do want to allow Ada
+run-time checks to detect use of uninitialized big integers, even if a SPARK
+tool might have eliminated the possibility before run-time. It is somewhat of
+a delicate balance, and I don't think we should presume either answer is
+"obviously" correct. In any case, I have been handed the AI to add more
+rationale based on our internal AdaCore discussions. I may need some help in
+doing so!
+
+****************************************************************
+
+From: Claire Dross
+Sent: Thursday, February 13, 2020 3:25 AM
+
+> The difference is that we have a run-time representation for "uninitialized"
+> for such big integers, but not necessarily for normal integers. I agree
+> with the notion that we don't want to "reify" this representation by
+> providing, for example, a declared "Invalid_Value" constant. But we do
+> want to allow Ada run-time checks to detect use of uninitialized big
+> integers, even if a SPARK tool might have eliminated the possibility before
+> run-time. It is somewhat of a delicate balance, and I don't think we should
+> presume either answer is "obviously" correct.
+
+I agree with that, and I think it is nice that it is possible for Ada run-time
+to detect uses of uninitialized values. I wouldn't want to prevent that. But I
+don't think this is the case for the new proposal.
+
+> In any case, I have been handed the AI to add more rationale based on our
+internal AdaCore discussions. I may need some help in doing so!
+
+Sure! I would be happy to help, and probably Arnaud too.
+
+****************************************************************
+
+From: Randy Brukardt
+Sent: Thursday, February 13, 2020 5:53 PM
+
+> I agree with that, and I think it is nice that it is possible for Ada
+> run-time to detect uses of uninitialized values. I wouldn't want to
+> prevent that. But I don't think this is the case for the new proposal.
+
+It needs to be *required* to detect these sorts of abuses, because no Ada user
+ought to be dependent on the charity of an implementer to get safe behavior.
+The fact that Ada does not do such detection for scalar types is already a
+black eye for Ada (one that we can't fix because of compatibility issues).
+Making it worse isn't helping anything (perhaps other than the sales of static
+analysis tools).
+
+****************************************************************
+
+From: Claire Dross
+Sent: Friday, February 14, 2020 3:57 AM
+
+>It needs to be *required* to detect these sorts of abuses, because no Ada
+>user ought to be dependent on the charity of an implementer to get safe
+>behavior. The fact that Ada does not do such detection for scalar types is
+>already a black eye for Ada (one that we can't fix because of compatibility
+>issues). Making it worse isn't helping anything (perhaps other than the
+>sales of static analysis tools).
+
+You mean if there is no default initial condition then it is unsafe? I find
+it a bit definitive, there must be possible to have it enforced that the
+default value of Big_Integer is invalid without using this construct. After
+all, everybody has done without this aspect for years.
+
+I won't fight for this more, all this back and forth are tiring. But
+honestly, we should not be surprised to get compatibility issues with SPARK
+when Ada takes features of SPARK and add them slightly modified in Ada...
+Here the correct annotation for SPARK is Default_Initial_Condition => null,
+meaning that the type does not provide safe default initialization, but I
+know this is not Ada.
+
+****************************************************************
+
+From: Randy Brukardt
+Sent: Friday, February 14, 2020 10:21 PM
+
+...
+
+>>It needs to be *required* to detect these sorts of abuses, because no
+>>Ada user ought to be dependent on the charity of an implementer to get
+>>safe behavior. The fact that Ada does not do such detection for scalar
+>>types is already a black eye for Ada (one that we can't fix because of
+compatibility
+>>issues). Making it worse isn't helping anything (perhaps other than
+>>the sales of static analysis tools).
+
+>You mean if there is no default initial condition then it is unsafe? I
+>find
+
+>it a bit definitive, there must be possible to have it enforced that
+>the default value of Big_Integer is invalid without using this
+>construct. After all, everybody has done without this aspect for years.
+
+Every abstraction (containers, files, etc.) in Ada to date has a defined
+initial state (written in English text for older abstractions). It seems
+bizarre to eliminate that principle at the same time that we finally have a
+way to write that state within the language.
+
+Whenever I've talked about Ada being designed to be safer than other languages
+(esp. that the defaults are generally safe), often the first objection comes
+from the fact that uninitialized objects aren't detected. And I don't have a
+very good rebuttal for that. In Ada 2012, we did add some mitigation with the
+Default_Value aspect, but it remains the case that the default is unsafe. Ada
+needs to be able to deal with uninitialized objects in order to deal safely
+with interfacing (both hardware and software) and with I/O, but there is no
+reason that needs to be the default for other code. We can't change that
+because of compatibility concerns. But I don't want to compound a problem by
+repeating the mistake for new features.
+
+>I won't fight for this more, all this back and forth are tiring.
+
+Welcome to the ARG. :-) :-)
+
+I've tried to let this discussion go, but I'm trying to understand your
+position here, since it doesn't to me seem to make any sense for Ada.
+
+>But honestly, we should not be surprised to get compatibility issues
+>with SPARK when Ada takes features of SPARK and add them slightly
+>modified in Ada... Here the correct annotation for SPARK is
+>Default_Initial_Condition => null, meaning that the type does not
+>provide safe default initialization, but I know this is not Ada.
+
+I agree that the slightly different goals of SPARK and Ada almost always are
+going to cause some issues if one tries to keep them completely "separate but
+equal".
+
+But the real question here is why anyone would want to define a private type
+with "unsafe default initialization". If you want to *prevent* default
+initialization, you can use unknown discriminants on the partial view (in that
+case, a default-initialized object is illegal). And if you do want default
+initialization, then surely it needs to have a defined meaning which equates
+to a defined state. The only reason to do that in pre-Ada 2012 was that it
+was impossible to set a default for a scalar type, so if the completion of the
+private type is scalar, then there was no choice. But we fixed that in Ada
+2012, so there no longer is any reason that one cannot initialize something by
+default. I can see having a way to model existing code, but that doesn't apply
+in this case.
+
+If Ada could detect this sort of error (especially at compile-time, like SPARK
+does), then I wouldn't care as much, but it can't (and as noted above, it's
+impractical to do so now for scalars).
+
+****************************************************************
Questions? Ask the ACAA Technical Agent