CVS difference for ai12s/ai12-0336-1.txt

Differences between 1.4 and version 1.5
Log of other versions for file ai12s/ai12-0336-1.txt

--- ai12s/ai12-0336-1.txt	2019/07/03 22:01:25	1.4
+++ ai12s/ai12-0336-1.txt	2019/07/19 04:34:42	1.5
@@ -1113,4 +1113,110 @@
 primarily in UTC time internally; I suspect that is the cause of this
 confusion.)
 
+****************************************************************
+
+From: Joey Fish
+Sent: Friday, June 7, 2019  8:37 PM
+
+>>>This is annoying in that the intent was that the default for Image would be
+>>>the local time (not UTC). Short of adding additional routines, though, I
+>>>don't see a way to do that and also change the RM meaning of Time_Offset.
+
+>>I think the image should be in local time, isn't that [messing around with 
+>>UTC and such] what the formatting/time_zones/conversion children of 
+>>Ada.Calendar are about?
+
+>True, but we also have an obligation to standardize existing practice when
+>possible. Do we want to break everyone's code in this area?
+
+Given the ARG's stance on backwards-compatibility, to the point of not 
+eliminating features which were absolutely mistakes, yes.
+
+One of the big failings of the SQL standard is they made so many of the 
+features optional —in the name of existing implementations— so as to make the 
+standard useless WRT portability for any non-trivial construction.
+
+Given the ARG's goal of correctness, yes.
+
+That implementations made a mistake is regrettable, and something that should 
+be clarified to avoid in the future, but that is not the fault of the 
+standard, is it?
+
+(I mean if X many people misread "you shall not paint your truck red", 
+dropping the 'not' and painting the truck red — is that the fault of the 
+instructions?)
+
+>I'm suggesting that we split the baby by defining Local_Image to give the
+>Calendar result and leave the existing Image to do what it does on all of
+>the existing compilers except mine. (And note that Image is already in the
+>"formatting/time_zones/conversion children of Ada.Calendar"; we're not
+>talking about any change to Ada.Calendar.) It's not perfect but it is the
+>least disruptive option. (It also would allow Ada runtimes to represent Time
+>primarily in UTC time internally; I suspect that is the cause of this
+>confusion.)
+
+Is there anything preventing that now?
+
+I mean it could be represented internally as a monotonic time-stamp, with 
+adjustments done in the image function, right?
+
+***************************************************************
+
+From: Tucker Taft
+Sent: Saturday, June 8, 2019  7:58 AM
+
+>>... (It also would allow Ada runtimes to represent Time
+>>primarily in UTC time internally; I suspect that is the cause of this
+>>confusion.)
+
+>Is there anything preventing that now?
+
+No, in fact most Ada run-times on Unix-like systems already do exactly that.
+
+>I mean it could be represented internally as a monotonic time-stamp, with 
+>adjustments done in the image function, right?
+
+Yes, that is how it is done in most Ada run-times that I have ever worked on.
+
+***************************************************************
+
+From: Randy Brukardt
+Sent: Monday, June 10, 2019  11:40 PM
+
+>Yes, that is how it is done in most Ada run-times that I have ever 
+>worked on.
+
+That doesn't work well on Windows, however, as Windows gives 
+Year:Month:Day:Hr:Min:Sec, and converting that into a "monotonic" value is 
+rather complicated because of leap-years. Doing that on every call to Clock
+is potentially too expensive. (There's a trade-off between the cost of 
+Clock/Split and the cost of ordering operators; which makes the most sense 
+depends on how often Clock is called vs. the operators.)
+
+***************************************************************
+
+From: Joey Fish
+Sent: Tuesday, June 11, 2019  10:24 AM
+
+> That doesn't work well on Windows, however, as Windows gives
+> Year:Month:Day:Hr:Min:Sec, and converting that into a "monotonic" value is
+> rather complicated because of leap-years. Doing that on every call to Clock
+> is potentially too expensive.
+
+Indeed, which is why having monotonic time be the default/internal time-format 
+makes more sense. (Though, I suppose you could argue that the PC-clock, which 
+has been part of the PC since its inception, is dedicated hardware with an 
+internal wall-clock format that handles all the messy stuff like leap-year and 
+daylight-saving time and, thus, that provides for a natural internal/default 
+for wall-clock date/time handling.) — Besides the fact that (a) formatting 
+changes dependent on usage [eg "0800" vs "8:00 am"], and (b) things like 
+daylight-savings time are put in place by legislation, and thus can be altered 
+far easier than altering the silicon.
+ 
+>(There's a trade-off between the cost of
+>Clock/Split and the cost of ordering operators; which makes the most sense
+>depends on how often Clock is called vs. the operators.)
+
+I wonder if anyone's done any real analysis of clock vs. operators usage rates.
+
 ***************************************************************

Questions? Ask the ACAA Technical Agent