All Activity

This stream auto-updates     

  1. Earlier
  2. Dear Colleagues, Can anyone explain the difference between NOD and NWD as it defined in the DCS: NOD - (numeric;) Refracted object distance at near viewing point (meters). NWD - (numeric;) Working object distance at near viewing point (meters). We would like to use the one who fits most to describe the reading distance measured by the ECP. Thank you in advance for your support. Stay safe Haim S. - Shamir Optical Ind.
  3. No. The ddf file (can be opened with Notepad++ i.e.) only contains GoNoGo results. Note that the content of the ddf file can also be uploaded by TCP/IP. This upload string can contain additional labels such as JOB, OPC, MBASE, OPERID, TIME...
  4. According to the standard, Design deviation datasets are contained in files which have an extension “DDF” My question: can the DDF file contain additional labels aside the DDFMT dataset? For example, labels such that JOB, OPC, MBASE, OPERID, TIME...
  5. Hi Robert, Seems that my comment about TOL* vs TOLV* has not been adapted; could you double-check please? Tolerance records TOLADD, TOLCYL, TOLDIA,... on Table A.1a should be IMHO integer values (0;1;9) and not (min|max;). The min|max; are for TOLVADD, TOLVCYL, TOLVDIA,... There is also a typo for INSMOD, I see two times CCF as possible INSMOD; I think that for convex (anterior) surface, inifinity on axis it should be CXI instead of CCF
  6. Hi, Yaron, I apologize for being so tardy in replying. I'll go through your suggestions one-by-one. 1. I'm indifferent to the ordering of the revision history, but since it has been done newest first, it would take some work to revise it. Do you want to volunteer to do it, if the group approves of of doing it that way? 2. a) I also like "enumerated" better than "specified"; although it might be possible for only one literal value to be specified for a particular record's field value(s), in which case "enumerated" would be inappropriate. "Specified" is the more general of the two terms. However, we're splitting hairs here. "Enumerated" would be OK with me. b) We have always tried to enforce a measure of "terseness" in various text features. Just because a particular requirement can be relaxed when necessary doesn't mean that it's "effectively" a nullity. 3. Limited is there for a reason (namely, to limit the lengths of certain elements). My preference is to allow the constraint to be relaxed on an exception basis - if that's actually necessary. I don't recall why we added the "unless otherwise noted in the record definition." However, as I wrote earlier, I don't think that allowing an "override" eliminates its utility when it's not explicitly relaxed. 4. I'll ask Paul to fix the reference errors (he's still on board for that). 5. Same as (4), Paul will fix the revision history. 6. It was not my intention (or anyone else's, as far as I know) that this revision would change the location of an engraving. I agree that that is unacceptable.
  7. I am a Staff Author at a Marketplace for On-Demand telecom workforce, extending from field engineers to high-level network engineers, project managers and Network Architects in 146 nation 

  8. Unfortunately that is just the nature of things. Typically title pages and tables of content are not numbered so you will end up with discrepancies between the PDF page numbering and the document and TOC page numbers. As long as the TOC matches the page numbers on the document I hope people can find stuff. I don't really think that is something we can fix.
  9. @SJO - ENGLOC seems to have not been changed from previous versions, it's still type Literal rather than changed to Literal[;] as you suggested. Are you looking at the same document that has been posted here?
  10. Hi, A small 'flaw' in my point of view is that the indicated page numbers does not correspond with the pdf page numbers. For example, the page where "1" is written on (FOREWORD) is page 4 of the pdf document.This is sometimes confusing when refferering a page to a colleague or customer. Point 5.12 POWER MAP DATASETS: We recently added the labels M/O/H for the fourth field ( We may need to add them in DCS 3.13 Standard => @Thomas Zangerle? I'm glad to see that my remarks about ENGLOC and TOLx/TOLVx have been accepted 🙂
  11. A few comments, on draft ID 30. None I see as extremely critical, except for the last (which is therefore very much an objection rather than a comment/suggestion/question). I'm basing this on the Revision History list, assuming anything changed in 3.13 so far is listed there, I did not read through all the document: Regarding the Revision History itself - I think the revision order (newer on top) is the wrong choice here. This ordering makes sense for thing like blogs and web on-page-updates. But in a part of a closed document older to newer would make more sense to me. Literal data - The updated definition (in 3.3 and is a substantial improvement, and much clearer. That said: - 1 - There is some difference between the more general definition in 3.3.2 and the more specific one in . The one significant thing that was done in and I think would be an important addition to 3.3.2 as well, is that all possible values are to be enumerated in the record definition. Maybe change 3.3.2 from "...having permissible values specified in the standard" to something like "...having permissible values enumerated in the standard" ? - 2 - Since the standard is now clearer that all values are enumerated, is there a point in specifying maximum length? Especially when both specifies maximum length of 12 and allows the definition of each usage to go beyond 12? This is effectively no length limit when defining a Limited record field (since it is explicitly allowed to override 12), and there is no point in specifying maximum length as an issue for anyone using a Limited record field (if all values are enumerated then specifying length is meaningless, maximum length is always the length of the longest enumerated value). Maximum length in the definition of Limited should either be waived completely, or be specified in a way that doesn't allow a record definition to override it. (I'd prefer the former, but either option is better, and internally consistent, compared to the current effectively "there is a maximum length and it can be ignored by everyone") Reference Coordinate System for Backside Engraved Lenses - Previous issues with terminology and usage aside, just a quick note that has an internal reference error, I assume to the related Figure 2 below the section, stating "Error! Reference source not found." in the middle of the paragraph. Removed New DEFAULT label - The Revision History lists having added a new Label DEFAULT. Which is not actually added anyhwhere in the document. I see that it was there in an earlier draft, so I assume it was decided to drop the label, and it's just the Revision History which needs to be adjusted to match. ENGMARK coordinate system - Talked about it in the past, the changed definition of the ENGMARK coordinate system origin is highly problematic. From the previous revision I see that the definition changed from trying to define it on black center to trying to define it on block center, this is irrelevant for the practical objections and has the exact same problem. Again, there are plenty of labs who, for quite a lot of years now, rely on the fact that the coordinate source for engraving (using ENGMARK or ENGMASK records) is ER, not SB. (The Reference point for an Engraving operation being the Engraving Reference). Engraving is being decentered/offset from the Block center in plenty of labs by using SBBC__ + BCER__ records. Using the exact same set of job records, a change from 3.12 to 3.13 should absolutely not cause the engraving to move. It even more certainly should not cause the engraving to move to where the lab does not expect, or want, it to be. This definition expansion isn't clarifying things, it's changing things. And in a way that will have clear and significant and unwanted impacts on labs. I absolutely don't see any benefit whatsoever to doing it, just many downsides. Why change the origin of an already widely-in-use label?
  12. Hello, everyone. I hope you are safe and well. With any luck, you'll find a link on this message to a draft of the proposed version 3.13 of the Data Communications Standard. The first draft posted has a draft ID of 30; if any changes are made, and posted here, the draft ID will be incremented. Please download it, review it, and post any questions, comments, or objections on this forum, in this topic. I do not expect task force meetings to take place at the next Vision Expo West. We should try to discuss any issues that you may have with the draft on this forum; if it becomes necessary, we can arrange Zoom or GoToMeetings. I would like to be able to approve this version prior to our next meeting, which I expect to occur at Vision Expo East. DCS v3.13_DRAFT-030.pdf
  13. Hi All, I was informed today that due to financial challenges related to COVID-19 I'm no longer going to be employed by The Vision Council. Tomorrow 5/6/20 will be my last day. I wanted to say that it's been a pleasure working on these committees with everyone, not just during my time at TVC but even when I was at Signet Armorlite. Going forward please direct inquiries related to the DCS or LPDS Committees to Michael Vitale ( Regards, Paul
  14. Hi All, I was informed today that due to financial challenges related to COVID-19 I'm no longer going to be employed by The Vision Council. Tomorrow 5/6/20 will be my last day. I wanted to say that it's been a pleasure working on these committees with everyone, not just during my time at TVC but even when I was at Signet Armorlite. Going forward please direct inquiries related to the DCS or LPDS Committees to Michael Vitale ( Regards, Paul
  15. Paul, this looks good. Section 5.2 will need to have the latest list pulled.
  16. Dear Colleagues, I hope all is well with everyone and their families during this worldwide crisis. I've spoken with the Steve's (Steve Nedomansky and Steve Shanbaum) and they'd like to try and make some progress with the draft review despite losing our face-to-face meeting at Vision Expo East. Attached you will find the most recent draft for the committee to review. Please post your feedback and comments in this thread so we can try and keep things organized. Please submit your feedback by April 30th, 2020. Thanks- Paul TVC Lens Product Description Standard 0.81-DRAFT.pdf
  17. This will be included on the agenda for the next DCS meeting.
  18. In theory you're correct. In practice, is this worth making a change? It doesn't serve any purpose: In current/new DCS version, ENGLOC indicates on what side the feducial (±17mm) marks would be located, for anything that tries to look for them after engraving. And HASENG indicates if there even are such marks. So the only case where ENGLOC needs to be chiral is if it's F for one lens and B for the others. Which I'd think would never be the case. (right?) Otherwise, there is no confusion or ambiguity. In your sample it would be ENGLOC=B , meaning that for this job the marks are supposed to be on the back side, and that only the left lens has those marks (which are on the back side). "which eye has marks?" and "where are the marks located?" can be (and currently are) independent questions. There is an N value for ENGLOC only for historical reasons, I assume, from before its purpose/meaning was changed (in 3.11 I think?), before there was HASENG. Maybe before there was DOENG (I don't have at hand older DCS documents to verify)? And just as a convenience factor to have a "nicer" value where no lenses have marks.
  19. Hi, Could you double-check the Data Type of the ENGLOC field for the next DCS? It should be literal; instead of literal Some of our customers does have jobs containing 2 different lens types for a single job (one lens SV, the other one PR FF) if the wearer just have an eye which needs a Rx lens for example. In that case, LTYPE=SV;PR FF HASENG=0;1 and ENGLOC needs to be ENGLOC=N;B
  20. Hi Haim, Yes, this will get handled in the standard in the way you show. The current description of the base curve chart line reads as (there are minor changes from your posted example with the naming): The formatting of the compressed base curve chart within the Standard is as follows, using the example chart fragment from 6.1. The Base Curve Chart object has an ID, and an array named ‘Adds’. Adds is an array of objects, where each object contains the base curve chart values for a single add power. The add power that applies is within the range specified between the ‘MinAdd’ and ‘MaxAdd’ values, inclusive. For a single vision, the add power range will be 0.00 to 0.00. The lines for that add power are in the ‘Lines’ array of strings, where each string corresponds to one record in the compressed base curve chart. The sphere, cylinder, and base curve values are all represented as *100. If the fourth and fifth values are not present in a given line (the base curve min and max recommended values), they are presumed to equal the third value.
  21. Hi Haim, Those 3 numbers define the range of possible values for a variable reference point. The first number is the minimum, the second is the increments, and the third is the maximum. In your example for "ERNRUP": "-8:-2:-12", it means that the possible values for ERNRUP are -8, -10, and -12. Regards
  22. Dear Colleagues, As you might already know from our previous conversations, Shamir is using in some cases more than one base per Rx range, where one considered to be the primary and the rest are secondary option. Can we use the Min/Max values in the JSON format to describe the span of bases can be used for the specific Rx range, where the first column is the primary, the Min is the minimum secondary base and the Max is the maximum secondary base? "AddMin": 0.75, "AddMax": 4.00, "Lines": [ "-1200, 0, null", "-900, 0, 150, 50, 150", "-500, 0, 300, 150, 450", "-100, 0, 500, 300, 650", ...... ] Thanks for your reply. Haim S.
  23. Dear Colleagues, Can someone explain what is the meaning of the 3 numbers under ERNRIN/UP described in the LPDS V0.78 example for layouts (p. 30)? What are they describing? Thanks, Haim S.
  24. We're aware of the missing reference errors and those are being corrected in 3.13 (I hope... it's a Word glitch that seems to keep popping up). I do believe you are correct that the data type is wrong for the tolerance records so we'll work that into the revision as well. Thanks for pointing these errors out.
  25. Hi, Is there any possibility to correct following topics on the next DCS version? - There a 30 messages "Error! Reference source not found." on the current V2.12 pdf document - Tolerance records TOLADD, TOLCYL, TOLDIA,... on Table A.1a (page 143) should be IMHO integer values (0;1;9) and not (min|max;). The min|max; are for TOLVADD, TOLVCYL, TOLVDIA,... This tolerance table should be like on attached pic...
  26. A web based API would be nice, and there is nothing to say that the transmission method for that API couldn't be converting the XML data to JSON (which works well when XML is the starting data and you have a nice schema). However, the reality is that such a service would be at least a few years away from a published standard and even if we have that tool, many of us who directly support the folks creating these files are going to be dealing with physical files and not web services. You seem to be under the impression that all lens companies use databases to track this stuff. As I've tried to explain during the meetings, that is simply not reality. Have some conversations with Tania or Dave Wedwick, et al, about the type of people they refer to me for help with their lens data. I promise you we will be handling, reading, and troubleshooting lots and lots of text files. Also not every company is going to want to submit their files to centralized DB. We have already proposed this in the past for LDS data, which is much less proprietary, and some larger manufacturers were not interested. So, in the end, we're going to be sharing files via email and such more than pulling things automatically from some web service which hasn't even been defined yet. As for complexity, with a well defined XML schema such complexity is far more easily explained and documented (not to mention the schema will prevent mistakes). The schema itself can be self-documenting. And again, the XML Schema standard is well established while the JSON schema standard has been languishing in draft stage for many years. There are also several tools that can take an XML schema and generate a nice document showing all the links between the elements and attributes that can be directly used in the standard. Essilor has done this with RxML. I'm not aware of any comparable tools for JSON. As for DCS, terseness is the only factor that gives JSON an edge in DCS, as I wrote. I wouldn't object to XML either for the flexibility and, again, the benefits of using XSD. However, if there was a proposal to use JSON for DCS I would not object because it makes sense in that context (real time data exchanges, such as happens in web services) and in reality 95% of the records are simple enough that complex types and other XML types aren't really necessary. JSON was never intended for the type of data we're defining here. Hence the reason the schema definition is taking so long IMO. They are trying to make it as functional as XML when it was never intended to be a replacement for XML. Rather, a supplement to XML specifically for the purpose of web services. After all, the "JS" stands for JavaScript. Again, you can force that square peg into the hole if you want, but it seems like a poor way forward compared to using the correct tool. Most of your counterpoints seem to be "Yes, XML does that but so does JSON". I would argue that in every case other than terseness XML does it better and there are still no factors that make JSON a better choice, only a preferential one, e.g. being more familiar with JSON which to me should not be a factor when writing standards we expect to be adopted internationally. I've worked with both JSON and XML extensively and I would never pick JSON for a project like this because it is simply not the best tool for the job and this is backed up by plenty of information in white papers and tech articles. In any event, as you said, it is apparent we are not going to reach a consensus betwixt ourselves so I'll post the poll with a link to this thread. The poll automatically allows people to post comments so they can state their reasons.
  1. Load more activity