The Vision Council

  • Content Count

  • Joined

  • Last visited

  • Days Won


Posts posted by The Vision Council

  1. Hi All,

    I was informed today that due to financial challenges related to COVID-19 I'm no longer going to be employed by The Vision Council.  Tomorrow 5/6/20 will be my last day.  I wanted to say that it's been a pleasure working on these committees with everyone, not just during my time at TVC but even when I was at Signet Armorlite.  Going forward please direct inquiries related to the DCS or LPDS Committees to Michael Vitale (



  2. Hi All,

    I was informed today that due to financial challenges related to COVID-19 I'm no longer going to be employed by The Vision Council.  Tomorrow 5/6/20 will be my last day.  I wanted to say that it's been a pleasure working on these committees with everyone, not just during my time at TVC but even when I was at Signet Armorlite.  Going forward please direct inquiries related to the DCS or LPDS Committees to Michael Vitale (



  3. Dear Colleagues,

    I hope all is well with everyone and their families during this worldwide crisis.  I've spoken with the Steve's (Steve Nedomansky and Steve Shanbaum) and they'd like to try and make some progress with the draft review despite losing our face-to-face meeting at Vision Expo East.  Attached you will find the most recent draft for the committee to review.  Please post your feedback and comments in this thread so we can try and keep things organized.  

    Please submit your feedback by April 30th, 2020.

    Thanks- Paul


    TVC Lens Product Description Standard 0.81-DRAFT.pdf

  4. We're aware of the missing reference errors and those are being corrected in 3.13 (I hope... it's a Word glitch that seems to keep popping up).  I do believe you are correct that the data type is wrong for the tolerance records so we'll work that into the revision as well.  Thanks for pointing these errors out.

  5. A web based API would be nice, and there is nothing to say that the transmission method for that API couldn't be converting the XML data to JSON (which works well when XML is the starting data and you have a nice schema).  However, the reality is that such a service would be at least a few years away from a published standard and even if we have that tool, many of us who directly support the folks creating these files are going to be dealing with physical files and not web services.  You seem to be under the impression that all lens companies use databases to track this stuff.  As I've tried to explain during the meetings, that is simply not reality.  Have some conversations with Tania or Dave Wedwick, et al, about the type of people they refer to me for help with their lens data.  I promise you we will be handling, reading, and troubleshooting lots and lots of text files.

    Also not every company is going to want to submit their files to centralized DB.  We have already proposed this in the past for LDS data, which is much less proprietary, and some larger manufacturers were not interested.  So, in the end, we're going to be sharing files via email and such more than pulling things automatically from some web service which hasn't even been defined yet.

    As for complexity, with a well defined XML schema such complexity is far more easily explained and documented (not to mention the schema will prevent mistakes).  The schema itself can be self-documenting.  And again, the XML Schema standard is well established while the JSON schema standard has been languishing in draft stage for many years.  There are also several tools that can take an XML schema and generate a nice document showing all the links between the elements and attributes that can be directly used in the standard.  Essilor has done this with RxML.  I'm not aware of any comparable tools for JSON.

    As for DCS, terseness is the only factor that gives JSON an edge in DCS, as I wrote.  I wouldn't object to XML either for the flexibility and, again, the benefits of using XSD.  However, if there was a proposal to use JSON for DCS I would not object because it makes sense in that context (real time data exchanges, such as happens in web services) and in reality 95% of the records are simple enough that complex types and other XML types aren't really necessary.  JSON was never intended for the type of data we're defining here.  Hence the reason the schema definition is taking so long IMO.  They are trying to make it as functional as XML when it was never intended to be a replacement for XML.  Rather, a supplement to XML specifically for the purpose of web services.  After all, the "JS" stands for JavaScript.  Again, you can force that square peg into the hole if you want, but it seems like a poor way forward compared to using the correct tool.

    Most of your counterpoints seem to be "Yes, XML does that but so does JSON".  I would argue that in every case other than terseness XML does it better and there are still no factors that make JSON a better choice, only a preferential one, e.g. being more familiar with JSON which to me should not be a factor when writing standards we expect to be adopted internationally.  I've worked with both JSON and XML extensively and I would never pick JSON for a project like this because it is simply not the best tool for the job and this is backed up by plenty of information in white papers and tech articles.

    In any event, as you said, it is apparent we are not going to reach a consensus betwixt ourselves so I'll post the poll with a link to this thread.  The poll automatically allows people to post comments so they can state their reasons.

  6. I concur that we will eventually need to put this to a vote.  How would you like it to be presented?  Simply " Do you prefer XML or JSON?" or with talking points?

    As for my position, and what I took from the various articles and white papers I've read on the subject, it's really simple.  If you're exchanging data between disparate systems, especially non-persisted data over a network interface (such as web services) where terseness is important, and you're using name/value pairs or simple data structures, then JSON is the clear and obvious choice.  On the other hand, if you're exchanging data between database systems using complex data structures in large, persisted data sets, then XML is the clear and obvious choice.  While in both cases one could use the alternative language, doing so is like pounding in a nail with the ball side of a ball peen hammer.  You may eventually get the job done but striking the nail cleanly and consistently is more a matter of luck than intention.  I'm merely advocating for using the best tool for the job, and to me that is clearly XML.  When we start talking about DCS, however, then I'd be more in favor of JSON because terseness is a factor there and we are already dealing with simple data structures and name/value pairs.

    In any event, let me know how you'd like to present the options and I will create a poll.

  7. That is a very simplistic example.  I agree that for short data sets there is value in brevity and this is exactly the type of data where I would go with JSON.  The catalogs will not be small data sets.  Reading a large mass of data and trying to figure out where the closing tag is in JSON is much more difficult than XML.  Additionally, almost all text editors that aren't Notepad can automatically prettify XML and will do syntax highlighting making viewing the data much easier than your black on white example.  There are also many more tools already available for dealing with large XML datasets by hand than I've been able to find for JSON.  I'm still not convinced, especially by such a contrived example. that JSON is better for this project.

    Also, the JSON schema standard is still in draft stage.  The XML schema standard is well established.

    I would further argue that as a committee we have more expertise available in XML than in JSON.

    And what about code generation from XML schema's?  That is a huge benefit IMO. 

    We seem to by trying to force JSON to fit, again, because it's newer and sexier, not because it offers any true benefit.

    JSON is awesome, just not for this.

    You might find this interesting.  It's a fair assessment despite the biased sounding title.

  8. My point was that I didn't think anyone would object to changing the nomenclature if we could come up with something better.  However, no one had proposed any alternatives.  I don't like using "Find" in the abbreviation but it might be a starting point to rethink the descriptors.  Your description seems more clear to me.

    Adopting this change to the narrative will also require completely revising the existing documentation in 3.12.  It's a large change and will require rewriting most of that material as well I think.  Someone from the committee will need to volunteer to undertake the primary rewrite.  I suppose this may require another vote.

  9. On 10/16/2019 at 5:34 AM, Michael Vitale said:

    3.1 - JSON format - Us JSON the proper format to use for this standard and do we believe all manufacturers will have the technical expertise to write this in JSON? They have historically used a csv format for the LDS and this may be very cumbersome for them.


    I have always been more in favor of using XML for this standard.  There are three reasons, in order of increasing importance:

    1.  XML does not have the idiosyncracy of inconsistent array ordering, which affects JSON.  This is currently constraining our design options for base curve chart data and could have the same effect on future development.

    2.  XML is more human readable than JSON, especially for large, complex documents such as these catalogs.  Although we will end up parsing these files with software, people are still going to need to read them for troubleshoooting and support purposes.  XML is better suited for that.

    3.  Using XML will allow us to encode all the rules of the standard in a single XML Schema Document (XSD).  This XSD can then be used in most development platforms to automatically generate libraries of code for parsing, writing and validating XML files.  There are numerous online document validators that if fed an XSD and a sample instance XML file, they can validate the instance document is properly formatted and follows all the necessary rules.  This will enable much faster development and adoption of the standard.

    I have not heard any good arguments for using JSON.  It mostly seems to be "XML is old and stodgy, JSON is new and fun".  I think XML is a much better language for this type of endeavor.  We can also always convert from XML to JSON for transmission purposes, but defining things in XML gives us many advantages.

    My understanding was that the decision on the language used would not be made until the standard had been fleshed out.  I had been pushing my points about XML back when Daniel was still in charge and I recall that we specifically tabled that conversation until we had defined all of the fields.  I was not expecting it to be dictated in the standard yet.

  10. Dave Wedwick's "no" vote response:



    I'm voting that if anyone has an issue with it, it should be pushed to a later version.  I actually don't care myself - I don't remember who needs it or how soon.  My initial thought was that there are many machine manufacturers out there that may have an issue with it as they may not be attending meetings and paying attention.

    But, now that I think about it, this mainly affects the host programs - if a machine starts wanting to send longer values, the host either takes it or not.  So, your main audience may be the host programs, and I as a host program would transparently handle any length (there's never been a length restriction in our software) - that's why I don't care either way.

    So, that's it - if anyone has a problem with it, maybe it should be postponed.  But, the host programs are probably the target audience.





  11. You need to make a concise recommendation then.  I don't think anyone is going to be able to follow all of the various points the way they are spread out.  Keep them brief and on topic and I'll include them in the poll.  You seem to be proposing one of the following:

    1.  If we are going to increase the length without a restriction on which records it can apply to, then it belongs in 4.0

    2.  If we are going to limit it to specific records, such as XSTATUS, then you are fine with it being included in 3.13

    Is this correct?  Again, please be concise.

    Edit:  May be moot as at least two other members have already voted to postpone until 4.0 so this might get completely tabled until our next meeting.

  12. 18 hours ago, Yaron [LaserOp] said:

    OK, Paul, since you keep insisting, I listened (well, mostly did quick skips throughout, to locate the relevant section/s) to the recording of the VEW meeting.

    Glad you finally listened to a meeting.  Probably the wrong one though.  We discussed that change first at VEE.  It was just the details we were discussing at VEW after having agreed in principle to the change at VEE.  I'm pretty sure the impact was discussed at that first meeting so it wasn't necessary to rehash it at VEW.  At least, that's my recollection.  In any event, thanks for your feedback.  Your objection is noted.  I disagree with your opinion and I have clearly stated my reasons why I believe this is a perfectly valid minor revision change.  No one else has agreed with your position thus far so, once again, we will be going with the consensus which is to include this change in 3.13.

    Edit:  To confirm the committee's position, I have created a poll.  We'll go with those results since at this point I don't believe there are any further arguments to make.

  13. Not true.  Some of the things being discussed for 4.0 are truly "breaking" the current paradigm, like changing everything to XML or JSON.  Again, I strongly urge you to start listening to the meetings if you can't actually attend.  If you do that shortly after they are posted, and they are always posted within a few days of the meeting, you can raise your objections sooner while everything is still fresh in everyone's mind.  You can be more informed about the objections already raised, about the direction we are going for 4.0, and in general you will be able to provide more effective feedback.  Quite often objections quite similar to ones you bring up are raised and discussed, and sometimes your specific objections are raised (by me) and discussed so you could hear those responses in detail.  I mentioned you several times during the last meeting.

    In any event, at this point we need to wait and see what, if anything, the other committee members say.  At this stage I still have to go with the consensus established at the last meeting.  There are a couple of other items delaying publication of the next review draft anyway so we'll have until sometime next week.


  14. 29 minutes ago, Yaron [LaserOp] said:

    So, again, what am I supposed to do with those D labels? It's valid for me now to send them longer, and it's now and invalid record for the LMS so it should ignore them. (Because, again, the idea was to ignore invalid new labels, not to invalidate active and working ones).

    This is straight to the Q&A forum once the new version is out. I'm correctly sending valid data which the LMS then correctly and validly ignores, my machine no longer works after a correctly performed upgrade that fits all the requirements of the standard, what should I do?

    I didn't say "invalid record label", I said invalid "record".  So, if for any reason a system thinks a record is invalid, it should be ignored.  That includes if the field value(s) are invalid which can include too long.  With that understood, this change is no different that any other new change in a new version.  If you are 3.13 compatible but the system you're speaking to is not, you have to work that out with them.  If they are 3.13 compatible, then it won't be an issue because they should have adopted the change.  I still don't see how this is a protocol "breaking" change.  It only applies if you are 3.13 compatible.

    Edit: And again, even if we call this 4.0, this problem will still exist.  I still do not agree that this is a big enough change to require a delay to 4.0 which may take years to develop as it will be a dramatic departure from the current protocol.

  15. I understand what you're saying but I disagree.  I don't feel this is a breaking change.  A record that is invalid should be ignored.  In any event, I'll give it a day or two to see if anyone else responds to this topic in favor of postponing this until 4.0.  Otherwise the draft will proceed as described above and you'll get a chance to vote against it when the time comes.

  16. 1 minute ago, Yaron [LaserOp] said:

    That's almost 8 hours, mostly not relevant for any one individual topic (like this), with no index or rough timestamps of what was discussed when.


    You have also posted questions on the engraving reference system which is the vast majority of the discussion of both meetings.  Most of the easier topics are covered within the first hour or two of each meeting because we save the best for last (Christian's topics).  The committee members expend an effort just getting to the meetings to participate in the discussions.  If you're going to object to their decisions I think it only makes sense to listen to those discussions first.