This is a weekly series for The Regulatory Reporting Data Model Working Group. The RRDMWG is a collaborative group of insurers, regulators and other insurance industry innovators dedicated to the development of data models that will support regulatory reporting through an openIDL node. The data models to be developed will reflect a greater synchronization of data for insurer statistical and financial data and a consistent methodology that insurers and regulators can leverage to modernize the data reporting environment. The models developed will be reported to the Regulatory Reporting Steering Committee for approval for publication as an open-source data model.

openIDL Community is inviting you to a scheduled Zoom meeting.

Join Zoom Meeting

Meeting ID: 989 0880 4279
Passcode: 740215

One tap mobile
+16699006833,,98908804279# US (San Jose)
+12532158782,,98908804279# US (Tacoma)
Dial by your location
        +1 669 900 6833 US (San Jose)
        +1 253 215 8782 US (Tacoma)
        +1 346 248 7799 US (Houston)
        +1 929 205 6099 US (New York)
        +1 301 715 8592 US (Washington DC)
        +1 312 626 6799 US (Chicago)
        888 788 0099 US Toll-free
        877 853 5247 US Toll-free
Meeting ID: 989 0880 4279
Find your local number:



Meeting Minutes

I. Intro by Mr. Antley

 A. Greetings to All

 B. Reading of Linux Foundation Anti-Trust Statement

II. Agenda/Content

A. Discussion/Recap on Earned Premium as Defined from Previous Weeks 

B. This week - Mr. Antley worked on incurred loss - sum of paid and outstanding losses. Primary emphasis of meeting.

  1. Sum of paid losses - very straightforward
    1. First query gives all of paid losses in timeline - Mongo query. Multiple clauses separated by commas. Searching for transaction code 2s. 2 is a paid loss. Applying parameters to coverage code. Hard coding liability - coverage code 1.
    2. Using accident date - greater than start and less than end 
    3. Produced result: object record
    4. Mr. Antley summed the records, rounded to 2 decimal points.
  2. Outstanding losses - more complex. Always most recent record that is correct. Can't sum all of them.
    1. Implementation of this: 2 different queries (Mr. Antley presented on the screen and shared corresponding link).
    2. Outstanding losses - transaction code 3
  3. Mr. Antley systematically ran through his calculations - paid loss (w/dates) and outstanding loss combined = incurred loss 
  4. Group noted we only provide outstanding losses 1x/qtr. not with every transaction. Outstanding is always a point in time #. Case reserves = outstanding losses. Paid + case can = incurred but incurred losses - a different actuarial definition (Ms. Tornquist). Statistically outstanding #s that are send are just a point in time.
  5. Mr. Antley discusses prior conversation with Mr. Harris - asked Mr. Harris to outline procedure to run a test for this: look at a claim level (claim ID) + paid + incurred losses, then whoever provided info can verify # calculated as incurred is same as incurred on the claim file. May be necessary to go back to carrier for sample validation. Mr. Antley agreed that this would be further worked on when better data set is available - finer-grain and more granular testing
  6. Mr. Antley: plan for next week is to get another categorization done.
  7. Mr. Antley: incurred counts = claim #s. Claim counts include any known claims. Could have an outstanding reserve but haven't made a payment yet. Every claim should only be counted once. If no payment, no case reserve, but a claim arrives, do we not count this (Ms. Tornquist). Answer: by definition today stat reporting is only activity. So someone calling and making a notification of a claim doesn't send anything downstream to stat reporting. (i.e., Notifications are not counted)
  8. Mr. Antley: already has incurred count records loaded - should be very simple and straightforward
  9. Mr. Reale: clarification about going from Mongo to Json; Mr. Sayers: each row turns into a json object. Mr. Reale: we're extracting from the databases, making a list of json entries (an array) and using local memory to process answers. Mr. Sayers: Mongo has pipeline but Json is critical; Mongo may not be as scalable as we would like, hence the move to Json. Not necessary to get it 100% correct, we just need to get it close.
  10. Mr. Antley: for Basic Limit Loss - Mr. Harris noted that Travelers doesn't do BLL. Just done for statistical reporting. BLL based on limits of policy relative to minimum financial requirements of state. 100% of the losses are under the basic limit. 
  11. Mr. Antley: will get back to car years around the second half of this next week - car years a bit trickier but doable.
  12. Mr. Reale pointed out that much of this can be used against a relational database (Apache drill etc.) If we are worried about extensibility we can pursue a decoupled option
  13. Mr. Antley noted nothing more on agenda

III. Adjournment - Meeting adjourned at 1:37pm EST.


Discussion items



Action items