Submission Make-overs

Underwriters increasingly rely on automated systems for processing submissions, analyzing risks, tracking exposures, policy administration, and other internal processes.
Submissions that are presented with highest-quality data and are easiest to import will get priority for processing.

If critical data is missing:
•  submission processing may be suspended pending receipt of
    the missing data, or
•  the underwriter may fill-in the missing information based on
    their own assumptions (which will be conservative), or
•  the submission may simply be declined.

Underwriters increasingly rely on information gathered from the internet to fill-in missing data elements.
This information may be outdated or inaccurate.
• Underwriters may choose not to tell the brokers that third-party data has been gathered.
• The risk manager may never have the opportunity to validate or explain third-party information.

Exposure values being presented to underwriters are commonly 30-40% under the actual replacement costs.
• Many underwriters systematically compensate for underinsurance by either inflating values used in their internal analyses, or relying on the actuaries to build an adjustment factor into the rating.
• If the risk manager is doing a good job of keeping the values current, they need to communicate this to the market so that the underwriters can use the values as presented and/or advocate internally for application of more discretionary credit.

If the underwriter lacks confidence in the quality of the information being presented, they will compensate for the increased uncertainty by declining to quote or reducing their willingness to quote the best terms and conditions available.
• The willingness of the underwriter to accept the data as submitted is a function of the current market environment, and those market conditions can change very rapidly following a large earthquake, hurricane, or other catastrophe.
• Underwriters may be compensating for poor quality data and undervaluation in their current pricing.
• Improving the data quality and correcting the values will not necessarily result in higher premiums!

The same data that feeds the underwriters’ evaluation also feeds the risk managers’ decision making process.
• Undervaluation and incomplete/inaccurate data leads the risk manager to underestimate limit requirements and percentage deductible exposures.
• Poor data also prevents the broker from creating optimal program layering.

Catastrophe models drive pricing for coastal and earthquake exposed risks.
• These models are sensitive to exposure data quality and undervaluation.
• The modeled loss calculation can be materially improved with more complete data.


The Asperta Process
1. We thoroughly review the existing data; checking for completeness, internal consistency, format, and perform high level analysis for valuation.
2. We review the current policy wording and structure: this can highlight policy restrictions being driven by perceived data quality issues and be a factor in prioritization of work in later phases.
3. We can interview underwriters (incumbent and prospective) to discuss their specific data needs: these can change over time.
4. We prepare a proposed action plan for supplemental data collection and reformatting.
5. We establish agreement on the action plan between the risk manager, broker, underwriters, and any other involved parties.

Asperta will document scope of work in a detailed work-order attached to a master consulting agreement.
6. We gather supplemental data as needed:
• Public record search,
• Hazard look-ups,
• Satellite/Aerial images,
• On-site visits, and
• Risk Manager’s file data
(loss control reports, diagrams, etc.)

7. We generate current estimates of values.

8. We prepare updated submission spreadsheet(s) and supplemental data, including narrative description of the process.

9. We review the draft submission with the client and others for final quality review.

10. We deliver high quality documents ready for submission to the market.