FHCF Share Financial Modelling Step-By-Step¶
This step-by-step guide illustrates how an FHCF contract type is modelled in Graphene.
Table of Contents
Objective¶
For this guide, the objective is to model a conventional FHCF with the following terms:
Term |
Value |
---|---|
Occurrence Limit |
30,000 USD |
Occurrence Attachment |
10,000 USD |
Aggregate Limit |
50,000 USD |
Aggregate Attachment |
20,000 USD |
Share |
20% |
Upfront Premium |
3,000 USD |
Inception Date |
2019-01-01T00:00:00Z |
Expiry Date |
2020-01-01T00:00:00Z |
We express this contract definition in Graphene’s business model network as a node with the following definition:
{
"_schema": "FHCF_1.0",
"inception_date": 1546300800,
"expiration_date": 1577836800,
"occurrence_attachment_value": 10000,
"occurrence_limit_value": 30000,
"aggregate_attachment_value": 20000,
"aggregate_limit_value": 50000,
"premium_value": 3000,
"share": 0.2,
}
We then use templates to generate the financial model graph for this FHCF definition. This graph is finally evaluated by the Graphene financial engine.
Step-By-Step Breakdown¶
In order to better understand the individual elements of the financial model graph illustrated above, let’s decompose the graph into separate components.
Layering of Loss Claims¶
The primary layering of loss claims is performed in this subgraph:
The graph consists of a number of core operations that are applied to the input Ledger in sequential order:
Do not consider records past the expiry date. Only keep records before the expiry date.
Do not consider records before the inception date. Only keep records on or after the inception date.
For layering only consider losses as subjects. All other records such as upstream premiums or fees are not subject to the layering.
Split loss streams into the 2 largest occurrences and all other occurrences.
Apply an occurrence attachment of 10,000 to the 2 largest occurrences, and an occurrence attachment of 1/3 of 10,000 to all other occurrences. The attachment is applied to the sum of values of all records with identical
Trial
,Time
, and occurrence ey values. Once the attachment is applied, the result is proportionally allocated to all records included in the group of records that makes up the occurrence.Apply an occurrence limit of 30,000. The limit is applied to the sum of values of all records with identical
Trial
,Time
, and occurrence key values. Once the limit is applied, the result is proportionally allocated to all records included in the group of records that makes up the occurrence.Apply an aggregate attachment of 10,000. The attachment is applied to the sum of values of all records with identical
Trial
. Once the attachment is applied, the result is proportionally allocated to all records included in the group of records that makes up the trial.Apply an aggregate limit of 50,000. The limit is applied to the sum of values of all records with identical
Trial
. Once the limit is applied, the result is proportionally allocated to all records included in the group of records that makes up the trial.Finally we apply the participation using a simple scaling function.
Upfront Premium, Non-loss records¶
The FHCF contract pays an upfront premium income.
The upfront premium is simply a record that is created in the Ledger at the time of contract inception with a value equal to the premium amount and a type of
Premium
.Select only records that are not of type
Loss
and pass them on to apply the contract participation.
Putting it back together¶
In order to model the complete FHCF contract, all components are combined together and collectively produce a single Ledger that contains:
A single upfront premium income record on inception of the contract
Any loss records that represent a loss expense to the contract
Any non-loss records within the inception and expiry of the contract
Core operations¶
The entire FHCF contract structure consists of the following core operations:
Templates¶
Using Graphene template capabilities, we can build out the appropriate templates to generate the financial model shown above from the initial JSON definition:
{
"_schema": "FHCF_1.0",
"inception_date": 1546300800,
"expiration_date": 1577836800,
"occurrence_attachment_value": 10000,
"occurrence_limit_value": 30000,
"aggregate_attachment_value": 20000,
"aggregate_limit_value": 50000,
"premium_value": 3000,
"share": 0.2,
}
The entire FHCF structure can be modelled using a single template of the following form:
{
vertices: [
# Vertex 0
{
_schema: "TimeCoverage_1.0",
inception_date: .inception_date,
expiration_date: .expiration_date
},
# Vertex 1
{
_schema: "RecordTypeFilter_1.0",
op: "EQUAL",
value: "Loss"
},
# Vertex 2
{
_schema: "LargestOccFilter_1.0",
count: 2,
invert_order: true,
invert_selection: false
},
# Vertex 3
{
_schema: "LargestOccFilter_1.0",
count: 2,
invert_order: true,
invert_selection: true
},
# Vertex 4
{
_schema: "OccurrenceCoverage_1.0",
attachment: .occurrence_attachment_value,
limit: .occurrence_limit_value,
currency: .currency
},
# Vertex 5
{
_schema: "OccurrenceCoverage_1.0",
attachment: (.occurrence_attachment_value / 3.0),
limit: .occurrence_limit_value,
currency: .currency
},
# Vertex 6
{
_schema: "AggregateCoverage_1.0",
attachment: .aggregate_attachment_value,
limit: .aggregate_limit_value,
currency: .currency
},
# Vertex 7
{
_schema: "Record_1.0",
_internal_terminal: true,
time: .inception_date,
type: "Premium",
value: .premium_value,
currency: .currency
},
# Vertex 8
{
_schema: "RecordTypeFilter_1.0",
op: "NOT_EQUAL",
value: "Loss"
},
# Vertex 9
{
_schema: "Scale_1.0",
factor: .share
}
],
edges: [
{
from: 0,
to: 1
},
{
from: 0,
to: 8
},
{
from: 1,
to: 2
},
{
from: 1,
to: 3
},
{
from: 2,
to: 4
},
{
from: 3,
to: 5
},
{
from: 4,
to: 6
},
{
from: 5,
to: 6
},
{
from: 6,
to: 9
},
{
from: 7,
to: 9
},
{
from: 8,
to: 9
}
]
}