Welcome!

Machine Learning Authors: Mehdi Daoudi, Mark Ross-Smith, Jason Bloomberg, Carmen Gonzalez, Jeffrey Abbott

News Feed Item

EQECAT Releases RQE (Risk Quantification & Engineering) Catastrophe Modeling Platform

Today EQECAT announced the release of its RQETM (Risk Quantification & Engineering) catastrophe risk modeling platform that enables clients to quantify and manage the potential financial impact of natural hazards.

RQE version 13 is the result of a multiple-year initiative that involved a significant degree of collaboration with clients, prospects, and industry experts and is the single largest release of its kind. While there are many improvements within the new platform, EQECAT has preserved and leveraged its robust methodology and unique treatment of uncertainty, which are the hallmarks of EQECAT risk modeling.

“We are thrilled that RQE will provide significant and increased value to the global re/insurance market,” commented Bill Keogh, president of EQECAT. “With so much that differentiates us competitively, we look forward to satisfying the pent up demand for our analytics. Having collaborated closely with leaders from virtually every segment and geography in the global re/insurance business, we are confident that RQE will disrupt the status quo of catastrophe risk modeling. All of us at EQECAT thank our existing and new clients for their collaboration throughout this development process and their confidence in RQE.”

Highlights of RQE v. 13 include:

  • Comprehensive portfolio aggregation
  • New financial model
  • Improved user interface
  • Improved import workflow
  • New database schema with 4-tier hierarchy
  • Significant improvements in import run times
  • Catastrophe model updates

Comprehensive Portfolio Aggregation

Standard output includes both Event Loss Table (ELT) for occurrence-based metrics and Year Loss Table (YLT) for aggregate loss metrics. 3G Correlation™ allows users to aggregate one or more YLTs to create an aggregate YLT. Reports can be generated from the YLT for annual losses, Exceedance Probability (EP) curves and the associated ELT.

New Financial Model

The implementation of the new 4-tier database hierarchy facilitates enhancements that enable more complete loss modeling for excess and surplus (E&S) lines, "step" policies, and other complex financial structures.

Improved User Interface

The release offers users significant improvements to the import process, exposure management and report selection to provide an enhanced user experience.

Improved Import Workflow

RQE v. 13 includes a new import workflow to enable easier input and editing of exposure data.

New Database Schema

This release includes an improved uniform database schema with 4-tier hierarchy at the account and portfolio level.

Significant Improvements in Import Run-time

Clients will experience faster run-times with new and improved import capabilities.

Catastrophe Model Updates

RQE v. 13 includes the update of 178 country/peril models including vulnerability, hazard, and correlation/simulation updates. Hazard and vulnerability have been updated for a number of models to incorporate new scientific research and detailed analyses of claims and exposure data from recent events. Correlation and simulation updates were made to all country/peril models to allow the ability to combine multiple portfolios without the need to re-analyze, using 3G Correlation™.

EQECAT will host a multiple-day conference to enable clients to own their view of risk by providing a thorough understanding of RQE v.13 and the entire EQECAT catastrophe modeling process. The catastrophe modeling conference is being held in Fort Lauderdale, Florida on April 9 – 11, 2013 at the Ritz-Carlton.

Learn more about RQE catastrophe modeling, or read the press release online.

EQECAT connects re/insurance and financial services clients with the world’s leading scientific minds to quantify and manage exposure to catastrophic risk. Leveraging decades of experience, EQECAT’s comprehensive methodology is distinguished by a unique treatment of uncertainty that helps clients set rational expectations about risk.

RQETM (Risk Quantification & Engineering), EQECAT’s new catastrophe risk modeling platform, will provide enhanced functionality and user experience with a new financial model, import workflow, and user interface. Increased analytical speed, expanded reporting, and improved integration capabilities provide clients with increased transparency and faster access to results for 180 natural hazard software models for 96 countries spanning six continents.

EQECAT, a subsidiary of ABSG Consulting Inc., was founded in 1994 and is headquartered in Oakland, California.

For more information, contact:
[email protected]

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@CloudExpo Stories
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
While many government agencies have embraced the idea of employing cloud computing as a tool for increasing the efficiency and flexibility of IT, many still struggle with large scale adoption. The challenge is mainly attributed to the federated structure of these agencies as well as the immaturity of brokerage and governance tools and models. Initiatives like FedRAMP are a great first step toward solving many of these challenges but there are a lot of unknowns that are yet to be tackled. In hi...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Ca...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...