Welcome!

Machine Learning Authors: Elizabeth White, Yeshim Deniz, Pat Romanski, AppNeta Blog, Derek Weeks

Blog Feed Post

MaaS applied to Healthcare – Use Case Practice

MaaS (Model as a Service) might allow building and controlling shared healthcare Cloud-ready data, affording agile data design, economies of scale and maintaining a trusted environment and scaling security. With MaaS, models map infrastructure and allow controlling persistent storage and deployment audit in order to certify th at data are coherent and remain linked to specific storage. As a consequence, models allow to check where data is deployed and stored. MaaS can play a crucial role in supplying services in healthcare: the model containing infrastructure properties includes information to classify the on-premise data Cloud service in terms of data security, coherence, outage, availability, geo-location and to secure an assisted service deployment and virtualization.

Introduction
Municipalities are opening new exchange information with healthcare institutes. The objective is sharing medical research, hospital acceptance by pathology, assistance and hospitalization with doctors, hospitals, clinics and, of course, patients. This open data [6] should improve patient care, prevention, prophylaxis and appropriate medical booking and scheduling by making information sharing more timely and efficient. From the data management point of view it means the service should assure data elasticity, multi-tenancy, scalability, security together with physical and logical architectures that represent the guidelines to design healthcare services.

Accordingly, healthcare services in the Cloud must primarily secure the following data properties [2]:
-      data location;
-      data persistence;
-      data discovery and navigation;
-      data inference;
-      confidentiality;
-      availability;
-      on-demand data secure deleting/shredding [4] [5] [11] [12].

These properties should be defined during the service design and data models play the “on-premise” integral role in defining, managing and protecting healthcare data in the Cloud. When creating healthcare data models, the service is created as well and properties for confidentiality, availability, authenticity, authorization, authentication and integrity [12] have to be defined inside: here is how MaaS provides preconfigured service properties.

Applying MaaS to Healthcare – Getting Practice
Applying MaaS to design and deploy healthcare services means explaining how apply the DaaS (Database as a Service, see [2] and [4]) lifecycle to realize faster and positive impacts on the go-live preparation with Cloud services. The Use Case introduces the practices how could be defined the healthcare service and then to translate them into the appropriate guidelines. Therefore, the DaaS lifecycle service practices we are applying are [4]:

Take into account, healthcare is a dynamic complex environment with many actors: patients, physicians, IT professionals, chemists, lab technicians, researchers, health operators…. The Use Case we are introducing tries to consider the whole system. It provides the main tasks along the DaaS lifecycle and so how the medical information might be managed and securely exchanged [12] among stakeholders for multiple entities such as hospital, clinics, pharmacy, labs and insurance companies.

The Use Case
Here is how MaaS might cover the Use Case and DaaS lifecycle best practices integrate the above properties and directions:

Objective To facilitate services to healthcare users and to improve exchange information experience among stakeholders. The Use Case aims to reduce costs of services by rapid data designing, updating, deployment and to provide data audit and control. To improve user experience with healthcare knowledge.
Description Current costs of data design, update and deployment are expensive and healthcare information (clinical, pharmaceutical, prevention, prophylaxis…) is not delivered fast enough based upon user experience;
Costs for hospitalization and treatments information should be predictable based upon user experience and interaction.
Actors Clinical and Research Centres;
Laboratories;
Healthcare Institute/Public Body  (Access Administrators);
Healthcare Institute/Public Body (Credentials, Roles Providers);
Patients;
IT Operations (Cloud Providers, Storage Providers, Clinical Application Providers).
Requirements Reducing costs and rapidly delivering relevant data to users, stakeholders and healthcare institutes;
Enabling decision making information to actors who regularly need access [11] [12] to healthcare services but lack the scale to exchange (and require) more dedicated services and support;
Fast supporting and updating healthcare data to users due to large reference base with many locations and disparate applications;
Ensuring compliance and governance directions are currently applied, revised and supervised;
Data security, confidentiality, availability, authenticity, authorization, authentication and integrity to be defined “on-premise”.
Pre-processing and post-processing Implementing and sharing data models;
Designing data model properties according to private, public and/or hybrid Cloud requirements;
Designing “on-premise” of the data storage model;
Modeling data to calculate “a priori” physical resources allocation;
Modeling data to predict usage “early” and to optimize database handling;
Outage is covered by versions and changes archived based on model partitioning;
Content discovery assists in identifying and auditing data to restore the service to previous versions and to irrecoverably destroying the data, if necessary, is asked by the regulations.
Included and extended use case Deployment is guided from model properties and architecture definition;
Mapping of data is defined and updated, checking whether the infrastructure provider has persistence and finding out whether outages are related to on-line tasks;
Deploying and sharing are guided from model properties and architecture definition.


Following, we apply MaaS’ properties (a subset) to the above healthcare Use Case. Per contra, Data Model properties (a subset) are applied along the DaaS lifecycle states:


MaaS Properties

DaaS Lifecycle States

Healthcare Data Model Properties
Data Location Create Data Model
Model Archive and Change
Deploy and Share
Data models contain partitioning properties and can include data location constraints. User tagging of data (a common Web 2.0 practice, through the use of clinic user-defined properties) should be managed. Support to compliant storage for preventative care data records should be provided
Data persistence Create Data Model
Model Archive & Change
Secure delete
For any partition, sub-model, or version of models, data model has to label and trace data location. Model defines a map specifying where data is stored (ambulatory care, clinical files have different storages). Providers persistence can be registered. Data discovery can update partition properties to identify where data is located
Data inference Create Data Model Data model has to support inference and special data aggregation: ambulatory might inference patient’s insurance file. All inferences and aggregations are defined, updated and tested into the model
Confidentiality Create Data Model
Populate, Use and Test
Data model guides rights assignment, access controls, rights management, and application data security starting from data model. As different tenants (hospitals, clinics, insurance companies and pharmacies) access the data, users and tenants should be defined inside the model. Logical and physical controls have to be set
High availability Deploy and Share
Model Archive and Change
Data model and partitioning configuration together with model changes and versions permits mastering of a recovery scheme and restoration when needed. Data inventory (classified by Surgery, Radiology, Cardiology, for example) vs discovery have to be traced and set.
Fast updates at low cost Create Data Model
Generate Schema/Update Data Model
Data reverse and forward engineering permits change management and version optimization in real-time directly on data deployed properties
Multi-database partitioning Create Data Model
Deploy and Share
Bi-directional partitioning in terms of deployment, storage, and evolution through model versioning has to be set. Multi-DBMS version management helps in sharing multi-partitioning deployments: for example, Insurance and Surgery by Patient, normally are partitioned and belong to different tenants vs different databases
Near-zero configuration and administration Create Data Model
Generate Schema/Update Data Model
Data models cover and contain all data properties including scripts, stored procedures, queries, partitions, changes and all configuration and administration properties. This means administrative actions decrease to leave more time for data design and update (and deployment). Regulation compliance can be a frequent administration task: models ensure that healthcare compliance and governance is currently aligned



The Outcome
MaaS defines service properties through which the DaaS process can be implemented and maintained. As a consequence, applying the Use Case through the introduced directions, the following results should be outlined.

Qualitative Outcomes:
1)    Healthcare actors share information on the basis of defined “on-premise” data models: models can be implemented and deployed using a model-driven paradigm;
2)    Data Models are standardized in terms of naming convention and conceptual templates (Pharma, Insurance, Municipality… and so on): in fact, models can be modified and updated with respect the knowledge they were initially designed;
3)    Storage and partitioning in the Cloud can be defined “a priori” and periodic audits can be set to certify that data are coherent and remain linked to specific sites;
4)    The users consult the information and perform 2 tasks:
4.1) try the (best) search and navigate the knowledge for personal and work activities;
4.2) give back information about user experience and practice/procedures that should be updated, rearranged, downsized or extended depending upon community needs, types of interaction, events or public specific situations.
5)    Models are “on-premise” policy-driven tools. Regulation compliance rules can be included in the data model. Changes on current compliance constraints means changes on the data model before it is deployed with the new version.

Quantitative Outcomes:
1)    Measurable and traceable costs reduction (to be calculated as a function of annual Cloud Fee, Resources tuning and TCO);
2)    Time reduction in terms of knowledge fast design, update, deployment, portability, reuse (to be calculated as a function of SLA, data and application management effort and ROI);
3)    Risk reduction accordingly to “on-premise” Cloud service design and control (to be calculated as a function of recovery time, chargeback on cost of applied countermeasures compared with periodical audit based upon model information).

Conclusion
MaaS might provide the real opportunity to offer a unique utility-style model life cycle to accelerate cloud data optimization and performance in the healthcare network. MaaS applied to healthcare services might be the right way to transform the medical service delivery in the Cloud. MaaS defines “on-premise” data security, coherence, outage, availability, geo-location and an assisted service deployment. Models are adaptable to various departmental needs and organizational sizes, simplify and align healthcare domain-specific knowledge combining the data model approach and the on-demand nature of cloud computing. MaaS agility is the key requirements of data services design, incremental data deployment and progressive data structure provisioning. Finally, the model approach allows the validation of service evolution. The models’ versions and configurations are a catalogue to manage both data regulation compliance [12] and data contract’s clauses in the Cloud among IT, Providers and Healthcare actors [9].

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[4] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
[5] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[6] N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data http://cloudbestpractices.net/2012/10/21/maas/
[7] N. Piscopo - MaaS Workshop, Awareness, Courses Syllabus
[8] N. Piscopo - DaaS Workshop, Awareness, Courses Syllabus
[9] N. Piscopo – Applying MaaS to DaaS (Database as a Service ) Contracts. An intorduction to the Practice http://cloudbestpractices.net/2012/11/04/applying-maas-to-daas/
[10] N. M. Josuttis – SOA in Practice
[11] H. A. J. Narayanan, M. H. GüneşEnsuring Access Control in Cloud Provisioned Healthcare Systems
[12] Kantara Initiatives -http://kantarainitiative.org/confluence/display/uma/UMA+Scenarios+and+Use+Cases

Disclamer
This document is provided AS-IS for your informational purposes only. In no event the contains of “How MaaS might be applied to Healthcare – A Use Case” will be liable to any party for direct, indirect, special, incidental, economical (including lost business profits, business interruption, loss or damage of data, and the like) or consequential damages, without limitations, arising out of the use or inability to use this documentation or the products, regardless of the form of action, whether in contract, tort (including negligence), breach of warranty, or otherwise, even if an advise of the possibility of such damages there exists. Specifically, it is disclaimed any warranties, including, but not limited to, the express or implied warranties of merchantability, fitness for a particular purpose and non-infringement, regarding this document or the products’ use or performance. All trademarks, trade names, service marks and logos referenced herein belong to their respective companies/offices.


Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@CloudExpo Stories
SYS-CON Events announced today that Auditwerx will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Auditwerx specializes in SOC 1, SOC 2, and SOC 3 attestation services throughout the U.S. and Canada. As a division of Carr, Riggs & Ingram (CRI), one of the top 20 largest CPA firms nationally, you can expect the resources, skills, and experience of a much larger firm combined with the accessibility and attent...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, will discuss how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He will discuss how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Ocean9will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Ocean9 provides cloud services for Backup, Disaster Recovery (DRaaS) and instant Innovation, and redefines enterprise infrastructure with its cloud native subscription offerings for mission critical SAP workloads.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, will provide a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services ...
Imagine having the ability to leverage all of your current technology and to be able to compose it into one resource pool. Now imagine, as your business grows, not having to deploy a complete new appliance to scale your infrastructure. Also imagine a true multi-cloud capability that allows live migration without any modification between cloud environments regardless of whether that cloud is your private cloud or your public AWS, Azure or Google instance. Now think of a world that is not locked i...
MongoDB Atlas leverages VPC peering for AWS, a service that allows multiple VPC networks to interact. This includes VPCs that belong to other AWS account holders. By performing cross account VPC peering, users ensure networks that host and communicate their data are secure. In his session at 20th Cloud Expo, Jay Gordon, a Developer Advocate at MongoDB, will explain how to properly architect your VPC using existing AWS tools and then peer with your MongoDB Atlas cluster. He'll discuss the secur...