Welcome!

Machine Learning Authors: Zakia Bouachraoui, Yeshim Deniz, Pat Romanski, Elizabeth White, Liz McMillan

Related Topics: Java IoT, Microservices Expo, Machine Learning , Agile Computing, Cloud Security

Java IoT: Article

Planning for ERP Disaster Recovery

A standard backup is created to ensure files can be restored in the event of missing or lost files

It is very likely that if something occurs that leaves you without your ERP system, it could be a disaster for your business. It is vital that you have a strong disaster recovery plan in place to deal with such a potential catastrophe, and minimize loss of time and money.

While disaster recovery is more commonly thought to involve flooding, fires, or earthquakes, your ERP system may be subjected to a catastrophe as a result of hardware failure, loss of electrical power, or any number of other technical problems. Unfortunately these potential problems tend to not garner any attention until they actually happen.

The best way to handle potential disaster is to have a solid plan in place. While it's probably impossible to avoid every single possible problem that might come up, there are many issues that can be avoided with just a little forethought (and money).

Disaster recovery preparation and planning consists of two things - preventative measures to avoid disaster and a recovery measure to take that will allow service to resume in the shortest time possible should a disaster occur. Preventative measures may include power protection through uninterruptable supplies of power and the provision of redundant electrical circuits.

A commonly used preventative measure is storing data via RAID arrays, but it is important to understand that not all levels of RAID provide sufficient protection. Commonly, RAID 10 (combination of RAID 1 and RAID 0) for quick recovery, and RAID 6 (RAID 5 plus an additional hard disk to prevent failure at a single point) where more than one disk is at risk of failure.

It is when things go seriously wrong that a recovery plan is in order. It is much more secure but does not come cheaply. Creating backup files on a regular basis is standard practice for any IT operation, but the backups required in the event of a disaster recovery are different.

A standard backup is created to ensure files can be restored in the event of missing or lost files. In the case of disaster recovery the backup is capable of complete restoration of your ERP software. A disaster recovery backup stores data in large blocks so that the system can be recovered and stored as quickly as possible. Data can be stored on site but it is highly advisable that additional copies be stored off-site, either by creating a backup to a remote site or by creating and storing copies on tape and then removing to a secondary location.

It is important that you understand the features of the service you are going to use if you plan to store your off-site backup on the cloud, as cloud services come in various forms that range from simple data storage services to those familiar with the requirements of disaster recovery. The latter may have features such as multiple redundant backups. Whichever service you choose you must ensure that the cloud storage system is capable of providing everything your company needs in the event of a disaster.

When dealing with a potential disaster recovery it is important that you test the backup system and recovery plan regularly to ensure that it works effectively. This should be done every few months.

More Stories By Aaron Louis

Aaron Louis is the head blogger of ERP Systems HQ. He holds a BS in Computer Information Systems and has worked as an ERP Consultant. Visit ERP Systems HQ for more information.

CloudEXPO Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.