Machine Learning Authors: Zakia Bouachraoui, Liz McMillan, Roger Strukhoff, Pat Romanski, Carmen Gonzalez

RSS Feed Item

The next era in IT

The history of IT is usually characterized in terms of technological change.  The mainframe era was succeeded by distributed systems and client/server computing.  User interaction has moved from terminals to thick clients to thin browser-based interfaces, which, with the emergence of Web 2.0 technologies, combine the benefits of ubiquity and a rich end-user experience.  Networking has seen a shift from proprietary protocols and wire-level standards to the near-universal adoption of TCP/IP and Ethernet.  On the software front, SOA represents another major technological shift beyond earlier distributed architectures, the ramifications of which will likely take years to unfold.

At a broader level, however, I see IT evolving through three major eras, with the industry currently transitioning from the second era into the third.

The first era, from IT’s inception and through the 70’s, revolved around data processing (a name that was in fact given to some IT departments).  During this period, the primary focus was turning data – about customers, taxpayers, materials, and so on, along with their associated information – from a paper-based form into an electronic form, and performing batch operations on the data, like calculating interest for savings account holders.

Electronic record-keeping was a huge advance, although getting information into electronic form in the first place was a non-trivial challenge given that almost all data originated in an analog format and had to be manually keyed in or entered through other mechanical processes.  Representing, storing, and manipulating data efficiently was especially important—particularly given the limitations of the hardware of the time—giving rise to the rivalry between hierarchical, relational, and other database technologies.  In fact, while many presume relational technology won out, IBM’s Information Management System (IMS), a hierarchical data processing system introduced in the 60’s, is still going strong and continues to manage a large chunk of the world’s business data, in part because of its superior performance over relational databases including IBM’s own DB2.

By the early 80’s, most large companies and government organizations had a firm grasp on the task of “electronicizing” data and its processing and – while the need to address new data processing requirements and changes continues on – IT’s focus shifted to the implementation of real-time applications to automate and manage various aspects of a company’s operations.  This era of enterprise application deployments, in which SAP, Peoplesoft, Siebel, Oracle, and others established themselves in the market, was characterized by huge investments in packaged applications.  While companies continued to do custom development, the tide shifted from IT departments building the organization’s core applications – as they mostly did in the data processing era – towards buying off-the-shelf application functionality.

Fast forwarding to this point in the new millennium, the wave of investment in data processing and core business applications has largely run its course, at least within the big global corporations.  They have their databases containing customer, product, and other information, and they have their systems to manage their customer relationships, sales, finances, inventory, production, logistics, human resources, and every other major aspect of the business.  For the most part, large companies have filled the big areas of whitespace within their IT portfolios, a fact evident in SAP and Oracle’s push into small- and medium-enterprise, which are less IT-saturated, and into non-transactional application areas such as business intelligence and reporting.

So companies have spent the last few decades filling in pieces of the IT puzzle.  Almost by definition, the next era will focus on ensuring that the pieces work together effectively and to reshape them to meet future business needs.  Of course, organizations will continue to deploy new applications, but these will mainly be to fill gaps or niche needs not addressed by the big enterprise packages, and the initiatives will be generally be smaller in scope than, say, an R/3.  For most companies, the application portfolio in 2015 will not be very different from the applications already in place today.

The suggestion that “asset optimization” – for lack of a better phrase – will become IT’s defining mission isn’t a pitch for SOA, although SOA will clearly play a role.  Rather, it is a recognition of IT’s evolving circumstances.  Increasingly, IT value won’t be as simple as dropping in an ERP package or creating a customer datamart, because most businesses already have these capabilities (as do their competitors).  Instead, the next era of IT advantage will come not from the IT assets that companies own, but from how effectively they are able to exploit these assets.

I use the word “optimize” because it implies achieving the best result within a set of constraints, which is a fitting description of IT today.  IT’s constraints are many.  Budget increases are modest, maintenance consumes an ever-increasing portion of the spend, the labor picture is tightening despite offshoring and globalization, the technology landscape is becoming more diverse and complex, and almost all initiatives have to be accomplished within the context of the existing installed base of applications.  At the same time, there is an ever-increasing business appetite for IT capability,  requirements are coming from new sources (for example, regulatory compliance), and, as businesses seek to differentiate how they do things versus simply what they do, there is a greater need for IT to empower the line worker.  IT's challenge, in a nutshell, is finding ways to delivering on these requirements in the face of some not insignificant restrictions.

Thus, just as data processing and enterprise application deployments required different skill sets, organizational structures, and technology capabilities, the new era of “doing more with what you have” entails some significant adaptations on IT’s part, and in my next post, I’ll get into some of these changes.

On a final note, my characterization of IT’s three eras is obviously a generalization.  Different companies, industries and even countries will be at various stages along this timeline and, furthermore, the generations coincide to some extent (think of three bell curves overlapping each other).  Nevertheless, I think the progression is pretty clear and even companies who aren’t as far along this path will at some point want to think about how they equip themselves for the road ahead.

Anyway, more thoughts on this to come.

Read the original blog entry...

CloudEXPO Stories
The precious oil is extracted from the seeds of prickly pear cactus plant. After taking out the seeds from the fruits, they are adequately dried and then cold pressed to obtain the oil. Indeed, the prickly seed oil is quite expensive. Well, that is understandable when you consider the fact that the seeds are really tiny and each seed contain only about 5% of oil in it at most, plus the seeds are usually handpicked from the fruits. This means it will take tons of these seeds to produce just one bottle of the oil for commercial purpose. But from its medical properties to its culinary importance, skin lightening, moisturizing, and protection abilities, down to its extraordinary hair care properties, prickly seed oil has got lots of excellent rewards for anyone who pays the price.
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected path for IoT innovators to scale globally, and the smartest path to cross-device synergy in an instrumented, connected world.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.