Welcome!

IoT User Interface Authors: Carmen Gonzalez, John Basso, Liz McMillan, Elizabeth White, Greg O'Connor

Related Topics: Cloud Security, Java IoT, Industrial IoT, Microservices Expo, Microsoft Cloud, IoT User Interface

Cloud Security: Blog Post

Addressing the Root Cause – A Proactive Approach to Securing Desktops

The trouble with reactionary behavior

The computers on your network are protected from malware right? If you are operating an environment based largely on Windows based PCs you likely have some kind of anti-virus installed and centrally managed. If you have purchased a more complete desktop protection suite, you probably even have a Host Based IDS/IPS protecting your machine from incoming malicious TCP scans, or possible outbound connections to known malicious sites (like google.com occasionally). Operating system firewall activated? Yep! AV signatures current? Check! Global Threat Intelligence updated? Uh, yeah....sure. Then you should be covered against threats targeting your organization, right? Most likely not, and at times these tools actually mask intrusions as they provide a false sense of security and protection.

The Trouble with Reactionary Behavior
The problem with these tools, all of them, is that they are purely reactionary in nature. Reactionary protection tools on every level, is something that basically states that an event has already occurred on your host computer, and those protection mechanisms will now activate. That means when you get an antivirus alert on your computer, the malware ALREADY present on the system. Yes, it may have stopped it, deleted it or possibly quarantined it (all of which are good). It has only done so because the AV software either has an existing signature in its database or the malware has attempted to operate in a suspicious manner, flagging the heuristics detection of the AV. What about when brand new malware, 0-day exploits, or sophisticated targeted malware executes on your host?

Do you imagine your AV will detect and mitigate it? I would suggest that your AV will be none the wiser to the presence of this yet to be detected threat, and only once it has been submitted to an AV vendor for analysis will you be provided with an updated signature. Well certainly if my AV missed it, one of the other layers of protection should stop it, right? It is possible, if the malware uses outbound connections that aren't considered "normal" by your OS's firewall or HIDS/HIPS software, then the malware could potentially be detected. If the malware uses standard outbound connections, port 80 or more than likely port 443, this appears as "normal" to the other layers of your systems host based defenses in place.

These tools all require some kind of known characteristics of a particular threat in order to detect its presence and mitigate it. These characteristics are obtained through analysis of reported and discovered threats of a similar nature, of which are used to develop signatures or heuristic models to detect the presence of malware on a host. If that threat has not yet been submitted for analysis and the callback domains not reported as malicious, it may be a while for it to be "discovered" and signatures made available. Until that time, your computer, its files, all of your activities as well as other computers on your network are at the mercy of an attacker unabated.

Being Proactive Is Essentially Free
This is the part that is really frustrating for me as an analyst, and also as an advocate for root cause solutions. Reactionary defenses cost an unreal amount of money for consumers, businesses, governments (both state and local), federal and military. You would think with all of this time and money spent on the various products billed as "protecting" you from cyber threats & intrusions, your environment would be better protected whether it is an enterprise or a single computer. This is not the case.  In fact, many studies show computer related intrusions are on the rise. Nation state threats, advanced persistent threats (APT) and even less skilled hackers continue to improve their sophistication as tools get cheaper and information is freely exchanged. Why is it then that I say, Proactive defenses are essentially free? And if that is in fact the case, why is this not being used more frequently? Proactive defense measures are essentially free, minus the time and effort in securing the root problems within your network. For this particular blog post, I am focused on host based proactive defensive measures.

Denying Execution at the Directory Level
The "how" is actually quite simple to explain, and in fact it is not a new protection technique at all, its just not as widely used outside of *nix based systems. All that an operating system provides is a platform for applications to run on, sometimes graphical based, sometimes a simple command line. The applications are typically stored in a common location within the operating system, allowing for dynamic linking as well as simplifying the directory structure. Not all applications require the need for linking to a dynamic library as they contain all of the requirements to run on their own, so they can easily be placed anywhere within the OS and they will execute.

This is extremely convenient when a developer wants to provide software that doesn't need to officially "install", and can be easily moved around. Therein lies the issue with the execution of these "self contained" applications, they can execute from anywhere on the host, without restriction. For a demonstration of this, copy "calc.exe" from the "system32" folder on your Windows PC to your "desktop". The program "calc.exe" will execute just the same as if it were under "system32" as it is a completely self contained binary. Almost all malware is designed the same way, and typically executes from a "temp" location or the root of your currently logged in user directory. The execution of malware needs to be stopped from occurring in the first place. This way, regardless of your current AV signatures or HIDS/HIPS capabilities, the malware cannot run. If the malware is unable to run, the threat is effectively mitigated before it can gain any foothold.

So how on earth do you stop the malware from executing from within these locations, and do I need some kind of "agent" based solution to monitor those particular directories to stop them? The approach is simple: deny ALL execution of programs outside of a particular directory (e.g., "Program Files" and "System32"). Require all necessary applications on the host, putty for instance, to be placed within one of the approved directories. If you are running a Windows based environment, locking down execution outside of approved directories can be implemented through both Group Policy (GPO) and Local Policy.

By expanding on an existing Windows policy called "Microsoft Windows Software Restriction" (which has been around since 2002 BTW) you can define directories that allow for execution of applications. This exact same technique can be employed on OSX systems as well.  Simply remove the execute privilege from locations within the OS that you would like to protect. In fact, I would venture to say it is easiest to implement on any *nix based system (if it's not already, as is the case on most unix/linux flavors).

No Silver Bullet
No solution is 100% effective, and this is no exception, as there are a number of ways to get past this protection.  Having said that, it adds a layer to your defense and will stop the majority of execution-based attacks.  If your software is properly patched (0-days not included), you have user privileges locked down with separate dedicated accounts, directory protection just steps up the difficulty your attackers have in gaining a presence on your network. No single solution will solve all of your problems, no matter how much a vendor sales engineer tries to sell you. Holistic, full spectrum defenses are the future, not "plug & play" protection hardware or software that requires updates, patching, signatures and "threat intelligence". The other side extremely important level of protection is in your Infosec professionals you have supporting you. Spend the money on good, talented and well rounded security professionals that understand the cyber threat landscape and the ways in which they can help better protect your organization.

To research further into how your network and its assets can be better protected please check out CyberSquared for solutions to root cause issues.

More Stories By Cory Marchand

Cory Marchand is a trusted subject matter expert on topics of Cyber Security Threats, Network and Host based Assessment and Computer Forensics. Mr. Marchand has supported several customers over his 10+ years within the field of Computer Security including State, Federal and Military Government as well as the Private sector. Mr. Marchand holds several industry related certificates including CISSP, EnCE, GSEC, GCIA, GCIH, GREM, GSNA and CEH.

@CloudExpo Stories
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU's GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes. In...
"Coalfire is a cyber-risk, security and compliance assessment and advisory services firm. We do a lot of work with the cloud service provider community," explained Ryan McGowan, Vice President, Sales (West) at Coalfire Systems, Inc., in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
CloudJumper, a Workspace as a Service (WaaS) platform innovator for agile business IT, has been recognized with the Customer Value Leadership Award for its nWorkSpace platform by Frost & Sullivan. The company was also featured in a new report(1) by the industry research firm titled, “Desktop-as-a-Service Buyer’s Guide, 2016,” which provides a comprehensive comparison of DaaS providers, including CloudJumper, Amazon, VMware, and Microsoft.
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.