Breaking down gateway and host-based security approaches in the cloud.

For most organizations, moving to the cloud has become a requirement, not an option. In the cloud, many of the same security controls are required but how they are delivered needs to change in order to fit into the agile cloud environment and DevOps methodologies. In the data center, security controls are delivered at the perimeter either using hardware security appliances for Firewall or Intrusion detection & prevention (IDS/IPS); or delivered through software (agents) on the servers themselves, such as Anti-Malware or File Integrity Monitoring. In the cloud, security is a shared responsibility which means that both the cloud provider and the cloud user share responsibilities for providing protection. Cloud providers like Azure provide immense physical data center security and processes, which is great for cloud users as it takes a lot of work off their plate. But it also means that cloud users can’t bring the hardware firewall or IPS devices to the cloud as they don’t have access to the physical servers. That leaves two options for controls like IPS:

  1. Gateway or virtual appliance
  2. Host-based with security software (agent) running on each workload

To get a better idea of the different approaches let’s dive into an example of IDS/IPS architecture in the cloud, as it is one of the security controls that most organizations have and it is often required for compliance.

 

Intrusion Detection and Prevention (IDS/IPS) Overview

Intrusion Detection Systems (IDS) were the first generation of network security controls. A reactive control, it would alert you when a breach or attack occurred so you could investigate. Intrusion Prevention Systems (IPS) overtook IDS in popularity because of the ability to proactively block attacks, not just react to them. IDS/IPS systems for data centers were network-based and consisted of dedicated hardware appliance with the performance and throughput being based on the size of the network interface, CPU and memory.

Virtual Appliance (Gateway) Approach

Using the security virtual appliance deployment model there are two methods in which IDS/IPS can be used. Method 1 requires software to be deployed to each instance in order to send a copy of the network traffic to the appliance for inspection. Method 2 requires routing configuration changes to be made in order for the security virtual appliance to inspect the traffic. Figure 1 illustrates both deployment scenarios.

  Figure 1: Security Virtual appliance

 Host-based Security Approach

The other option is to deploy software (also known as an agent) onto each workload. This allows the security polices to be tailored to the specific software executing on that server. This removes the need to have generic or extraneous rules running and taking up resources. For instance, with Trend Micro Deep Security you can run a Recommendation Scan that quickly determines the specific rules needed to protect the instance, depending on the OS or patches applied. Additionally, the deployment of security software and policies can be automated for environments with auto-scaling requirements with configuration management tools such as Chef, Puppet or OpsWorks. This approach is illustrated in Figure 2. A host-based fits seamlessly with your existing deployment workflow.

   

Figure 2: Host-based IPS from Deep Security

 Comparing Approaches

One of the biggest architectural problems with network-based IDS/IPS is the use of encryption to protect network traffic. This security practice protects the contents of network traffic but it makes it difficult or impossible to analyze traffic and detect exploits. With host-based IDS/IPS, network encryption is less of an issue as the host decrypts traffic before it is analyzed. The following is a summary comparison of the different methods, which can be used to deploy IDS/IPS protection for cloud instances.

  Virtual Appliance (Method 1 Inline) Virtual Appliance (Method 2 Tap) Host-based Security
Scaling Parallel to the workload In proportion to the workload With the workload
Protection Detect Generic protection Customized protection
Summary

Although both security virtual appliances and host-based software can be used to deliver IDS/IPS in the cloud, there is a strong argument that a host-based approach is easier and more cost effective.

  • Host-based security can be deployed with automation tools like Chef or Puppet.
  • Host-based controls seamlessly auto-scale to thousands of instances without requiring additional virtual appliances or managers to be added.
  • A host-based approach reduces resource requirements as there is no duplication of traffic and no specialized instance is required for a virtual appliance.
  • Eliminates the requirement to configure SSL/TLS to decrypt and analyze network traffic.

Host-based security enable controls and policies to be customized for each workload.

In the past, IPS and IDS have only been defined in terms of one versus the other. While each offer their own unique attributes, the key to success may lie within a blend of the two. So how can you decipher what is offered with the two intrusion defense tools? In this article, we breakdown the main differences between IPS and IDS and how you can leverage their capabilities to protect your workloads.
IPS vs IDS
Intrusion Prevention System (IPS) is a security control tool; it inspects traffic for vulnerabilities and exploits within the network stream and can remove packets before they reach your applications. It can also act as a protocol enforcing tool by ensuring each packet you are accepting is correct for that application. For example, you can allow any HTTPS packet that comes in on 443, but also block any non-HTTPS packets like SSH on the same port. This allows you do to additional enforcement of the traffic for multiple protocols on the same network port. Intrusion Detection System (IDS) is a visibility tool; it can alert an administrator on patterns within traffic while allowing the traffic to pass through to the application. This allows you to create IDS rules to give additional information about the traffic being accepted into your environment. For example, you might have an IDS rule to inspect SSL ciphers from servers communicating with you to ensure they are following compliance mandates and security policies. It is a powerful tool for giving in depth information without impacting your applications.
Layering Your Security
Ideally, you want to use both technologies within your environment. This allows you to use the IPS functions to protect your workloads from vulnerabilities and exploits which gives an additional layer of security within your environment. An IDS is helpful for monitoring and investigation within your environment without downtime to users or applications. This allows administrators to build additional IPS policies based on the information displayed within the IDS to keep your environment protected. Having a tool like Deep Security which can be configured as an IPS, IDS or both is extremely useful in implementing security controls and removing vulnerabilities as well as giving you real-time information about traffic patterns and policies. Each rule within Deep Security can be configured either in Prevent (IPS) or Detect (IDS) giving you granular control of your security posture while still allowing your applications to run without impact. Combine this with our recommendation scan technology and your network security has now become context aware matching the correct IPS and IDS rules to the operating system and applications running on your workload. Deep Security only applies the rules which you need within your environment keeping your performance costs low. Intrusion detection and prevention are valuable security tools, especially for cloud workloads when you can easily spin up a workload that’s visible to the world. By using Deep Security, you can add another layer of network security controls and visibility to your security arsenal.

A Perspective from Security

Blue Green deployment (or Red/Black, A/B) is a methodology to eliminate downtime from your workloads by bringing up a parallel production environment and implementing required changes before moving the traffic from one group to another. It is an effective technique to minimize risk in application changes ensuring you have appropriate time to test while your users are unaffected and being handled as normal. There are also security events which can be handled similarly.

Let’s take a closer look at some specifics and scenarios.

In this figure, we have a set of Azure VM’s running behind an Azure Load Balancer in production and labeled as Blue. These workloads could be running a number of different services including but not limited to LAMP stack or application logic.

Next, we’ll bring up a parallel architecture that mirrors the blue workloads.   These instances could be in a separate subnet or network security group. You might even place them in the exact same location with an enumerated version in their names. Your next goal will be to apply the change on the green side while blue is still handing production. This change could be new application logic or patch, an operating system hotfix, anything that could cause an outage to your customers that would require testing.

After the change has been made, it’s important to test all aspects of the AzureVM to ensure proper functionality. Since these are about to go into production and the whole purpose of this technique is to eliminate downtime, testing is the most critical stage.

Finally, when you complete your testing, you will promote the green side to production.  You could also just instantiate new instances into the blue side and delete the VM’s running the older code.

As you move back and forth with this code deployment and application development, you can minimize the impact to your users to hopefully zero.

From a security perspective, this also allows you to buy yourself time during a breach or attack. By bringing up a parallel environment, you can test new firewall or intrusion prevention rules, pull in a new security hotfix, or even just remove an attackers footing in the existing instances causing them to start their attack over.  You could also use techniques like quarantining instances into a locked down security group to run forensic analysis on or switching over the deployments automatically in the case of a malware or other alert from your security tool.

The ability to swap back and forth between parallel production environments allows you to deal with many situations since it effectively makes compute disposable. If you can move your workloads seamlessly without loss of user connectivity, it gives your environment resiliency and flexibility to respond to any situation (hopefully automatically).

 

When you’re tasked with meeting the compliance requirements to achieve and maintain PCI DSS compliance, you’ll soon realize that minimizing the number of security tools you use can be a huge asset. When it’s time for your PCI DSS audit, you can hit the accelerator with Trend Micro Deep Security as a Service.

What do I need to know about PCI DSS?

Any organization that has applications that deal with credit or payment card data, you are required to go through a process outlined by the Payment Card Industry (PCI).

If your applications are in the cloud, like Azure, PCI compliance can be easier – as long as you choose the right service provider. Infrastructure as a Service (IaaS) providers like Microsoft Azure have Level 2 PCI DSS certification. This means they have validated their security controls, people and processes with auditors and take care of many aspects that you would be responsible for if your application was in a physical data center. If you’re using SaaS offerings for log management, monitoring or security, they need to be PCI DSS certified, even if the service doesn’t directly deal with cardholder data.

Here is the real question.

Are your SaaS products also PCI Level 1 certified? It’s time to check, as of version 3 of the standard, if you use third party Software as a Service (SaaS) offerings, they are included in the scope of your PCI audit!

We’re happy to announce that Trend MicroTM Deep Security as a ServiceTM is now a PCI DSS Level 1 Service Provider for your Azure workloads! This means you can streamline your PCI DSS certification process with a single tool!

Deep Security as a Service removes the cost and effort of running the security management stack. All of your security policies and events are stored securely and managed by Trend Micro. Best of all you can get up and going with Deep Security as a Service in just a few minutes with our 30 day free trial.

Trend Micro has saved users months of precious resource time on PCI DSS projects by meeting many of the requirements with a single tool, including critical controls that address requirements like 11.4 Intrusion Prevention, 11.5 Integrity Monitoring, 5.1 Anti-malware and many more. Here are just a couple examples,

  • For Royal Gate, Deep Security accelerated PCI DSS compliance for its payment service platform and increased security within its hybrid environment.
  • For Guess?, Inc., Deep Security helped the company segment traffic and fulfill multiple PCI requirements rapidly.

For more detailed information on how Trend Micro Deep Security can help you accelerate PCI compliance, download the detailed matrix of PCI requirements here,  written by the PCI Qualified Security Assessor (QSA) Coalfire.

Information Technology Management or simply IT Management is a broad term and there are many disciplines tied to it e.g. configuration management, service management, security management to name a few.

The one thing that is constant in life is CHANGE! The Information Technology world is no different, the digital transformation started decades ago but the speed at which these changes are happening now shows no sign of slowing down, it’s on rapid acceleration. It is just a start to the world of everything-is-connected-to-everything-else.

db5-skyfall

There was a reason when James Bond (In Skyfall) stopped by a garage, telling M that they would have to switch cars and opened the door to reveal the Aston Martin DB5 (1963), and they drove together to Scotland, M complaining the car was uncomfortable, and Bond jokingly threatening to use the ejector seat. Confused, why I’m bringing this here? The MI6 cars all have trackers!

This digital transformation is revolutionizing our businesses today because when you start migrating from the analog to the digital world you get your hands on the information that you didn’t have before! The growing volume of information collected as part of this transformation, as we call it “big-data”, opens up new set of opportunities for businesses that didn’t exist before.  Look at digital thermostat transformations these days (think nest and the likes). The businesses can now learn your temperature adjusting patterns and with this data at hand now they can create new business opportunities/model for them e.g. this information is used by the energy companies to plan and adjust their power plants capacity, peak rate billings and so forth.

These innovations and advancements in technology to create our connected world present new challenges to the management solutions and the need to have true “Single Pane of Glass” concept is a must in your IT strategy but is there a single pane of glass solution that can equip today’s information technology professionals with the tools they need to succeed?

Since this blog post is written for our Azure site, you have guessed it right, I’m talking about Microsoft Operations Management Suite (aka OMS). It is Microsoft’s cloud-based IT management solution. There are four solution areas offered under OMS;

OMS-offerings

We will focus our discussion for this blog to the Insight & Analytics offering. The Log Analytics helps you collect, correlate, search, and act on log and performance data generated by operating systems, network devices and applications, simply put you can collect and analyze machine data from virtually any source.

If this is indeed true, then let’s see how we can leverage OMS log Analytics service and bring Trend Micro Deep Security event data inside OMS to help identify and resolve security threats. The good news is Trend Micro Deep Security offers seamless integration with OMS data analytics service, thanks to OMS agent.

Architecture Components of OMS – Log Analytics

Before we look into how to integrate Trend Micro Deep Security with the OMS log analytic service, I like to share with you the architecture components of OMS.

OMS Repository is the key component of OMS; it is hosted in the Azure cloud. Data is collected into the repository from connected sources by configuring data sources and adding solutions to your subscription.

OMS Agent a customized version of the Microsoft Monitoring Agent (MMA). You need to install and connect agents for all of the computers that you want to onboard to OMS in order for them to send data to OMS.

Connected-Sources   Connected Sources are the locations where data can be retrieved for the OMS repository.

Data sources run on Connected Sources and define the specific data that’s collected. Data sources run on Connected Sources and define the specific data that’s collected.

Integration of Deep Security with OMS – Log Analytics

Now that we got some basic understanding about the components of OMS – Log Analytics Service, let’s see how the integration of Deep Security with OMS – Log Analytics works?

The integration of Deep Security with OMS is very simple; Deep Security can write its event data in CEF/syslog to one of the OMS connected sources and then they can be collected by the Syslog data source on this OMS Linux agent.  The one thing you need to know about OMS agent and syslog support is that either rsyslog or syslog-ng are required to collect syslog messages.

There are three main steps to this integration as illustrated here:

DS Integration

I’m going to skip the part of installation and configuring a syslog data source here, assuming you already know this part. However, when it comes to event forwarding choices in Deep Security, there are two integration options available to configure Deep Security Solution to forward security events to the OMS connected source.

Relay via Deep Security ManagerThis option sends the syslog messages from the Deep Security Manager after events are collected on heartbeats

Direct Forward This option sends the security events/messages in real time directly from the Agents

This choice decision is dependent on your deployment model, network design/topology, your bandwidth and policy design. The simplest and often used choice is to use Relay via Manager, as shown below.

01-Relay-via-Manager

Once the event data is collected and available in OMS you can leverage log searches where you construct queries to analyze collected data.  The raw syslog/CEF data that is sent by Deep Security to OMS can be extracted by using OMS’s Custom Field feature, This feature utilizes FlashExtract, a program-synthesis technology developed by Microsoft Research, to extract the fields you teach it to recognize.

syslog-data-extract

syslog-data-extract-example1

It’s little work upfront to extract fields of interest but once it’s is done then all your custom fields are now searchable field which you can use to aggregate & group data etc.

DS-Firewall-View-3 DS-Log-Query

The last thing that I want to touch base on is Designer View feature. With View Designer, you can create visualizations and dashboards using the event data available in OMS as Views e.g. you can use Custom Fields to build your view for what matters to you the most.

DS-Firewall-View

DS-Firewall-View-2  DS-Firewall-View-3 So here you have it, now you can use Deep Security to protect your workloads running across complex, hybrid infrastructures and use OMS to gain control and visibility to identify and resolve security threats rapidly. No need to worry about having multiple tools and interfaces,  and most importantly without needing to spend valuable time on software setup and complex integration options, thanks to OMS – all-in-one cloud IT management solution.  

With Microsoft’s release of Application Gateway Web Application Firewall (or WAF), you now have an additional layer of defense built into your Application Gateway against network-based attacks.

When you’re looking to secure your workloads, you should build your defenses in layers to avoid any single weak point. The goal is to stop any attacks as far from your data as possible. This approach lowers your risk as it provides multiple controls the opportunity to detect and prevent an attack.

Protection for WAF is applied at each Application Gateway meaning that these attacks never reach your Azure VMs.

This is a great first line of defense for your web applications. It includes protection against malicious sessions, HTTP DoS sessions, and covers the majority of the OWASP Top 10.

Picture 2

Check out the Microsoft Azure Website for more information about WAF capabilities.

For more complex attacks, Deep Security’s intrusion prevention capabilities fit the bill. Deep Security monitors all network traffic to your instances and can detect and stop attacks before they reach your applications. Deep Security can also do protocol enforcement and drop unknown or non-conforming traffic which can help protect your workloads from new attacks.  Not just for web apps, your operating system is also protected from exploit and vulnerabilities as well.

Pasted image at 2016_11_14 11_34 AM

Working together, Microsoft Azure’s Application Gateway WAF and Deep Security provide your web applications a strong, layered defense.

To learn more about how Deep Security can help solve your security needs within Azure, contact us at azure@trendmicro.com

Learn about the important security how Trend Micro Deep Security is working hand in hand with Microsoft to provide customers a secure cloud to meet their business needs for the hybrid cloud reality of today, and tomorrow.

In this webinar you will learn:

  • Microsoft Ignite announcements for Azure
  • Maersk’s security journey to the cloud
  • How organizations should be thinking about security for the hybrid cloud
  • How to use Deep Security to manage and deploy security in the cloud
  • How to achieve cost effective compliance
Click the below image or follow this link to watch the on-demand webinar now!   Microsoft-Ignite-Trend-Micro  

Trend Micro has developed a security platform that is optimized for hybrid cloud deployments. It includes a wide range of security controls that help to address an organization’s server security responsibility  and simplify security management.

To learn more about Maersk and their journey to the cloud, listen to our Microsoft Ignite 2016 session featuring the Head of Security and Security Manager sharing what they learned and the role that security played in the migration from the data center to the cloud. Watch the full session here.

Are you a CISO in cloud or security operations and architecture? The decisions you make when migrating and securing workloads at scale in Azure have a large impact on your business. Trend Micro close and personal with the team at Maersk to better understand their Azure migration and how security played a role in the decisions they made.

This recorded session from Microsoft Ignite 2016 will help you jump-start your migration to Azure or, if you’re already running workloads in Azure, learn how companies just like you are using Azure to improve efficiency of their deployments. Maersk’s Head of Security and Security Manager share what the organization learned while tackling these issues at scale in their hybrid environment. You’ll hear about the security challenges that they faced and lessons learned in moving from on premise servers to a cloud environment in Azure.

The integration with Trend Micro Deep Security on Maersk’s Azure cloud platform resulted in several security and hybrid cloud benefits, including:
  • Visibility across their physical and cloud environments
  • A much more stable system and standardized environment on a supported platform
  • Analysis of security incidents from one central, unified location
  • Decreased time invested to analyze data
  • Real-time alerts for file integrity changes
  • Establish compliance in a streamlined and cost-effective manner
  • Efficient internal communication due to the holistic, single platform solution

The Maersk Group is a worldwide conglomerate operating in more than 130 countries with a workforce surpassing 89,000 employees. Owning the world’s largest container shipping company, Maersk is involved in a wide range of activities in the shipping, logistics and oil and gas industries. Due to Maersk’s breadth of business channels and global reach, it required an integrated system to manage its various data centers across the world for seamless flow of information.

maersk-group-core

To learn more about how Deep Security can protect your Azure migration and projects, read about our features or contact us for a quick demo today.

Now that public cloud is a generally accepted, and expected, path for enterprises dealing with large amounts of applications and data, it’s up to cloud solution providers and partners to make it that much easier to provide a quick and painless transition to the cloud. Thanks to the introduction of the Azure Quickstart Templates at Microsoft Ignite 2016, customers are given the tools they need to truly embrace the possibilities of the cloud.

The cloud security manager’s balancing act

Many enterprises are facing common challenges in cloud security, a lack of both time and expertise. The speed of cloud means that needs are constantly changing, and having the time and expertise in your team can be an overwhelming challenge. Time is your most valuable resource and without the right expertise, your time is spent on trying to improve your IT infrastructure can be fruitless.

Partnering to help customers embrace cloud

We often run into situations where our customers want to launch multiple products from the Azure Marketplace, but the time that it takes to do this can be daunting, not to mention the level of technical expertise required to ensure that all settings and variables are appropriately entered. That’s where Azure Quickstart Templates come in.

Azure Quickstart Templates are ARM templates created by trusted Microsoft partners and designed to help save you time when looking to get started with integrated, multi-artifact solutions on Azure. Quickstart Templates are helpers and learning tools that you can customize to meet your unique business needs.

The goal of the Trend Micro Quickstart is to reduce the amount of time and technical expertise required to launch numerous products within an Azure account, integrating pre-configured products like Deep Security, Chef and Splunk with a single click. See the Trend Micro Quickstart here.

Azure-quickstart-trend-chef-splunk

How will Azure Quickstart Templates help you?

  • Save time: In under and hour, you can deploy fully integrated solution stacks to suit your enterprise needs. Reducing the time and manpower needed to test and launch your cloud project.
  • Provide Expertise: The heavy lifting is done for you, the Quickstart Templates help automate much of the manual work that would be otherwise normally be involved in stitching artifacts together.
  • Speed Time to Value: Whether you’re looking for a proof of concept, or are ready to start your cloud project or migration, deploying the Trend Micro Quickstart will give you an enterprise-ready environment so that you can start using your environment immediately.

For a more technical overview and steps to get started, you can read my colleague Saif’s blog and check out his video here.

Ready to get started on your own, visit the Azure Quickstart Template website to learn more.

Technology advancements such as high-speed Internet connectivity, ability to create abstract layers in computing environments has allowed us to achieve things that were unthinkable ten years ago. I have personally experience these advancements in my professional life. At one point, I was happy to get my hands on virtualization and ability to run integrated solution from my laptop and an external portable storage device for my work. Now, with the eruption of cloud computing combined with the power of orchestration tools is mind blowing. We have entered into an-era where we are looking to automate everything aka Infrastructure as a Code.

Today, I’m happy to talk about our “Azure quickstart template”, let’s get started and get to the technical details.

What makes this quickstart template?

This integrated stack consists of Trend Micro Deep Security, Splunk Enterprise and Chef automation platform, all running on Azure.

Azure-quickstart-trend-chef-splunk

How is this quickstart template created?

This integrated stack is built using a JSON template, the template is based on Microsoft Azure Resource Manager (ARM) templates. With ARM templates, we can deploy topologies quickly, consistently with multiple services along with their dependencies. You do it once and consume it many times. It’s pretty powerful, since we by working with our Azure partner did for you already, you can simply consume.

What’s in it for me?

It saves you a lot of time that you can spend on things that matters to you, I don’t know perhaps watching a hockey game (yes, I’m Canadian and it’s our sport). Seriously, thing about it, if you are familiar with these solutions then you must be aware of it that each element has various components such as web based management application, database etc. and requires specific communication paths, database schema etc. To setup this type of environment where you have a fully functional integrated components would take you at least couple of hours and this estimate is by assuming your three-year-old daughter is not screaming in the background and you have your full attention and focused time to build this up.  I don’t know but I love when someone else can do part of my job and make it easy for me.

I’m with you tell me more.

Okay, if you are sill reading then I like to think I have your attention so let’s look at this diagram on what is involved here to give more technical details about this quick start;

To break it up, we have;

  • A storage account in the resource group.
  • A Virtual Network (vNet) with four subnets
  • Virtual Machines to host solution components
  • Network security groups to control what communication paths are allowed
  • Azure SQL DB to host Deep Security persistent data
  • Three test Virtual Machines; 2 VMs (Linux, Windows) with bootstrap scripts to install TrendMicro agents (through Azure VM extensions) and 1 VMs (Linux) with bootstrap scripts to install Chef Agents

There is a lot happening here as you can see and the only thing you need to do as part of consuming this is to provide some values for the template parameters such as;

  • Where you want to deploy this stack.
  • Web application administrators account and Virtual machine administrator account credentials for the various stack components.
  • Communication ports for Deep Security
  • Virtual machine size and number of test virtual machines

It takes roughly 30-45 minutes or so to have this environment fully functional. At the end, we will return the URL’s for each solution component (Trend Micro, Splunk and Chef) to you so that you can go ahead and simply login to these applications and do what ever you wanted to do e.g. protecting your Azure based workloads from various vulnerabilities, remember cloud security is a shared responsibility i.e. although cloud providers deliver an extremely secure environment but you need to protect what you put IN the cloud—your workloads.

quickstart-diagram

I’m sold, where can I get this template for quick start?

That was easy! The ARM template is available on the Azure website (here). You can simply click the “Deploy to Azure” button on or select Browse in GitHub repository. You can also use PowerShell, Azure CLI etc. to start the deployment, the GitHub link provides necessary documentation for it.

The Chef recipe for Deep Security Agent is available here.

Questions? Reach out to us by email at azure@trendmicro.com and we’ll be happy to help!