Network Deployment

Network Deployment

What is Microsegmentation?

by

Mustafa Ali

|

October 23, 2017

Wondering, “What is micro segmentation?”. Microsegmentation is the art of using software-defined policies, instead of hardware network configurations, to make network security more flexible. It can only work if implemented with right tools and forethought.

Models Used in Microsegmentation

There are four architectural models used in microsegmentation. They are:

  1. Hybrid model – it is a combination of third-party and native controls
  2. Overlay model – usually uses a type of software or agent within every host, instead of using moderating communications. Vendors of this model include Unisys, vArmour, Vmware NSX, ShieldX, Juniper, Illumio, Guardicore, Drawbridge Networks, CloudPassage, and Cisco.
  3. Third-party model – mainly based on a virtual firewall presented by third-party firewall vendors. They include Huawei, Sophos, SonicWall, Palo Alto, Juniper, Fortinet, Checkpoint, and Cisco.
  4. Native model – makes use of included or inherent capabilities provided within various areas such as infrastructure, operating hypervisor/system, IaaS, or virtualization platform.

When considering microsegmentation solutions, even for a virtual machine, avoid thinking about technical solutions only. The solution should be inclusive of technology, process, and people. Therefore, do not go for the model of security you think is best to implement. Rather, you should select the architectural model that ensures security in the operation of your modern data center.


Microsegmentation allows security policies to be defined by workload, applications, VM, OS, or other characteristic. Source: VMware

Benefits of Microsegmentation

Now that you’re no longer asking yourself, “What is micro segmentation?”, the next step is getting information on its benefits. This will help you understand why it is important in data centers and other IT/ICT platforms. The following are benefits of microsegmentation:

Extensibility

Security administrators depend on microsegmentation to make adaptations to unfolding and new scenarios. Threat topologies in data centers are constantly changing. Old vulnerabilities become inconsequential as new ones are exposed. At the same time, user behavior is a constant variable that surprises any security administrator. Since emerging security scenarios are consistent, administrators are able to extend capabilities through microsegmentation.

For example, a security administrator may start with an effective firewall distributed in the data center. They may then add IPS and stateful firewalling for visibility of deeper traffic. Alternatively, the administrator may develop better server security using agentless anti-malware.

All the same, administrators need all functions to cooperate to have more effective security. For this to happen, microsegmentation ensures sharing of intelligence between security functions is enabled. The end result is a security infrastructure working concertedly to design response to different situations.

Ubiquity

Normal data center architecture concentrates security on important workloads. This is usually at the cost of creating minimal security for lower priority systems. Managing and deploying traditional security in virtual networking is costly. The cost forces data center administrators to ration security. Intruders take advantage of the low security in low-priority systems to infiltrate the data center.

To have a sufficient defense level, security administrators are required to rely on a high level of security in every system in a data center. This is made possible through microsegmentation because it embeds security functions into the infrastructure itself. Taking advantage of this allows administrators to depend on security functions for all workloads in the data center.


Persistence

A security administrator will need to be sure enforcement of security persists even when there are changes in the network environment. This is crucial due to the constant change in data center topologies. Workloads are moved, server pools are expanded, networks are re-numbered, and so on. The constants in all this change are need for security and the workload.

The security protocols implemented when a workload was first deployed in a changing environment will no longer be enforceable after a short while. The situation is mostly common in scenarios where the policy relied on loose associations. Examples of loose associations with workloads include protocol, port, and IP address. The challenge of maintaining this persistent security is aggravated by workloads to the hybrid cloud, or even other data centers.

Administrators are given more useful ways to describe the workload through microsegmentation. They can describe inherent characteristics of a workload, instead of depending on IP addresses. The information is then tied back to the security policy. Once this is done, the policy can answer questions such as: what kind of data will this workload handle (personally identifiable information, financial, or low-sensitivity)?, or what will the workload be used for (production, staging, or development)? Additionally, administrators can combine these characteristics to describe inherited policy attributes. For instance, a production workload handling financial data may get a higher level of security than a workload handling financial data.

How to Make Microsegmentation Work

Similar to conventional virtualization, there are many ways to implement network microsegmentation. In most scenarios, the existing protection mechanisms and legacy infrastructure are augmented systematically with new technologies, which include virtual firewalls and software defined networking. When it comes to adoption of microsegmentation technology, there are three major considerations involved.

Visibility is the first thing to be considered. Potential adopters need to thoroughly understand communication patterns and network traffic flow within, from, and to the data center. Next, use a zero-trust approach to implement security policies and rules. This is a complete lock down of communications. Throughout the deployment of microsegmentation, zero-trust policies should be followed. Across the network, communication should be only allowed carefully using the results of previous analysis. It is the best practice for anyone who wants to ensure application security and connectivity.

The process is to be repeated regularly. Distilling rules and analyzing traffic is not a deployment effort that is done once. It needs to be a continuous activity that has to be done often to make sure policies and workloads do not change suddenly and any current analytical results can be used to effectively tune microsegmentation rules. Current analytical results may come from changes in traffic patterns or new applications. All these are consideration putting an emphasis on the choice of tools and hypervisor used in microsegmentation facilitation.

Recommended Posts

On-Demand

Industry Wide Cloud Skills Shortage

by

Syed Ali

|

December 6, 2022

Freelance

Future of Work 2022: Freelancing State of Mind

by

Gary McCauley

|

December 6, 2022

Business

Business Adaptation Is Necessary for Growth

by

Gary McCauley

|

December 6, 2022