card image

 

Whether you work from home, commercial build, or traverse a grand corporate campus, security is not going away, and everyone needs to be mindful of it.  Whether new to network security or a seasoned veteran, the threat landscape is evolving daily.  Throughout this article, I will share where we started with network security and the direction many are beginning to embrace to secure their empires.

 

Did you know DEC (Digital Equipment Corporation) wrote the first paper on firewall technology?

 

In 1987, DEC wrote the first paper describing firewall technology and the premise of the large “castle-and-moat” network security paradigm.  Castles were often the control center for a kingdom, and if they were left unprotected, they would be vulnerable to a variety of attacks and potentially become crippled.  Thus, around 1016, moats were first used to protect them in medieval times.  Castles were typically built on high hills; the area below it was called the moat.  If you have read any fairytale stories, it was always mentioned that the moats had alligators living in them to add another barrier of protection; however, animals would not survive in one, but they often had eels and fish.  Thus, DEC created an electronic system called a firewall to prevent attacks.

 

Did you know that in 2001, IEEE released the 802.1X standard, a NAC protocol?

 

The IEE released the 802.1X called NAC (Network Access Control).  In short, NAC is a technology that allows organizations to restrict unauthorized devices and users from accessing the corporate network.  Thus, NAC solves two potential security issues many companies have: providing access to permitted devices compliant with specific corporate security policies.

 

The Jerrico forum introduced the concept of de-parameterization in 2004

 

The Jerrico forum comprised members and vendor members from a loose connection to CISOs (Chief Information Security Officers).  Jerrico was concerned that “reparameterization” has become obsolete, according to Paul Simonds, CISO of the UK, who kicked off a Black Hat Briefing in Las Vegas. He said, “We’ve lost the war on good security.”  Understanding the deperimeterization doesn’t mean forgetting about firewalls that may call from the perimeter and in some web service implementations.  Thus, according to Simmons, this will be a new standard to combat a new interoperable OS-agnostic.

 

 

Did the DISA publish its “black core” model?

 

DISA (Defense Information System Agency) and the DOD (Department of Defense) released their work regarding a more secure enterprise strategy.  They called this new strategy “black core”(BCORE).  The central premise of this new model required moving the current perimeter security to one that concentrated on individual transactions.  Thus, we can learn from this that the concept of “Zero Trust” is not new and was first introduced in the NIST SP 800-207 “black core” paper published in 2007. 

 

Did you know Google implemented a “Zero Trust” Computer Concept in  2009?

 

Code-named “BeyondCorp” by Google was an implementation of  Zero Trust Network in 2009, resulting in response to the  Aurora Attack.  This was an open-source implementation carried out on Google’s Servers, which took flight because of a research paper they wrote on access proxy known as “transcend.”

 

Forrester Research uses the term  “Zero Trust” for the first time.

 

John Kindervag Forester, Researcher, and analyst, first coined “Zero Trust” after his original motto, “Never trust, always verify.”  This revolutionary discovery pointed out risk is an inherent factor inside and outside the network.”  Many companies today believe that it is safe if the traffic is coming from within their internal network; this is far from the truth. In fact, around 2014, Sony was not just hacked but nuked from the inside.  If they had been using Zero Trust for their network security, this attack would have been blocked, as in zero trust, nothing is allowed until it is validated, and trust will only be granted.

 

The Jerrico Forum Disbands?

The Jerrico Forum in 2013 officially declared deperimeterization a “fact” and disbanded.  Over a decade later, the world realizes what many have known for years: Zero Trust is the only way to protect against internal and external network threats.

Gartner devises CARTA Framework.

 

Gartner, in 2010, released Continuous Adaptive Risk and Trust Assessment (CARTA), a specific IT Security framework that dives deeper than ordinary role-based access control (RBAC) by adding attribute-based access control (ABAC) to support continuous context-aware real-time security assessment but didn’t become popular till 2017.  CART requires effective risk and cyber security to have 100% device visibility and facilitate automated control.  It also stipulates that there will be continuous monitoring of all devices, assessment, and resolution delivery to mitigate cyber security risks.  Finally, they require segmentation, forcing any breaches that may occur to limit their movement to prevent damage.

 

 

Gartner pushed the limits by adding. (SASE)

 

SASE (Secure Access Service Edge) is a technology that regulates the security of devices connected to a data center to prevent cyber security threats.  Thus, it is a secure framework add-on, allowing users to provide the best experience while connecting them to applications and other resources seamlessly and securely.

 

NIST creates documented standards for ZTA (Zero Trust Access)

 

In 2020, NIST released a publication SP 800-207 to document standards for ZTA Architecture.  This document describes ZTA for enterprise architects and is also meant to be a guide to help in understanding ZTA, its enterprise roadmap,-oo and the related concepts.

 

Did you know the U.S. Government mandates all offices adopt ZTA for all its agencies by 2024

 

In 2022, the U.S. Government requires that all of its agencies' offices be on ZTA by 2024.  They are making this requirement to enforce the Government's stance against the ever-growing, persistent, malicious cyber threat campaigns.  We all remember The SolarWinds supply chain attack on many government institutions using this IT management tool to deploy viruses to many computers on their campuses faster than they knew what happened.

 

How do you implement a ZTA architecture?

Setting up a ZTA infostructure requires 100% visibility and control over all the environment users and traffic, including any encrypted traffic.  MFA (Multi-factor authentication) is required to validate access with more than a password, such as adding biometrics, facial recognition, and OTP (One Time Password Code).   ZTA has many benefits; unfortunately, many small companies won’t implement this immediately because it may require it since every device, user, and application must be validated.  Unfortunately, some firms that attempt to implement this will fail because they may lack the information to secure legacy and IoT devices.  Thus, every device, application, and user must be considered untrusted until it is validated through more than just a password.  Gartner says that “over 50% of organizations will fail to realize the benefits of zero trust.” 

 

The reasons why ZTA Implementation may fail in your organization

 

ZTA is not new and has been tired kicked for over a decade, but it is only within the last few years that it is becoming seen as a viable solution to mitigate cyber threats.  ZTA implementations will often fail for three reasons: technological challenges, organizational challenges, and economic challenges.  I hope you have learned something and will start investigating how to make your organization a ZTA Infostructure.

 

 

 

Check out more of my fantastic content at http://believemeachieve.com