Category Archives: Cloud Computing

Micro-segmentation – Protecting Advanced Threats Within The Perimeter

Micro-segmentation – Protecting Advanced Threats Within The Perimeter


Changing with the times is frequently overlooked when it comes to data center security. The technology powering today’s networks has become increasingly dynamic, but most data center admins still employ archaic security measures to protect their network. These traditional security methods just don’t stand a chance against today’s sophisticated attacks.

That hasn’t stopped organizations from diving dive head-first into cloud-based technologies. More and more businesses are migrating workloads and application data to virtualized environments at an alarming pace. While the appetite for increased network agility drives massive changes to infrastructure, the tools and techniques used to protect the data center also need to adapt and evolve.

Recent efforts to upgrade these massive security systems are still falling short. Since data centers by design house huge amounts of sensitive data, there shouldn’t be any shortcuts when implementing security to protect all that data. The focus remains on providing protection only at the perimeter to keep threats outside. However, implementing strictly perimeter-centric security such as a Content Delivery Network (CDN) leaves the inside of the data center vulnerable, where the actual data resides.


(Infographic Source: Internap)

Cybercriminals understand this all too well. They are constantly utilizing advanced threats and techniques to breach external protections and move further inside the data center. Without strong internal security protections, hackers have visibility to all traffic and the ability to steal data or disrupt business processes before they are even detected.

Security Bottleneck

At the same time businesses face additional challenges as traffic behavior and patterns are shifting. There are greater numbers of applications within the data center, and these applications are all integrated with each other. The increasing number of applications has caused the amount of traffic going east-west traffic – or laterally among applications and virtual machines – within the data center to drastically grow as well.

As more data is contained with the data center and not crossing the north-south perimeter defenses, security controls are now blind to this traffic – making lateral threat movement possible. With the rising number of applications, hackers have a broader choice of targets. Compounding this challenge is the fact that traditional processes for managing security are manually intensive and very slow. Applications now are being rapidly created and evolving far more quickly than static security controls are able to keep pace with.

To address these challenges, a new security approach is needed—one that requires effectively bringing security inside the data center to protect against advanced threats: Micro-segmentation.



Micro-segmentation works by grouping resources within the data center and applying specific security policies to the communication between those groups. The data center is essentially divided up into smaller, protected sections (segments) with logical boundaries which increase the ability to discover and contain intrusions. However, despite the separation, application data needs to cross micro-segments in order to communicate with other applications, hosts or virtual machines. This makes lateral movement still possible, which is why in order to detect and prevent lateral movement in the data center it is vital for threat prevention to inspect traffic crossing the micro-segments.

For example, a web-based application may utilize the SQL protocol for interacting with database servers and storage devices. The application web services are all logically grouped together in the same micro-segment and rules are applied to prevent these application services from having direct contact with other services. However SQL may be used across multiple applications, thus providing a handy exploit route for advanced malware that can be inserted into the web service for the purpose of laterally spreading itself throughout the data center.

Micro-segmentation with advanced threat prevention is emerging as the new way to improve data center security. This provides the ability to insert threat prevention security – Firewall, Intrusion Prevention System (IPS), AntiVirus, Anti-Bot, Sandboxing technology and more – for inspecting traffic moving into and out of any micro-segment and prevent the lateral spread of threats. However, this presents security challenges due to the dynamic nature of virtual networks, namely the ability to rapidly adapt the infrastructure to accommodate bursts and lulls in traffic patterns or the rapid provisioning of new applications.

In order to address data center security agility so it can cope with rapid changes, security in a software-defined data center needs to learn about the role, scale, and location of each application. This allows the correct security policies to be enforced, eliminating the need for manual processes. What’s more, dynamic changes to the infrastructure are automatically recognized and absorbed into security policies, keeping security tuned to the actual environment in real-time.

What’s more, by sharing context between security and the software-defined infrastructure, the network then becomes better able to adapt to and mitigate any risks. As an example, if an infected VM is identified by an advanced threat prevention security solution protecting a micro-segment, the VM can automatically be re-classified as being infected. Re-classifying the VM can then trigger a predefined remediation workflow to quarantine and clean the infected VM.

Once the threat has been eliminated, the infrastructure can then re-classify the VM back to its “cleaned” status and remove the quarantine, allowing the VM to return to service. Firewall rules can be automatically adjusted and the entire event logged – including what remediation steps were taken and when the issue was resolved – without having to invoke manual intervention or losing visibility and control.

Strong perimeter security is still an important element to an effective defense-in-depth strategy, but perimeter security alone offers minimal protections for virtualized assets within the data center. It is difficult to protect data and assets that aren’t known or seen. With micro-segmentation, advanced security and threat prevention services can be deployed wherever they are needed in the virtualized data center environment.

By Yoav Shay Daniely

InformationWeek Reveals Top 125 Vendors Taking the Technology Industry by Storm

InformationWeek Reveals Top 125 Vendors Taking the Technology Industry by Storm

InformationWeek Reveals Top 125 Vendors

Five-part series details companies to watch across five essential technology sectors

SAN FRANCISCO, Sept. 27, 2016 /PRNewswire/ — InformationWeek released its list of “125 Vendors to Watch” in 2017. Selected by InformationWeek’s expert editorial team, the companies listed fall into one of five key themes: infrastructure, security, cloud, data management and DevOps.

The rapid pace of technological change puts more pressure on IT organizations than ever before, but also offers unprecedented opportunities for companies to rethink how they do business,” said Susan Fogarty, Director of Content, InformationWeek & Interop ITX. “We are pleased to recognize technology suppliers that are helping our readers to navigate the possibilities.”

The technology industry is in a state of constant transition and evolution. In turn, new benchmarks are developing as a fresh class of innovative tools, disruptive technology and methodology, and professionals break into the space. To meet these expectations, technology vendors are hard at work to ensure they are adequately adapting to provide the enterprise with the most innovative and effective systems and products.

Across the wide spectrum of sectors that the tech industry touches, there has been a surge of innovation within a few key areas: infrastructure, security, cloud, data management and DevOps. To help professionals navigate where they should be looking for the latest and greatest technologies within these growing sectors, InformationWeek has compiled a list of top 25 companies per theme. The InformationWeek editorial team has detailed their selections for 2017 in a five-part blog series.

InformationWeek’s Top 125 Technology Vendors to Watch

Infrastructure: Businesses are rethinking their IT infrastructures and vendors are looking to software and open source solutions to help them reform. Here are the top companies providing those solutions.

  • A10 Networks
  • Barefoot Networks
  • Big Switch Networks
  • Cambium Networks
  • Cisco
  • CloudGenix
  • CoreOS
  • Cumulus Networks
  • Docker
  • ExtraHop
  • Mist
  • Nimble Storage
  • Nutanix
  • Pluribus Networks
  • Pure Storage
  • SimpliVity
  • NetApp SolidFire
  • Rubrik
  • StorageOS
  • SwiftStack
  • Teridion
  • Veeam Software
  • VeloCloud
  • Viptela
  • VMware

Security: A wave of companies is entering the security field, armed with technologies to help businesses mitigate the next generation of cyberattacks. Here are some of the most innovative security companies to watch.

  • Accenture/FusionX
  • Bay Dynamics
  • CloudFlare
  • CrowdStrike
  • Cymmetria
  • Deep Instinct
  • FireEye/Mandiant
  • IBM and Watson
  • Intel/McAfee
  • IOActive
  • Kaspersky
  • Lookout
  • Nok Nok Labs
  • Okta
  • Onapsis
  • Optiv
  • Palo Alto Networks
  • Rapid7
  • RSA
  • Splunk
  • Symantec/Blue Coat
  • Vectra Networks
  • Veracode
  • White Ops
  • Zscaler

Cloud: The need to handle big data and real-time events, alongside the ability to respond to business demands has dictated a shift towards cloud computing by companies seeking to remain competitive in the information age. Here are some of the companies staying ahead of the curve.

  • Alibaba Cloud
  • Amazon Web Services
  • Apptio
  • Bluelock
  • CloudHealth Technologies
  • CenturyLink
  • CGI IaaS
  • CSC Agility Platform
  • DigitalOcean
  • Dimension Data
  • EMC Virtustream
  • Fujitsu K5
  • GoGrid
  • Google Cloud Platform
  • IBM Cloud
  • Joyent Cloud
  • Kaavo
  • Microsoft Azure
  • New Relic
  • Oracle Cloud
  • Pantheon
  • Rackspace
  • SAP Hana Cloud Platform
  • Verizon Cloud
  • VMware

Data Management: Data is critical to get a clear picture of customers, products, and more, but in order to do so, that data must be managed across multiple systems — systems that aren’t necessarily compatible. Here are the companies that can help enterprise organizations wrangle their multiple data sources.

  • Alation
  • Ataccama
  • AtScale
  • Cloudera
  • Collibra
  • Confluent
  • Databricks
  • Dell Boomi
  • Hortonworks
  • Informatica
  • Information Builders
  • Looker
  • MapR
  • MarkLogic
  • MongoDB
  • Orchestra Networks
  • Profisee
  • Reltio
  • SAP
  • SAS
  • SoftwareAG
  • Talend
  • Teradata
  • TIBCO Software
  • Verato

DevOps: Organizations looking to transform into a DevOps organization need a solid plan, complete with executive buy-in, and the right tools to get all the jobs done. Here are InformationWeek’s picks for companies offering products organizations should know about when making the move to DevOps.

  • Atlassian
  • Canonical (Ubuntu Juju)
  • Chef
  • CFEngine
  • Electric Cloud
  • Google (Cloud Platform)
  • HashiCorp
  • Inedo
  • Jenkins
  • Kony
  • Loggly
  • Microsoft (Visual Studio)
  • Nagios
  • New Relic
  • Octopus Deploy
  • Path Solutions
  • Puppet
  • RabbitMQ
  • Red Hat (Ansible)
  • SaltStack
  • Splunk
  • Tripwire
  • UpGuard
  • UrbanCode (IBM)
  • Xamarin

Interop ITX 2017

The same core industry themes highlighted by InformationWeek will be incorporated into Interop’s revamped Conference, Interop ITX. In the technology industry, change outpaces many others, the next phase of tech education is Interop ITX – a Conference that anticipates the X factor: anyone or anything that can impact your business, your customers, or your market.The Conference incorporates an educational program built by Interop’s trusted community of technology professionals. To learn more about Interop ITX and to register, please visit: 


For more than 30 years, InformationWeek has provided millions of IT executives worldwide with the insight and perspective they need to leverage the business value of technology. InformationWeek provides CIOs and IT executives with commentary, analysis and research through its thriving online community, digital issues, webcasts, virtual events, proprietary research and live, in-person events. InformationWeek’s award-winning editorial coverage can be found at InformationWeek is organized by UBM Americas, a part of UBM plc (UBM.L), an Events First marketing and communications services business. For more information, visit 

UBM Americas

UBM Americas, a part of UBM plc, delivers events and marketing services in the fashion, technology, licensing, advanced manufacturing, automotive and powersports, healthcare, veterinary and pharmaceutical industries, among others.  Through a range of aligned interactive environments, both physical and digital, UBM Americas increases business effectiveness for customers and audiences through meaningful experiences, knowledge and connections. The division also includes UBM Brazil’s market leading events in construction, cargo transportation, logistics & international trade, and agricultural production; and UBM Mexico’s construction, advanced manufacturing and hospitality services shows. For more information,

Part 1 – Connected Vehicles: Paving The Way For IoT On Wheels

Part 1 – Connected Vehicles: Paving The Way For IoT On Wheels

Connected Vehicles

From cars to combines, the IoT market potential of connected vehicles is so expansive that it will even eclipse that of the mobile phone. Connected personal vehicles will be the final link in a fully connected IoT ecosystem. This is an incredibly important moment to capitalize on given how much time people spend in cars. When mobility services and autonomous cars begin to take hold, it will become even more critical as people will be behind a screen instead of behind the wheel.

Industrial equipment and mobility solutions may prove to be even larger and more lucrative than the consumer side. McKinsey & Company says that by 2025, IoT will have an economic impact up to $11 trillion a year, and that 70 percent of the revenue IoT creates will be generated from B2B businesses. In the connected vehicle world, industrial applications will likely follow this trend.


(Infographic source: Spireon)

Technologies like GPS, telematics services, on-board computers, specialized sensors, internet connectivity, and cloud-based data stream management are all fairly mature. The challenge businesses are facing now is how all this technology can be interconnected and used to monetize IoT services in connected vehicles.

To get started, here is a framework for understanding the connected vehicle space—from consumer to industrial offerings. I’ve outlined challenges and opportunities associated with the various business models and lessons learned from other markets to offer guidelines, best practices, and guardrails to maximize the chances for commercial success.

What are the offerings?

The number of individual connected car services that are available or coming to the marketplace is large, but we can place them in four basic categories:

  • Transportation as a Service – Any alternative to traditional sale or lease of a vehicle that requires some amount of connectivity in order to work. Examples: peer-to-peer car sharing, multi-entity (group) leasing, and fleet subscriptions.
  • Post-Sale/Lease Secondary Services – Services offered to vehicle owners/lessees after initial vehicle acquisition. Examples: entertainment delivery, driver experience personalization, roadside assistance, mapping and geo-fencing, human-assisted services like on-demand concierge parking, and intelligent preventive maintenance subscriptions.
  • Road Use Measurement Services – Services directly based upon telematics-sourced data streams. Examples: usage-based insurance, road-use-based taxation, and commercial fleet tracking and management.
  •  Secondary Data Stream Monetization – The analysis of individual driving habits and patterns.

Examples: personalized discounted insurance promotions, or data for to third parties.

What should you do next?

Although these use cases appear to be vastly different from one another, there are some common themes and guidance which can be gleaned from them.

Here are next steps for any connected car industry player:

  • Get your Data House in Order – The connected car depends on data streams produced by sensors and devices, but knowing everything about a person’s driving can be dangerous in malicious hands, so stringently secured systems and policies must be put into place. Some data is vehicle-specific and other data is individual-specific, and different data will need to go to different places, so companies need to deploy sophisticated data management systems that can handle an ever-shifting ‘many-to-many-to-many’ landscape of identity management. Savvy organizations will put systems in place at the outset in order to ‘future-proof’ themselves and lay the groundwork for market agility and consumer safety.
  • Embrace Your Data – If you have access to data, even when it has no apparent direct influence on how you choose to charge for your offering, retain it anyway. Insights are always available via analysis of the consumption patterns of users, but only if you keep the data in an organized, centralized, accessible place. Knowing how users consume a service allows you to stay nimble in a market that is guaranteed to constantly shift and change.
  • The Money Isn’t in the ‘Thing’ – In the IoT world in general, many businesses realize too late that the true monetization opportunity lies NOT with the physical ‘thing’ itself, but rather with the virtual perpetual service that it unlocks. Companies like Jawbone, GoPro, and FitBit all learned this the hard way. Recurring, perpetual services provide an ongoing linkage to a customer that cannot be achieved via any one-and-done sales model, and build an annuity for the enterprise that keeps on giving and doesn’t require constant reinvestment in costly customer acquisition.
  • Learn to ‘Count the Beans’ Differently – Connected vehicle services, like any IoT-based services, lend themselves readily to recurring revenue business models which are measure by fundamentally different KPIs than the one-time-sale model. Profit margins in recurring revenue models are often not fully realized at point of sale, but with a ‘long tail’ of annuity-style profit once margin has been reached. The lessons learned by OEMs years ago when many also became direct lenders are the best analog for recurring revenue success available to them. OEMs should consider housing their connected car strategies within their lending arms for this reason.
  • Take Engineers Out of the Equation Where You Can – Agility and speed to market are of paramount importance in the connected car world. Organizations that show a willingness and ability to get to market quickly, even if imperfectly, and to aggressively pursue multiple simultaneous monetization strategies knowing that not all will succeed—these will be the organizations that ‘win’ in the connected car space.
  • Prepare Back-Office Systems – Unfortunately, especially for mature enterprises like OEMs that have developed dependencies on legacy back-office systems, existing infrastructure was not built to rapidly support new offering models like recurring revenue. More modern (usually cloud-based) back-office systems offer not only dramatically shorter implementation timelines than legacy or home-grown systems, but also put the power of innovation and change in the hands of business users rather than engineers.

The Clearest True North—Connected Vehicles

The obsession with the cult of cars is understandable, but the truth is that there are many segments of “IoT on Wheels” that are moving quickly towards full-on connectedness.

Like many subsets of the IoT (e.g. wearables, connected home) consumer applications for connected personal vehicles get all the hype, but the business applications are really demonstrating the most traction.

Whether B2B or B2C, there is no magic formula for exactly how to make money in the connected vehicle space, however, we are sure to see fascinating changes occurring in the transportation landscape in the coming years. There is hardly anything as captivating, and potentially profitable, as the emergence of IoT on wheels.

Stay tuned for Part 2 of the series next month…

By Tom Dibble

Embedded Sensors and the Wearable Personal Cloud

Embedded Sensors and the Wearable Personal Cloud

The Wearable Personal Cloud

Wearable tech is one avenue of technology that’s encouraging cloud connections and getting us all onto interconnected networks, and with the continued miniaturization and advancement of computing the types of wearable tech are always expanding and providing us with new opportunities. A few years ago, smartwatches were rather clunky devices with their computing power quite obviously on display, but today the sleek devices that adorn our wrists offer as much style as tech capability. How long until the stylish eyewear sported offers more than protection from UV rays, and the clothes we’re donning provide insights into our physical condition?

Wearable Tech & The Cloud

Much of wearable tech’s advantage is in the data it’s able to collect, store, and ultimately send out for analysis. The cloud plays an integral role in wearable tech, not least of all the management of the data. Moreover, with advances in connection methods, battery life, and cloud infrastructures the insights we’re able to take from all of this collected data are enhanced, just as the time to realization is shortened. In fact, much of the intelligence wearable devices feed back can now be achieved in real time thereby strengthening the advantages. Without the cloud, wearables may be relegated to the awkward corner, requiring far more user interaction and administration than most are willing to give, but as the cloud makes wearable communication a smooth, sleek, and autonomous procedure, so too does is provide the added profit of connection to social media networks for even more personal and insightful gains.

Wearable Tech & Mobile Computing


According to researchers and infographic discovered via the University of Alabama at Birmingham, wearable tech could be heading in the direction of a ‘wearable personal cloud.’ With the latest in embedded sensors advancing smart clothing, nodes would be able to communicate effectively with smartphones, smartwatches, and tablets, and UAB researchers suggest that small computers, perhaps ten cheap and petite Raspberry Pis, embedded within a smart jacket would mean mobile devices could do away with complex and powerful processes as, instead, they become “dumb terminal devices” connected to the smart jacket mainframe. Says Ragib Hasan, Ph.D., assistant professor of computer and information sciences in the UAB College of Arts and Sciences, “Once you have turned everything else into a ‘dumb device,’ the wearable cloud becomes the smart one. The application paradigm becomes much more simple and brings everything together. Instead of individual solutions, now you have everything as a composite solution.”

The wearable personal cloud proposed by Hasan and his colleague, Rasib Khan, is a step ahead of smart clothing in that the system model can be extended to items outside of the clothing set. It’s proposed that these devices could be linked together into a shared cloud which would provide invaluable information in emergency and disaster situations. Suggests Hasan, “With seven to ten people wearing such a cloud together, they create what we call a hypercloud, a much more powerful engine. The jacket can also act as a micro or picocell tower. All of its capabilities can be shared on a private network with other devices via WiFi or Bluetooth. If a first responder is out in the field and doesn’t have complete information to act on a mission, but someone else does, it can be shared and updated through the cloud in real time.” Additional benefits of this wearable personal cloud come into play with monitoring and maintaining patient health status in hospitals, and furthermore, personal data could be retained within the wearable jacket, thus providing better data security and privacy.


Today, the idea of a wearable personal cloud is drawing attention, but with such rapid progress it’s hard to imagine what the next few years will bring. Some experts believe wearables will in fact morph into ‘implantables’ in the not too distant future, and it’s possible that much of the work put into today’s wearable tech will be supplanted with the future’s implantable tech. For now, most of us are more comfortable being able to take off our smart devices as we choose, and innovators still have a way to go before the general public agrees to build technology into themselves.

By Jennifer Klostermann

SWIFT Says Bank Hacks Set To Increase

SWIFT Says Bank Hacks Set To Increase

Bank Hacks Set To Increase

SWIFT, whose messaging network is used by banks to send payment instructions worth trillions of dollars each day, said three clients were hacked over the summer and cyber attacks on banks are set to increase.

The theft of $81 million in February from Bangladesh’s central bank using SWIFT messages rocked faith in the system whose messages had, until then, been accepted at face value.

SWIFT Chief Executive Gottfried Leibbrandt told the Sibos conference in Geneva on Monday that hackers breached the systems of two banks over the summer and a third bank repelled an attack before fraudulent SWIFT messages could be sent.

In the two cases where hackers sent payment instructions over SWIFT, the orders were not fulfilled. In the first, the receiving bank noticed that the instruction did not conform with normal transaction patterns and queried it.

In the second case, the payment was held up because the receiving bank had concerns about the ultimate beneficiary of the transfer and flagged the transaction to the paying bank, which then realized it had been hacked.

In the third case, the bank had installed a software patch from SWIFT which allowed the lender’s system to spot the infiltration.

In all of those cases no money was lost,” Leibbrandt said.

Read Full Article: Reuters

The Future of Cybersecurity and Authentication Methods

The Future of Cybersecurity and Authentication Methods

The Future of Cybersecurity

Cybersecurity has been on the minds of companies everywhere since the Dropbox and Yahoo hacks occurred. With the advent of cloud connected technology and the growing sophistication of malware and hacking attempts, it seems many common cybersecurity methods have become outdated.

So what could be in the future of companies seeking to improve their cybersecurity methods? We asked Stephen Gates, the Chief Intelligence Analyst with NSFOCUS, what his thoughts were on the situation.

stephen-gatesGates immediately told us that all companies need to start employing multi-factor authentication as a mandatory part of their information systems. He referenced a quote from the Cybersecurity National Action Plan: “The President is calling on Americans to move beyond just the password to leverage multiple factors of authentication when logging-in to online accounts. Private companies, non-profits, and the Federal Government are working together to help more Americans stay safe online through a new public awareness campaign that focuses on broad adoption of multi-factor authentication.” However, Gates also noted that there currently is no regulation being enforced in how companies handle their cybersecurity, and that public awareness efforts can only go so far in solving the issue.

The biggest threat to IT executives today is prevalence of employees using work machines for personal business purposes. Employees that have their personal accounts hacked increase the likelihood of successful phishing, malware, and ransomware attacks hitting company networks. Because of this, parts of the U.S. government banned employee access to certain online email services earlier this year. Perhaps other organizations would be well advised to follow suit in the wake of the recent security breaches.

While organizations who store millions of user account credentials for online services are getting better at protecting their data, many still could be falling short. Two-factor authentication should be implemented everywhere, on user accounts as well as administrator accounts. If two-factor authentication is not widely adopted, it will not solve the problem at large.

One of Gate’s suggestions for increasing cybersecurity effectiveness is preventing employees from using company machines and networks for personal business. While he agreed that this can be seen as a potential problem with younger and younger workforces – policing their usage leading to animosity, distrust, and attrition – he proposed a solution that allows companies to protect their machines and networks from personal usage without upsetting the younger generations: “create an environment whereby personal interaction with the Internet can be done at work, without using corporate devices and the corporate network.” He explained that setting up a separate network with different machines specifically labeled for personal usage by employees could help keep both companies and their employees happy and safe.

Alternative Authentication


I had heard a rumor about security experts discussing a possible future in using alternative forms of authentication instead of a password. I asked Gates about what this could mean. “Fingerprints, retinas, facial features, and even DNA are all very unique to each individual,” Gates explained to me. “In addition, researchers have recently discovered that each human’s hair proteins are also very unique. These are the types of things that must be used to authenticate someone; not passwords, tokens, and two-factor codes.

There are some new developments in attempting to implement a better method of authentication across the board. For example, many laptops today come with fingerprint scanners. Smartphones are now using applications that can identify facial features for authentication using the cameras they come with. Physical security may include retinal scans and even hand scanners. These are things people can’t lose, can’t forget, and most likely can’t be stolen. Personally, I think facial feature authentication is a great step in the right direction. It’s not too overly intrusive and most people would not be afraid of it – like a retinal scan.”

While facial recognition is still in its early infancy, the industry will soon become more proficient in identifying possible biomarkers on someone’s face that are difficult to spoof. The cameras on smartphones nowadays are just as good if not better than many of the stand-alone cameras on the market, so the next step would be to install higher quality cameras on computers that can adjust themselves automatically for different environmental conditions like lighting, makeup, hair, and aging. While other forms of authentication like retinal scanning may seem intrusive, no one seems to mind taking pictures of themselves. However, for now current two-factor authentication methods would still need to be implemented as a form of backup in case facial recognition fails.

So, it would seem that biometrics are possibly in the not-too-distant future as the new standard of authentication, with facial recognition being the most likely method to be implemented due to society already being pre-conditioned for taking selfies. In the meantime, companies need to make sure that their employees are using two-factor authentication on their company owned user accounts. Companies would also benefit from separating machines and networks intended for business related usage from machines and networks used for personal business purposes to help isolate attacks.

By Jonquil McDaniel

Security: Avoiding A Hatton Garden-Style Data Center Heist

Security: Avoiding A Hatton Garden-Style Data Center Heist

Data Center Protection

In April 2015, one of the world’s biggest jewelry heists occurred at the Hatton Garden Safe Deposit Company in London. Posing as workmen, the criminals entered the building through a lift shaft and cut through a 50cm-thick concrete wall with an industrial power drill. Once inside, the criminals had free and unlimited access to the company’s secure vault for over 48 hours during the Easter weekend, breaking into one safety deposit box after another to steal an estimated $100m worth of jewelry.

So why weren’t the criminals caught? How did they have free reign into all of the safety deposit boxes? It turns out that the security systems only monitored the perimeter, not inside the vault. Despite the burglars initially triggering an alarm to which the police responded, no physical signs of burglary were found outside the company’s vault. So the perpetrators were able to continue their robbery uninterrupted. In other words, the theft was made possible by simply breaching the vault’s perimeter – once the gang was inside, they could move around undetected and undisturbed.


(Image Source: Wikipedia)

Most businesses do not have store gold, diamonds or jewelry. Instead, their most precious assets are data. And they’re not stored in reinforced vaults, but in data centers. Yet in many cases, both vaults and data centers are secured against breaches in similar ways. Organizations often focus on reinforcing the perimeter and less on internal security.

If attackers are able to breach the external protection, they can often move inside the data center from one application to the next, stealing data and disrupting business processes for some time before they are detected – just like the criminal gang inside the Hatton Garden vault were able to move freely and undetected. In some recent data center breaches, the hackers had access to applications and data for months, due to lack of visibility and internal security measures.

Security Challenges in Virtualized Environments

This situation is made worse as enterprises move from physical data center networks to virtualized networks – to accelerate configuring and deploying applications, reduce hardware costs and reduce management time. In this new data center environment, all of the infrastructure elements – networking, storage, compute and security – are virtualized and delivered as a service. This fundamental change means that the traditional security approaches of securing the network’s perimeter is no longer suitable to address the dynamic virtualized environment.

Main security challenges are:

Traffic behavior shifts – Historically, the majority of traffic was ‘north-south’ traffic, which crosses the data center perimeter and is managed by traditional perimeter security controls. Now, intra-data center ‘east-west’ traffic has drastically increased, as the number of applications has multiplied and those applications need to interconnect and share data in order to function. With the number of applications growing, hackers have a wider choice of targets: they can focus on a single low-priority application and then use it to start moving laterally inside the data center, undetected. Perimeter security is no longer enough.

Manual configuration and policy changes – In these newly dynamic data centers, traditional, manual processes for managing security are too slow, taking too much of the IT team’s time – which means security can be a bottleneck, slowing the delivery of new applications. Manual processes are also prone to human errors which can introduce vulnerabilities. Therefore, automating security management is essential to enable automated application provisioning and to fully support data center agility.

Until recently, delivering advanced threat prevention and security technologies within the data center would involve managing a large number of separate VLANs and keeping complicated network diagrams and configuration constantly up-to-date using manual processes. In short, an unrealistically difficult and expensive management task for most organizations.

Micro-segmentation: armed guards inside the vault

But what if we could place the equivalent of a security guard on every safety deposit box in the vault so that even if an attacker breaches the perimeter, there is protection for every valuable asset inside? As data centers become increasingly software-defined with all functions managed virtually, this can be accomplished by using micro-segmentation in the software-defined data center (SDDC).

Micro-segmentation works by coloring and grouping resources within the data center with communication between those groups applied with specific dynamic security policies. Traffic within the data center is then directed to virtual security gateways. The traffic is deeply inspected at the content level using advanced threat prevention techniques to stop attackers attempting to move laterally from one application to another using exploits and reconnaissance techniques.

Whenever a virtual machine or server is detected executing an attack using the above techniques, it can be tagged as infected and immediately quarantined automatically by the ‘security guard’ in the data center: the security gateway. This way, a system breach does not compromise the entire infrastructure.

Once an application is added and evolves over time, it is imperative for the security policy to instantly apply and automatically adapt to the dynamic changes. Using integration to cloud management and orchestration tools, the security in the software defined data center learns about the role of the application, how it scales and its location. As a result, the right policy is enforced enabling applications inside the data center to securely communicate with each other. For example, when servers are added or an IP address changes, the object is already provisioned and inherits the relevant security policies removing the need for a manual process.

Just as virtualization has driven the development of scalable, flexible, easily-managed data centers, it’s also driving the next generation of data center security. Using SDDC micro-segmentation delivered via an integrated, virtualized security platform, advanced security and threat prevention services can be dynamically deployed wherever they are needed in the software-defined data center environment. This puts armed security guards around inside the organization’s vault, protecting each safety deposit box and the valuable assets they hold – helping to stop data centers falling victim of a Hatton Garden-style breach.

By Yoav Shay Daniely

Automated Application Discovery Introduced By Savision At Microsoft Ignite 2016

Automated Application Discovery Introduced By Savision At Microsoft Ignite 2016

Automated Application Discovery

ATLANTA, GEORGIA – September 26, 2016Savision, a market leader in service-oriented monitoring solutions that unify IT operations with IT service management, today announced the release of its automated application discovery module for Unity iQ.

Savision’s automated application discovery module offers an agentless, trigger-based discovery method that captures infrastructure elements, application dependencies, as well as end-to-end service models in real time.

Since IT environments change frequently, manually managing changes to dependencies or configuration items (CIs) is tedious and error prone. Unity iQ’s real-time discovery and application dependency modeling not only detects changes to infrastructure elements, but it also captures the dependencies between and among applications that work in concert to deliver a complete business service.

The module works out-of-the-box across today’s cross-platform and hybrid environments, including: bare metal, virtualized, and private- and public-cloud. In addition to its real-time discovery engine, IT administrators can use dynamic rules to add human intelligence that can guide automatic updates accurately in ever transforming IT environments.

Unity iQ’s application discovery module accelerates the transformational journey of unifying IT operations and IT service management with the following features:

  • Rapid Element Discovery – Discover infrastructure elements or CIs by using its agentless, non-disruptive architecture that does not require a network port.
  • Application Topologies – Trigger-based discovery ensures that any changes to IT elements and application dependencies are discovered in real time.
  • Real-time Service Modeling – Building upon infrastructure element and application dependencies, services are automatically modeled and optimized using dynamic rules.
  • Change-aware CMDB – All CIs and services are matched and reconciled as changes occur with a CMDB.
  • Incident Impact Analysis – Real-time service modeling allows for any IT team to directly see the underlying cause of every incident.

With application modernization and IT transformation efforts in full-swing for many organizations, automated application and service discovery is more important than ever for maintaining service levels during such change,” said Rob Doucette, CTO, Savision. “Our discovery module is designed to work with the increasing complexity ushered in by today’s cross-platform and hybrid environments. By modeling the complete end-to-end service – enduser, applications, network, storage, etc. – Unity iQ provides a single pane of glass that IT operations teams know they can always trust when identifying and resolving incidents.

Unity iQ’s application discovery module is available immediately and it also integrates with Savision’s Live Maps product.

About Savision

Savision is the market leader in service-oriented IT monitoring solutions. The company’s solutions transform infrastructure and application monitoring data into dashboards that automate and unify IT operations with IT service management workflows. Savision allows companies to maximize the value of existing IT management tools in order to optimize IT service delivery, prevent problems, and reduce service downtime. Over 800 of the world’s most demanding companies, governments, and non-profits have deployed Savision’s solutions to visualize, rationalize, and optimize their IT service delivery workflows. Savision is privately held and headquartered in the Netherlands, with offices in the United States and Canada. For more information, visit

CloudTweaks Comics
The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

Timeline of the Massive DDoS DYN Attacks

Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

Using Private Cloud Architecture For Multi-Tier Applications

Using Private Cloud Architecture For Multi-Tier Applications

Cloud Architecture These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything…

Security: Avoiding A Hatton Garden-Style Data Center Heist

Security: Avoiding A Hatton Garden-Style Data Center Heist

Data Center Protection In April 2015, one of the world’s biggest jewelry heists occurred at the Hatton Garden Safe Deposit Company in London. Posing as workmen, the criminals entered the building through a lift shaft and cut through a 50cm-thick concrete wall with an industrial power drill. Once inside, the criminals had free and unlimited…

Adopting A Cohesive GRC Mindset For Cloud Security

Adopting A Cohesive GRC Mindset For Cloud Security

Cloud Security Mindset Businesses are becoming wise to the compelling benefits of cloud computing. When adopting cloud, they need a high level of confidence in how it will be risk-managed and controlled, to preserve the security of their information and integrity of their operations. Cloud implementation is sometimes built up over time in a business,…

The Cancer Moonshot: Collaboration Is Key

The Cancer Moonshot: Collaboration Is Key

Cancer Moonshot In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around…

Don’t Be Intimidated By Data Governance

Don’t Be Intimidated By Data Governance

Data Governance Data governance, the understanding of the raw data of an organization is an area IT departments have historically viewed as a lose-lose proposition. Not doing anything means organizations run the risk of data loss, data breaches and data anarchy – no control, no oversight – the Wild West with IT is just hoping…

How You Can Improve Customer Experience With Fast Data Analytics

How You Can Improve Customer Experience With Fast Data Analytics

Fast Data Analytics In today’s constantly connected world, customers expect more than ever before from the companies they do business with. With the emergence of big data, businesses have been able to better meet and exceed customer expectations thanks to analytics and data science. However, the role of data in your business’ success doesn’t end…

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Speed, flexibility, and innovation require multiple cloud services As businesses seek new paths to innovation, racing to market with new features and products, cloud services continue to grow in popularity. According to Gartner, 88% of total compute will be cloud-based by 2020, leaving just 12% on premise. Flexibility remains a key consideration, and…


Sponsored Partners