Category Archives: Security

Using Private Cloud Architecture For Multi-Tier Applications

Using Private Cloud Architecture For Multi-Tier Applications

Cloud Architecture

These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything across back-end networks causes headaches for the end-users who try to access the systems over VPN and other private links.

Many strategies have been implemented to address this issue across traditional datacenter infrastructures. Independent physical networks with a “DMZ” for public-facing components, complex routers and firewall configurations have all done the job, although they do add multiple layers of complexity and require highly specialized knowledge and skill sets to accomplish.

Virtualization has made management much easier, but virtual administrators are still required to create and manage each aspect of the configuration – from start to finish. Using a private cloud configuration can make the process much simpler, and it helps segment control while still enabling application administrators to get their jobs done.

Multi-tenancy in the Private Cloud

Private cloud architecture allows for multi-tenancy, which in turn allows for separation of the networking, back-end and front-end tiers. Cloud administrators can define logical relationships between components and enable the app admins to manage their applications without worrying about how they will connect to each other.

One example is a web-based application using a MySQL back-end data platform. In a traditional datacenter platform, the app administrators would request connectivity to either isolate the back-end database or to isolate everything and allow only minimal web traffic to cross the threshold. This requires network administrators to spend hours working with the app team to create and test firewalls and other networking rules to ensure the access they need without opening any security holes that could be exploited.

Applying private cloud methodology changes the game dramatically.

Two individual virtual networks can be created by the cloud administrator. Within each network, traffic flows freely, removing the need to manually create networking links between components in the same virtual network entirely. In addition, a set of security groups can be established that will only allow specified traffic to route between the back-end data network and the front-end web server network – specifically ports and protocols used for the transfer of MySQL data and requests. Security groups utilize per-tenant access control list (ACL) rules, which allow each virtual network to independently define what traffic it will and will not accept and route.

Private cloud networking

Due to the nature of private cloud networking, it becomes much easier to not only ensure that approved data is flowing between the front and back end networks, but to ensure that traffic only flows if it originates from the application networks themselves. This allows for free-flow of required information but blocks anyone outside the network from trying to enter through those same ports.

In the front-end virtual network, all web traffic ports are opened so that users can access those web servers. With the back-end network, the front-end network can be configured to easily reject any other protocol or port and only allow routing from the outside world to the front-end servers, but nowhere else. This has the dual effect of enabling the web servers to do their jobs but won’t allow other administrators or anyone else in the datacenter to gain access, minimalizing faults due to human error or malicious intent.

Once application and database servers are installed and configured by the application administrators, the solution is complete. MySQL data flows from the back-end network to the front-end network and back, but no traffic from other sources reaches that data network. Web traffic from the outside world flows into and out of the front-end network, but it cannot “leapfrog” into the back-end network because external routes would not be permitted to any other server in the configuration. As each tenant is handled separately and governed by individual security groups, app administrators from other groups cannot interfere with the web application. The admins also cannot cause security vulnerabilities by accidentally opening unnecessary ports across the board because they need them for their own apps.

Streamlined Administration

Finally, the entire process becomes easier when each tenant has access to self-service, only relying on the cloud administrator for configuration of the tenancy as a whole and for the provisioning of the virtual networks. The servers, applications, security groups and other configurations can now be performed by the app administrator, and will not impact other projects, even when they reside on the same equipment. Troubleshooting can be accomplished via the cloud platform, which makes tracking down problems much easier. Of course, the cloud administrator could manage the entire platform, but they no longer have to.

Using a private cloud model allows for greater flexibility, better security, and easier management. While it is possible to accomplish this with a traditional physical and virtual configuration, adding the self-service and highly configurable tools of a private cloud is a great way to take control, and make your systems work the way you want, instead of the other way around.

By Ariel Maislos, CEO, Stratoscale

ariel-maislosAriel brings more than twenty years of technology innovation and entrepreneurship to Stratoscale. After a ten-year career with the IDF, where he was responsible for managing a section of the Technology R&D Department, Ariel founded Passave, now the world leader in FTTH technology. Passave was established in 2001, and acquired in 2006 by PMC-Sierra (PMCS), where Ariel served as VP of Strategy. In 2006 Ariel founded Pudding Media, an early pioneer in speech recognition technology, and Anobit, the leading provider of SSD technology acquired by Apple (AAPL) in 2012. At Apple, he served as a Senior Director in charge of Flash Storage, until he left the company to found Stratoscale. Ariel is a graduate of the prestigious IDF training program Talpiot, and holds a BSc from the Hebrew University of Jerusalem in Physics, Mathematics and Computer Science (Cum Laude) and an MBA from Tel Aviv University. He holds numerous patents in networking, signal processing, storage and flash memory technologies.

Battle of the Clouds: Multi-Instance vs. Multi-Tenant

Battle of the Clouds: Multi-Instance vs. Multi-Tenant

Multi-Instance vs. Multi-Tenant

The cloud is part of everything we do. It’s always there backing up our data, pictures, and videos. To many, the cloud is considered to be a newer technology. However, cloud services actually got their start in the late 90s when large companies used it as a way to centralize computing, storage, and networking. Back then, the architecture was built on database systems originally designed for tracking customer service requests and running financial systems. For many years, companies like Oracle, IBM, EMC and Cisco thrived in this centralized ecosystem as they scaled their hardware to accommodate customer growth.

Unfortunately, what is good for large enterprises, does not typically translate to a positive experience for customers. While the cloud providers have the advantage of building and maintaining a centralized system, the customers must share the same software and infrastructure. This is known as a multi-tenant architecture, a legacy system that nearly all clouds still operate on today.


Here are three major drawbacks of the multi-tenant model for customers:

  • Commingled data – In a multi-tenant environment, the customer relies on the cloud provider to logically isolate their data from everyone else’s. Essentially, customers and their competitors’ data could be commingled in a single database. While you cannot see another company’s data, the data is still not physically separate and relies on software for separation and isolation. This has major implications for government, healthcare and financial regulations, and not to mention, a security breach that could expose your data along with everyone else.
  • Excessive maintenance and downtime – Multi-tenant architectures rely on large and complex databases that require hardware and software maintenance on a regular basis, resulting in availability issues for customers.  While some departments can experience downtime in the off hours such as sales or marketing, enterprise applications that are used across the entire enterprise need to be operational nearly 100 percent of time. Ideally, enterprise applications should not experience more than 26 seconds of downtime a month on average. They simply cannot suffer the excessive maintenance downtime of a multi-tenant architecture.
  • All are impacted – In a multi-tenant cloud, any action that affects the multi-tenant database such as outages, upgrades, or availability issues affect all those who share that multi-tenancy. When software or hardware issues are found on a multi-tenant database, it causes an outage for all customers. The same goes with upgrades. The main issue arises when this model is applied to run enterprise–wide business services. Entire organizations cannot tolerate this shared approach on applications that are critical to their success. Instead, they require upgrades done on their own schedule for planning purposes and for software and hardware issues to be isolated and resolved quickly.

With its inherent data isolation and multiple availability issues, multi-tenancy is a legacy cloud computing architecture that will not stand the test of time. To embrace and lead today’s technological innovations; companies need to look at an advanced cloud architecture called multi-instance. A multi-instance architecture provides each customer with their own unique database. Rather then using a large centralized database, instances are deployed on a per-customer basis, allowing the multi-instance cloud to scale horizontally and infinitely.

With this architecture and deployment model come many benefits, including data isolation, advanced high availability, and customer-driven upgrade schedules.

Here’s a closer look at each of these areas:

  • True data isolation – In a multi-instance architecture, each customer has its own unique database making sure their data is not shared with other customers. A multi-instance architecture is not built on a large centralized database, instead, instances are deployed on a per-customer basis, making hardware and software maintenance easier to perform and issues can be resolved on a customer-by-customer basis.
  • Advanced high availability – Ensuring high availability of data and achieving true redundancy is no longer possible through legacy disaster recovery tactics. Multiple sites being tested infrequently and used only in the most dire of times, is simply not enough. Through multi-instance cloud technology, true redundancy is achieved with the application logic and database for each customer instance being replicated between two paired, yet geographically separate data centers. Each redundant data center is fully operational and active resulting in almost real-time replication of the customer instances and databases. Coupling a multi-instance cloud with automation technology, the customer instances can be quickly moved between each data center resulting in high availability of data.
  • Customer-driven upgrades – As described above, the multi-instance architecture allows cloud service providers to perform actions on individual customer instances, this also includes upgrades. A multi-instance cloud allows each instance to be upgraded on a schedule that fits compliance requirements and the needs of individual customers.

When it comes down to it, the multi-instance architecture clearly has significant advantages over the antiquated multi-tenant clouds. With its data isolation and a fully replicated environment that provides high availability and scheduled upgrades, the multi-instance architecture puts customers in control of their cloud.

By Allan Leinwand

Cukes and the Cloud

Cukes and the Cloud

The Cloud, through bringing vast processing power to bear inexpensively, is enabling artificial intelligence. But, don’t think Skynet and the Terminator. Think cucumbers!

Artificial Intelligence (A.I.) conjures up the images of vast cool intellects bent on our destruction or at best ignoring us the way we ignore ants. Reality is a lot different and much more prosaic – A.I. recommends products or movies and shows you might like from Amazon or Netflix learning from your past preferences. Now you can do it yourself as one farmer in Japan did. He used it to sort his cucumber harvest.


Makoto Koike, inspired by seeing Google’s AlphaGo beat the world’s best Go player, decided to try using Google’s open source TensorFlow offering to address a much less exalted challenge but nonetheless a difficult one: sorting the cucumber harvest from his parent’s farm.

Now these are not just any cucumbers. They are thorny cucumbers where straightness, vivid color and a large number of prickles command premium prices. Each farmer has his own classification and Makoto’s father had spent a lifetime perfecting his crop and customer base for his finest offerings. The challenge was to sort them quickly during the harvest so the best and freshest could be sent to buyers as rapidly as possible.

This sorting was previously a “human only” task that required much experience and training – ruling out supplementing the harvest with part-time temporary labor. The result was Makoto’s poor mother would spend eight hours a day tediously sorting them by hand.

Makoto tied together a video inspection system and mechanical sorting machines with his DIY software based on the Google TensorFlow and it works! If you want a deep dive on the technology check out the details here. Essentially the machine is trained to recognize a set of images that represent the different classifications of quality. The challenge is using just a standard local computer required keeping the images at a relatively low resolution. The result is 75% accuracy in the actual sorting. Even achieving that required three days of training the computer on recognizing the 7000 images.

Expanding to a server farm (no pun intended) large enough to raise that accuracy to 95% would be cost prohibitive and only needed during harvest. But Makoto is excited because Google offers Cloud Machine Learning (Cloud ML), a low-cost cloud platform for training and prediction that dedicates hundreds of cloud servers to training a network with TensorFlow. With Cloud ML, Google handles building a large-scale cluster for distributed training, and you just pay for what you use, making it easier for developers to try out deep learning without making a significant capital investment.

If you can do this with sorting cucumbers imagine what might be possible as cloud power continues to increase inexpensively and the tools get easier to use. The personal assistant on your phone will really become your personal assistant and not the clunky beasts they are today. In your professional life they’ll be your right-hand minion taking over the tedious aspects of your job. Given what Makoto achieved perhaps you should try your hand at it. Who knows what you might come up with?

By John Pientka

(Originally published Sept 22nd, 2016. You can periodically read John’s syndicated articles here on CloudTweaks. Contact us for more information on these programs)

Ransomware’s Great Lessons

Ransomware’s Great Lessons


The vision is chilling. It’s another busy day. An employee arrives and logs on to the network only to be confronted by a locked screen displaying a simple message: “Your files have been captured and encrypted. To release them, you must pay.

Ransomware has grown recently to become one of the primary threats to companies, governments and institutions worldwide. The physical nightmare of inaccessible files pairs up with the more human nightmare of deciding whether to pay the extortionists or tough it out.

Security experts are used to seeing attacks of all types, and it comes as no surprise that ransomware attacks are becoming more frequent and more sophisticated.


(See full (ISC)2 Infographic)

Security Experts Take Note

Chris Sellards, a Certified Cloud Security Professional (CCSP) working in the southwestern U.S. as a senior security architect points out that cyber threats change by the day, and that ransomware is becoming the biggest risk of 2016. Companies might start out with adequate provisions against infiltration, but as they grow, their defenses sometimes do not grow with them. He points out the example of a corporate merger or acquisition. As two companies become one, the focus may be on the day-to-day challenges of the transition. But in the background, the data that the new company now owns may be of significantly higher value than it was before. This can set the company up as a larger potential target, possibly even disproportionate to its new size.

The problem with ransomware as a security threat is that its impact can be significantly reduced through adequate backup and storage protocols. As Michael Lyman, a Boston-area CCSP states, when companies are diligent about disaster recovery, they can turn ransomware from a crisis to merely a nuisance. He says that organizations must pay attention to their disaster recovery plans. It’s a classic case of the ounce of prevention being worth more than the pound of cure. However, he points out that such diligence is not happening as frequently as it should.

As an independent consultant, Michael has been called into companies either to implement a plan or to help fix the problem once it has happened. He points out that with many young companies still in their first years of aggressive growth, the obligation to stop and make sure that all the strategic safeguards are in place is often pushed aside. “These companies,” he says, “tend to accept the risk and focus instead on performance.” He is usually called in only after the Board of Directors has asked management for a detailed risk assessment for the second time.

Neutralizing The Danger

Adequate disaster preparations and redundancy can neutralize the danger of having unique files held hostage. It is vital that companies practice a philosophy of “untrust,” meaning that everything on the inside must remain locked up. It is not enough to simply have a strong wall around the company and its data; it must be assumed that the bad people will find their way in somehow, which means all the data on the inside must be adequately and constantly encrypted.


It is essential to also bear in mind that ransomware damage does not exist solely inside the organization. There will also be costs and damage to the company-client relationship. At the worst is the specter of leaked confidential files – the data that clients entrusted to a company – and the recrimination and litigation that will follow. But even when a ransom event is resolved, meaning files are retained and no data is stolen, there is still the damage to a company’s reputation when the questions start to fly: “How could this have happened?” and “How do we know it won’t happen again?”

As cloud and IOT technologies continue to connect with each other, businesses and business leaders must understand that they own their risk. It is appropriate for security experts to focus on the fear factor, especially when conversing with the members of the Executive, for whom the cost of adequate security often flies in the face of profitability. Eugene Grant, a CCSP based in Ontario, Canada, suggests that the best way to adequately convey the significance of a proactive security plan is to use facts to back up your presentation; facts that reveal a quantitative risk assessment as opposed to solely qualitative. In other words, bring it down to cost versus benefit.

No company is too small to be immune or invisible to the black hats. It is up to the security specialists to convey that message.

For more on the CCSP certification from (ISC)2, please visit their website. Sponsored by (ISC)2.

By Steve Prentice

Micro-segmentation – Protecting Advanced Threats Within The Perimeter

Micro-segmentation – Protecting Advanced Threats Within The Perimeter


Changing with the times is frequently overlooked when it comes to data center security. The technology powering today’s networks has become increasingly dynamic, but most data center admins still employ archaic security measures to protect their network. These traditional security methods just don’t stand a chance against today’s sophisticated attacks.

That hasn’t stopped organizations from diving dive head-first into cloud-based technologies. More and more businesses are migrating workloads and application data to virtualized environments at an alarming pace. While the appetite for increased network agility drives massive changes to infrastructure, the tools and techniques used to protect the data center also need to adapt and evolve.

Recent efforts to upgrade these massive security systems are still falling short. Since data centers by design house huge amounts of sensitive data, there shouldn’t be any shortcuts when implementing security to protect all that data. The focus remains on providing protection only at the perimeter to keep threats outside. However, implementing strictly perimeter-centric security such as a Content Delivery Network (CDN) leaves the inside of the data center vulnerable, where the actual data resides.


(Infographic Source: Internap)

Cybercriminals understand this all too well. They are constantly utilizing advanced threats and techniques to breach external protections and move further inside the data center. Without strong internal security protections, hackers have visibility to all traffic and the ability to steal data or disrupt business processes before they are even detected.

Security Bottleneck

At the same time businesses face additional challenges as traffic behavior and patterns are shifting. There are greater numbers of applications within the data center, and these applications are all integrated with each other. The increasing number of applications has caused the amount of traffic going east-west traffic – or laterally among applications and virtual machines – within the data center to drastically grow as well.

As more data is contained with the data center and not crossing the north-south perimeter defenses, security controls are now blind to this traffic – making lateral threat movement possible. With the rising number of applications, hackers have a broader choice of targets. Compounding this challenge is the fact that traditional processes for managing security are manually intensive and very slow. Applications now are being rapidly created and evolving far more quickly than static security controls are able to keep pace with.

To address these challenges, a new security approach is needed—one that requires effectively bringing security inside the data center to protect against advanced threats: Micro-segmentation.



Micro-segmentation works by grouping resources within the data center and applying specific security policies to the communication between those groups. The data center is essentially divided up into smaller, protected sections (segments) with logical boundaries which increase the ability to discover and contain intrusions. However, despite the separation, application data needs to cross micro-segments in order to communicate with other applications, hosts or virtual machines. This makes lateral movement still possible, which is why in order to detect and prevent lateral movement in the data center it is vital for threat prevention to inspect traffic crossing the micro-segments.

For example, a web-based application may utilize the SQL protocol for interacting with database servers and storage devices. The application web services are all logically grouped together in the same micro-segment and rules are applied to prevent these application services from having direct contact with other services. However SQL may be used across multiple applications, thus providing a handy exploit route for advanced malware that can be inserted into the web service for the purpose of laterally spreading itself throughout the data center.

Micro-segmentation with advanced threat prevention is emerging as the new way to improve data center security. This provides the ability to insert threat prevention security – Firewall, Intrusion Prevention System (IPS), AntiVirus, Anti-Bot, Sandboxing technology and more – for inspecting traffic moving into and out of any micro-segment and prevent the lateral spread of threats. However, this presents security challenges due to the dynamic nature of virtual networks, namely the ability to rapidly adapt the infrastructure to accommodate bursts and lulls in traffic patterns or the rapid provisioning of new applications.

In order to address data center security agility so it can cope with rapid changes, security in a software-defined data center needs to learn about the role, scale, and location of each application. This allows the correct security policies to be enforced, eliminating the need for manual processes. What’s more, dynamic changes to the infrastructure are automatically recognized and absorbed into security policies, keeping security tuned to the actual environment in real-time.

What’s more, by sharing context between security and the software-defined infrastructure, the network then becomes better able to adapt to and mitigate any risks. As an example, if an infected VM is identified by an advanced threat prevention security solution protecting a micro-segment, the VM can automatically be re-classified as being infected. Re-classifying the VM can then trigger a predefined remediation workflow to quarantine and clean the infected VM.

Once the threat has been eliminated, the infrastructure can then re-classify the VM back to its “cleaned” status and remove the quarantine, allowing the VM to return to service. Firewall rules can be automatically adjusted and the entire event logged – including what remediation steps were taken and when the issue was resolved – without having to invoke manual intervention or losing visibility and control.

Strong perimeter security is still an important element to an effective defense-in-depth strategy, but perimeter security alone offers minimal protections for virtualized assets within the data center. It is difficult to protect data and assets that aren’t known or seen. With micro-segmentation, advanced security and threat prevention services can be deployed wherever they are needed in the virtualized data center environment.

By Yoav Shay Daniely

InformationWeek Reveals Top 125 Vendors Taking the Technology Industry by Storm

InformationWeek Reveals Top 125 Vendors Taking the Technology Industry by Storm

InformationWeek Reveals Top 125 Vendors

Five-part series details companies to watch across five essential technology sectors

SAN FRANCISCO, Sept. 27, 2016 /PRNewswire/ — InformationWeek released its list of “125 Vendors to Watch” in 2017. Selected by InformationWeek’s expert editorial team, the companies listed fall into one of five key themes: infrastructure, security, cloud, data management and DevOps.

The rapid pace of technological change puts more pressure on IT organizations than ever before, but also offers unprecedented opportunities for companies to rethink how they do business,” said Susan Fogarty, Director of Content, InformationWeek & Interop ITX. “We are pleased to recognize technology suppliers that are helping our readers to navigate the possibilities.”

The technology industry is in a state of constant transition and evolution. In turn, new benchmarks are developing as a fresh class of innovative tools, disruptive technology and methodology, and professionals break into the space. To meet these expectations, technology vendors are hard at work to ensure they are adequately adapting to provide the enterprise with the most innovative and effective systems and products.

Across the wide spectrum of sectors that the tech industry touches, there has been a surge of innovation within a few key areas: infrastructure, security, cloud, data management and DevOps. To help professionals navigate where they should be looking for the latest and greatest technologies within these growing sectors, InformationWeek has compiled a list of top 25 companies per theme. The InformationWeek editorial team has detailed their selections for 2017 in a five-part blog series.

InformationWeek’s Top 125 Technology Vendors to Watch

Infrastructure: Businesses are rethinking their IT infrastructures and vendors are looking to software and open source solutions to help them reform. Here are the top companies providing those solutions.

  • A10 Networks
  • Barefoot Networks
  • Big Switch Networks
  • Cambium Networks
  • Cisco
  • CloudGenix
  • CoreOS
  • Cumulus Networks
  • Docker
  • ExtraHop
  • Mist
  • Nimble Storage
  • Nutanix
  • Pluribus Networks
  • Pure Storage
  • SimpliVity
  • NetApp SolidFire
  • Rubrik
  • StorageOS
  • SwiftStack
  • Teridion
  • Veeam Software
  • VeloCloud
  • Viptela
  • VMware

Security: A wave of companies is entering the security field, armed with technologies to help businesses mitigate the next generation of cyberattacks. Here are some of the most innovative security companies to watch.

  • Accenture/FusionX
  • Bay Dynamics
  • CloudFlare
  • CrowdStrike
  • Cymmetria
  • Deep Instinct
  • FireEye/Mandiant
  • IBM and Watson
  • Intel/McAfee
  • IOActive
  • Kaspersky
  • Lookout
  • Nok Nok Labs
  • Okta
  • Onapsis
  • Optiv
  • Palo Alto Networks
  • Rapid7
  • RSA
  • Splunk
  • Symantec/Blue Coat
  • Vectra Networks
  • Veracode
  • White Ops
  • Zscaler

Cloud: The need to handle big data and real-time events, alongside the ability to respond to business demands has dictated a shift towards cloud computing by companies seeking to remain competitive in the information age. Here are some of the companies staying ahead of the curve.

  • Alibaba Cloud
  • Amazon Web Services
  • Apptio
  • Bluelock
  • CloudHealth Technologies
  • CenturyLink
  • CGI IaaS
  • CSC Agility Platform
  • DigitalOcean
  • Dimension Data
  • EMC Virtustream
  • Fujitsu K5
  • GoGrid
  • Google Cloud Platform
  • IBM Cloud
  • Joyent Cloud
  • Kaavo
  • Microsoft Azure
  • New Relic
  • Oracle Cloud
  • Pantheon
  • Rackspace
  • SAP Hana Cloud Platform
  • Verizon Cloud
  • VMware

Data Management: Data is critical to get a clear picture of customers, products, and more, but in order to do so, that data must be managed across multiple systems — systems that aren’t necessarily compatible. Here are the companies that can help enterprise organizations wrangle their multiple data sources.

  • Alation
  • Ataccama
  • AtScale
  • Cloudera
  • Collibra
  • Confluent
  • Databricks
  • Dell Boomi
  • Hortonworks
  • Informatica
  • Information Builders
  • Looker
  • MapR
  • MarkLogic
  • MongoDB
  • Orchestra Networks
  • Profisee
  • Reltio
  • SAP
  • SAS
  • SoftwareAG
  • Talend
  • Teradata
  • TIBCO Software
  • Verato

DevOps: Organizations looking to transform into a DevOps organization need a solid plan, complete with executive buy-in, and the right tools to get all the jobs done. Here are InformationWeek’s picks for companies offering products organizations should know about when making the move to DevOps.

  • Atlassian
  • Canonical (Ubuntu Juju)
  • Chef
  • CFEngine
  • Electric Cloud
  • Google (Cloud Platform)
  • HashiCorp
  • Inedo
  • Jenkins
  • Kony
  • Loggly
  • Microsoft (Visual Studio)
  • Nagios
  • New Relic
  • Octopus Deploy
  • Path Solutions
  • Puppet
  • RabbitMQ
  • Red Hat (Ansible)
  • SaltStack
  • Splunk
  • Tripwire
  • UpGuard
  • UrbanCode (IBM)
  • Xamarin

Interop ITX 2017

The same core industry themes highlighted by InformationWeek will be incorporated into Interop’s revamped Conference, Interop ITX. In the technology industry, change outpaces many others, the next phase of tech education is Interop ITX – a Conference that anticipates the X factor: anyone or anything that can impact your business, your customers, or your market.The Conference incorporates an educational program built by Interop’s trusted community of technology professionals. To learn more about Interop ITX and to register, please visit: 


For more than 30 years, InformationWeek has provided millions of IT executives worldwide with the insight and perspective they need to leverage the business value of technology. InformationWeek provides CIOs and IT executives with commentary, analysis and research through its thriving online community, digital issues, webcasts, virtual events, proprietary research and live, in-person events. InformationWeek’s award-winning editorial coverage can be found at InformationWeek is organized by UBM Americas, a part of UBM plc (UBM.L), an Events First marketing and communications services business. For more information, visit 

UBM Americas

UBM Americas, a part of UBM plc, delivers events and marketing services in the fashion, technology, licensing, advanced manufacturing, automotive and powersports, healthcare, veterinary and pharmaceutical industries, among others.  Through a range of aligned interactive environments, both physical and digital, UBM Americas increases business effectiveness for customers and audiences through meaningful experiences, knowledge and connections. The division also includes UBM Brazil’s market leading events in construction, cargo transportation, logistics & international trade, and agricultural production; and UBM Mexico’s construction, advanced manufacturing and hospitality services shows. For more information,

SWIFT Says Bank Hacks Set To Increase

SWIFT Says Bank Hacks Set To Increase

Bank Hacks Set To Increase

SWIFT, whose messaging network is used by banks to send payment instructions worth trillions of dollars each day, said three clients were hacked over the summer and cyber attacks on banks are set to increase.

The theft of $81 million in February from Bangladesh’s central bank using SWIFT messages rocked faith in the system whose messages had, until then, been accepted at face value.

SWIFT Chief Executive Gottfried Leibbrandt told the Sibos conference in Geneva on Monday that hackers breached the systems of two banks over the summer and a third bank repelled an attack before fraudulent SWIFT messages could be sent.

In the two cases where hackers sent payment instructions over SWIFT, the orders were not fulfilled. In the first, the receiving bank noticed that the instruction did not conform with normal transaction patterns and queried it.

In the second case, the payment was held up because the receiving bank had concerns about the ultimate beneficiary of the transfer and flagged the transaction to the paying bank, which then realized it had been hacked.

In the third case, the bank had installed a software patch from SWIFT which allowed the lender’s system to spot the infiltration.

In all of those cases no money was lost,” Leibbrandt said.

Read Full Article: Reuters

The Future of Cybersecurity and Authentication Methods

The Future of Cybersecurity and Authentication Methods

The Future of Cybersecurity

Cybersecurity has been on the minds of companies everywhere since the Dropbox and Yahoo hacks occurred. With the advent of cloud connected technology and the growing sophistication of malware and hacking attempts, it seems many common cybersecurity methods have become outdated.

So what could be in the future of companies seeking to improve their cybersecurity methods? We asked Stephen Gates, the Chief Intelligence Analyst with NSFOCUS, what his thoughts were on the situation.

stephen-gatesGates immediately told us that all companies need to start employing multi-factor authentication as a mandatory part of their information systems. He referenced a quote from the Cybersecurity National Action Plan: “The President is calling on Americans to move beyond just the password to leverage multiple factors of authentication when logging-in to online accounts. Private companies, non-profits, and the Federal Government are working together to help more Americans stay safe online through a new public awareness campaign that focuses on broad adoption of multi-factor authentication.” However, Gates also noted that there currently is no regulation being enforced in how companies handle their cybersecurity, and that public awareness efforts can only go so far in solving the issue.

The biggest threat to IT executives today is prevalence of employees using work machines for personal business purposes. Employees that have their personal accounts hacked increase the likelihood of successful phishing, malware, and ransomware attacks hitting company networks. Because of this, parts of the U.S. government banned employee access to certain online email services earlier this year. Perhaps other organizations would be well advised to follow suit in the wake of the recent security breaches.

While organizations who store millions of user account credentials for online services are getting better at protecting their data, many still could be falling short. Two-factor authentication should be implemented everywhere, on user accounts as well as administrator accounts. If two-factor authentication is not widely adopted, it will not solve the problem at large.

One of Gate’s suggestions for increasing cybersecurity effectiveness is preventing employees from using company machines and networks for personal business. While he agreed that this can be seen as a potential problem with younger and younger workforces – policing their usage leading to animosity, distrust, and attrition – he proposed a solution that allows companies to protect their machines and networks from personal usage without upsetting the younger generations: “create an environment whereby personal interaction with the Internet can be done at work, without using corporate devices and the corporate network.” He explained that setting up a separate network with different machines specifically labeled for personal usage by employees could help keep both companies and their employees happy and safe.

Alternative Authentication


I had heard a rumor about security experts discussing a possible future in using alternative forms of authentication instead of a password. I asked Gates about what this could mean. “Fingerprints, retinas, facial features, and even DNA are all very unique to each individual,” Gates explained to me. “In addition, researchers have recently discovered that each human’s hair proteins are also very unique. These are the types of things that must be used to authenticate someone; not passwords, tokens, and two-factor codes.

There are some new developments in attempting to implement a better method of authentication across the board. For example, many laptops today come with fingerprint scanners. Smartphones are now using applications that can identify facial features for authentication using the cameras they come with. Physical security may include retinal scans and even hand scanners. These are things people can’t lose, can’t forget, and most likely can’t be stolen. Personally, I think facial feature authentication is a great step in the right direction. It’s not too overly intrusive and most people would not be afraid of it – like a retinal scan.”

While facial recognition is still in its early infancy, the industry will soon become more proficient in identifying possible biomarkers on someone’s face that are difficult to spoof. The cameras on smartphones nowadays are just as good if not better than many of the stand-alone cameras on the market, so the next step would be to install higher quality cameras on computers that can adjust themselves automatically for different environmental conditions like lighting, makeup, hair, and aging. While other forms of authentication like retinal scanning may seem intrusive, no one seems to mind taking pictures of themselves. However, for now current two-factor authentication methods would still need to be implemented as a form of backup in case facial recognition fails.

So, it would seem that biometrics are possibly in the not-too-distant future as the new standard of authentication, with facial recognition being the most likely method to be implemented due to society already being pre-conditioned for taking selfies. In the meantime, companies need to make sure that their employees are using two-factor authentication on their company owned user accounts. Companies would also benefit from separating machines and networks intended for business related usage from machines and networks used for personal business purposes to help isolate attacks.

By Jonquil McDaniel

CloudTweaks Comics
Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

Connecting With Customers In The Cloud

Connecting With Customers In The Cloud

Customers in the Cloud Global enterprises in every industry are increasingly turning to cloud-based innovators like Salesforce, ServiceNow, WorkDay and Aria, to handle critical systems like billing, IT services, HCM and CRM. One need look no further than Salesforce’s and Amazon’s most recent earnings report, to see this indeed is not a passing fad, but…

Are CEO’s Missing Out On Big Data’s Big Picture?

Are CEO’s Missing Out On Big Data’s Big Picture?

Big Data’s Big Picture Big data allows marketing and production strategists to see where their efforts are succeeding and where they need some work. With big data analytics, every move you make for your company can be backed by data and analytics. While every business venture involves some level of risk, with big data, that risk…

Three Factors For Choosing Your Long-term Cloud Strategy

Three Factors For Choosing Your Long-term Cloud Strategy

Choosing Your Long-term Cloud Strategy A few weeks ago I visited the global headquarters of a large multi-national company to discuss cloud strategy with the CIO. I arrived 30 minutes early and took a tour of the area where the marketing team showcased their award winning brands. I was impressed by the digital marketing strategy…

Data Breaches: Incident Response Planning – Part 1

Data Breaches: Incident Response Planning – Part 1

Incident Response Planning – Part 1 The topic of cybersecurity has become part of the boardroom agendas in the last couple of years, and not surprisingly — these days, it’s almost impossible to read news headlines without noticing yet another story about a data breach. As cybersecurity shifts from being a strictly IT issue to…

Digital Transformation: Not Just For Large Enterprises Anymore

Digital Transformation: Not Just For Large Enterprises Anymore

Digital Transformation Digital transformation is the acceleration of business activities, processes, and operational models to fully embrace the changes and opportunities of digital technologies. The concept is not new; we’ve been talking about it in one way or another for decades: paperless office, BYOD, user experience, consumerization of IT – all of these were stepping…

Your Biggest Data Security Threat Could Be….

Your Biggest Data Security Threat Could Be….

Paying Attention To Data Security Your biggest data security threat could be sitting next to you… Data security is a big concern for businesses. The repercussions of a data security breach ranges from embarrassment, to costly lawsuits and clean-up jobs – particularly when confidential client information is involved. But although more and more businesses are…

Micro-segmentation – Protecting Advanced Threats Within The Perimeter

Micro-segmentation – Protecting Advanced Threats Within The Perimeter

Micro-segmentation Changing with the times is frequently overlooked when it comes to data center security. The technology powering today’s networks has become increasingly dynamic, but most data center admins still employ archaic security measures to protect their network. These traditional security methods just don’t stand a chance against today’s sophisticated attacks. That hasn’t stopped organizations…

Protecting Devices From Data Breach: Identity of Things (IDoT)

Protecting Devices From Data Breach: Identity of Things (IDoT)

How to Identify and Authenticate in the Expanding IoT Ecosystem It is a necessity to protect IoT devices and their associated data. As the IoT ecosystem continues to expand, the need to create an identity to newly-connected things is becoming increasingly crucial. These ‘things’ can include anything from basic sensors and gateways to industrial controls…


Sponsored Partners