Category Archives: Security

Cloud Startup In Focus – Anaplan

Cloud Startup In Focus – Anaplan

Cloud Startup In Focus – Anaplan

anaplanCloud Analytics is a hot space right now and there’s no second guessing about it. Driven by the demand for quick analytics for anyone and everyone, from C level executives down to a plant operator, startups are sprouting up and taking on the challenge to solve the equation of providing big data analytics for SMEs and enterprise companies alike. Anaplan is one such startup with solid technology and rapidly growing customer base. It provides modeling solutions for finance and operations, all served through the Cloud.

The industry has perfected the art of gathering data, huge amounts of it. This is what we often call “big data”. It may be temperature readings inside a manufacturing plant being streamed through a WSN (Wireless Sensor Network) onto a big data store or finance transactions of an organization or consumer product sales etc. With the humongous amounts of data being logged around the clock, what use would it be if we cannot model it and derive trends, patterns and insights? Unfortunately, the “processing” part of this equation has yet to catch up, despite a plethora of new and old companies offering big data modeling and analytic solutions.

anaplan-cloud

The big promise of Cloud based big data analytics is to enable decision makers to base their forecasts and decisions, albeit small, on real data being constantly streamed into companies own systems or on the internet. One such case, for example, is sentiment analysis on Twitter. There are companies which offer solutions to process tweets and determine user sentiment on a particular product, service or an event. Small and medium sized companies would find it hard to procure hardware and license software in order to log the twitter data and then perform analytics on it. Cloud analytics provides a viable alternative. It helps you log and process data and generate analytic reports without the need for any sort of hardware.

Anaplan is one such startup. It has recently closed its Series C with $30 million funding. The company promises to offer an “alternate to (Microsoft) Excel based analytics”. This promise comes with lightning fast multi-dimensional modeling engine powered by their patented Hyperblock technology which is a hybrid of relational, vertical and OLAP databases and a calculation engine. One user quotes processing 2.8 million transactional records under 2 seconds – on the Cloud.

Since the complete application resides on the Cloud, there’s no need for any dedicated hardware for the application to work. You can fire up the application inside your browser on a mobile platform. This thin deployment which they call “Zero Deployment” is what Cloud is all about. You no more need to license “boxes” along with the analytics solution although there’s still an aspect of vendor lock-in which needs to be addressed deeply. However, the solution comes with a data migration tool which enables data to be imported or exported in/out of popular platforms like Salesforce and SnapLogic etc.

Putting powerful and yet simple-to-use analytics into the hands of decision makers liberates them from the need to tie up IT and drives better decisions based on real data. Anaplan and startups like this are making a headway into this territory.

By Salman UI Haq

Are you a unique young cloud startup looking for coverage? Contact us and tell us why you believe you are the next rising star! 

Cloud Infographic: The Cloud Is Here To Stay

Cloud Infographic: The Cloud Is Here To Stay

Cloud Infographic: The Cloud Is Here To Stay

Three factors were individually tested to measure server performance – processor, memory and storage – and each were evaluated working jointly during workloads for an overall server performance grade. A bare-metal dedicated server was used as a benchmark control. Read More

Edit: An infographic provided by Firehost with statistical sources from Cisco, Gartner and Forbes…

-60%  business workload report in the cloud by 2016 can be found at  Cisco Global Cloud Index.

– “Forecast Overview: Public Cloud Services, Worldwide, 2011 – 2016 (Q4 Update published Feb. 8, 2013) as well as Forbes

FireHost-Performance-Infographic copy

Infographic Source: Firehost

 

First High Performance Cloud Benchmarks Unveiled

First High Performance Cloud Benchmarks Unveiled

Study highlights need for transparency in the cloud

Dallas – May 14, 2013 Until now, there have been very little verifiable, objective cloud performance benchmarks. FireHost, the secure cloud hosting company, has again bridged an important gap for enterprises that require a cloud they can trust before they’ll even consider moving production workloads to the cloud.  To give IT stakeholders some genuine, meaningful and independently verified data about cloud performance, FireHost and six other leading cloud hosting providers were benchmarked by an independent, third-party research firm in a series of extensive tests to provide a head-to-head comparison. FireHost created several impactful infographics that depict the market and business drivers for high performance cloud in healthcare, payments and more general IT environments.

As enterprises migrate to the cloud it is critical that they consider the right metrics that will have positive returns on investment without impairing performance in speed, flexibility, reliability, control, efficiency and costs,” said Chris Wiles, CTO of Zeta Compliance Technologies. “This kind of performance comes down to designing an infrastructure that is top spec in absolutely every department, which is why we chose FireHost as out hosting partner.”

Three factors were individually tested to measure server performance – processor, memory and storage – and each were evaluated working jointly during workloads for an overall server performance grade. A bare-metal dedicated server was used as a benchmark control. FireHost scored highest in all areas including overall server performance, outperforming Amazon Web Services EC2, Dell vCloud, HP Cloud, Microsoft Azure, Rackspace Cloud Express and Terremark vCloud. Only three out of the seven cloud providers scored higher than the dedicated server in average overall server performance*.

Performance is the game-changing metric that no one talks about because it’s difficult and expensive to measure,” said Todd Gleason, Director of Technology for FireHost. “Many cloud infrastructures were built with a single focus on price point, but FireHost serves enterprises with critical services that have more sophisticated considerations and realize that a number of factors beyond the invoice from your cloud hosting provider impact the total cost to own a IaaS solution.”

The performance benchmark study offers IT stakeholders a great deal of much needed transparency before committing to a fully managed IaaS provider. The report demonstrates that not all clouds are created equal and provides a baseline to help improve predictions on how critical, production applications will perform once in a cloud infrastructure. For example, the processor, memory and storage of a high performance cloud infrastructure can produce double the efficiency with fewer resources than commodity clouds. The total cost of ownership can be considerably lower with a high performance cloud, than with a lower performing, commodity cloud option as well.

FireHost has always carved a unique path in the secure cloud category, focusing first on enterprise-grade security and backing it up with a high performance cloud, demanded by enterprises for production workloads,” said Chris Drake, founder and CEO of FireHost.

Additional information on FireHost’s secure, high-performance cloud services and a copy of the benchmarking study can be obtained from FireHost.

*For overall server performance testing, the benchmark observed how the server groups fared using real-life workloads, measuring the holistic performance of coordinated operations between memory, processor and storage. The control server specifications included a Dell PowerEdge server configured with 16GB DDR3 1066Mhz memory, dual quad core Intel Xeon E5506 2.13GHz processors and 2.5” Seagate SAS 15K RPM drives.

About FireHost

FireHost offers the most secure cloud hosting available, protecting sensitive data and brand reputations of some of the largest companies in the world. With infrastructure built for security, compliance, performance and scalability, responsible businesses choose FireHost to reduce risk and improve the collection, storage and transmission of their most confidential data. Secure cloud servers, available in Dallas, Phoenix, London and Amsterdam, offer robust, geographically redundant business continuity options across all sites. Based in Dallas and funded by The Stephens Group, FireHost is the chosen secure cloud service provider for brands that won’t compromise on the security of their payment card and healthcare data. http://www.firehost.com.

Company Contact:

Cathi Lane
FireHost
cathi.lane@firehost.com
+1.877.262.3473 x. 8133

Editorial Contact:

Sarah Hawley
Ubiquity Public Relations
sarah@ubiquitypr.com
480.292.4640

Google Dumps Custom Linux In Favor Of Debian For App Engine

Google Dumps Custom Linux In Favor Of Debian For App Engine

Google Dumps Custom Linux in Favor of Debian for App Engine

Google has made the switch from its own custom version of Linux, which they called “Google Compute Engine Linux” to more standard and generic, Debian Linux for App Engine. The switch syncs with the announcement of Debian 7.0 codenamed “wheezy” which brings significant new improvements and bug fixes as pushed by the community. The switch was announced last week on the Google App Engine blog. Google’s support for a standard version of Linux would ease down on compatibility issues and also increase its user base.

In addition to other benefits of the switch, Google will also tap into the massive Debian community which can now easily migrate their applications and solutions onto Google App Engine without going through nightmarish migration and compatibility procedures. According to Jimmy Kaplowitz wgoogle-app-engineho is a site reliability engineer at Google and a Debian developer, “We (Jimmy’s team) are continually evaluating other operating systems that we can enable with Compute Engine”. This statement suggests continuous lookout for more OS which may be a good fit, both from the technical and business perspectives, for the App Engine. Apart from Debian, Google also has support for CentOS which is the derivative of RHEL (Red Had Enterprise Linux) and perhaps the second most popular Linux distribution after RHEL in the enterprise world.

Even though Debian is not the first Linux distribution which supported the Cloud, it still stands out as a stable and secure distribution. The latest release, 7.0 “wheezy” comes out with improvements which are vital for deployment on any Cloud environment. Some of these include:

  • Support for OpenStack suite as well as XCP (Xen Cloud Platform) which gives users a choice to deploy their own cloud infrastructure. This will definitely be attractive for Google since there’s a momentum being built around OpenStack within the Cloud infrastructure community. In addition to using Debian on App Engine, you can also deploy your own private Cloud using Debian, thanks to its tight integration with OpenStack and XCP.

  • Support for s390x which is the 64-bit port of IBM System z machines

  • Stable support for 32/64 bit machines

  • Full Disk Encryption using geli. This is important, if not vital for any serious enterprise deployment.

  • A massive addition of around 12,800 new packages bringing the total to 37,493 packages which is by far, the highest number of packages available in any Linux distribution. One of the motivations behind this push for new packages was to get the status of “Universal Operating System”.

In order to ease the migration and deployment of your application stack on App Engine, Google has created its own Debian package mirror which may be used by Google compute engine Debian instances. Debian is definitely a pleasant addition to App Engine and indicates a move towards a standards based Cloud platform by Google.

By Salman Ul Haq

Cloudera Not Cutting It With Big Data Security

Cloudera Not Cutting It With Big Data Security 

Cloudera is, for the moment, a dominating presence in the open source Hadoop landscape; but does it have staying power? While Cloudera’s Big Data platform is the darling of the Hadoop space, they and their open source distribution competitors have so far failed to adequately address the elephant in the room: enterprise data security.

Cloudera’s Chief Architect and creator of Hadoop, Doug Cutting, recently discussed the growing value of Big Data in a CNBC Squawk Box segment, but nervously glossed over the subject of data security when it was raised. Benzinga reported Cutting as saying that, “…the value of Cloudera outweighs most security concerns,” thereby demonstrating a level of hubris and naivety that should put every IT security professional on high alert.  Their dismissive approach to Big Data security should really come as no surprise. Hadoop was not written with security in mind, and to date, the open source Hadoop community, including Cloudera, has not focused on addressing this critical gap.  For enterprise organizations with data at risk, especially those companies that must adhere to regulatory compliance mandates, this should be cause for concern.

Hadoop was a spin-off sub-project of Apache Lucene and Nutch projects, which are based on a MapReduce framework and a distributed file system. That initial application, web indexing, did not require any integrated security.  Hadoop is also the open-source version of the Google MapReduce framework, and the data being stored (public URLs) was not subject to privacy regulation. The open source Hadoop community supports some security features through the current implementation of Kerberos, the use of firewalls, and basic HDFS permissions.  However, Kerberos is difficult to install, configure, and integrate with Active Directory (AD) and Lightweight Directory Access Protocol, (LDAP) services.  Even with special network configuration, a firewall has limited effectiveness, can only restrict access on an IP/port basis, and knows nothing of the Hadoop File System or Hadoop itself.

Enterprises want the same security capabilities for Big Data as they have now for “non-Big Data” information systems, including solutions that address user authentication, access control, policy enforcement, and encryption.  Many organizations require these Big Data safeguards in order to maintain regulatory compliance with HIPAA, HITECH, SOX, PCI/DSS, and other security and privacy mandates.  But they won’t find those safeguards in open source Hadoop distributions today.  Community initiatives underway such as Knox and Rhino are intended to improve Hadoop’s security posture, but tangible results will take time and will certainly lag behind more aggressive commercial efforts.

Cloudera and other distribution vendors are essentially branding open source Hadoop, along with its inherent security limitations.  While Cloudera is perceived as a software company, in reality the vast majority of its revenue is derived from professional services, training, and support.  It’s unlikely that Cloudera will suddenly invert its business model and come to the rescue with an integrated software solution for data security.  Does this mean that Cloudera and other open source Hadoop solutions are dangerous to deploy?  Only if IT organizations ignore the inherent security gaps and risks involved, and do not take adequate precautions to secure the data store.

The recent $45 million cybercrime heist involving ATM machines in New York and around the world is a perfect example of how unauthorized access to a compromised data store can result in tremendous financial loss to the victimized financial institution.  And, by the way, ATM transaction records are exactly the kind of unstructured Big Data that ends up being stored in a Hadoop environment.

For organizations needing robust Big Data security now, Orchestrator, a commercial software solution from Zettaset, provides enterprise-class security that is embedded in the Big Data cluster itself, moving security as close as possible to the data, and providing protection that perimeter security devices such as firewalls simply cannot deliver.   Zettaset’s Orchestrator software automates cluster management and security, and works in conjunction with most Hadoop distributions, including Cloudera’s, to address open source vulnerabilities in datacenter environments where security and compliance is a business imperative.

While open source Hadoop solutions such as Cloudera’s do indeed have value, make no mistake: The security demands of today’s at-risk enterprises clearly represent a much higher priority for IT professionals and the organizations they serve.

By Jim Vogt /  Zettaset CEO

With more than 25 years of leadership experience in both start-up and established corporations, Jim Vogt brings a wealth of business and technology expertise to his role as president and CEO of Zettaset. Most recently, Jim served as senior vice president and general manager of the Cloud Services business unit at Blue Coat Systems. Prior to Blue Coat, he served as president and CEO at Trapeze Networks, which was acquired by Belden, Inc. He was also president and CEO at data encryption start-up Ingrian Networks (acquired in April, 2008 by SafeNet).

HR Security Risk Prevention…

HR Security Risk Prevention…

With the rapid adoption of the Cloud by SMEs as well as large enterprises, it has become vital to review and update HR policies to mitigate information security threats that come with this paradigm shift. Cloud systems differ from traditional, in-house IT infrastructure in a way that businesses now have less control over their software while handing over most of the control to third party Cloud service providers. For example, it is hard to keep track of your employee’s browser history if he or she is connected to a virtualized environment inside the Cloud. Your business data is more vulnerable in the hands of an employee using Cloud since the chances of involuntary information spill are greater in Cloud environments.

For companies moving to the Cloud or those who have already made the transition, it is important that not only their CIOs sit sit down and review the IT staff policies to adequately cover the company against any risks of employee using company information for illegitimate purposes. CIOs may make the policies but when it comes to enforcing anything on employees, HR has to be involved so it’s better to involve them early on instead of handing them down a plethora of information security policy for theCloud.

To start with, companies should enforce technology based restriction on Cloud on what an employee can and cannot do vis-à-vis Cloud apps. Of course, you have to make sure that the Cloud solution provider conforms to your information security requirements on Cloud apps. For example, employees should not be allowed to send emails to their private accounts using Cloud without prior permission. HR staff also needs to include the Cloud related policy decisions in employee’s handbook.

For example:

  • Whether an employee can use public Cloud storage solutions like DropBox at work and more importantly, does the company allow information to be put into public Cloud storage services?
  • Can an employee use personal handheld devices like smartphone/tablet at/for work?
  • Can an employee be allowed to send emails to private accounts to facilitate his/her work outside the office environment? If so, should that email be CC’ed to some else as well?
  • Does the policy handbook covers in detail the use of internet, email and other IT transactions from work and can they be monitored?

HR policy should clearly mention what comes under the definition of ‘company information’ and ‘company property’. IT policy also needs to be updated periodically because with the plethora of new possibilities which the Cloud brings for businesses, it also leaves loopholes in company’s information security policy.

By Salam UI Haq

5 Cloud Computing Trends For 2013

5 Cloud Computing Trends For 2013

In 2012, cloud computing became a much bigger trend in the business and networking world. IDC have predicted a 130% increase in cloud computing by 2016, meaning an increase to $43 billion. Here are some of the five trends to look out for that are coming in 2013 which are going to help boost cloud computing in the long term;

Subscription

With more and more companies beginning to look into cloud computing, the hype is growing every day and more businesses are using it. Colleges are using it more and more to help store lecture data for easy access to the class. Businesses are using it to work from home, pick up easy access to documents and for sharing important company information. One of the big changes expected to appear in 2013 is the subscription model for Cloud computing. The idea is that you will only pay for how much data you need, rather than buying bulk for space that may never be used. It gives you a powerful security measure as well, knowing that your paid-up data is protected and accessible at any given time.

Recovery Services

Cloud computing gives you so much freedom, it could eventually start to replace backup companies as a cost-effective and easy to use way to back-up your whole company. Using resources that are only stored on the company intranet, it would be very good to be able to have important company documents stored online, saved and updated regularly. There has been more and more talk and actual action of smaller to medium businesses moving toward the cheaper cloud alternative, and the trends are showing that larger businesses are beginning to see the potential in cloud computing, too.

Security

The one hold-back of cloud computing at this moment in time, is the lack of – or perceived lack of – quality in the defense it offers you and your data. Cloud computing however is improving all the time, and there is a continued effort to bring in new clients with more and more companies that provide cloud computing to produce a top quality safety structure. This is essential as cloud computing is all about protection and privacy anyway, so getting this right could really detonate the niche. This is surprising because you would imagine that a software as important as cloud computing would already have top-grade security.

Specific Designs

Much like when phone apps became big, more companies started to spend money on having a mobile app designed to complement the website. Restaurants perhaps seen the most use from this, as it gave them a new dimension and something to hook in potential customers with. Businesses are beginning to view cloud computing in the same light. It can be shifted and edited to make your organization more powerful and to give you a top of the range service which is very unique at present. As the trends show, more and more businesses are looking to leverage the power of the internet and using a modified version of cloud computing could be an extremely valuable tool for certain niches.

Hoarding

This may sound odd, but the term hoarders refers to people who just pile up crazy amounts of junk in their house and hold onto it for years, believing all of it to be extremely valuable. Well, cloud computing has been shown to be heading toward a budding trend of hoarders getting involved. More people are filling up their cloud compartments with random old files, and sentimental objects, that remind them of a previous time in their life. They do not want to delete these files permanently, but they don’t really intend on looking at them or using them ever again.

By Robert Smith

IT Disaster Recovery For SMEs

IT Disaster Recovery For SMEs

IT Disaster Recovery For SMEs

According to credible estimates, an hour of outage may cost a medium sized company $70,000. Yes, that is accumulated losses when IT systems go offline. What’s interesting to note here is that in contrast to the popular belief that natural disasters constitute the primary reason for IT system failure, a recent study finds hardware failure to be the leading cause, by a big margin, of IT disasters and the losses, both financial and loss in credibility, which small and medium sized businesses have to incur. However, if SMEs take the right precautions, much of the loss can be quickly remedied, even if it occurs.

I do not need to argue about the importance of prompt recovery from IT disasters. Even if your business can burn through $70,000/hour of losses, the loss in customer confidence, especially for consumer facing enterprises may not be repaired, ever. A study by HP and Score also reveals that 1/4th of medium sized businesses go broke as a result of a major disaster. It shows the ROI for investing your time and money in contingency planning and executing dry runs to ensure your plan works.

image-graph

Among the four major types of disasters – Hardware failure, natural disasters, human error and software failure, only natural disasters are something which are not in human control, everything else, including human error can be tamed, if not controlled. The key however is to be prepared for extreme situations and make your plans based on disaster predictive studies available out there.

Unless your organization is unique, it’s very much likely that you have one SAN (Storage Area Network) or NAS (Network Attached Storage) which is being utilized across your organization. In order to keep storage simple and scalable, organizations tend to neglect the doomsday scenario which may trigger due to a slight failure of their SAN. On top of it, all data, including virtualized storage relies on this one big SAN. Now imagine this SAN failing for any reason – there are plenty. Since the whole IT environment is connected to the SAN, the whole IT infrastructure comes to a halt, all because of SAN failure. This is not a hypothetical scenario which I’m creating to drive my point home, rather, it’s one of the major causes of hardware failures which result in IT disasters. Let’s look at some of the measures organizations may take to mitigate risks. First comes redundancy but even with layers of redundancy, if your SAN is not diversified (separate systems and not one big unit), there are good chances those added layers of redundant storage will fall like a house of cards when disaster strikes. Next comes ensuring a standard data backup policy is made and followed to the letter and spirit. However, surveys suggest that it normally take tens of hours to recovery from SAN failure with tape and disk backups. Some studies draw an even starker picture by claiming that tape backups often fail.

Cloud backup seems to be an emerging trend, primarily driven by the idea to ‘physically’ diversify your storage network. Organizations which deeply embrace Cloud completely let go of any internal SAN and rely on the Cloud. This may not be a wise move considering that Cloud may also fail (remember the Amazon EC2 failure which brought down mega internet services like Reddit etc?). Using Cloud backup is a credible plan to recover from any storage related IT failures. Diversifying your Cloud backup pool only further strengthens your IT and mitigates failure risks.

No matter how strong your IT systems are, they’re prone to failure. This may happen because of your system administrator accidentally wiping out server file system or a hurricane sweeping through your data center. Preparation is the key.   Read The Full Quorum: Disaster Recovery Report 2013

By Salman Ul Haq

CloudTweaks Comics
Consequences Of Combining Off Premise Cloud Storage and Corporate Data

Consequences Of Combining Off Premise Cloud Storage and Corporate Data

Off Premise Corporate Data Storage Cloud storage is a broad term. It can encompass anything from on premise solutions, to file storage, disaster recovery and off premise options. To narrow the scope, I’ve dedicated the focus of today’s discussion to the more popular cloud storage services—such as Dropbox, Box, OneDrive—which are also known as hosted,…

The Future Of Work: What Cloud Technology Has Allowed Us To Do Better

The Future Of Work: What Cloud Technology Has Allowed Us To Do Better

What Cloud Technology Has Allowed Us to Do Better The cloud has made our working lives easier, with everything from virtually unlimited email storage to access-from-anywhere enterprise resource planning (ERP) systems. It’s no wonder the 2013 cloud computing research IDG survey revealed at least 84 percent of the companies surveyed run at least one cloud-based application.…

Cloud Computing – The Real Story Is About Business Strategy, Not Technology

Cloud Computing – The Real Story Is About Business Strategy, Not Technology

Enabling Business Strategies The cloud is not really the final destination: It’s mid-2015, and it’s clear that the cloud paradigm is here to stay. Its services are growing exponentially and, at this time, it’s a fluid model with no steady state on the horizon. As such, adopting cloud computing has been surprisingly slow and seen more…

Using Big Data To Analyze Venture Capitalists’ Ability To Recognize Potential

Using Big Data To Analyze Venture Capitalists’ Ability To Recognize Potential

Big Data To Analyze Using Big Data to Analyze Venture Capitalists’ Ability To Recognize Potential For those who are regularly involved with SMEs, venture capital, and company valuations, it is common knowledge that start-ups that exit for more than $1 billion dollars are extremely rare – often termed ‘unicorn’ companies. Despite their rarity, it should…

Cloud Infographic – Monetizing Internet Of Things

Cloud Infographic – Monetizing Internet Of Things

Monetizing Internet Of Things There are many interesting ways in which companies are looking to connect devices to the cloud. From the vehicles to kitchen appliances the internet of things is already a $1.9 trillion dollar market based on research estimates from IDC. Included is a fascinating infographic provided by AriaSystems which shows us some of the exciting…

The Future Of Cloud Storage And Sharing…

The Future Of Cloud Storage And Sharing…

Box.net, Amazon Cloud Drive The online (or cloud) storage business has always been a really interesting industry. When we started Box in 2005, it was a somewhat untouchable category of technology, perceived to be a commodity service with low margins and little consumer willingness to pay. All three of these factors remain today, but with…

Cloud Computing Services Perfect For Your Startup

Cloud Computing Services Perfect For Your Startup

Cloud Computing Services Chances are if you’re working for a startup or smaller company, you don’t have a robust IT department. You’d be lucky to even have a couple IT specialists. It’s not that smaller companies are ignoring the value and importance of IT, but with limited resources, they can’t afford to focus on anything…

The CloudTweaks Archive - Posted by
Cloud Computing – The Game Changer

Cloud Computing – The Game Changer

Global Cloud Index In October, Cisco released its Global Cloud Index (GCI) report for 2014-2019, projecting a near 3-fold growth of global data center traffic, with predictions that this traffic will reach 8.6 zettabytes (cloud data center traffic) and 10.4 zettabytes (total data center traffic) per year in 2019 and 80% of it will come…

The Business of Security: Avoiding Risks

The Business of Security: Avoiding Risks

The Business of Security Security is one of those IT concerns that aren’t problematic until disaster strikes. It might be tomorrow, it could be next week or next year. The fact is that poor security leaves businesses wide open for data loss and theft. News outlets just skim the surface, but hackers cost business up…

Will Your Internet of Things Device Testify Against You?

Will Your Internet of Things Device Testify Against You?

Will Your Internet of Things Device Testify Imagine this:  Your wearable device is subpoenaed to testify against you.  You were driving when you were over the legal alcohol limit and data from a smart Breathalyzer device is used against you. Some might argue that such a use case could potentially safeguard society. However, it poses…

The Fully Aware, Hybrid-Cloud Approach

The Fully Aware, Hybrid-Cloud Approach

Hybrid-Cloud Approach For over 20 years, organizations have been attempting to secure their networks and protect their data. However, have any of their efforts really improved security? Today we hear journalists and industry experts talk about the erosion of the perimeter. Some say it’s squishy, others say it’s spongy, and yet another claims it crunchy.…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt. Even so, decision makers should not put off moving from old legacy systems to…

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends Once upon a time, only a select few companies like Google and Salesforce possessed the knowledge and expertise to operate efficient cloud infrastructure and applications. Organizations patronizing those companies benefitted with apps that offered new benefits in flexibility, scalability and cost effectiveness. These days, the sharp division between cloud and on-premises infrastructure…

Using Cloud Technology In The Education Industry

Using Cloud Technology In The Education Industry

Education Tech and the Cloud Arguably one of society’s most important functions, teaching can still seem antiquated at times. Many schools still function similarly to how they did five or 10 years ago, which is surprising considering the amount of technical innovation we’ve seen in the past decade. Education is an industry ripe for innovation…

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

The 80-20 Rule For Security Practitioners  Everyday we learn about yet another egregious data security breach, exposure of customer data or misuse of data. It begs the question why in this 21st century, as a security industry we cannot seem to secure our most valuable data assets when technology has surpassed our expectations in other regards.…

Don’t Be Intimidated By Data Governance

Don’t Be Intimidated By Data Governance

Data Governance Data governance, the understanding of the raw data of an organization is an area IT departments have historically viewed as a lose-lose proposition. Not doing anything means organizations run the risk of data loss, data breaches and data anarchy – no control, no oversight – the Wild West with IT is just hoping…

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US. The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about…