Category Archives: Technology

Ensuring Cloud Authorizations Are Correct

Ensuring Cloud Authorizations Are Correct

Cloud Authorization

Almost all organizations in every industry now use some type cloud application. This is because of cost, efficiency, ease of use and because many software companies are offering their solutions in the cloud. For example, Microsoft 365 and Adobe Suite are mostly utilized by organizations in the respective cloud versions.

Cloud applications have many benefits for both the organization and for the end user, but there also needs to be some type of guideline or solution in place to ensure that they are managed correctly. There are many account and access management issues that come with implementing cloud applications for your organization.

So what are some of the issues that organizations have with access management to cloud applications? Like with in-house applications, often two things happen. End users either are given too few rights and need to request additional access or they accidently receive too many rights to systems and applications that they should not.

cloud-systems

For the first scenario, employees can request additional access rights from the application manager at their organization, but this is very inefficient. They need to contact someone in the company who handles access and request that an account is created for them or additional access rights are made for them. This is frustrating for the employee and for the manager, since they are likely working on other projects. The employee has to then wait until this is created and may need follow up with the admin to see if the request is in the works.

For the latter problem, it is a major security concern for the organization. Often for convenience, an employee’s account is copied from another employee’s in a similar role to make. This potentially leaves the employee with additional access rights that they should not have, possibly to sensitive information.

The issue is difficult to manage and there needs to be someone who is manually creating access or checking to ensure that access rights are accurate. If you are a system admin, a CIO or other technology director, you know that either there is no one who is designated to complete these tasks, or this is something that is delegated to an employee with to an already full workload.

So enough about talking about everything that your organization is having issues with. How can this be resolved and what type of solution and guidelines should be put in place so that this doesn’t regularly occur?

An identity and access governance (IAG) solution is the first way to help ensure that all rights are correct. The company sets up a model of exactly the access rights for each role in the organization. For example, someone working as a manger in the IT department will need certain access rights to systems, applications and resources. This allows the person who is creating the account to easily do so without accidentally making any access mistakes; either giving the employee too many rights or too little rights.

Once an account is created for the employee how can it be ensured that going forward changes are made efficiently and the network remains secure?

Another solution that can be used is workflow management. These applications are a controlled, automated process with a defined sequence of tasks that can replace an otherwise manual process. This allows for a streamlined process for employee requests and their implementation.

Using a web portal, employees can request any additional access rights to their current applications or even new applications. A workflow is set up so that when a user requests a change, the request then goes through a predefined sequence of people who need to approve it before the change is implemented. The organization can set up the workflow process however they desire, so that depending on the user, and what they request, the process goes through a specific sequence. There is also no need for the employee to bother their manager to check on the request. They can easily access the web portal and see exactly where the request is and what steps still need to be completed.

There are also several ways to check access rights, as a double check, to ensure that everything is correct throughout the year or at any interval. These methods will allow someone to check everything is correct easily and efficiently.

One way this can be achieved is with reconciliation. This module in an IAG solution compares how access rights are set up to be in the model to how they actually are and creates a report on any differences. Anything that is not accurate can then be sent to the appropriate manager to check the issue and easily correct if needed.

Attestation is still another form of checking access and goes one step further to verify everything is correct. A report will be sent out to managers of a department, with all their employees, for them to verify that everything is correct. For example, the marketing manager will receive a report on the access rights of everyone in the marketing department. He or she will need to look over and either mark access right for deletion, change access right directly, or create a ticket in the helpdesk system to change the access rights. After looking everything over, the manager must give their final approval for the proposed set of changes to ensure that everything is correct.

For organizations to receive the best benefits from cloud applications there needs to be guideline and solutions in place to help manage the accounts in these applications. These are just some of the many ways IAG solutions allow for the organization to easily ensure correct access rights.

By Dean Wiech

Was The Promised Land Of Cloud False? Or Did It Just Take A While?

Was The Promised Land Of Cloud False? Or Did It Just Take A While?

Cloud Consumption

A new day has dawned! Computing will now be accessed and consumed like a power utility. Just flip the switch and consume what you need. When done, turn it off and you pay only for what you used. Why it is so cheap and easy to use, you can buy it with your credit card. No more waiting for months to get equipment purchased, installed and verified. Welcome to the Promised Land – or so Amazon Web Services (AWS) promised us when it launched its cloud offering ten years ago.

But look where we are today. Sure, AWS is a behemoth with an annual run rate over $10 Billion. On the other hand, the promise of a simple and easy to use utility has been replaced by a wild garden of over sixty products and services. A growing number of firms are lining up to be AWS Certified MSP’s (Managed Service Providers) just to help you navigate this thicket. And AWS’s competitors, Microsoft and Google, are proliferating their offerings as well, as they chase the market leader. What happened? Amazon will tell you that they are just responding to the needs customers are sharing with them. And while true, let’s look deeper.

Consider our power utility analogy. All power in a household comes out of standard outlets in standard voltages and amperages. What we often don’t think about is how we turn that power into useful work for us. I am writing this on a computer where it stepped the voltage down to the low levels needed to process information through Integrated Circuits and memory drives.

I had toast this morning created by a toaster that took the full power and turned it into heat. The vacuum cleaner used a different amount to turn it into mechanical work. Think of your appliances as applications that take the raw standard electrical power and create some useful outcome for you. The key is that they manipulate that power – raising it up or down – to produce the needed outcomes.

IoT-CloudTweaks-Comic

That’s not the way it quite works in computing. Applications need different amounts of resources depending upon what they are designed to do in order to function well. We are used to the applications we all run on our personal computers and mobile devices. These were all designed to run on those standard platforms. Even today we can see that some run better than others depending on the machine you have. Some of the newer applications won’t even run on old machines or run so slow as to not be practical to use.

Imagine the difference between running the applications for a retail website, versus processing checks for payroll, versus analyzing a piece of the human genome. These are very different tasks needing very different levels of capability to be effective. So, was the cloud’s promise of computing being an easy to use utility a bogus come-on designed to draw in the unsuspecting? Not really, it was more of an imperfect analogy. (Aren’t they all?).

In the “early days” of cloud computing developers were used to needing to consider the concepts of servers, memory, storage, etc. When AWS started, it packaged its offerings in this familiar way. This means the developer had to be knowledgeable about the processing speed and capacity needed for the application to run well. Lots of different applications mean lots of different sizes and combinations – that’s how we got the unruly garden.

But what if that was not necessary? What if the machines were “smart” enough to know what the application needed? (I know, this takes a little time to get used to.) That’s where AWS Lambda comes in. The application is written to the Lambda Service – you do not specify any infrastructure – and then is activated by a triggering event. The event can be almost anything but let’s say, someone want to place an order on your site. The Lambda service then turns on the right resources, executes the application and you are billed only for as long as it took to execute your application. Billing is $0.00001667 for every GB-second used – Voila! – A true utility.

Microsoft and Google have responded and launched their own services in what is being called “serverless computing”. Although almost two years old, we are in the early days. While almost all customers use the original standard AWS offerings, only about 17% have used Lambda. But could it be, are we entering the Promised Land?

By John Pientka

(Originally published September 1st, 2016. You can periodically read John’s syndicated articles here on CloudTweaks. Contact us for more information on our programs)

Connecting Big Data and IoT

Connecting Big Data and IoT

Data Connection

Big Data and the Internet of Things (IoT) are two of the most discussed tech topics of late, and the progress of each eggs the other on; as the increasing amount of information collected due to an expanding range of IoT devices bulks up Big Data stores, so Big Data and Big Data analytics influences the designs and developments of new IoT sensors and mechanisms. Often working hand in hand, IoT and Big Data are changing our lives in big and small ways across a variety of sectors from healthcare management, to education approaches, to marketing and advertising.

The Elementary Connection

IoT, a quickly expanding compilation of internet-connected sensors, involves the multiple measurements obtained by device sensors which track our daily lives. These measurements are the Big Data so coveted today, large amounts of both structured and unstructured information typically obtained in real-time. It is important, however, to recognise that not all Big Data holds equal value and the tools used to process it play a significant role in the final value. To get the best out of IoT and the Big Data it collects, organizations struggle to access high-value and relevant data that is current, reflecting an adequately-sized information footprint, and able to provide necessary insights through analysis. This is easier said than done, and so far much of the data we collect isn’t able to give us considerable value.

What a Lot We’ve Got

DataStorm-comic-cloudtweaks

Gartner predicted 6.4 billion connected ‘things’ would be in use in 2016, and expects this number to reach 20.8 billion by 2020. In 2016, Gartner contends, 5.5 million new things will be connected each day. With the cost of sensor technology steadily decreasing, as well as developments in low-power hardware and spreading wireless connectivity, it’s no wonder we’re seeing such an explosion of IoT devices. On the other hand, there was really no shortage of Big Data before IoT technology became popular, and analysts predicted in 2012 that we’d see our digital universe, the digital data created, replicated, and consumed in one year, doubling every two years to reach 40 zettabytes by 2020. This enormous number has since been revised by some to approximately 10% higher than the original prediction. An even more astounding prediction came from Cisco, estimating that data generated from Internet of Everything devices (including people-to-people, machine-to-people, and machine-to-machine connections) would hit 403 zettabytes by 2018.

Big Data & IoT Disruptions

Such enormous quantities easily leave one feeling overwhelmed, and though it’s fairly obvious that Big Data and IoT will be disrupting our landscape, it’s almost too much to comprehend. Luckily for us, some brilliant data scientists and developers have simplified the processes for us and by implementing effective tools we’re seeing the positive outcomes in improved global visibility, more efficient and intelligent operations, and improved market agility and business systems through real-time information and insight.

The realm of influence of Big Data and IoT is already large, but to effectively meet expectations a few challenges will have to be dealt with. Standardisation is one area with no clear solution as the increasing number of devices comes with a growth in the applications and programs required to operate devices and analyse collected data; most IoT devices don’t work together, and their manufacturers are hesitant to join forces with competitors. Furthermore, we’re still waiting for a single framework which allows devices and applications to securely exchange data. Suggests OneM2M, “The emerging need for interoperability across different industries and applications has necessitated a move away from an industry-specific approach to one that involves a common platform bringing together connected cars, healthcare, smart meters, emergency services, local authority services and the many other stakeholders in the ecosystem.” Further barriers include concerns for privacy and security of data, as well as relevant skill sets and practical analytics tools.

The Big Data and IoT connection continues to grow and develop, and though not yet delivering everything we’re hoping for, it’s possible to see just how influential these two spheres will be in our future lives.

By Jennifer Klostermann

Three Ways To Secure The Enterprise Cloud

Three Ways To Secure The Enterprise Cloud

Secure The Enterprise Cloud

Data is moving to the cloud. It is moving quickly and in enormous volumes. As this trend continues, more enterprise data will reside in the cloud and organizations will be faced with the challenge of entrusting even their most sensitive and critical data to a different security environment that comes with using the cloud. Cloud service providers need to take the necessary steps to keep pace with these changes, all while instilling in customers the utmost confidence in the security of their environments. Due to the prevalence and public visibility of hacks and data breaches, confidence in cloud security may not come easily. However, for every apprehension or concern about cloud security, there is a tool or method available to properly secure the cloud and allow customers to enjoy the benefits of cloud computing while maintaining the proper level of security.

While there are many ways to secure the enterprise cloud, this article will highlight some of the most important features used to secure data in the cloud including authentication, authorization and encryption.

Let’s start with authentication. To make sure only authenticated users can log into a cloud service, enterprises should use an authentication mechanism held outside the cloud and in an enterprise datacenter. Many enterprises authenticate users by using Secure Sockets Layer (SSL) to establish an encrypted connection between their cloud provider service and their existing internal Active Directory Federation Services (ADFS) or Lightweight Directory Access Protocol (LDAP) server. Another popular authentication method is to use Security Assertion Markup Language (SAML) for Single Sign-On (SSO) that makes it easier for users to log in to multiple systems without remembering multiple passwords. Cloud service providers should also offer ways to integrate user authentication with two-factor authentication or multi-factor authentication tools that provide additional layers of enterprise security.

Second, authorizing the functionality a user can access is another way to help secure data in the cloud. After a user is logged in a cloud platform needs to provide rich functionality to authorize user actions. An enterprise cloud platform should also include Role Based Access Control (RBAC) that allows the authorization of users by source IP address, by username or by groups of users. The most advanced cloud platforms allow users to build customized Access Control Lists to build simple or complex authorization rules.

Finally, encryption is an additional level of security that encodes all the data so that only users who have a proper key can read it properly. Users without the key either cannot see the data or it is seen as an unintelligible string of characters. The first way cloud providers use encryption is to secure all data in-flight between client browsers and the cloud provider using Transport Layer Security (TLS), a protocol sometimes referred to by its legacy name SSL. This use of encryption secures all data between the enterprise customer site and the cloud service provider so it cannot be read in transit across the Internet.

In addition to using encryption for data in-flight, many cloud providers can also encrypt data at-rest while stored in a database using technologies like column encryption. Database column encryption, as the name suggests, can encrypt each database column using a unique private encryption key. This usually takes the form of authorizing specific fields to be visible by certain users or users with certain roles. For example, this use of data at-rest encryption could potentially only permit users who have an authorized Human Resources role to see database fields showing employees home addresses and other personal information in an unencrypted format.

Encryption

For some cloud service providers, there is an additional way to use encryption –encrypting data in the enterprise before it is sent to the cloud service provider. This technique uses a proxy application that resides in the enterprise network and encrypts data with a private key before sending it to the cloud. The data remains encrypted while in-flight and at-rest in the cloud. It is then sent back to the proxy application when requested and decrypted by the proxy. While this approach may seem to have security advantages, it can severely limit the usefulness of the data in the cloud as it is all encrypted and not readable by any cloud services.

While securing the cloud is a complicated, technical process, these main features represent the most foundational parts of properly securing the cloud. With consistent and thorough application of the proper security measures cloud service providers will enable customers to unlock the potential of the cloud.

By Allan Leinwand

Trading Routine: How To Track Suspicious Events In Different Locations

Trading Routine: How To Track Suspicious Events In Different Locations

Tracking Suspicious Events

Financial security can be compared to a constant arms race between cyber criminals and businessmen who try to magnify their assets. Trading and financial organizations bear the brunt of the losses occurred due to fraud because their active assets are more liquid and it attracts criminals in all shapes and forms. Security expenditures also turn to be forced losses.

In late 2013 for example, United States entered the age of the mega breach when Target Corp. lost 40 million credit-card numbers to Russian-hackers. And it didn’t stop there; other companies such as Adobe Systems Inc., Home Depot Inc., J.P. Morgan Chase & Co., Anthem Inc. and eBay Inc., fell victim to hackers.

Tense situations like these call for efficient tools for tracking suspicious events. An opportunity to detect and analyze these threats will produce an amplified outcome, i.e. significant revenues for businessmen.

In fact, trading companies generate huge amounts of information. And the main purpose of any corporate security system is to analyze the data and define suspicious events.

How to create an effective system to analyze and monitor corporate information?

Every day, companies are entrusted with the personal and highly confidential information of their customers, therefore creating an effective security policy, which is executed as planned, is extremely important. Experts in custom trading and brokerage solutions emphasize the following security issues that should be taken into account during the elaboration and integration of a security system:

1) Flexible scenarios

It is very well known that swindlers are continuously searching for sophisticated and innovative ways to commit fraud. Since hackers will scan for susceptibilities the minute they are discovered, an organization must have a routine in place for checking its own networks regularly. To address the challenge, we can’t employ universal scenarios; the only thing left is to use some specific methods. A ‘Threats and Alerts’ system should support a flexible parametric structure with individual indicators adjustment, giving the operator a possibility to regulate basic security scenarios and take into consideration all the factors.

2) Analysis algorithms plugged on demand

Trading routine - How to track suspicious events in different locations

Using the same information security tools and analysis algorithms demonstrates different levels of efficiency throughout the course of time. Some of them are up-to-date, others become obsolete. That is why the operator needs an analytic tool base that could be implemented within the context. At the same time, the solution provider should refresh and update the analytic tools base.

3) Online Geoscreening

Upon analyzing hacker attacks and fraudulent operations, specialists in custom e-commerce apps agree that the visualization of information on transactions and financial tools usage is of great importance during the initial stage of detecting suspicious events. Sometimes experts’ intuition and analytic skills prevail over automatic monitoring systems. That is why it’s crucial to provide the operator with well-organized and visualized information.

4) Machine learning algorithms

escalator-769790_640

Many specialists recommend another double system to track suspicious events. It is based on machine learning algorithms. The efficiency of such a system can be noticed only after a certain period of time when the algorithms already analyzed the needed amount of information. That’s why it is vital to launch this system as an independent sub-program as early as possible to obtain another security tool to address financial frauds.

Conclusion

As no one can predict the nature of a future threat (internal or external) it’s a must for a company to have an individual dynamic platform for analyzing information streams within and outside the institution.

By Yana Yelina

Cloud Computing – The Good and the Bad

Cloud Computing – The Good and the Bad

The Cloud Movement

Like it or not, cloud computing permeates many aspects of our lives, and it’s going to be a big part of our future in both business and personal spheres. The current and future possibilities of global access to files and data, remote working opportunities, improved storage structures, and greater solution distribution have the pundits encouraging the cloud move for one and all; on the other hand, complete reliance on electronic networks and external service providers comes with its own set of dangers, along with a sometimes insufficient understanding of the products and tools implemented.

The Increasing Demand for Cloud Computing

The last ten years have seen a marked increase in demand for and implementation of cloud computing. Thanks in part to smartphones, real-time streaming, connected devices, and always-on social media needs, this flexible, off-site, and highly scalable technology has become indispensable. Gartner estimates that we’ll see the public cloud services market reach $204 billion in 2016, an annual growth of 16.5%, and the highest growth is set to come from cloud system infrastructure services. Says Sid Nag, research director at Gartner, “The market for public cloud services is continuing to demonstrate high rates of growth across all markets and Gartner expects this to continue through 2017. This strong growth continues to reflect a shift away from legacy IT services to cloud-based services, due to increased trend of organizations pursuing a digital business strategy.

The Good

cloud-comic3

Availability

Cloud services mean solutions and resources once only accessible by the elite or giants are now open to all. With options such as pay per use and global reach, organizations of all shape and size can tailor packages to suit both their needs and their budgets.

Reduced Costs of Infrastructure

In the three top cloud computing categories, IaaS, PaaS, and SaaS, organizations typically don’t need to lay down their own infrastructure or spend money on hardware. Cloud service providers provide the IT teams, connections, software and storage facilities, reducing a business’s Capex costs.

Improved Disaster Recovery

Thanks to the distribution of data across multiple failover points, disaster recovery is a prime benefit of cloud computing. Implementing cloud-based disaster recovery means it’s possible to switch over to mobile systems when necessary and resume the use of local systems thereafter.

Collaboration & Flexibility

Providing advanced solutions for team collaboration, cloud computing allows numerous users to work simultaneously on the same projects and files with real-time updates and no restrictions that bind them to specific sites.

The Bad

cartoon-comic-data

Opex Costs

Although cloud computing certainly reduces Capex costs, it naturally increases operational costs adding a monthly burden for the services used. It’s important to weigh up the pros and cons of outsourcing or keeping infrastructure in-house to suit each business and its budget.

Security

A concern in all things IT, cloud computing is no different. It’s important for organizations to identify which data they’re comfortable storing on the cloud, and which perhaps should be off-network. It should be noted, however, that most reputable cloud service providers offer security superior to that which the average business is able to implement. Security doesn’t have to be considered a negative of cloud computing, as long as organizations take the time to ensure the tools they’re using are compliant with regulations and standards, and confirm their service providers are implementing the necessary security features.

Always-On Connection

Cloud computing, of course, requires an always-on internet connection, good bandwidth, and suitable speeds – only a negative when you haven’t got it.

Limited Control

Although cloud computing provides much flexibility and choice, it’s important to remember that the infrastructure is owned by someone else and so organizations are limited to the services they pay for and the solutions a service provider is willing to provide.

Overall, the drawbacks of cloud computing tend not to cause too much disruption and are easily outweighed by the benefits. It’s important to understand the risks and disadvantages, but the constantly evolving cloud computing environment is rapidly stamping out weaknesses and replacing them with constructive innovations.

By Jennifer Klostermann

Three Tips To Simplify Governance, Risk and Compliance

Three Tips To Simplify Governance, Risk and Compliance

Governance, Risk and Compliance

Businesses are under pressure to deliver against a backdrop of evolving regulations and security threats. In the face of such challenges they strive to perform better, be leaner, cut costs and be more efficient. Effective governance, risk and compliance (GRC) can help preserve the business’ corporate integrity and protect the brand, but in an ever-changing technology landscape and with complex, inter-related business operations to manage, implementing GRC can seem like a complex undertaking.

Many businesses still manage operations departmentally, with activities separated by business silos. This can make implementing policies and processes with pan-business reach seem difficult. GRC falls into this category – it has to span all business departments – but it doesn’t have to be such a headache.

Businesses can simplify GRC with these three tips:

1. Don’t boil the ocean

GRC covers a lot of ground – operational risk, compliance, cybersecurity, third party management, auditing and so on – and incorporates hundreds of rules and regulations, dozens of policies and scores of risk management activities.

The trick to simplification is to take it one step at a time; to not try and do everything at once. Anyone attempting to deploy an integrated solution for all GRC activities in one go is courting failure.

Instead, take on two or three activities to be prioritized within an integrated GRC program. A few simple questions about your business processes – how they work, how they can be more effective, and how they can be audited and monitored – will reveal where the priorities lie for efficient GRC.

Many companies start with internal auditing and Sarbanes-Oxley (SOX) compliance for financial reporting. Others with enterprise risk management or operational risk management; still others with IT compliance and policy management.

Developing common frameworks and taxonomies – which is a critical foundation for effective GRC – is simpler begun with two or three key activities. Over time, additional activities can be brought into an integrated GRC program.

2. Develop common frameworks and taxonomies

A valuable benefit of an integrated GRC solution is that different activities – risk management, compliance, auditing and so on – can share information. For this to work effectively, they need to conform to common taxonomies. As well as enabling collaboration, common taxonomies can help identify redundancies so that rationalization can take place. This keeps the system up to date and helps reduce the cost of control testing and risk assessments.

Policies and rules held within common frameworks give companies the control they need for rapid change when it comes about. It’s one thing to embed automation within systems so that, for example, payments over a certain authorization level get the required sign-off before they’re authorized, but what happens after a merger or acquisition? If all those rules are hardwired into individual systems, there’s a whole bunch of work to do to achieve consistency across merged companies. When the rules sit outside individual workflows they can trigger action inside. The set of these rules ‘libraries’ is qualified in the GRC system.

3. Use pre-packaged cloud-based applications

Most vendors offer both on premise and cloud-based application – Going with the cloud relieves the business’ IT infrastructure from supporting the GRC solution. GRC in the cloud helps consign manual processes to the past. Furthermore, future upgrades are simpler with pre-packaged solutions that haven’t been customized.

The cloud approach also ensures that you are set up for real-time Content Integration such as Regulatory Change Management. This is important because GRC is not only about systems and tools but it is also about staying abreast and ahead of the regulatory landscape that is constantly evolving.

Risk and regulation is always evolving. The way businesses manage it cannot stand still either. The future of GRC lies in automation, integrated reporting and a culture of compliance. By heeding the three tips for simpler GRC, businesses can help mitigate risk, minimize compliance firefighting and smoothly manage change wherever it may come from to drive better business performance.

By Vidya Phalke

CloudTweaks Comics
Cloud Computing Price War Rages On

Cloud Computing Price War Rages On

Cloud Computing Price War There’s little question that the business world is a competitive place, but probably no area in business truly defines cutthroat quite like cloud computing. At the moment, we are witnessing a heated price war pitting some of the top cloud providers against each other, all in a big way to attract…

The CloudTweaks Archive - Posted by
Cloud Infographic: IoT For Automotive Deconstructed

Cloud Infographic: IoT For Automotive Deconstructed

IoT For Automotive Deconstructed The IoT automotive industry is moving rapidly with many exciting growth opportunities available. We’ve written about some of the risks and benefits as well as some of the players involved. One thing for certain as that the auto industry is starting to take notice and we can expect the implementation of a…

Cloud Infographic – The Internet Of Things In 2020

Cloud Infographic – The Internet Of Things In 2020

The Internet Of Things In 2020 The growing interest in the Internet of Things is amongst us and there is much discussion. Attached is an archived but still relevant infographic by Intel which has produced a memorizing snapshot at how the number of connected devices have exploded since the birth of the Internet and PC.…

The Future Of Work: What Cloud Technology Has Allowed Us To Do Better

The Future Of Work: What Cloud Technology Has Allowed Us To Do Better

What Cloud Technology Has Allowed Us to Do Better The cloud has made our working lives easier, with everything from virtually unlimited email storage to access-from-anywhere enterprise resource planning (ERP) systems. It’s no wonder the 2013 cloud computing research IDG survey revealed at least 84 percent of the companies surveyed run at least one cloud-based application.…

Cloud Infographic – Big Data Analytics Trends

Cloud Infographic – Big Data Analytics Trends

Big Data Analytics Trends As data information and cloud computing continues to work together, the need for data analytics continues to grow. Many tech firms predict that big data volume will grow steadily 40% per year and in 2020, will grow up to 50 times that. This growth will also bring a number of cost…

Cloud Computing and Finland Green Technology

Cloud Computing and Finland Green Technology

Green Technology Finland Last week we touched upon how a project in Finland had blended two of the world’s most important industries, cloud computing and green technology, to produce a data centre that used nearby sea water to both cool their servers and heat local homes.  Despite such positive environmental projects, there is little doubt that…

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US. The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about…

Are Women Discriminated Against In The Tech Sector?

Are Women Discriminated Against In The Tech Sector?

Women Discriminated Against In Tech Sector It is no secret that the tech industry is considered sexist since most women are paid less than men; there are considerably fewer women in tech jobs; and generally men get promoted above women. Yet the irony is twofold. Firstly, there is an enormous demand for employees with skills…

7 Common Cloud Security Missteps

7 Common Cloud Security Missteps

Cloud Security Missteps Cloud computing remains shrouded in mystery for the average American. The most common sentiment is, “It’s not secure.” Few realize how many cloud applications they access every day: Facebook, Gmail, Uber, Evernote, Venmo, and the list goes on and on… People flock to cloud services for convenient solutions to everyday tasks. They…

Cloud-Based or On-Premise ERP Deployment? Find Out

Cloud-Based or On-Premise ERP Deployment? Find Out

ERP Deployment You know how ERP deployment can improve processes within your supply chain, and the things to keep in mind when implementing an ERP system. But do you know if cloud-based or on-premise ERP deployment is better for your company or industry? While cloud computing is becoming more and more popular, it is worth…

How To Overcome Data Insecurity In The Cloud

How To Overcome Data Insecurity In The Cloud

Data Insecurity In The Cloud Today’s escalating attacks, vulnerabilities, breaches, and losses have cut deeply across organizations and captured the attention of, regulators, investors and most importantly customers. In many cases such incidents have completely eroded customer trust in a company, its services and its employees. The challenge of ensuring data security is far more…

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Success for Today’s CMOs Being a CMO is an exhilarating experience – it’s a lot like running a triathlon and then following it with a base jump. Not only do you play an active role in building a company and brand, but the decisions you make have direct impact on the company’s business outcomes for…

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt. Even so, decision makers should not put off moving from old legacy systems to…

Are Cloud Solutions Secure Enough Out-of-the-box?

Are Cloud Solutions Secure Enough Out-of-the-box?

Out-of-the-box Cloud Solutions Although people may argue that data is not safe in the Cloud because using cloud infrastructure requires trusting another party to look after mission critical data, cloud services actually are more secure than legacy systems. In fact, a recent study on the state of cloud security in the enterprise market revealed that…

Three Factors For Choosing Your Long-term Cloud Strategy

Three Factors For Choosing Your Long-term Cloud Strategy

Choosing Your Long-term Cloud Strategy A few weeks ago I visited the global headquarters of a large multi-national company to discuss cloud strategy with the CIO. I arrived 30 minutes early and took a tour of the area where the marketing team showcased their award winning brands. I was impressed by the digital marketing strategy…

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

Why Businesses Need Hybrid Solutions Running a cloud server is no longer the novel trend it once was. Now, the cloud is a necessary data tier that allows employees to access vital company data and maintain productivity from anywhere in the world. But it isn’t a perfect system — security and performance issues can quickly…

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends Once upon a time, only a select few companies like Google and Salesforce possessed the knowledge and expertise to operate efficient cloud infrastructure and applications. Organizations patronizing those companies benefitted with apps that offered new benefits in flexibility, scalability and cost effectiveness. These days, the sharp division between cloud and on-premises infrastructure…

The Fully Aware, Hybrid-Cloud Approach

The Fully Aware, Hybrid-Cloud Approach

Hybrid-Cloud Approach For over 20 years, organizations have been attempting to secure their networks and protect their data. However, have any of their efforts really improved security? Today we hear journalists and industry experts talk about the erosion of the perimeter. Some say it’s squishy, others say it’s spongy, and yet another claims it crunchy.…

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

The 80-20 Rule For Security Practitioners  Everyday we learn about yet another egregious data security breach, exposure of customer data or misuse of data. It begs the question why in this 21st century, as a security industry we cannot seem to secure our most valuable data assets when technology has surpassed our expectations in other regards.…