Category Archives: Contributors

If Not Managed Correctly, The Cloud Can Cost An Organization

If Not Managed Correctly, The Cloud Can Cost An Organization

Cloud Application Management

When organizations implement cloud applications, they do so as a means to be more efficient and in the hopes of saving a great deal of money. What many organizational leaders often don’t realize is that they need some type of solution to help them manage these applications in the background for them to be successful. Without some way to properly manage cloud applications, organizations can actually spend a great deal of time and money to have someone manually managing these processes, rather than saving them time and resources.

While the management of cloud applications doesn’t sound difficult, it is very time consuming. Think about an organization that has frequent movement of employees or has temporary employees working there. To create accounts for each of these employees and make changes when needed, often requires a full-time admin. For the admin, they can quickly become overwhelmed with work and calls for changes to accounts, which can leave them no time to handle other more technical or important projects.


The management of these applications not only effects admins, but also has an effect on other groups. The organization as a whole — the helpdesk or admin and the end users — are all affected. For example, the organization and its managers are concerned with how much the overall cost and ROI is of the technology that they use and how efficiently everything works. If the cloud is not managed correctly it can end up costing more time and money for the organization.

And, of course, what about the end user? They want to be able to access what they need quickly and efficiently and have any changes to their accounts or access made in a timely manner so that they can complete their work and any projects. Who wants to wait around for additional access to work on a project that has a deadline. For example, often an employee needs to contact a manager or admin if they need access to an application or to make a change to their account. If this request is time sensitive, they may continually contact the manager to check up on the progress and see if the change is being implemented.

Why Organizations Are Hesitant

Why are organizations hesitant about a solution, such as identity and access management, to help manage their cloud applications then if they can benefit many different people and groups in the organization? One of the reasons for the resistance is that many of the solutions that were available to help with account management when they first came out were often large scale solutions, which cost a lot, took a long time to implement, and were for larger organizations.

Many organizations also think that they can just do it themselves. The reality is they don’t realize that these tasks are extremely time consuming and are taking time from some of their highly technical employees who could be working on other projects. It also might be costing them more to have a full-time employee manually managing cloud applications and issues. Many IAM vendors now offer the ability to choose exactly which modules are needed so that they don’t need to purchase a large enterprise solution with modules that they don’t need. They can tell the vendor exactly what is needed and have them customize the solution. This drastically reduces both the cost and the time to implement. This allows even smaller organizations to benefit from IAM solutions.

How IAM Solutions Can Help

So now that we talked about why a solution is needed and why many organizations are hesitant to employ the solutions, let’s look at how many types of different IAM solutions can assist with the cloud applications that organizations use.

Account management of cloud applications can easily be automated along with in-house applications. An automated account management allows the organization to link their HR system to the systems and applications that the company uses so that any change that is made in the HR system is automatically reflected in all connected applications. So for example, when a new employee starts at the organization they can simply be added to the HR system and have their accounts automatically generated for them. This allows both the admin to quickly create accounts and the end user to begin work right away without needing to wait around.

Another solution that can be used is workflow management. Using a web portal, employees can request any additional access rights to their current applications or even new applications. A workflow is setup so that when a user requests a change, the request then goes through a predefined sequence of people who need to approve it before the change is implemented. The organization can set up the workflow process however they desire, so that depending on the user, and what they request, the process goes through a specific sequence. There is also no need for the employee to bother their manager to check on the request. They can easily access the web portal and see exactly where the request is and what steps still need to be completed.

These are just some of the many solutions that help with the management of cloud applications behind the scenes. There are many other ways that IAM solutions can be customized to meet the unique needs of each organization. Since we discussed how the management of the cloud has an effect on several groups within the organization lets now look at how an IAM solution can benefit these different groups.

For the admin, they can easily manage user accounts or even delegate this task to a less technical employee so that they can work on other more technical issues and projects. They no longer need to perform tedious account management tasks that are extremely time consuming. For the end user, they can easily get any access or application they need efficiently without needing to continually contact an admin. With the portal in the workflow management module, they have an easy way of requesting any changes if needed. Lastly, for the managers and overall organization they can realize the true benefits of cloud applications without needing to focus on the manual tasks of creating, disabling, and making changes to user account.

By Dean Wiech

Connecting the Power of IoT

Connecting the Power of IoT

Connection Power

I come not to bury Caesar. Nor do I come to bury his estimates. Estimates, attempts based on data filled with holes to produce a best-guess scenario. Start with how you gather data and from there how you determine what the risks are going forward. Estimation is tough, so you want to caveat your assumptions. You want to make sure you are on the low side, not on the high side.

Alternatively, you can be slightly on the high side, but not way over-estimating the change or impact. So it is my exception that in fact the estimates published so far for Internet of Things solutions are low. The question though isn’t that they are low, but rather by how much.

3D Printing

So a couple of meanderings first… A connected car would count as X number of IoT devices. I say X number here because at this point there really isn’t a standard and frankly every car has different levels of IoT connectivity. If your car has a device that connects to your cellular phone, that counts as one IoT connection. If you have a second connection in your car that is linked to a manufacturer, that represents a second IoT connection. Between those two connections however, you could have nearly 100 IoT sensors. This would include automatic braking, cruise control and lane control, just to name a few. There is also the broader concepts of crash and report sensors—when the airbags deploy, the car automatically calls 911.

Geek Connection

(Geez, he is meandering all over the place. What’s the point?) Well first, I would like to mention that I personally am a geek. I have begun the process of automating my home. I have connected TV’s and many other connected devices available today. In fact, I now have over 100 connected devices currently in my home. The reason I am bringing up this number has to do with the fact that I know there are MANY people that have more IoT devices deployed in their homes than I do.

I also happen to know you always throw out the highest end when providing an estimate. If your basis of the estimate is on the high-end, you will invariably become frustrated. This is mostly because the high-end isn’t a true overall number. Still, there are between 2.5 and 3 billion cellular phones on the planet today. If we take industrial, government and home IoT devices and add in the cellular phones, this number is greater than 12 billion worldwide. My basis for this is the logical breakdown of the numbers.

The IoT POWER Users

First off, I think analysts missed the high-end of power users when it comes to IoT devices. Even if we remove the outliers, we still have a large population, maybe as high as 500 million people with more than 12 devices per member of the household on average. Just this population and deployed cellular devices (also IoT devices) would come out to between 7.5-8 billion deployed devices. There are between 2-3 million deployed cameras for business and home security in New York City alone, so this number would likely rise to closer to 20 billion IoT devices actually deployed in the global market today.

New York

By the way, you probably remember that I opened with the fact that I didn’t come to bury Caesar. I understand why the analysts choose to publish lower numbers. My numbers come with an inherent risk. I could be wrong. I could be off by a factor of .2 or even .3. Even with this margin of error, I still think the published numbers are on the low side. My gut in playing with and laying what I believe the real numbers to be is that the analysts are off by 50% today. I believe there are between 18 and 22 billion IoT devices deployed right now. Depending on how things move, by 2020, this number will be closer to 100 billion deployed devices.

Noise to Ratio

Frankly, I worry a bit based on the reality of these numbers, and that’s why I started evaluating this. I believe home networks are going to be the first to topple. First off, it is fairly easy to crack most home routers, and secondly because they are not made to support the load of all these new IoT devices. The noise-to-signal ratio in an overused home network will allow more and more hacks to go undetected for longer periods of time.

SO, the risk is that the home network becomes saturated and falls over. This presents significant risk for businesses, as every employee that works from home presents a new and potentially hard-to-catch security leak. When friends ask what can I do to prevent this, I always tell them this simple answer: Go and buy a new Wi-Fi router. Create an easy password for the networks on that router and plug it directly into your router. Then only connect IoT devices in your home to that network. You can still get all the data from the IoT devices that you want. But now, you have a separate network segment that will reduce total bandwidth consumed. And if something happens, you can unplug that router from the internet. The best security strategy for Internet-connected devices remains from removing the Internet connection.

I come not to bury Caesar, but I would like to know where he keeps his IoT devices!

By Scott Anderson

RCS In Emerging Markets Means A Step Forward For Cloud Computing

RCS In Emerging Markets Means A Step Forward For Cloud Computing

Rich Communication Services

As a cloud service provider operating in emerging markets, we’re excited about the possibilities of rich communication services (RCS) offered by mobile carriers.

You may remember, in the era before smartphones (and a bit after the launch), SMS ruled the mobile messaging landscape. There were no over-the-top (OTT) messaging platforms to compete with text-based mobile chatting and mobile carriers placed a premium on text messaging amounts.

And then came WhatsApp, LINE, Facebook Messenger, WeChat, Kakao, and Skype (among many others). All of a sudden, mobile carriers saw customers place less of a premium on SMS and data became the real breadwinner.

So why does the development of a chat platform excite a white label cloud service provider?

Simply put, RCS may be the messaging platform, but cloud is where the chats can be stored, backed up and accessed. If RCS is on the rise, we feel that cloud is going to be right there with it in the search for the perfect 4G bundle. Given that mobile carriers are involved in implementation of RCS, we feel that the cloud capabilities to accompany it are up for grabs.


So when Google announced last February that they had signed with the GSMA and mobile carriers around the world to develop an RCS for Android, we were just as excited about what that meant for the future of cloud on Android phones.

What is RCS you ask?

Think of RCS as SMS on steroids. While SMS lacks a lot of the emoticons, gifs, and video chatting options we typically find on OTT messengers, RCS would allow SMS to compete with similar features.

While early reception to RCS is mixed, when it’s implemented well, users are eager it implement it. T-Mobile’s RCS app, Advanced Messenger came armed with features such as real-time typing display, message delivery reports, and file sharing. After seven months, the US mobile carrier reported over 5.5 million users on the platform.

However, RCS does have a spotty history. Joyn, an RCS app by Jibe, failed to pick up any traction when it launched. While there were many reasons for Joyn’s failures, mainly the fact that while Joyn had potential, it was very poorly executed.

Thankfully, Google bought Jibe in September 2015, signing with the GSMA and mobile carriers globally just 5 months later.

RCS and the Emerging Market

In the first quarter of 2016, Android held the largest mobile market share by a mile with 84.1% of all units sold. And with Android devices sweeping emerging markets, the distance between first place Android and second place iOS looks like it’s only going to get further.

So here we are, with our own service targeting emerging markets, when all of a sudden, the leading device OS manufacturer comes out and says they want to implement a service that will require users to save more data on their existing plans.

RCS presents mobile carriers an opportunity to claim the messaging crown back from the likes of OTT. Putting things in perspective, mobile carriers are expected to earn US$96.7 billion in 2018 from SMS. In 2013, they were making US$120 billion from SMS revenue. You can see clearly the 23.3 billion reasons mobile carriers want to see RCS come to life.

In emerging markets where Android has dominated, RCS could take a healthy bit out of users on LINE, WeChat, Kakao, QQ, and even giants Facebook and WhatsApp. If you take into effect the network effect that OTT messengers benefit from (you download the messenger your friends and intimates use), RCS would benefit in the same vein iMessenger does for iPhones.

So when you factor in the network effect plus the segmentation of the OTT messenger space in emerging markets like Asia, this means Google could penetrate these markets instantly and be the first stop for mobile users with a pre-installed messenger.

Plus White Label Cloud

Looking at our own internal usage statistics for Cloudike personal, we’ve found that about 80% of the files stored are photos. While some users store files just to have backups ready in case of disaster, we would wager that some users store them because of space capacity on their phone.

So here comes RCS with with media sharing capabilities pre-installed on a phone. With Google’s backing, we’ve no doubt that RCS constitutes another source of data to occupy user storage spaces. And what better way for mobile carriers to get in on this action by pushing a white label cloud branded under their own flag to preserve and backup all this beautiful data shared over their network?

Again, we’re excited for RCS…

By Max Azarov

4 Tech Cleanups For National Cyber Security Awareness

4 Tech Cleanups For National Cyber Security Awareness

National Cyber Security Awareness

October is Cyber Security Awareness Month, and President Obama recently called upon the American people to recognize how important it is to have tight cyber security — not just for keeping personal details secure, but to support the nation’s security. Now is the perfect time to ensure you’re doing all you can to lock down your tech tools.

Here are four actionable tips you can take:

1. Restore Devices to Factory Settings Before Parting With Them

With the holiday season just around the corner, perhaps you’re considering selling a tablet you no longer need so you can generate some cash. There are many websites and physical shops that’ll accept devices in good condition. Or maybe you want to play Santa and give an old tablet to someone in your family who’s not concerned about having the newest model.


Before doing away with yours, though, select the option to restore the gadget to factory settings. That way, you can rest assured the next owner won’t see personal details you forgot to delete.

2. Verify App Permissions Regularly

Many of the apps you use every day probably share personal details. During the setup process of installing applications, you’ll usually see a pop-up window that indicates the kinds of information that’s normally disclosed through the app. If you don’t agree with the outlined permissions, there are usually ways to alter what’s shared.

However, many people absentmindedly accept the default permissions sharing option and only skim through the section that lists shared information. If you relate to that approach, there’s an easy way to take action — with another app.

MyPermissions works on the Android and iOS platforms and offers a streamlined way to scan your social media networks, computers and devices and check which permissions you’ve authorized for each application. It’s also possible to set alerts and receive notifications when apps access personal information.

After you’ve done an initial check, continue verifying permissions every month, or more often if you install apps frequently. Furthermore, adopt the habit of taking time to read details about permissions when downloading a new app rather than just blindly giving acceptance. Finally, scroll through your device and decide whether you really need everything you’ve installed.

3. Delete Old Files in All Applicable Places

Hopefully you’re already accustomed to periodically deleting unwanted files from your tablets, computers and smartphones. That’s a smart step to take because it not only helps those devices run faster, but it also gives hackers less data access if your files become compromised.

However, if you use a cloud-backup service, don’t forget to delete old files from the cloud instead of just from physical devices. The process for doing that varies based on the service you use, but it’s normally very straightforward.

This tip applies both to cloud-based backup services and cloud-storage services such as Dropbox and Google Drive. When people use the cloud-storage services, they often upload important documents, especially if they’re nervous about losing work due to a hardware failure or if they want to access those files from anywhere. When deleting files for good, don’t overlook those cloud-based storage sites, especially if you rely on them to hold sensitive information.

4. Set up Automatic Software Updates

There are dozens of ways to make your time online as secure as possible, and you can adjust settings so some of them occur in the background. Namely, see if your software has an automatic updating feature, and if so, turn it on. Software manufacturers publish security updates, patches and other enhancements to improve the user experience, especially if they’ve identified security vulnerabilities.

By enabling automatic updates, it’s easy to keep software as current as possible without further action on your behalf. Plus, ongoing updates allow you to always access the latest versions of software, preventing you from unknowingly using buggy, outdated versions that don’t include all available features.

The tips above all only take a few minutes of your time, but they could safeguard you from hours of headaches that result from compromised data. Tune up your tech today while simultaneously celebrating Cyber Security Awareness Month.

By Kayla Matthews

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends

Once upon a time, only a select few companies like Google and Salesforce possessed the knowledge and expertise to operate efficient cloud infrastructure and applications. Organizations patronizing those companies benefitted with apps that offered new benefits in flexibility, scalability and cost effectiveness.

These days, the sharp division between cloud and on-premises infrastructure is quickly becoming a thing of the past. In fact, the cloud has become so ingrained in the fabric of the enterprise computing experience that we often don’t even use the term “cloud” as a descriptive qualifier, but rather take it for granted as an inherent and vital component of all IT environments.


In the enterprise, where once traditional on-premises software like Oracle and SAP dominated the IT environment, organizations are now increasingly turning to cloud and cloud native capabilities – that is, applications built from microservices running in containers, or installed in cloud-based virtual machines (VMs)–to achieve greater efficiency and better economic value of IT services.

Why the surge of interest in cloud native technologies? Organizations that are making new and ambitious forays into the world of cloud native are allowed to press the proverbial “reset button.” For them, it’s an opportunity to do things differently, from customer-facing applications all the way down to the infrastructure layer.

And the advantages are tremendous. The ability to develop and manage applications in a true modular fashion – to troubleshoot and update components up and down the stack without impacting other parts of the application – delivers better efficiency and strong economic benefits which are some of the reasons why more and more organizations are rolling up their sleeves and diving headfirst into this new arena.

One of the driving forces behind this technological and economic transformation is the proliferation of container technologies like Docker*, which helps to enable automated deployment of cloud native applications. All you have to do is look at the numbers to wrap your head around Docker’s exponential growth rates. In February 2016, 2 billion Docker images had been pulled from Docker hub.


That number has recently surpassed 5 billion in August 2016, according to Docker published statistics. If this kind of growth trajectory remains consistent, it’s very likely that by 2020 nearly 100 percent of net new enterprise applications will be cloud native and a significant portion of legacy applications will be migrated to cloud native infrastructure.

The ripple effects around this massive shift are extensive. One of the ramifications is that traditional IT tooling suites are going to be losing quite a bit of “real estate.” For example, traditional storage mechanisms will likely give away to software-defined storage. Traditional networking with physical routers connected to physical endpoints will be replaced with virtual overlay networks whose topologies can change on the dot. And security mechanisms that work on traditional host or VM boundaries will need to adopt new semantic lens to address containers or container-equivalent.

Is this shift occurring already? The short answer is yes. Many user organizations are either already in the middle of the transformation or are actively preparing for this impending reality. Adobe, the Silicon Valley based digital media company, is moving its hugely-popular Creative Cloud services to cloud native infrastructure. Online payroll service provider ADP made an early and critical bet on Docker technology and is transforming many of its applications and services to a cloud native implementation. GE digital’s Predix system will be largely built on container infrastructure. Even GSA, the largest service provider to the U.S. government, invested heavily in Docker and microservice-related technologies to modernize service delivery to government agencies.

But what may be an even bigger harbinger of changes to come is that many startups aren’t investing in legacy products, but instead are leap-frogging over traditional solutions right into container technologies and cloud native apps. And the startup companies of today will be the new industry visionaries and leaders of tomorrow. While the apex of this technological shift might still be some time in the future, organizations that are laying the foundation for this transformation today will not only have a competitive edge tomorrow, but will also help pioneer an entirely new era of digital transformation.

By Chenxi Wang,

chenxi-wangDr. Chenxi Wang, current chief strategy officer at Twistlock, is a security industry veteran and a respected thought leader. She held a variety of strategy leadership positions from Intel and Ciphercloud, following a stint as a highly respected industry analyst at Forrester Research. Chenxi held a faculty position at Carnegie Mellon University earlier on in her career. She has a Ph.D. in Computer Science from University of Virginia.

Introducing and Implementing Voice Biometrics in Call Centers

Introducing and Implementing Voice Biometrics in Call Centers

Voice Biometrics in Call Centers

It wouldn’t be wrong to say that voice biometrics is the way of the future, when it comes to verifying the identity of customers contacting call centers. Market research firm Forrester, for one, predicts it will be the go-to authentication solution for financial institutions by 2020.

But it is just as accurate to say that voice biometrics is rapidly being recognized as today’s best practice as well. Already, major businesses in such sectors as banking and finance, healthcare, telecom, and other security-sensitive fields are recognizing that voice authentication offers a wide array of compelling benefits.

For one thing, it vastly improves the customer experience, by doing away with the unwelcome interrogations that call centers traditionally needed to go through to identify each caller. Since voice authentication relies on the caller’s normal conversation itself, and verifies a caller’s identity in real time without requiring any effort on the caller’s part, the process is frustration-free, unlike a barrage of questions. In fact, most consumers say they prefer voice authentication to jumping through the current hoops. Secondly, because voice authentication takes into account more than 100 variables of speech in a sophisticated mathematical expression, it offers a high degree of accuracy and security that rivals or exceeds the certainty of the fingerprint.


(Infographic Source: NJIT)

And, in no small matter for businesses, it offers benefits that go directly to the bottom line. By eliminating time spent on verifying identity every time the phone rings, it frees up employees for the revenue-generating activities at the heart of their jobs.

That said, it is still the case that making the transition from the old way to the new and improved way doesn’t come without challenges. Fortunately, with the right guidance for efficient implementation, these adoption challenges become negligible.

Facing the hurdles, and clearing them

  • As much as deploying a voice authentication solution is a technical challenge, it is also a legal one in many jurisdictions. It can’t happen without the consent of the customers, so investigating the requirements and potential issues is an essential starting point.
  • Once legal questions are resolved, the next step is optimizing the process of actually asking for consent. The key to mounting a successful recruitment campaign includes not only making an effective pitch by way of carefully selected channels (mass media, email etc.), but also providing consumers with the information they need about voice biometrics to make an informed decision about whether they want to opt in.
  • Enrolling those who give consent demands yet another optimized process to collect and maintain all the necessary records, but it also calls for attention to a crucial factor. If done less than optimally, enrollment can be a lengthy, complex, and expensive proposition to gather the voiceprints of customers. The alternative, as pointed out by the experts at NICE, a leading provider of voice biometrics solutions, is to enroll customers “passively“. As opposed to using an “active” approach, in which customers might be asked to repeat a phrase a number of times to create a voiceprint, a process that undermines the customer experience gains voice biometrics offers. The passive approach employs a solution that integrates with existing call recording capabilities to leverage historical calls. Once they gave their consent, customers can then be enrolled without having to do anything at all.
  • The need for integration doesn’t end with enrollment. It is not uncommon for call centers to need to integrate voice biometrics technology with a number of other systems such as security and CRM software. That can be a lengthy and costly process when an ad-hoc integration is attempted, but selecting a biometrics product that offers ready-made end-to-end support or that features embedded APIs can alleviate the problems.

The advantages outweigh the challenges

Everybody wins with voice biometrics. It puts the customer first, because it eliminates extra steps and frustration. Businesses and customers alike benefit from the added security it provides, and from the shorter call times, which pay off in convenience for the customer and increased ROI for the company – especially when the company has selected a biometrics solution that adapts to all necessary integrations.

By Naomi Webb

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data

Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt.

Even so, decision makers should not put off moving from old legacy systems to a more flexible and accessible solution — whether public or private. By putting the right system in place, businesses can free up IT staff for more strategic projects, ensure content is available and retrievable whenever and wherever it’s needed, and analyze data effectively for actionable insights.

Keeping Data Accessible

There are several obstacles that must be overcome when keeping data accessible:

1. Storage silos have been a problem since the first digital storage devices hit the market. Directory and folder hierarchical structures were fairly useful when dealing with a limited number of files, all accessed by users or a handful of applications that knew where to find the files.

But in today’s connected world, where remote collaboration and access by many devices and applications is the norm, these hierarchical structures are hindering new workflows and locking files in place (hence the term “silo”).

2. Search issues present a number of operational and financial challenges to businesses. Searching for data from multiple systems spread across several geographic locations is a laborious task, and the need to use both past and present data makes it even trickier.

The data that is searched is often indexed in a database located in a specific application. This valuable “data about the data,” also known as metadata, needs to be stored in a way that enables portability. The Internet of Things has opened businesses to a world of new data possibilities, but going back to a specific application to search for your file or continuously migrating entire data sets to different analysis applications wastes valuable time and introduces the possibility of errors.

3. Scalability dilemmas in storage capacity, both in file count and the amount of data, as well as expansion to different geographies, prevents businesses from keeping pace with the needs of modern data accessibility requirements.

Most organizations keep data forever because they don’t know what will have value. There are also many use cases in which government regulations require longer retention times and tighter security, creating a compounding effect on storage needs. This growth, combined with the need to keep the data accessible, poses a serious problem for traditional network attached storage solutions, file systems, and their complex hierarchical structures.

Making Your Storage Efficient


While certainly challenging, these problems are far from insurmountable.  Here are 3 easy-to-implement solutions to help keep storage simple and efficient:

1. Consolidate your data in one storage platform.

The dawn of the cloud was a major breakthrough for data storage, and the first step toward a simplified storage process is to embrace that technology. Sharing resources in a virtual environment is at the heart of the transformation we’re seeing to a more service-based approach in IT.

You can now stand up a storage service within your own data center (a private solution) or use any one of the services on the market (a public solution). If you need to keep your data secure or plan to keep the data for more than three years, private is most likely your best option. However, if you have limited data center space or only need to store data for a few months or years, public is probably the way to go.

2. Leverage metadata.

Data is growing at an astonishing rate, and experts predict the digital universe will reach 44 trillion gigabytes by 2020. But what use is that data if it can’t be found or identified? Metadata is an essential tool for simplifying data storage because it allows managers to quickly and automatically identify characteristics about data in a way that can continuously evolve, providing new views of ever-changing data sets.

The key is making metadata portable and accessible by any application or device in a way that’s easy to protect. For this reason, metadata searching and management must be native features of your storage systems — not just afterthoughts.

3. Adopt object storage.

Object storage is a core feature of many major cloud storage services on the market and is the most efficient and cost-effective way to reliably store and provide access to petabytes of data and trillions of files. Object storage is highly automated, resilient, and easily available, resulting in a vastly improved capacity-to-management resource ratio. It’s common to find one system administrator managing more than 10 PBs of storage (compared to 1PB for a NAS solution).

Object storage uses a method of storing and retrieving data that uses a key or name, supplementing that with metadata. Think of it like a valet service for your data: When you store something, you get a key or associate a tag (metadata) with it. All you need to do is present the key or search for the specific tag or combination of tags, and the storage system will retrieve the data that matches your request.


This approach not only makes data easier to find, but it also enables continuous, self-healing protection and virtually unlimited scalability. Certain vendors are also making significant advancements in integrating search and providing interfaces that plug right into existing workflows in a way that’s transparent to current users and applications.

The most effective and simple storage solutions incorporate data consolidation and the use of metadata with object storage. This provides greater data access, better protection from data corruption, and the streamlined performance necessary to keep any amount of data online, accessible, and providing value for growing businesses and organizations.

Whether you want to attribute the quote “With great power comes great responsibility” to Voltaire or Spider-Man, in the world of business, we need to preface that by saying “With great knowledge comes great power.” Once you simplify your storage, it gives you the knowledge to not only help run your business, but to also gain actionable insight and the power to make discoveries that can help you solve problems and propel your business forward.

By Jonathan Ring

jonathan-ringJonathan Ring is co-founder and CEO of Caringo, a leading scale-out storage provider. Prior to Caringo, Jonathan was an active angel investor advising a broad range of companies, and he was a vice president of engineering at Siebel Systems, where he was a member of the executive team that grew Siebel from $4 million to $2 billion in sales. Jonathan’s passion and experience are shaping the future of Caringo.

Cloud and the Convenience Solution

Cloud and the Convenience Solution

Cloud Mobility

Buying a new phone is always an exciting endeavour. Whether you had just broken your phone (ouch) or re-upping after a contract expired, it’s something most people look forward to. As a mobile carrier, while your line-up of fancy new phones and best-bang-for-buck service plans will entice your customers, one feature that doesn’t often make headlines is convenience.

It may be low-key, but over the years, mobile carriers have made an effort to make switching carriers, renewing plans, or signing a contract easier and easier.

Yet, one issue of convenience not being tackled by mobile carriers is backing up and moving existing personal effects onto a new one.

For example, we’ve seen offers like Verizon’s which helps customers keep their existing phone numbers or T-Mobile’s providing discounts for bringing in your pre-owned devices. All of this is done in the name of bringing in and retaining customers.


But all of these existing promotions deal with hardware and assets. From our vantage point, we feel that consumer-side convenience is an up-and-coming market. And if you’ve ever bought a new phone and sat there moving, downloading, and checking everything into your new phone, I’m ready to bet that you’ve come across one or two things later that you inexplicably missed. While the software to help transition personal data on mobile phones exists, the process is still left outside of mobile carrier control.

The smooth and consolidated transfer of personal effects should be something mobile carriers add as more and more users buy their 2nd or 3rd smartphones – and we feel that white label personal clouds are just the way to access the market.

The Status of Cloud Integration by Carriers

According to research done by Ericsson in 2012, mobile carriers have dedicated significant amounts of time and money towards integrating cloud into their existing bundle of services. The report cites Verizon having spent “well over US$2 billion” trying to capture a large share of the global cloud community; while Australia’s largest telecom, Telstra, predicts that cloud would make up roughly 20-30% of its total revenue by 2018.

However, even with all that money spent, cloud computing has yet to headline mobile plans as a pillar of a winning 4G offering for customers.

This possibility of transferring files, is an opportunity left unexplored by most mobile carriers.

Convenience and Transition

In an open questionnaire by Android Police in 2015, over 63% of respondents claimed that they had owned between 1-4 smartphones (The survey looked at Android device ownership exclusively), with 3 being the most frequent response at 21%. Given the average smartphone plan length at 2-3 years, this indicates that most users have only gone through the hassles of transferring data once:

  • The first commercially available Android device, HTC’s Dream in 2008, had only 256 MB of memory and would’ve been an easy move for personal data
  • According to that 2-3 plan cycle, the next available smartphones, such as the Galaxy SII, was the first time where users had significant internal storage to move personal effects like photos, videos, docs, etc…
  • With their next phone (roughly 2013 and onwards on 2-3 year plan cycles) is the first time where users may have had difficulty transferring data given the sheer amounts (8 GB and up)

Granted, Android and iOS allows users to sync contacts, notes, and calendars through its own infrastructure, while apps downloaded can be backed up through the app stores of both platforms.

However, everything from photos, videos, documents, messages have to be transferred by the individual through third-party apps or through their PCs rather than through the operating system.

For consumers in emerging markets, this solution becomes even more pivotal given that many of their users have skipped the ‘256 MB era’ altogether. This means these smartphone owners will face the issue on their second phone.

This is the chance white label cloud solutions can aim at.

Backup and Access As Needed – All Under Your Brand

Given the costs associated with integrating a cloud onto existing services (i.e. Verizon’s US$2 billion cloud budget), we wanted to offer mobile carriers a scalable and affordable cloud option. For us, that included making sure that mobile service subscribers wouldn’t be handcuffed to packages (like Dropbox’s Pro option) should they decide to exceed their allotted 2 GB.


(Comic Above Is Free For The Commercial/Personal Reuse Courtesy Of CloudTweaks)

With white label, brands have the opportunity to brand and advertise their newfound cloud feature. Enabling them another leg up on the competition when they can quote features such as ‘transfer all of your personal effects from one phone to another’ without issue.

In conclusion, yes, we feel that white label cloud solutions should take sight at this market deemed convenience. As more and more users go through smartphones with sizeable memory, we’re banking on this issue becoming more and more prominent.

The question is: Will your mobile carrier be ready with a solution?

By Max Azarov
CloudTweaks Comics
The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

The DDoS Attack That Shook The World

The DDoS Attack That Shook The World

DDoS Attack: Update 2 6 days after the DDoS attack that rocked the internet to its core, Dyn have released detailed analysis of the attack and further details have emerged. The attack has been confirmed to have been the largest of its kind in history, and the Mirai botnet has been cited as the official cause.…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

Digital Twin And The End Of The Dreaded Product Recall

Digital Twin And The End Of The Dreaded Product Recall

The Digital Twin  How smart factories and connected assets in the emerging Industrial IoT era along with the automation of machine learning and advancement of artificial intelligence can dramatically change the manufacturing process and put an end to the dreaded product recalls in the future. In recent news, Samsung Electronics Co. has initiated a global…

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt. Even so, decision makers should not put off moving from old legacy systems to…

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups Cloud platforms have become a necessary part of modern business with the benefits far outweighing the risks. However, the risks are real and account for billions of dollars in losses across the globe per year. If you’ve been hacked, you’re not alone. Here are some other companies in the past…

Maintaining Network Performance And Security In Hybrid Cloud Environments

Maintaining Network Performance And Security In Hybrid Cloud Environments

Hybrid Cloud Environments After several years of steady cloud adoption in the enterprise, an interesting trend has emerged: More companies are retaining their existing, on-premise IT infrastructures while also embracing the latest cloud technologies. In fact, IDC predicts markets for such hybrid cloud environments will grow from the over $25 billion global market we saw…

Adopting A Cohesive GRC Mindset For Cloud Security

Adopting A Cohesive GRC Mindset For Cloud Security

Cloud Security Mindset Businesses are becoming wise to the compelling benefits of cloud computing. When adopting cloud, they need a high level of confidence in how it will be risk-managed and controlled, to preserve the security of their information and integrity of their operations. Cloud implementation is sometimes built up over time in a business,…

Three Factors For Choosing Your Long-term Cloud Strategy

Three Factors For Choosing Your Long-term Cloud Strategy

Choosing Your Long-term Cloud Strategy A few weeks ago I visited the global headquarters of a large multi-national company to discuss cloud strategy with the CIO. I arrived 30 minutes early and took a tour of the area where the marketing team showcased their award winning brands. I was impressed by the digital marketing strategy…

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Speed, flexibility, and innovation require multiple cloud services As businesses seek new paths to innovation, racing to market with new features and products, cloud services continue to grow in popularity. According to Gartner, 88% of total compute will be cloud-based by 2020, leaving just 12% on premise. Flexibility remains a key consideration, and…

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

The 80-20 Rule For Security Practitioners  Everyday we learn about yet another egregious data security breach, exposure of customer data or misuse of data. It begs the question why in this 21st century, as a security industry we cannot seem to secure our most valuable data assets when technology has surpassed our expectations in other regards.…


Sponsored Partners