Category Archives: Cloud Computing

Cloud Predictions 2017: Forrester Research Highlights 10 Trends

Cloud Predictions 2017: Forrester Research Highlights 10 Trends

Cloud Predictions 2017

By 2020 it’s projected by Gartner that corporate no-cloud policies will be as scarce as no-internet policies are today. Not surprising considering how valuable the cloud and cloud tools already are to many businesses across a range of industries. But although we’re seeing mainstream uptake of popular cloud products and services, cloud developers aren’t resting on their laurels; instead, we’re noting the development of existing and new cloud devices that are likely to keep the cloud top of mind and increasingly appreciated in the years to come.

Trends for 2017 and Beyond

Cloud Predictions 2017

Forrester Research has released a new report outlining cloud predictions for 2017 and highlighting ten trends for the coming year they believe necessary to act on. It’s expected that the cloud will be saving users money in many ways, not just through their traditional pay-per-use models, but also through the advancement of best practices optimizing costs. Expense transparency may also be realized through integrated cost management tools. And though Forrester says ‘size still matters,’ it’s clear that mega cloud providers will be balanced with niche providers. The public cloud option with its scalability and good economics continues to be a popular choice for enterprises averse to setting up their own private cloud networks, but the customization available through niche cloud provider services promises the smaller dedicated providers will also have a slice of the pie.

Furthermore, Forrester suggests that ‘hyper-converged infrastructure will help private clouds get real.’ These systems create integration of contrasting services on private cloud networks, and Forrester believes hyper-converged infrastructures should be the foundation for the development of private cloud networks, ensuring effortless and effective implementations. It’s also likely that containers will ‘shake up’ cloud platform and management strategies as Forrester predicts container-driven software code management will advance with Linux containers likely available in the majority of private and public cloud platforms early on in 2017. This, however, increases security challenges and is just one motive for the belief that cloud service providers will begin developing better security protocols into their offerings.

And further encouraging the cloud shift, Forrester believes that migration is going to become easier thanks to ‘lift-and-shift’ tools. Cloud migration applications are expected to be highly relevant in 2017, enabling smooth implementation and making the switch from public to private cloud, or vice versa, straightforward. Forrester does also expect enterprises to avoid large, complex and expensive cloud software suites, but also concludes that hybrid cloud networking will continue to create challenges for the hybrid cloud. We might also see SaaS moving towards regional and industry solutions instead of the prevalent one-stop-shops of today. Finally, Forrester suggests we keep an eye on what’s coming out of China as it’s expected that ‘Chinese firms will be key drivers of global cloud evolution.’

What the Cloud has in Store for Enterprises

Taking a look at enterprise advances, it’s suggested that the cloud market will accelerate more rapidly in 2017 as businesses attempt to improve efficiencies while scaling computing resources for better customer service. Says Forrester analyst Dave Bartoletti, “The number one trend is here come the enterprises. Enterprises with big budgets, data centers, and complex applications are now looking at cloud as a viable place to run core business applications.” Forrester recognizes Amazon Web Services as the originators of the first wave of cloud computing, launching basic storage and computing services back in 2006; ten years on and the results are mind-boggling. With 38% of surveyed North American and European enterprise infrastructure technology directors building private clouds and a further 32% securing public cloud services it’s evident that businesses are well into their cloud journey and nudging providers toward greater developments and innovations for the future.

By Jennifer Klostermann

5 Simple Tips to Make Strong and Robust Business Continuity Plans

5 Simple Tips to Make Strong and Robust Business Continuity Plans

Business Continuity Plans

Today’s organizations need comprehensive and robust business continuity planning for swift and effective action in case of a disaster or crisis. As the trade and supply chain have gone global, businesses today expect crisis response to be in seconds, not in hours, to ensure that the ripple impact is minimized. As organizations go digital, an IT failure can cripple the whole supply chain and business operations, causing extreme losses within hours and requiring countless hours to recover from the them. Plans to mitigate IT failures are also affected by the complexity of today’s IT infrastructure. As applications and systems are added based on business and market requirements, newer technologies and infrastructure pose new challenges.

Most businesses leverage cloud based platforms for their enterprise needs at least partially. The cloud helps businesses minimize costs and maximize efficiency; made for speed and convenience, it can scale up and down as needs demand and bring flexibility to business operations. However, the added overhead of managing cloud data centers, planning and performing test exercises across multiple locations and vendors as well as managing a crisis recovery, requires that organizations pay critical attention to their cloud solutions in combination with legacy infrastructure.

Today, an effective business continuity plan requires dynamic collection of information across the extended organization in a continuous manner. Organizations need to overcome the traditional fragmented approach to business continuity and formulate the business continuity strategy that adheres to the following five-point agenda:

1. Champion Business Continuity at the Highest Level

With senior management sponsorship, the business continuity plan will occupy its rightful position, high up in business priorities. This is important for sufficient budget, resourcing and training to be assigned to it. Senior leaders must set the tone at the top by insisting on robust crisis planning and regular reviews as a standard practice rather than a mere formality.

In August, Delta suffered a major IT outage that resulted in a $100 million loss in revenues for the airline. The impact was far-reaching, affecting check-in systems, flight information screens, the airline’s website and smartphone apps. The disruption to customers was extensive as well.

This is just one example of many; unfortunately, downtime of one type or another is a common situation in business. According to the Continuity Insights and KPMG, Global BCM 2016 report, 39 percent of global organizations have estimated the cost of business disruption to be $100,000 or less and 27 percent have estimated business disruptions ranging from $100,000 to $5 million or more in the last 12 months . This highlights the need for robust business continuity planning, championed at the highest level.

bcm-costs

Types of Instances and Interruptions in Past Year

bcm-types-of-risks

2. Review, Update and Test Regularly

The business continuity plan is a living document; it isn’t one to be created, filed and never looked at again. Risks evolve. Exercising the plans on a regularly scheduled basis will ensure businesses keep pace with the changing environment and understand what’s needed to protect critical infrastructure and preserve operations during a physical or virtual attack. Companies must learn from their own experience. Worryingly, according to Forrester and the Disaster Recovery Journal, 33 percent of businesses who had to invoke a business continuity plan, said one lesson learned from the experience was that the plan was out of date. Yet, 60 percent never carry out a full simulation of their business continuity plan for the entire organization; most walk through the plan as a document review.

It is of utmost importance that business continuity plans be reviewed by senior management and the planning team. Also, test results should be periodically evaluated and reported to the board, to assess the nature and scope of any changes to the organization’s business.

3. Include Partners, Suppliers and Third Parties

Companies don’t pay enough attention to the significant role of partners, suppliers and third parties in their business continuity. Deloitte found that over 94 percent of survey respondents had low to moderate confidence in the tools and technology used to manage third party risk and 88 percent felt the same about risk management processes. This, despite 87 percent having experienced disruption in the past three years that involved a third party.

Business continuity planning and disaster recovery has to be part of early third party discussions with responsibilities documented in service level agreements. Plans need to be aligned so that it is clear and easy to identify who does what, and where the handover points are when a plan is executed. The tools and systems used for collaboration must support transparency of information so that both parties are able to work from up to date information and take swift action in the event of a crisis.

4. Prioritize Ongoing Business Operations

The continuity plan should demonstrate that the business understands the priority level of its systems and that mitigating plans are in place to restore core operations as quickly as possible.

Cyber Crime DDoS

In the case of the Delta crisis, the outage was so extensive that it paralyzed business critical operations. The range of problems that can disrupt business – natural disasters, industrial action, cybercrime, IT failures, political or economic upheaval, suppliers ceasing to trade and so on – is so vast, and the systems and operations that can be impacted can be so wide that prioritization is a must.

A cloud-based option provides many benefits as an off-site back-up solution to ensure the efficacy of your continuity plan. However, as you develop your plan, ask yourself if a cloud-based option would increase the efficiency and cost-effectiveness of your plan and cover off essential considerations such as due diligence and service reliability with their provider. Another option is establishing a back-up plan that is independent of the cloud by leveraging personalized file backups, cross-device continuity solutions and communication software. The main aim is to get back faster and limit the amount of time that you’re spending without access to critical systems and information, by having a clearly defined continuity plan in place.

5. Define the Communications Plan Clearly

The business continuity plan has to be absolutely clear on how all stakeholders are going to be kept informed and how to enable upstream and downstream communication channels in times of crisis. Stakeholders include employees at all levels of the organization, such as suppliers, partners and customers.

The goal of the communications plan is to outline the channels and mechanisms for the sharing of information that will support efforts to resolve an issue at hand and limit the extent of its damage. How a company handles a crisis has an enormous impact on how they come out of the incident – people remember how the organization dealt and reacted to the issue and how convincing they were over the company’s efforts to make things right. For this reason, crisis management communications must be engaged at the earliest opportunity.

Service disruption is damaging to all businesses not only in terms of immediate revenue loss but also in the longer term brand and reputational impact. The business continuity plan is an essential, living document that aims to protect the ongoing sustainability of the business. Those that plan and execute well will see better performance in the long-run and be best-placed to weather the storms, whatever form they take.

By Vibhav Agarwal

How the Cloud Is Improving DNA Sequencing

How the Cloud Is Improving DNA Sequencing

DNA Sequencing

For many of us, the cloud is part of our daily lives.

We use these virtual storage servers to hold our pictures, our memories and our work documents, just to name a few. Cloud storage is also making its mark in the medical industry, with electronic health records making patient care easier no matter where you’re making your appointments.

This utilization of virtual information storage is also being used to improve the speed and accuracy of DNA sequencing. How can cloud storage change the way we look at DNA?

The Importance of DNA Sequencing

dna sequencingDNA, which stands for deoxyribonucleic acid, is the smallest building block of life. It’s found in almost all living things on the planet. Your DNA, found in every cell in your body, holds the blueprint that governs why you are the way you are.

Do you have red hair, or blue eyes? That’s written into your DNA. Are you tall, short, fat, skinny or athletic? You guessed it — that’s written into your DNA as well. Do you hate cilantro and think it tastes like soap? Believe it or not, that’s something that’s written into your DNA too.

In that DNA blueprint, there are answers to thousands of questions that we’ve been posing for centuries, including things like how long we’ll live, what diseases we may be predisposed to, and many others. That is where DNA sequencing comes in.

To stick with our same metaphor from a moment ago, you wouldn’t be able to read a blueprint without a key to tell you what different symbols mean, right? DNA sequencing provides researchers with the key to our DNA blueprint. By learning the order of the four base amino acids that make up DNA, researchers can determine which combinations of genes produce what result.

Old Tech, New Tech

Until now, DNA sequencing was performed on non-networked computers. While breakthroughs were being made, they were limited by the small subset of information available and the insufficient computer processing speeds. In other words, individual computers used for DNA sequencing are limited by the amount of processing power that they can possess.

Moore’s Law, coined by Gordon Moore — one of the founders of Intel — suggests that computers are limited by the number of transistors that can be placed on a single chip. He stated that this number would likely double every two years, and all current trends show that even with today’s advances, Moore’s Law still holds true.

Advances in DNA sequencing are appearing exponentially, and in many cases are only being limited by the available processing power.

Predictive Analytics

Predictive analytics, or the study of patterns to make predictions, has already made its way into the medical fields. When applied to DNA sequencing, it’s often dubbed Predictive Genomics. Cloud computing is a key component in the success of predictive genomics for a variety of reasons, including:

  • The amount of data — The sheer amount of data in one human being’s genome is almost mind-boggling. Each individual’s genome has up to 25,000 genes. These genes are made up of almost 3 million base pairs. When you break that down into digital data, you’re looking at upwards of 100 gigabytes of data per person.
  • The cost — Right now, having your personal genetic code sequenced costs between $1,500 and $4,000. This also plays a large role in the high cost of testing for specific genetic markers, like the BRCA1 and BRCA2 genes that indicate a higher chance of breast cancer.

The use of cloud computing and predictive genomics can reduce costs, ensure quality and improve accuracy throughout the world of DNA sequencing.

Amazon, our favorite online shopping mall, is doing what they can to help in the world of cloud computing and genomics. Amazon Web Services provides a cloud computing service that a number of companies, including DNAnexus and Helix, are using to improve the speed and accuracy of their genome sequencing.

There’s an App for That

While sending off a saliva-soaked q-tip to have your DNA tested isn’t a new concept, this is the first time it’s heading to both the cloud and the App Store.

A new startup from Silicon Valley named Helix has recently hit the DNA sequencing market with a new twist on the DNA game. Now, not only can you have your DNA tested for all sorts of information, but you can also have your genetic ancestry analyzed by the minds at National Geographic.

As the icing on the cake, all of your information will be stored on the cloud and accessible through Helix’s app.

Cloud computing is becoming an invaluable tool for a variety of different industries, with DNA sequencing as just the latest in a long line of innovations. As this advancement becomes more mainstream, only time will tell what secrets our DNA holds, and what we’ll be able to do with them once we find them.

By Kayla Matthews

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists

In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential deficit between supply and demand.

When a 2012 article in the Harvard Business Review, co-written by U.S. chief data scientist DJ Patil, declared the role of data scientist “the sexiest job of the 21st century,” it sparked a frenzy of hiring people with an understanding of data analysis. Even today, enterprises are scrambling to identify and build analytics teams that can not only analyze the data received from a multitude of human and machine sources, but also can put it to work creatively.

One of the key areas of concern has been the ability of machines to gain cognitive power as their intelligence capacities increase. Beyond the ability to leverage data to disrupt multiple white-collar professions, signs that machine learning has matured enough to execute roles traditionally done by data scientists are increasing. After all, advances in deep learning are automating the time-consuming and challenging tasks of feature engineering.

While reflecting on the increasing power of machine learning, one disconcerting question comes to mind: Would advances in machine learning make data scientists obsolete?

The Day the Machines Take Over

machine

Advances in the development of machine learning platforms from leaders like Microsoft, Google, and a range of startups mean that a lot of work done by data scientists would be very amenable to automation — including multiple steps in data cleansing, determination of optimal features, and development of domain-specific variations for predictive models.

With these platforms’ increasing maturity and ability to create market-standard models and data-exchange interfaces, the focus shifts toward tapping machine-learning algorithms with a “black box” approach and away from worrying about the internal complexities.

However, as with any breakthrough technology, we need to recognize that the impact of the technology is limited unless it is well-integrated into the overall business flow. Some of the most successful innovations have been driven not by a single breakthrough technology but by reimagining an end-to-end business process through creative integration of multiple existing components. Uber and Netflix offer prime examples of intelligence gleaned from data being integrated seamlessly into a company’s process flow. Data scientists play a key role in this by leveraging data to orchestrate processes for better customer experience and by optimizing through continuous experimentation.

While organizations across industries increasingly see a more strategic role for data, they often lack clarity around how to make it work. Their tendency to miss the big picture by looking for “easy wins” and working with traditional data sources means that data scientists have an opportunity to help frame problems and to clearly articulate the “realm of the possible.

From Data to Strategy

It is easy to get carried away by the initial hype that machine learning will be a panacea that can solve all the problems and concerns around its impact on the roles of data science practitioners. However, let us recall the AI winters in the mid-’70s, and later in the ’90s, when the journey to the “promised land” did not pan out.

data-cloud

Today, we don’t see the same concerns as in the past — lack of data, data storage costs, limitations of compute power — but we still find true challenges in identifying the right use cases and applying AI in a creative fashion. At the highest of levels, it helps to understand that machine learning capability needs to translate into one of two outcomes:

  • Interaction: Understanding user needs and building better and more seamless engagement
  • Execution: Meeting customer needs in the most optimal manner with ability to self-correct and fine-tune

Stakeholder management becomes extremely important throughout the process. Framing key business problems as amenable to data-led decision-making (in lieu of traditional gut feel) to secure stakeholder buy-in is critical. Consequently, multiple groups need to be involved in identifying the right set of data sources (or best alternatives) while staying conscious of data governance and privacy considerations. Finally, stakeholders need to be fully engaged to ensure that the insights feed into business processes.

Data Scientists Become Core Change Agents

Given the hype surrounding big data analytics, data scientists need to manage responses that fall on opposite ends of the spectrum by tempering extreme optimism and handling skepticism. A combination of the following skills that go beyond platforms and technology are thus needed:

  • Framing solutions to business problems as hypotheses that will require experimentation, incorporating user input as critical feedback
  • Identifying parameters by which outcomes can be judged and being sensitive to the need for learning and iteration
  • Safeguarding against correlations being read as causal factors
  • Ensuring the right framework for data use and governance, given the potential for misuse

This requires pivoting a data scientist’s remit in a company from a pure data-analysis function into a more consultative role, engaging across business functions. Data scientists are not becoming obsolete. They are becoming bigger, more powerful, and more central to organizations, morphing from technician into change agents through the use of data.

By Guha Ramasubramanian

guha-rGuha heads Corporate Business Development at Wipro Technologies and is focused on two strategic themes at the intersection of technology and business: cybersecurity framed from a business risk perspective and how to leverage machine learning for business transformation.

Guha is currently leading the development and deployment of Apollo, an anomaly detection platform that seeks to mitigate risk and improve process velocity through smarter detection.

Accelerite Announces CloudPlatform Integration with Docker Containers at ApacheCon 2016

Accelerite Announces CloudPlatform Integration with Docker Containers at ApacheCon 2016

Accelerite Announces CloudPlatform Integration

Provides Enterprises with a Single Orchestration Platform for Bare-metal, VM, and Container Management

SANTA CLARA, Calif. – November 16, 2016 — Accelerite, a provider of software for simplifying and securing enterprise infrastructure, announced that the Accelerite CloudPlatform, powered by Apache CloudStack™, has achieved another major milestone in enhancing cloud services for enterprises. Accelerite CloudPlatform now integrates with Kubernetes, an open-source system for automating deployment, scaling, and managing containerized applications.

The Accelerite CloudPlatform integration with Kubernetes makes it one of the most powerful cloud platforms for private clouds and a highly advanced solution to provision VMs and Docker containers on bare-metal and virtualized environments. This new feature is immediately available to all CloudPlatform users and new customers in the CloudPlatform 4.7.1 release. Other important features include – Hosting VRs on the XenServer and KVM Hypervisors for Deploying Bare Metal Hosts, Using Destination CIDR in the Egress Rules, Upgrading Virtual Routers with Minimal Downtime, Performing Cold Migration Directly from the Source Host to the Destination Host, Creating Multiple Snapshots Simultaneously, Resizing Root Disk on XenServer and VMWare.

With the integration with Kubernetes, Accelerite continues to accelerate enhancement of CloudPlatform. As an already proven technology for running large scale private clouds with tens of thousands of nodes, this integration enables our customers to develop next generation cloud native applications and deploy them in their existing infrastructure — a potentially significant return on their original investment, “ said Rajesh Ramchandani, General Manager, cloud products.

Accelerite is also a Gold sponsor at ApacheCon Europe 2016, the official conference of the Apache Software Foundation. In partnership with the Linux Foundation, ApacheCon Europe is held in Seville, Spain, November 14-18, 2016. Accelerite will share insights for container deployments and the current architecture of bare metal provisioning in Apache CloudStack in a presentation, “Bare Metal Provisioning in CloudStack” to be held Thursday, November 17, 2016, 15:10 – 16:00.

We are very pleased that Accelerite will share insights and valuable expertise about the cloud industry at ApacheCon Europe,” said Angela Brown, VP of Events at The Linux Foundation. “Accelerite continues to grow its involvement in the open source cloud community, going back to the company’s acquisition of the Citrix CloudPlatform and their support for Apache CloudStack, and we look forward to working together as a community to create ever more advanced open cloud technologies.

Accelerite Cloud Life Cycle Management Products

Accelerite’s cloud life cycle management solutions include CloudPlatform, unified cloud services orchestration platform; CloudPortal Business Manager, IT as a Service through cloud services automation;and rCloud, cloud-based disaster recovery.

About Accelerite

Accelerite’s software suite of cloud, IoT solutions and advanced endpoint management make it easy for enterprises to simplify and secure today’s complex, ever-evolving infrastructure. Fortune 500s, SMEs, operators, service providers and VARs around the world rely on Accelerite products to secure connected enterprises from a single pane view, quickly and easily build private and public enterprise clouds, and bring connected things to life with rapid IoT service creation and enrichment. With headquarters in Silicon Valley, Accelerite is a wholly owned business of Persistent Systems (BSE & NSE: PERSISTENT), a leader in software product development and technology services. Learn more at www.accelerite.com.

Big Data Comes to Bear on Healthcare

Big Data Comes to Bear on Healthcare

Big Data Healthcare

The National Institutes of Health (NIH) is examining the use of big data for infectious disease surveillance, exploring the use of information taken from social media, electronic health records, and a range of other digital sources to provide detailed and well-timed intelligence around infectious disease threats and outbreaks that traditional surveillance methods aren’t able to. On the other side of the disease spectrum, big data analytics is also helping with the management of diabetes, a disease affecting over 422 million people globally and resulting in 1.5 million deaths per year according to the World Health Organization. Today, big data and big data analytics are delivering a range of innovative health care options as well as disease and illness monitoring and prevention tools that better the wellbeing of the world’s population.

Where is All This Data Coming From?

Thanks to the digitization of records, the spreading use of sensors, and the prolific use of mobile and standard computing devices, the data that is collected and recorded today is immense. But just because all of this data exists doesn’t mean it’s necessarily useful. Consider the ‘information’ gleaned from Twitter and Facebook posts, Snapchat and text messages, and Google and Siri question and answer sessions: certainly, some of that will be relevant to someone, but the sheer volume of non-qualitative data available can sometimes be a deterrent. Fortunately, the technology that’s evolving to collect all of this data is working hand in hand with big data management and analytics tech to ensure value.

Today, big data applications can predict future actions and with the widespread use of Internet of Things tech personalized data can be collected and monitored on an individual level. Applications such as Google Trends provide practical methods for using big data, while big data analytics helps navigate and utilize unstructured data that might otherwise seem irrelevant. And thanks to tools such as Hadoop, developers are able to construct predictive models that help organizations understand user responses and better tailor applications to these results.

Solutions for Healthcare

Already big data plays a role in biomedicine, advancing methodologies and skills and creating new cultures and modes of discovery. Some experts, in fact, believe the advances in medicine suggest we’ll be facing disruption in the industry as new systems and approaches prove their worth. Precision medicine initiatives already involve above a million volunteers in the US alone, along with several NIH-funded cohorts, and it’s likely that we’ll see the sharing of lifestyle information, genomic data, and biological samples linked to electronic health records as these schemes search for superior health care solutions. The benefits of these initiatives are collaborative and cooperative science, more efficient and better-funded research enterprises, and training advances, but all of this needs to be carefully balanced with the necessary privacy and security demands of big data.

Other advantages provided by big data analysis include a better understanding of rare diseases through the precision provided by aggregated integrated data, as well as predictive modeling able to advance diagnosis of illnesses and diseases both common and rare. Though many opportunities available through big data and big data analytics require a particular cultural shift, our high-tech environment already encourages this change.

Concerns and Further Investigations

Although experts see potential in the use of big data in the healthcare field, we’re also cautioned that unconventional data streams may lack necessary demographic identifiers or provide information that underrepresents particular groups. Further, social media can’t always be relied on as a stable data source. Nevertheless, big data research continues in many unique health care areas: multiple studies are investigating social media and online health forums for drug use and the existence of adverse reactions; one European surveillance system is collecting crowdsourced data on influenza; ResistanceOpen monitors antibiotic resistance at regional levels; and many others provide unique insight into our healthcare systems. The combination of traditional and digital disease surveillance methods is promising, and says Professor Shweta Bansal of Georgetown University, “There’s a magnitude of difference between what we need and what we have, so our hope is that big data will help us fill this gap.”

By Jennifer Klostermann

Net Neutrality Part 1 – What Is Net Neutrality And Why Is It So Important?

Net Neutrality Part 1 – What Is Net Neutrality And Why Is It So Important?

Net Neutrality

Net neutrality is a concept that has been the centre of a lot of debates recently, it is based on the idea that all internet traffic should be treated equally by your internet service provider. The fact that all traffic has (in most countries across the world) been treated equally, is one of the greatest strengths of the internet – it is a level playing field.

However, there is a risk at the moment that it could all change. Large ISP providers like AT&T, Verizon, and Comcast want to classify different types of traffic and treat them differently so they can charge you more depending on what you use, and potentially even restrict content that they disagree with.

There are several fundamental fears that drive people to advocate for net neutrality. Firstly, they fear that without it, ISPs (Internet Service Providers) could provide a “fast lane” to “favoured” (and what many suspect will be sponsored) content. They also fear that it could restrict market accessibility, by preventing small businesses from getting a step on the ladder, and thus lead to ever more entrenched monopolies. Finally, they worry that some ISPs would strategically slow the speed of websites who can’t afford to pay (or who don’t want to), holding young businesses to ransom as a result of their poor service.

There are a number of independent sites and organisations, such as theopeninter.net, that promote and advocate for net neutrality and seek to spread the word about its importance to society. They argue that free and open internet is important because,

  1. Free and open internet is the single greatest technology of our time, and control should not be at the mercy of corporations.
  2. Free and open internet stimulates ISP competition.
  3. Free and open internet helps prevent unfair pricing practices.
  4. Free and open internet promotes innovation.
  5. Free and open internet promotes the spread of ideas.
  6. Free and open internet drives entrepreneurship.
  7. Free and open internet protects freedom of speech.

Theopeninter.net was created by entrepreneur Michael Ciarlo and his story is the perfect example of the why net neutrality is so important; without net neutrality many people may not have been able to find his site, and his message would be lost. A level playing field of access and bandwidth is key to ensuring the internet continues to promote innovation and collaboration across the world.

Thisisnetneutrality.org suggests that there is need to preserve the openness of the internet, in order to ensure its continued success. They argue that

The open internet has fostered unprecedented creativity, innovation and access to knowledge and to other kinds of social, economic, cultural, and political opportunities across the globe”

There is no argument against the fact that service providers shouldn’t be able to manage their networks. However, a line needs to be drawn to prevent the service providers from becoming the gate-keepers of internet content, attempting to limit access to certain parts of the internet, or even trying to police or censor our online experience.

As you can see from the map here, there hasn’t been many countries so far that have put actual legislation in place. However, access around the world at the minute in most countries is essentially unrestricted and completely neutral, as you can see from this map which was assembled using internet speed data as opposed to looking at legal precedent or legislation. However, there have been efforts to legally secure this trend and ensure the survival of net neutrality.

net-neutral

Brazil has been arguably the most successful at attempting to codify net neutrality and protect it, through the passing of what was dubbed as Brazil’s “Internet Constitution”. The law was adopted the legislation in April 2014, and prevents ISPs from charging higher rates for access to content that requires more bandwidth, such as Netflix; as well as limiting the gathering of metadata, and holds corporations, like Facebook and Google, accountable for the security of Brazilians’ data, even when it is stored abroad.

I like to think of the internet as a globalised and modernised version of the American dream. It can provide a fresh start or blank canvass for anyone who wants to sell their product, spread their message, or just go viral with a home video. The quality of the content is what matters, not where it came from.

Keep an eye out on CloudTweaks.com for the next part of the series where will be exploring the arguments against net neutrality….

By Josh Hamilton

IoT, Smart Cities and the Future

IoT, Smart Cities and the Future

Smarter IoT

When we use the term smart cities, a series of frames begin to run in front of our very eyes. The reason behind this is pretty simple; over the last few years, the definition of a ‘SMART CITY’ has changed drastically. However, amidst these series of definitions, two things have remained consistent: Information and Communication Technology (ICT) and the Internet itself.

Here in this article, the latter is our primary point of discussion. As a tremendously growing urban population and various organizations have started to switch over to IoT, I presume that here that an important question arises:

What exactly is IoT in the first place?

Evolution-IoT

Internet of Things, popularly known as IoT, is essentially a networked connection between physical objects to create a dynamic, smarter approach to just about everything we do in our daily lives.

Let us take an example: today, we have abundant access to the internet through our smartphones, tablets, PCs, televisions, etc. and are personally connected with each other through prominent social networking sites like Instagram, Twitter, and Facebook.

While IoT is hard at work making our smart cities smarter, we can save a lot of time and effort, in converting an increasing amount of raw data into final useful information.

For some time now , it has been readily apparent that the use of IoT will not only help the private sector, but also the public sector.

In lieu of this paradigm shift, we have a couple of applications we will be rolling out to assist the cities in becoming smarter.

Mobile applications contributing to building IoT-powered smart cities

In this article, it was mentioned earlier that the IoT entails connectivity between anything and everything. There are certain mobile applications that have been introduced with the same ideas in mind, including:

Tado: An application that will help you control your home heating. The application calculates the distance you have to travel and the time you’ll take to reach home. With the collected data, this gadget will adjust your room heating accordingly. You can more accurately understand how this technology works via the following screen-shot.

  • Smart Lighting: This application allows you to control the lighting in your home or office directly through your smartphone. With this, you can have command over everything, starting from the brightness up to the color. Now, let your surroundings match your mood without any special effort on your part.
  • SIM Tools: SIM is basically known as Smart Identity Management Tools. This application will prevent you from carrying keys all the time with you by giving you a smarter lock system. This gadget will allow you to leave all your worries behind linked with theft or losing/misplacing your keys.
  • Bluesmart Suitcases: This piece of tech belongs to the same family as SIM tools, but this time it’s for your suitcase. With this application loaded in your mobile phone, you can lock and unlock your suitcase with the touch of a button.
  • Flower Power H2O: Imagine having a personalized guide to show you how much water your plants need—all at the right time and in a right manner, and automatically tracked and monitored via your smartphone. Sounds exciting, does it not?

These are just a few members of the IoT family. Mobile apps are indeed a better way to make our IoT-powered smart cities smarter. Here are three benefits of such mobile apps:

1- Ease of Use: This undoubtedly stands first in the list, as smartness is all about doing things in a better way with the least amount of time, resources and effort. And as we all know that these three benefits most important requirements to live a truly smart life.

2- Security: Second on our list is the security that these applications provide. You’ll soon have all the commands in your password-protected smartphone that sits right nearby in your pocket. Be it your home, your locker, or your suitcase—each is waiting for a touch on your smartphone.

3- No more Irregularity: With these IoT applications around, you’ll have better, more consistent knowledge about the things which are not only related to your individual needs, but also with everything around you. With most events happening in real-time and in the way they were meant to, you will barely have to lift a finger and to keep track of things. I don’t think anyone could wish for something better—or rather smarter—than a nice dose of peace and relaxation.

So friends, at this point I must sign out; I trust that the information presented here is of immediate value and use. I hope you will have a smarter life in the (near) future.

By Shahid Mansuri

CloudTweaks Comics
Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks Does cloud security risks ever bother you? It would be weird if it didn’t. Cloud computing has a lot of benefits, but also a lot of risks if done in the wrong way. So what are the most important risks? The European Network Information Security Agency did extensive research on that, and…

Cloud Infographic – Big Data Analytics Trends

Cloud Infographic – Big Data Analytics Trends

Big Data Analytics Trends As data information and cloud computing continues to work together, the need for data analytics continues to grow. Many tech firms predict that big data volume will grow steadily 40% per year and in 2020, will grow up to 50 times that. This growth will also bring a number of cost…

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US. The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about…

Digital Transformation: Not Just For Large Enterprises Anymore

Digital Transformation: Not Just For Large Enterprises Anymore

Digital Transformation Digital transformation is the acceleration of business activities, processes, and operational models to fully embrace the changes and opportunities of digital technologies. The concept is not new; we’ve been talking about it in one way or another for decades: paperless office, BYOD, user experience, consumerization of IT – all of these were stepping…

Cloud Infographic: The Future of File Storage

Cloud Infographic: The Future of File Storage

 The Future of File Storage A multi-billion dollar market Data storage has been readily increasing for decades. In 1989, an 8MB Macintosh Portable was top of the range; in 2006, the Dell Inspiron 6400 became available, boasting 160GB; and now, we have the ‘Next Generation’ MacBook Pro with 256GB of storage built in. But, of course,…

The Internet of Things Lifts Off To The Cloud

The Internet of Things Lifts Off To The Cloud

The Staggering Size And Potential Of The Internet of Things Here’s a quick statistic that will blow your mind and give you a glimpse into the future. When you break that down, it translates to 127 new devices online every second. In only a decade from now, every single vehicle on earth will be connected…

Cloud Computing Offers Key Benefits For Small, Medium Businesses

Cloud Computing Offers Key Benefits For Small, Medium Businesses

Cloud Computing Benefits A growing number of small and medium businesses in the United States rely on as a means of deploying mission-critical software products. Prior to the advent of cloud-based products — software solutions delivered over the Internet – companies were often forced to invest in servers and other products to run software and…

Cloud Infographic – Monetizing Internet Of Things

Cloud Infographic – Monetizing Internet Of Things

Monetizing Internet Of Things There are many interesting ways in which companies are looking to connect devices to the cloud. From the vehicles to kitchen appliances the internet of things is already a $1.9 trillion dollar market based on research estimates from IDC. Included is a fascinating infographic provided by AriaSystems which shows us some of the exciting…

The Future Of Cybersecurity

The Future Of Cybersecurity

The Future of Cybersecurity In 2013, President Obama issued an Executive Order to protect critical infrastructure by establishing baseline security standards. One year later, the government announced the cybersecurity framework, a voluntary how-to guide to strengthen cybersecurity and meanwhile, the Senate Intelligence Committee voted to approve the Cybersecurity Information Sharing Act (CISA), moving it one…

Five Cloud Questions Every CIO Needs To Know How To Answer

Five Cloud Questions Every CIO Needs To Know How To Answer

The Hot Seat Five cloud questions every CIO needs to know how to answer The cloud is a powerful thing, but here in the CloudTweaks community, we already know that. The challenge we have is validating the value it brings to today’s enterprise. Below, let’s review five questions we need to be ready to address…

Cloud-Based or On-Premise ERP Deployment? Find Out

Cloud-Based or On-Premise ERP Deployment? Find Out

ERP Deployment You know how ERP deployment can improve processes within your supply chain, and the things to keep in mind when implementing an ERP system. But do you know if cloud-based or on-premise ERP deployment is better for your company or industry? While cloud computing is becoming more and more popular, it is worth…

Three Factors For Choosing Your Long-term Cloud Strategy

Three Factors For Choosing Your Long-term Cloud Strategy

Choosing Your Long-term Cloud Strategy A few weeks ago I visited the global headquarters of a large multi-national company to discuss cloud strategy with the CIO. I arrived 30 minutes early and took a tour of the area where the marketing team showcased their award winning brands. I was impressed by the digital marketing strategy…

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

The 80-20 Rule For Security Practitioners  Everyday we learn about yet another egregious data security breach, exposure of customer data or misuse of data. It begs the question why in this 21st century, as a security industry we cannot seem to secure our most valuable data assets when technology has surpassed our expectations in other regards.…

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

Why Businesses Need Hybrid Solutions Running a cloud server is no longer the novel trend it once was. Now, the cloud is a necessary data tier that allows employees to access vital company data and maintain productivity from anywhere in the world. But it isn’t a perfect system — security and performance issues can quickly…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Secure Third Party Access Still Not An IT Priority Research has revealed that third parties cause 63 percent of all data breaches. From HVAC contractors, to IT consultants, to supply chain analysts and beyond, the threats posed by third parties are real and growing. Deloitte, in its Global Survey 2016 of third party risk, reported…

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted.…

Cost of the Cloud: Is It Really Worth It?

Cost of the Cloud: Is It Really Worth It?

Cost of the Cloud Cloud computing is more than just another storage tier. Imagine if you’re able to scale up 10x just to handle seasonal volumes or rely on a true disaster-recovery solution without upfront capital. Although the pay-as-you-go pricing model of cloud computing makes it a noticeable expense, it’s the only solution for many…

Your Biggest Data Security Threat Could Be….

Your Biggest Data Security Threat Could Be….

Paying Attention To Data Security Your biggest data security threat could be sitting next to you… Data security is a big concern for businesses. The repercussions of a data security breach ranges from embarrassment, to costly lawsuits and clean-up jobs – particularly when confidential client information is involved. But although more and more businesses are…