Category Archives: Cloud Computing

DreamHost Debuts Havana Version Of Its DreamCompute OpenStack Cloud

DreamHost Debuts Havana Version Of Its DreamCompute OpenStack Cloud

DreamHost Debuts Havana Version of Its DreamCompute OpenStack Cloud 

Continuous Integration Approach Delivers Powerful New Features in Real Time

LOS ANGELES, California—November 4, 2013—DreamHost®, a global leader in Web hosting and cloud services, today announced an upgraded Havana version of its DreamCompute™ public cloud computing service, and significant new levels of OpenStack™ code and community contribution. DreamCompute is entirely software-defined; powered by OpenStack, Ceph™, and NSX to deliver unprecedented control, configurability, and security of virtual machine, storage and network resources for expert and aspiring developers worldwide.

Havana Upgrade170px-OpenStack

Commencing with the OpenStack Havana release in October 2013, DreamHost utilizes continuous integration and deployment of OpenStack software to ensure DreamCompute customers can access the latest OpenStack features as soon as they are available. OpenStack Havana has hundreds of new features and enhancements contributed by over 900 developers in the OpenStack community – representing 145 organizations worldwide.

DreamHost uses best-of-breed open source tools like Jenkins, Python, virtualenv, fpm, and tox to automate the creation, testing, and deployment of software packages. Leveraging these tools, Opscode Chef, and Nephology (a DreamHost-developed bare-metal provisioning system), the software powering DreamCompute is continuously upgraded to maintain the feature-set at the leading edge of OpenStack cloud computing innovation.

Contributions to OpenStack Havana

OpenStack has rapidly become the cloud platform of choice for enterprises and service providers wishing to implement a public, private or hybrid cloud. DreamHost made a commitment to OpenStack in early 2011, and since then has become one of the leading contributors of code, technical leadership, and community-building. Benefiting from an expert, senior OpenStack development team, DreamHost is now home to two OpenStack Technical Committee members, two Program Technical Leads (PTL), and three core developers, placing DreamHost in the top 15 companies worldwide for code contributions during the Havana release cycle.

The OpenStack Networking (Neutron) community elected DreamHost senior developer, Mark McClain, to the PTL position for a second consecutive term. Doug Hellmann, senior developer at DreamHost, is also the newly elected PTL for the OpenStack Common Libraries project (Oslo), after having made major contributions to the Metering and Orchestration project. And senior developer Mike Perez continues his contributions as a core developer on OpenStack Block Storage (Cinder).

DreamHost openly shares its technical and operational learnings with the OpenStack community to drive the project forward.  DreamHost employees also lead OpenStack Meetup groups in Los Angeles and Atlanta, where DreamHost offices and key developers are located.

At the Foundation level, DreamHost is a founding Gold Member, with chief executive Simon Anderson elected to the Board, focusing on strengthening the promotion and adoption of OpenStack worldwide, and supporting open, democratic governance of the project through participation in prospective Gold Member and individual election committees.

Expanded Beta Features

As DreamHost opens up DreamCompute to more Beta users, the company is also releasing more details on the flexible and powerful configurations available. DreamCompute instances range from 1 vCPU with 1GB memory to 64 vCPUs and 128GB memory, with plenty of choice to power a wide range of web and application workloads. A la carte block storage is delivered by DreamBlocks™, the reliable, fast and distributed block storage service powered by Ceph. In addition, DreamCompute provides fully isolated, secure, programmable tenant networks, without the use of VLANs, and with full support for IPv6.

DreamCompute Beta is complemented by DreamHost’s generally available object-based cloud storage service, DreamObjects™, together delivering a powerful, programmable cloud platform for entrepreneurs and developers to build and launch scalable websites and applications in the cloud.

Since the early days of web hosting, DreamHost has focused on enabling entrepreneurs and developers to experiment, explore and innovate using the best open source web application software, programming languages, frameworks and tools”, said Simon Anderson, CEO of DreamHost, and Board Member of the OpenStack Foundation.  “OpenStack and Ceph are transformational open source software projects for building the world’s best cloud computing services, and DreamCompute and DreamObjects will enable the next generation of developers to rapidly build and scale disruptive web applications for a global audience.”

Supporting Quotes


Inktank is excited to support DreamHost’s expansion of the company’s Ceph-based cloud computing offering,” said Bryan Bogensberger, CEO of Inktank. “DreamHost is leveraging Ceph’s tunable reliability, scalability and efficient manageability at scale to deliver extremely cost competitive block storage. By combining OpenStack and Ceph, DreamHost is now aggressively proving its leadership in the cloud computing industry!”

About DreamHost

DreamHost is a global web hosting and cloud services provider with over 375,000 customers and 1.3 million blogs, websites and apps hosted.  The company offers a wide spectrum of web hosting and cloud services including Shared Hosting, Virtual Private Servers (VPS), Dedicated Server Hosting, Domain Name Registration, the cloud storage service, DreamObjects, the cloud computing service DreamCompute, and the managed WordPress service DreamPress.  Please visit for more information or follow us on twitter at @DreamHost.

About Inktank

Inktank is the company delivering Ceph—the massively scalable, open source, software-defined storage system. Launched by some of the leading developers of Ceph, Inktank is dedicated to helping organizations fully leverage the transformative power of Ceph to decrease storage costs, increase operational flexibility and help free them from restrictive and expensive proprietary storage systems. Inktank provides best-in-class professional services and support offerings to enterprises, service providers, and cloud platforms.  Please visit for more information or follow us on Twitter at @inktank.

About Ceph

Ceph, The Future of Storage™, is a massively scalable, open source, software-defined storage system that runs on commodity hardware. Ceph has been developed from the ground up to deliver object, block, and file system storage in a single software platform that is self-managing, self-healing and has no single point of failure. Consuming Ceph storage is easy because it is in the mainline Linux kernel and has many userland implementations. For example, Ceph can be accessed through the kernel, via Hadoop, Samba and FUSE, via the S3 and Swift APIs, and via the OpenStack™ and Apache CloudStack cloud operating systems. Because of its open source, scalable, software defined storage architecture, Ceph is an ideal replacement for legacy storage systems and object and block storage for cloud computing environments. Please visit for more information.


10 Steps To Success With The Cloud

10 Steps To Success With The Cloud

10 Steps To Success With The Cloud

The cloud, the trend that appears to be steaming on ahead as the likes of Salesforce, Amazon’s Web Services and Workday continue to gain market value. Utilising cloud services for business has allowed for companies to scale, drive IT costs down and become a more agile, streamlined business.

There are a few steps that should be taken by any company before they decide to make the switch across the cloud, and these are even steps that companies utilising the cloud should make sure they’re aware of to keep cloud usage both in check and running smoothly.

The pace of cloud computing shows no sign of letting up with likes of Salesforce, Amazon’s Web Services and Workday expanding rapidly and gaining market value.


Utilising cloud services allows companies to scale, drive IT costs down and become more agile. However to keep cloud usage in check and systems running smoothly there are a number of factors that should be considered:

1) Assess what business infrastructure can be outsourced.

Before considering the cloud, check you’re sure what business infrastructure, processes and systems you’re willing to outsource. Whether its email exchange or data storage, having an on premise solution gives you direct control so in the event of downtime you are able to act.

2) Calculate the savings.

When it comes to justifying the cost of implementing cloud services it’s crucial to be sure you’re going to make a saving. Replacing and maintaining hardware can be a costly expenditure and the cloud can reduce a large amount of hardware costs, however Cloud based services need licences and the cost of multiple licences can quickly mount up

3) Be aware of compliance issues.

When data is moved between an internal network and cloud storage it is important to know how the data is going to be stored and secured. Laws such as the UK’s Data Protection Act protects personal information from misuse while the US has Sarbanes-Oxley which all publicly traded companies must comply with , covering all financial systems and forcing SSL encryption policies upon finance departments.

4) Understand the differences between the private, public and hybrid cloud.

There are a few key distinctions on the differences between differing cloud types. A public cloud is just that, public, SaaS providers utilise a public cloud to offer their services. A private cloud is hosted internally to employees and gives a high level of security. A hybrid cloud allows for the most effective deployment, such as a business using public cloud computing resources to scale quicker.

5) Get an SLA with your cloud provider.

Downtime affects everyone but having a service level agreement with a cloud hosting company can make sure that if downtime does happen, your business is up and running as quickly as possible. Applications that are migrated onto the cloud are expected to perform at the same level as dedicated hardware, if not perform better, so an SLA is key to keeping performance maximised.

6) Create Cloud policies.

Policies for how the cloud should be used within a company are important to make resources are used correctly, whether it is data storage, file or shared web services.

7) Be aware of all the security implications.

The cloud has security implications that need to be addressed. Endpoint security can be put in place that monitors information as it travels outside of an organisation to reduce the risk of data leakage. Heavy fines can be levied against data leaks so it’s important to take the appropriate measures to reduce the risk.

8) Monitor usage of the cloud and the cloud itself.

Depending what your main use of the cloud is, whether it’s public or private, it’s important to monitor both the status of the implementation as well as monitoring the usage that it is experiencing. Monitoring can give you greater insight into what is happening on the network and allows you to foresee any future issues or trends, such as downtime or server latency.

9) Manage users and manage licenses.

The cloud brings with it a whole host of issues surrounding users, from multiple logins for several different SaaS platforms to individual licensing management. Making sure that licenses are used effectively across teams is an important step to making the most effective use of the cloud. Management tools that allow for single sign on are a worthy investment but come with additional risks that should be taken into consideration.

10) Keep up-to-date with changes and improvements from providers.

SaaS providers are now in a race to the top, functionality that used to be done through a physical update can now be launched across the network giving the user the latest and greatest functionality immediately. Keeping up with these changes can make sure you’re using your cloud based applications to the best of their ability.

While there is no silver bullet to make the most from the cloud, these are some of the ways you can make it easier during implementation, deployment and usage for your business.

By Brian King,

This article was written by Brian King, Digital Marketing Manager at Opsview, a leading network performance monitoring company.

(Image Source: Shutterstock)

The Lighter Side Of The Cloud – Birds & The Bees

The Lighter Side Of The Cloud – Birds & The Bees


By David Fletcher

All David Fletcher comic images on this site are owned by If you would like to reuse them on your social media network, please feel free to do so as long as there is a clearly defined link to the original comic source. If you would like to use them in a Newsletter, Print, Powerpoint or Website. Please contact us regarding the licensing details and policies.

What The Gartner Big Data 2013 Report Means For The Industry

What The Gartner Big Data 2013 Report Means For The Industry

What The Gartner Big Data 2013 Report Means For The Industry

Gartner’s 2013 Big Data survey was recently released and has highlighted some of the long held beliefs in Big Data with survey evidence to prove it.

The survey itself was primarily focused on companies currently working with Gartner. Despite this, due to the numbers and varieties, this is still a broad and impressive number who can realistically give accurate results.

One of the most striking results to come out of the survey was that 64% of companies were planning or have already implemented Big Data systems. This is a significant number as it show that there is a genuine drive within companies to adopt the system. That more than half of companies are looking at the ways they are using data and looking at new data initiatives is only going to be a positive for industry growth.


(Image Source: Shutterstock)

This is especially pertinent when discussing the future of the industry, as this 64% represents overall plans or implementations, but only 8% of those who are planning on implementing have actually made moves towards it. The 92% gap represents a large list of potential clients for consultants, cloud database products and technology providers.

The 92% potential growth shows not only the willingness of companies to implement, but the future business that will create longevity for Big Data in the next decade. Far from a flash in the pan, with this kind of demand and potential revenue, investments become safer and therefore the industry has a better foundation to grow.

Banking and media are the industries that have made the biggest strides in this area. That these are also two of the traditionally rich industries is unsurprising, but especially with the banking sector, shows that there is a business case for utilising this technology and a definite profit to be made from it.

So why are these companies looking to implement? 

There have been countless success stories of companies using Big Data to make billions, the Facebooks, Googles and YouTubes of this world have set a strong precedent for how correct data usage can have a significant impact on revenues. Therefore, it is unsurprising that many companies are looking at the ways that these behemoths have used their data.

In each of these cases it has been about customer experience, putting the correct information in front of them at the correct time to create money making situations. This aligns with the survey findings, where the most important reason for wanting to implement Big Data systems is the improvement of customer experience.

According to the survey though, organisations struggle most with knowing how to get value from Big Data. This means that although the merits of a successful implementation are evident from the success of several companies, the individual company rewards from this are difficult to find.

A lack of knowledge in value creation is fed by one of main stumbling blocks for new data implementations; talent acquisition. This is a key concern for many companies as the conversion of meaningful data analysis to actionable business strategies is difficult and requires the kind of skills that are difficult to find in such a new and quickly growing industry.

Gartner have found through discussions with companies who are looking at implementation that despite the drive towards new data initiatives, within the experimentation stage trends are found, but the skills are not available to use these for significant business gains. Therefore leaders become reluctant to move forward when the tentative steps promised much but delivered little.

What are people doing with data? 

Despite the hype around new types of data such as text analysis, image analysis and sensor data, the vast bulk collected is still transactional in nature. This is testament to the customer experience focus of many companies today who can associate certain individual actions to viable business processes.

Gartner’s studies have revealed that there is an increase in the use of machine gathered information such as sensor fed data collection. With the increased use of sensors within electronic devices this is likely to only increase in future.

Overall the Gartner report gives backing to many of the widely held beliefs within the industry. As the famous saying goes ‘if it cannot be measured it cannot exist‘ and this report puts numbers behind assumptions and gives genuine weight to what industry insiders have been saying for a relatively long time.

The report overall will give companies looking to implement a real drive to put some real investment behind Big Data service initiatives whilst also giving practitioners the confidence to expand. We are told that one of the main reasons that startups fail is due to aggressive expansion too quickly, creating unsustainable business models with no genuine numbers behind future business. This report will hopefully give these companies the confidence to go ahead and expand to fill what will be a shortfall in the market within the next decade.

By Gil Allouche

Gil Allouche is the Vice President of Marketing at Qubole. Gil began his marketing career as a product strategist at SAP while earning his MBA at Babson College and is a former software engineer.

Better Data Risk Mitigation For SaaS Providers

Better Data Risk Mitigation For SaaS Providers

Better Data Risk Mitigation for SaaS Providers


We live in a world that is rife with internal-controls breakdowns that result in security and data breaches, which can cause tremendous business and reputational damage for organizations.

The rise of cloud computing systems has now created an even greater need for organizations to develop the right controls to protect data that reside in the ‘cloud.’ Virtually every organization leverages Software-as-a-Service (SaaS) solutions – where data can be easily accessed through a web browser.

As most technology providers are migrating away from larger enterprise data systems to the cloud, it opens the doors to vulnerabilities. With SaaS providers hosting vital client data, they need to provide the right level of assurance that their clients’ sensitive data resides in a highly trusted environment.

Created by the American Institute of Certified Public Accountants (AICPA), Service Organization Control 2 (SOC 2) reporting allows any SaaS provider to mitigate risk when it comes to managing sensitive customer data in a virtualized environment.

Going through a SOC 2 security audit and receiving a favorable report allows SaaS providers to build-in a level of controls and trust in relationships with clients. However, the challenge with SOC 2 reporting is that many SaaS providers are unaware of this reporting and that not having an audit completed can cause significant business damage.

In addition, it is often the SaaS providers’ clients who inquire about SOC 2 reporting, and an “I don’t know” response does not provide clients with the critical assurance that they seek.

Fortunately, there are new tools that help SaaS providers determine their readiness to undergo a SOC 2 security audit and gain a ‘clean opinion.’

As more organizations are seeking support from SaaS providers, we will continue to see the true value of cloud computing emerge for any business sector. Providing the right level of assurance is critical for SaaS providers to further grow their businesses, and the little secret of undergoing a SOC 2 audit is now out of the bag.

Now, is the time to make sure that all of your clients’ data is residing in a truly trusted environment, and there are solutions for meeting this goal.

By Paul L. Shifrin, CPA, is a Director of Audit Services at SC&H Group

Paul directs SC&H’s SOC/SSAE 16 auditing practice, providing companies with audit services for their outsourcing of key components of their clients’ internal controls.

(Image Source: Shutterstock)

How Cloud Services Have Changed The Mindset Of Outsourcing

How Cloud Services Have Changed The Mindset Of Outsourcing

How Cloud Services Have Changed the Mindset of Outsourcing

The business of information technology outsourcing has been dominated by sizable companies such as IBM, CSC, ASC and others that provide a comprehensive line card of IT services.  These mammoth outsourcers were viewed as a one-stop-shop delivering IT services ranging from level-1 help desk to level-3 network engineers and even C-level executives.  This one-throat-to-choke mentality was popular with companies looking to outsource as it was thought to replace the finger pointing between multiple vendors with the hopes of a single, unified IT solution.  In some cases operational excellence was achieved and IT services improved while cost was contained.  While there were success stories, many outsourcing arrangements were a failure based on the customer’s experience.  Costs often ended up increasing during the duration of the contract while SLA’s and response times were often higher.

Single Source Dominated

So why did these single source providers dominant when there appeared to be so many disgruntled customers?  These providers were uniquely capable of building IT teams with competence ranging from help desk to CIO level skills on technologies from PCs to Mainframes.  An ability that required massive recruiting and training efforts not afforded to the smaller companies.

Secondly, these outsourcing contracts were not only diverse in technical skill requirements but also geographically challenging.   Many of the companies that chose to outsource not only had numerous locations geographically dispersed across the United States but internationally.

“While there were success stories, many outsourcing arrangements were a failure based on the customer’s experience.”

Only the large outsourcer with a global breadth of resources was able to support these complex opportunities.

Finally, most outsourcing contracts were a minimum of 5 years in duration and valued in the tens or hundreds of millions of dollars.  In order to win contracts of this size the providers needed to demonstrate financial stability and long-term viability.

As a result, companies seeking to outsource developed all encompassing requests for proposals with the goal of selecting a single provider capable of delivering a full compliment of IT services.  By its very nature, this thought process eliminated the smaller outsourcers focused on fewer technologies and services even though they were often better qualified in their areas of specialization, than their larger counterparts.

The Problem

The larger outsourcers, through no fault of their own, often fell into the category of “jack of all trades, master of none”.  Trying to be the best in everything IT is a paramount challenge very few, if any, can accomplish.  Also, in order to assure profitability, these providers implemented common computing platforms which provided benefit to the customer by introducing stability while allowing the outsourcers to move employees between customers with minimal learning curve.  The downside was these common platforms were often based on older technology thereby stifling the customer’s ability to innovate based on the newer technologies available.

The Shift in Thinking

The Cloud, as defined by NIST, “is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction”.   So what does the Cloud have to do with outsourcing? The Software as a Service (SaaS) delivery method of the Cloud Model is the catalyst to the outsourcing mind shift.

SaaS is the capability provided to the consumer to use the provider’s applications running on a cloud infrastructure.  Popular SaaS providers like provide a targeted IT solution to companies delivered entirely from their managed cloud platform to any end-user with Internet connectivity and a browser.  These solutions are quickly provisioned, require minimal customer intervention and are available for a monthly fee.  As more and more customers leverage these SaaS solutions ranging from hosted Customer Relationship Management Systems, Sales Automation Systems, Accounting Systems, Email, SPAM/AntiVirus, Archiving and the countless other cloud based software solutions, it becomes apparent that the IT group’s thinking is shifting.  SaaS has even enabled a new outsourcing model called “Shadow IT” which is where lines of business directly contract for IT services without involving their internal IT departments.  IT, now often driven by Shadow IT, is open to outsourcing specific IT solutions to best of breed providers.  This is a significant change from the one-throat-to-choke mentality of the past.

These Disrupters are companies that leverage the Cloud to quickly startup a company with nothing more than an idea.  The relative low cost, flexibility and speed to provision IT services provided by the Cloud has allowed these new companies to bring a concept to market so inexpensively that business plans and the funding they are often written for are not necessary.  Market research is no longer theoretical but delivered in real time based on social media feedback.  The barriers to launch a business by these disrupters have been eliminated.  These SaaS providers often leverage the other two Cloud delivery models of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) to build and distribute their offering.

The end result is companies of all sizes leverage the Cloud to provide best of breed solutions and these solutions are consumed by organizations based on best-of-breed service.

Benefitting The Corporations

As enterprises continue to utilize Cloud Services to fulfill their needs, “These Disrupters are companies that leverage the Cloud to quickly startup a company with nothing more than an idea.

Cloud specific positions are appearing within these companies.  At this time, many of the positions are dedicated to managing an organization’s Cloud Provider partners.  Companies will get more granular with their IT outsourcing needs as these positions become more prevalent and they feel more comfortable delegating their IT functions across many Cloud Providers.

Also, these functions will not be limited to SaaS offerings.  IaaS is a general platform being offered by Cloud Providers.  IaaS provides a cost effective medium for companies to acquire compute and storage resources while providing the flexibility and scalability required to meet ever-changing business demands.  Companies want to get out of “the server hardware and storage business” and spend their time focusing on solving business issues.  Skilled IaaS providers, those with a systems integration background, are leveraging their IaaS platform to deliver turnkey, outsourced solution for a monthly fee to their customers.


(Image Sources: Shutterstock)

Disaster recovery is a perfect example of such a turnkey solution.  Disaster Recovery as a Service (DRaaS) is a solution which replicates a customers mission critical servers to a Cloud Providers IaaS.  If the customer declares a disaster, the Cloud Providers activates the replicated servers at their location and turns them into the customer’s production site.  The Cloud provider is responsible for monitoring the replication, activating the servers to production and managing all other aspects to successfully make the replicated environment accessible by the customer.    Another example of a turnkey solutions is Desktop as a Service (DaaS) where customers host their Windows desktops at a Cloud Provider or the outsourcers that leverage their IaaS offering host and manage a customer’s entire IT infrastructure.

This shift will allow smaller, more focused companies to win their share of the outsourcing business.  The customer will benefit by working with specialists in each area to fulfill their needs rather than settling for the bigger, often less skilled, single source solution.

By Marc Malizia,

Marc is the Chief Technology Office and a founding partner for RKON Inc.  As the CTO, he has responsibilities for designing and enhancing both RKON’s Professional and Cloud Service offerings.  During his 15 years growing RKON, Marc served as a pre-sales subject matter expert on technologies ranging from application delivery and security to Cloud and managed services.  Marc earned a B.S. in Computer Science from the University of Illinois in 1987 and a M.S. in Telecommunication from DePaul University in 1992.

CloudTweaks Comics
Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted.…

Using Private Cloud Architecture For Multi-Tier Applications

Using Private Cloud Architecture For Multi-Tier Applications

Cloud Architecture These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything…

Lavabit, Edward Snowden and the Legal Battle For Privacy

Lavabit, Edward Snowden and the Legal Battle For Privacy

The Legal Battle For Privacy In early June 2013, Edward Snowden made headlines around the world when he leaked information about the National Security Agency (NSA) collecting the phone records of tens of millions of Americans. It was a dramatic story. Snowden flew to Hong Kong and then Russia to avoid deportation to the US,…

The Rise Of BI Data And How To Use It Effectively

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…

How You Can Improve Customer Experience With Fast Data Analytics

How You Can Improve Customer Experience With Fast Data Analytics

Fast Data Analytics In today’s constantly connected world, customers expect more than ever before from the companies they do business with. With the emergence of big data, businesses have been able to better meet and exceed customer expectations thanks to analytics and data science. However, the role of data in your business’ success doesn’t end…

How The CFAA Ruling Affects Individuals And Password-Sharing

How The CFAA Ruling Affects Individuals And Password-Sharing

Individuals and Password-Sharing With the 1980s came the explosion of computing. In 1980, the Commodore ushered in the advent of home computing. Time magazine declared 1982 was “The Year of the Computer.” By 1983, there were an estimated 10 million personal computers in the United States alone. As soon as computers became popular, the federal government…

5 Ways To Ensure Your Cloud Solution Is Always Operational

5 Ways To Ensure Your Cloud Solution Is Always Operational

Ensure Your Cloud Is Always Operational We have become so accustomed to being online that we take for granted the technological advances that enable us to have instant access to everything and anything on the internet, wherever we are. In fact, it would likely be a little disconcerting if we really mapped out all that…

Don’t Be Intimidated By Data Governance

Don’t Be Intimidated By Data Governance

Data Governance Data governance, the understanding of the raw data of an organization is an area IT departments have historically viewed as a lose-lose proposition. Not doing anything means organizations run the risk of data loss, data breaches and data anarchy – no control, no oversight – the Wild West with IT is just hoping…


Sponsored Partners