Category Archives: Cloud Computing

Driverless Cars – The End Of The Human

Driverless Cars – The End Of The Human

Driverless Cars

We should have seen it coming. The decline of the relationship between human and car has been gathering pace for years. First we saw automatic transmission replacing manual gear changes, then cruise control removed the need for pedal pushing on long open roads, and finally GPS systems replaced the map. Now, it seems the next major revolution of the motor industry will be driverless cars, doing away with the human input altogether.

In truth, it’s probably a good thing. Despite what we tell ourselves, we aren’t very good drivers. We’re easily distracted, we crash, we break speed limits, and we run red lights. It is believed that of the 10 million annual road accidents in North America, 9.5 million are a result of driver-error.

What role does cloud computing and big data have in the emergent industry? Could Google soon be driving you to work?

It appears that widespread adoption of driverless cars is now very much on the auto-driverlesshorizon. As of August 2012 Google had completed 500,000 miles of driverless testing, with not one reported accident or incident. Ironically, the only crash in which a driverless car was involved was near Google’s headquarters when a human was controlling the vehicle.

The technology is impressive. Using a 64-beam radar mounted to the top of the car, the vehicle produces a constant 3D model of its surroundings. This model is then correlated with existing high resolution maps of the area and thus produce a wide variety of data sets that the car uses to drive itself.

The ‘car cloud’ has huge benefits to drivers aside from the removal of the physical need to operate the car. Imagine a situation where a crash has created long delays five miles in front of you. The car cloud would relay the information and details of the crash to the other cars in the area – instantly rerouting them via alternative routes. The cloud would also help ensure none of the alternative routes got too busy, streaming cars round a variety of different routes in the most efficient way possible.

The huge amounts of data collected by the cars has both advantages and disadvantages to an owner. On the positive side, the data will help you with insurance claims. Until adoption of driverless cars near 100 percent there will still be accidents, but the way the car constantly models its environment and records every part of a journey in the car cloud means that liability will be easy to apportion and prove.

On the negative side, there is a question of privacy. Who owns the data produced by the cars? Recording where, how and when the car was driven, and including information about the performance of parts within the car, the data is inherently valuable. Already there are examples of manufacturers asking buyers to sign a waiver that grants the manufacturer permission to use the car’s data, and producers are likely to stick with this process as the cars become mainstream. Younger drivers may not question the idea of signing away their privacy, but for older drivers it will certainly be a sticking point.

What’s your opinion? Are driverless cars set to make our roads safer? Who is responsible in an accident between two computer controlled cars? What is the future of the existing car industry? Let us know in the comments below.

By Daniel Price


Could Indie Gaming Grow Into A Challenge For Big Game Publishers?

Could Indie Gaming Grow Into A Challenge For Big Game Publishers?

Could Indie Gaming Grow Into a Challenge for Big Game Publishers?


I will be the first to admit, I am a Steam junkie. The sheer number of different games available for purchase or demo is staggering, providing choices no matter what budgetary constraints I am under. All of the industry staples are here, including giants like the Elder Scrolls, Bioshock and Total War franchises. No matter what mood you are in, you are sure to find tons of good titles to satisfy it using the Steam service.

Whilst poking around the other day looking for a new game to master, I noticed the Indie section for the first time. It caught my eye, mainly due to all of the hype that has surrounded this category of games in recent news. While big-developer games are still the reigning champs of the gaming world, indie producers are fast making serious inroads and claiming their slice of the pie.

What Are Indie Games?

Indie games set themselves apart from traditional games in that their creation and development is done by private individuals or small teams. This is a sharp contras to traditional games, which usually have hundreds of people to work on them and millions of dollars to spend creating them.

This production process produces games, which while usually less graphic-intensive than their mainstream counterparts, can actually have a heightened feel of creativity to them. In addition, due to the absence of the immense operating costs of big-game publishers, indie games are usually much less expensive to buy.

What Is Available Now?

If you want an example of just how successful an indie game can be, look no further than Minecraft. The designers of Minecraft took the simple activity of building and turned it into one of the most loved and lucrative games currently available.

Minecraft is not the only game that has caused industry experts and players alike to sit up and take notice. In fact, two staples of the mainstream gaming industry, Sony and Microsoft, have already professed their love for indie games. Each of these behemoths recently announced different features and packages designed specifically with the indie game in mind.

What Is on the Horizon?

Obviously, given that technology is changing and evolving each and every day, we are by no means at the pinnacle of possibility when it comes to indie games. In fact, independent developer, Octav, predicts that many current gaming models used by industry giants will fall by the wayside, clearing the way for more indie games to rise to prominence.

The creativity previously mentioned that is so prevalent in indie games is also expected to turn the tide in their favor. Big game developers have a narrow look when developing a game, usually constrained by the ever-present need to turn a profit. While indie game developers do want to make money, many times the driving force behind their creations is turning a good idea into a cool game. This will lead to a much more diverse landscape to choose your indie games from.

While indie gaming has certainly begun to enter into its own, the possibilities and potential it holds signals a true rise in prominence for this genre in the coming years. If you enjoy cloud gaming, as well as an inexpensive and diverse menu of different options to choose from, indie gaming may just well be your perfect niche.

By Joe Pellicone

The Truth About Clouds And Data: Can Your Storage Survive Climate Change?

The Truth About Clouds And Data: Can Your Storage Survive Climate Change?

The Truth About Clouds and Data: Can Your Storage Survive Climate Change?

Just like real clouds, cloud infrastructure is in constant flux. No cloud looks the same way twice. That’s the trouble with traditional storage architectures. They pretend that the cloud they serve either stays the same in a neat little box, or grows steadily in nice predictable spurts. As a data manager, you know that’s poppycock. Workloads are dynamic. Just like the weather, data storage demands are not as easy to predict as people think—and guessing it wrong can be embarrassing, expensive, or even disastrous.

That’s why you need an environment that fits your business now and grows with you. Application specific storage is out. Now, scalable is what’s up. Let’s compare. Sure, you can buy big now, but all that wasted processing power and capacity still costs CAPEX. What if you start small? Fast forward. By the time you feel growing pains and wait for a forklift upgrade, your performance is already suffering. With a modular, pay-as-you-grow approach, you immediately eliminate the top-heavy outlay of investment. Meanwhile, your compute and storage capacity keep pace with your data and applications. That agility makes cloud service providers’ rates and SLAs more competitive.

The agility of modular, “grow-as-you-go” storage goes beyond flexible capacity. You can see it in operational efficiency. In days of yore (and maybe even in your days), companies were forced to use proprietary hardware solutions found in legacy storage: solutions that only allowed them to run one application at a time per physical server. Scalable environments, on the other hand, not only use industry standard hardware for just-in-time scaling, but also provide intelligent software to simplify and even automate many operational tasks, including reallocation. This allows providers to run different workloads on shared infrastructures. Such an environment can move with your data, and not just grow with it. Now isn’t that much more cloud-like?

Here are the two key capabilities that infrastructures must enable for cloud service providers to survive:

  • Independent scaling of compute and capacity. To address the rapid, unpredictable growth of real-world data and application demands, cloud service providers need the flexibility of a storage architecture that is immune to performance plateaus. Scale-on-demand cloud builders are not forced to do a rip and replace update. Modular, scalable architecture allows the addition of performance and/or capacity as needed. 
  • The capacity to embrace dynamic workloads—not just endure them. Cloud service providers need the capacity to support highly dynamic workloads so they can deliver differentiated performance, resilience, and availability per workload across multiple applications. Free from the glass ceiling of pre-upgrade plateaus, scale-on-demand capacity enables providers to reconfigure storage as workload profiles change, and dynamically deliver the desired level of performance, resilience, and availability for new requirements.

Tomorrow’s cloud service models demand dexterity in capacity and performance. That’s why they cannot be shoehorned into legacy storage architectures that don’t fit properly and can’t grow without replacement.

To compete with large public clouds, builders of public and private clouds need radically different storage architectures that remain as flexible, dynamic, and scalable as the data they handle. Modular, pay-as-you-grow, scalable environments are much more capable of weathering the storms of change faced by cloud service providers and their customers. That’s what’s on the horizon for modern data centers, like yours.

Gokul SathiacamaBy Gokul Sathiacama

Gokul is the senior director of product management for Coraid, responsible for the product strategy, definition and life cycle for Coraid’s EtherDrive Data Storage solutions. Prior to joining Coraid, Sathiacama held product management positions at Pillar Data Systems (acquired by Oracle) and Sun Microsystems, Inc. He holds a bachelor’s degree in computer engineering from the University of Southern California and a master’s degree from Pepperdine University.

10 Useful Cloud Security Tools: Part 1

10 Useful Cloud Security Tools: Part 1

10 Useful Cloud Security Tools: Part 1

Cloud computing has become a business solution for many organizational problems. But there are security risks involved with using cloud servers: service providers generally only take responsibility of keeping systems up, and they neglect security at many ends. Therefore, it is important that clouds are properly penetration (pen) tested and secured to ensure proper security of user data.

There are many tools available that can be used to automate the process of pen testing. Most of them can be found with pen testing distributions like Backtrack or Blackbox. Here is a list of recommended tools for pen testing cloud security:

Acunetix – Web Vulnerability Scanner


This information gathering tool scans web applications on the cloud and lists possible vulnerabilities that might be present in the given web application. Most of the scanning is focused on finding SQL injection and cross site scripting vulnerabilities. It has both free and paid versions, with paid versions including added functionalities. After scanning, it generates a detailed report describing vulnerabilities along with the suitable action that can be taken to remedy the loophole.

This tool can be used for scanning cloud applications. Beware: there is always a chance of false positives. Any security flaw, if discovered through scanning, should be verified. The latest version of this software, Acunetix WVS version 8, has a report template for checking compliance with ISO 27001, and can also scan for HTTP denial of service attacks.

Aircrack-ng – A Tool for Wi-Fi Pen Testers

This is a comprehensive suite of tools designed specifically for network pen testing and security. This tool is useful for scanning Infrastructure as a Service (IaaS) models. Having no firewall, or a weak firewall, makes it very easy for malicious users to exploit your network on the cloud through virtual machines. This suite consists of many tools with different functionalities, which can be used for monitoring the network for any kind of malicious activity over the cloud.

Its main functions include:

  • Aircrack-ng – Cracks WEP or WPA encryption keys with dictionary attacks
  • Airdecap-ng – Decrypts captured packet files of WEP and WPA keys
  • Airmon-ng – Puts your network interface card, like Alfa card, into monitoring mode
  • Aireplay-ng – This is packet injector tool
  • Airodump-ng – Acts as a packet sniffer on networks
  • Airtun-ng – Can be used for virtual tunnel interfaces
  • Airolib-ng – Acts as a library for storing captured passwords and ESSID
  • Packetforge-ng – Creates forged packets, which are used for packet injection
  • Airbase-ng – Used for attacking clients through various techniques.
  • Airdecloak-ng – Capable of removing WEP clocking.

Several others tools are also available in this suite, including esside-ng, wesside-ng and tkiptun-ng. Aircrack-ng can be used on both command line interfaces and on graphical interfaces. In GUI, it is named Gerix Wi-Fi Cracker, which is a freely available network security tool licensed to GNU.

Cain & Abel

This is a password recovery tool. Cain is used by penetration testers for recovering passwords by sniffing networks, brute forcing and decrypting passwords. This also allows pen testers to intercept VoIP conversations that might be occurring through cloud. This multi functionality tool can decode Wi-Fi network keys, unscramble passwords, discover cached passwords, etc. An expert pen tester can analyze routing protocols as well, thereby detecting any flaws in protocols governing cloud security. The feature that separates Cain from similar tools is that it identifies security flaws in protocol standards rather than exploiting software vulnerabilities. This tool is very helpful for recovering lost passwords.

In the latest version of Cain, the ‘sniffer’ feature allows for analyzing encrypted protocols such as SSH-1 and HTTPS. This tool can be utilized for ARP cache poisoning, enabling sniffing of switched LAN devices, thereby performing Man in the Middle (MITM) attacks. Further functionalities have been added in the latest version, including authentication monitors for routing protocols, brute-force for most of the popular algorithms and cryptanalysis attacks.


Ettercap is a free and open source tool for network security, designed for analyzing computer network protocols and detecting MITM attacks. It is usually accompanied with Cain. This tool can be used for pen testing cloud networks and verifying leakage of information to an unauthorized third party. It has four methods of functionality:

  • IP-based Scanning – Network security is scanned by filtering IP based packets.
  • Mac-based Scanning – Here packets are filtered based on MAC addresses. This is used for sniffing connections through channels.
  • ARP-based functionality – ARP poisoning is used for sniffing into switched LAN through an MITM attack operating between two hosts (full duplex).
  • Public-ARP based functionality – In this functionality mode, ettercap uses one victim host to sniff all other hosts on a switched LAN network (half duplex).

John the Ripper

The name for this tool was inspired by the infamous serial killer Jack the Ripper. This tool was written by Black Hat Pwnie winner Alexander Peslyak. Usually abbreviated to just “John”, this is freeware which has very powerful password cracking capabilities; it is highly popular among information security researchers as a password testing and breaking program tool. This tool has the capability of brute forcing cloud panels. If any security breach is found, then a security patch can be applied to secure enterprise data.

Originally created for UNIX platforms, John now has supported versions for all major operating systems. Numerous password cracking techniques are embedded into this pen testing tool to create a concise package that is capable of identifying hashes through its own cracker algorithm.

Cloud providing vendors need to embed security within their infrastructure. They should not emphasize keeping high uptime at the expense of security.

By Chetan Soni

The Challenges Of Multi-tenancy

The Challenges Of Multi-tenancy

The Challenges of Multi-tenancy

Regarded as one of the most important features of cloud computing, multi-tenancy is a key common attribute of both public and private spaces. It applies to all three layers of a cloud (IaaS, PaaS and SaaS) and refers to a software architecture design in which a single instance of a software application serves multiple customers.

Multi-tenancy architecture has many benefits over multi-instance architecture. It is often cheaper to run thanks to software development costs and maintenance costs being shared, updates are faster because the provider only has to make the changes once, and it is easily scalable. Nonetheless, challenges of running software for a large number of tenants still presents problems – what are they?



Software providers will naturally argue that their software is protected with the highest level of security available and that a company’s data is more secure than ever on their servers. Nonetheless, there is a scope for human error, where a database administrator accidentally grants access to an unauthorized person or contravenes the security policy of an organisation.

There is also the threat of hackers – no matter how secure an encryption is it can always to broken with the right knowledge. A hacker who breaks the encryption of multitenant database will be able to steal the data of hundreds of businesses who have data stored on it.

Capacity Optimization

Database administrators need the tools and the knowledge to understand which tenant should be deployed on which network in order to maximise capacity and reduce costs. This is process is further complicated by the need to continuously align capacity with business demand and requires providers to manage the actual and forecasted resource utilization for all their servers.

Service Delivery and High Availability

When failures occur or when certain services generate abnormal loads the service delivery can be interrupted – yet business clients will often request high-availability, typically 99.999 percent. Therefore, monitoring the service delivery and its availability is critical to ensure that the service is properly delivered and meeting SLAs. Without effective monitoring problems are hard to locate and downtimes are increased – often leading to lost revenue.



According to Librato CTO and co-founder Joseph Ruscio, “modern IT environments are incredibly dynamic and their operators require sophisticated alerting capabilities”. He believes effective monitoring can be the solution for successfully managing the ever changing IT landscape and thus many of the challenges of multi-tenancy.

Ruscio’s company, the San Francisco-based Librato, offers clients a secure, stable and resistant platform that has been optimised for time series data analytics. It allows users to see all the metrics that are required to track the health of web-scale applications and consequently enables them to quickly find the cause of unexpected patterns and events. Their software accepts both a company’s operational metrics and its other additional metrics by using a REST API, and presents all the information in a web-based application that is highly-detailed and easy-to-use.

Indeed the company has recently launched a brand new alerting platform that they hope provide a framework for industry-leading new features. Amongst other features, users can now be alerted on application-level SLAs, on when a source stops reporting and on when all data-points in a given duration exceed a threshold.

Librato is rapidly becoming one of the ‘must-haves’ of multi-tenant architecture monitoring – a view echoed by Scott Turnquest, application developer at ThoughtWorks. He says, “Librato is one of the most important live dashboards that we have running in our team room. By watching out for particular trends, we’re usually able to be proactive about issues before they affect customers”.

Turnquest’s quote undoubtedly highlights the key reason for using an effective monitoring solution – addressing problems before they reach the customer. It means less downtime, reduced costs, improved client feedback, a better reputation in the market place, and improved business prospects long term. Ultimately, a high quality cloud monitoring tool such as Librato will aid administrators of multi-tenant architecture improve its security, capacity optimisation, service delivery, and high availability by helping them to configure problem detection and to do root-cause analysis. 

What do you think are the challenges of multi-tenant architecture? What about solutions? Do you use an effective monitoring tool? Let us know in the comments below.

By Daniel Price

Post Sponsored By Librato

How Haptic Technology Could Revolutionize The Cloud Gaming Experience

How Haptic Technology Could Revolutionize The Cloud Gaming Experience

How Haptic Technology Could Revolutionize the Cloud Gaming Experience

Haptic Technology

When it comes to picking a word used to describe the pinnacle of the online gaming experience, in this author’s humble opinion, that word would have to be “realistic.” Sure, plenty of other attributes exist that are important to the gaming world in general, and the online gaming world in specific. However, no Holy Grail exists that is more important than finding the most realistic gaming experience imaginable.

This quality is actually more important in cloud gaming situations than what you would find playing a single-player game. Granted, most people who play video games like to experience a realistic environment in the games they play. However, given the fact that much of the efficacy of cloud play is based on how quickly you react to, and interact with, your surroundings, the tiny details found in more realistic games usually give the gamer an edge.

So how do we further increase the realism found in the cloud games we play? Graphics and sound have reached pinnacles unseen in years past, contributing greatly to the level of realism we enjoy. There is, however, one area that has garnered an exceptional amount of buzz from recent innovations that could very well be the answer to the ever-present quest of supreme realism: haptic technology.

What Is Haptic Technology?

Haptic technology is defined as “a tactile feedback technology which takes advantage of the sense of touch by applying forces, vibrations, or motions to the user.” Basically what this means is that using different types of vibrations and motions, haptic technology can mimic one of the areas neglected in most current game hardware: the tactile.

What We Have Today

Haptic technology has actually been around for quite a while, with the first patent based on it being filed in the 1970s. Unfortunately, the tech hardware and software available at the time were in no way sophisticated enough to fully utilize this new innovation. As a result, similarly to what has happened with all types of inventions unsupported by what was currently available, it was shelved.

One of the first popular uses of haptic technology in the gaming industry was the Sony DualShock controller. This controller differed from all other offerings at the time by providing a vibratory feeling whenever predefined actions occurred in the game. Instead of just seeing the gun you are using, now you can feel it buck in your hand in tandem with the shots you fire.

Another exciting product that is just starting to show its face on the mainstream consumer market is the haptic glove. Also know as a wired glove or data glove, haptic gloves take the premise first established by products like the DualShock and enhances it by an order of magnitude. The amount of different ways to customize these gloves, such as the Keyglove, is astounding.

What Is On the Horizon?

If you think the DualShock or the Keyglove are cool, imagine of you had the capacity to feel haptic feedback throughout your entire body? Sound too good to be true? A new Kickstarter project, named the ARAIG, says differently.

The ARAIG, which stands for As Real As It Gets, is a suit that you wear, which contains 48 different haptic feedback sensors placed strategically throughout it. This allows for individual sensations to be felt exactly where they should be. Things like the impact of a bullet, the explosion of a bomb or even pouring rain, can be felt exactly where you would feel it in a real-world scenario.

Given that a split second can turn a killstreak into a fatality, having every possible way to experience things more quickly can only serve to enhance the game you are playing, as well as its playability. Haptic technology can provide this advantage, potentially turning you into the gamer only found in your dreams.

By Joe Pellicone

(Infographic Source:

Challenges Faced By Cloud Security

Challenges Faced By Cloud Security

Challenges Faced By Cloud Security

Cloud Infographic_001

Cloud computing has revolutionized the way businesses manage their data. The amount of data produced by the corporate sector has increased at a rapid rate over the past few years. In order to handle this exponential need for storage space, organizations need a reliable and secure approach with which they can use to optimize their operations, which in turn will reduce costs. Cloud computing provides suitable development environments, rapid resources for operating platforms, application environments and backup and storage of data at low costs. But, some of the factors that make cloud computing such a convenience for managing resources also raise considerable security concerns.

Challenges Faced by Cloud Security

Cloud computing inherits the security issues pertaining in the technologies that it uses, which consists chiefly of the risk of a breach in the integrity or confidentiality of information. One security measure is encrypting stored data, but there are drawbacks with encryption and it does not always protect data. This presents a very challenging situation for cloud security professionals. Seven of these challenges are discussed below:

1) Breach of Trust

In cloud services, it is very important that the service provider has the trust of his customer and he does not exploit this in any way. There is no way to be 100% sure of your cloud service providers being trustworthy. There are certain legal issues entangled with cloud security as well, because there are certain laws that cloud service providers should comply with and these laws vary from country to country. Users have no idea or control over where or in what jurisdiction their data is being physically stored over the cloud.

2) Maintaining Confidentiality

Preventing improper disclosure of information is maintaining confidentiality of data. Service providers have full access to your data, so they have the opportunity to misuse this information. This issue requires proper attention from an information security analyst in order to ensure your data is not being shared without your permission.

3) Preserving Integrity

Integrity is preventing illegal modification of data or its instances. Users with privilege to your data can easily modify it unless it is encrypted. One entity with such privilege is a cloud service provider. Preserving integrity of data over the cloud is a viable challenge to security researchers.

4) Authenticity and Completeness

In a cloud, there may be multiple users with varying levels of access privilege to your data. A user with limited access may have access to a subset of data, but he needs to be assured that this subset is valid and verified. Digital signatures are used for providing a validation, proof of authentication for access to a superset of data. Certain approaches inspired by Merkle trees and signature aggregation are used for digital validation of data. But still there are vulnerabilities for this issue in cloud security.

5) Risk Factors Associated with Virtual Machines

In a typical cloud model application, processes are run from within virtual machines. These virtual machines are on a shared server with other virtual machines running as well, some of which may be malicious. Security researchers have proved that attacks from one virtual machine to another is possible. Therefore, cloud security experts consider this a serious issue.

6) Vulnerabilities from Shared Resources

Cloud data running on multicore processors is vulnerable to application data being compromised, because, as researches have shown, applications can communicate through the cores and may exchange data as well. With the multi tenancy architecture of a cloud server in which many applications are stored on the same server, it is always possible for malicious users to intercept data from the network channel.

7) Issues with Encryption

Although encrypting data seems like the solution for preserving confidentiality, integrity and authenticity on the cloud, this approach does have shortcomings. For one, this is not a cost effective method because to decrypt data, an enormous amount of computational time is added to the processing time. Each time a query runs in the database, both the cost and time increases dramatically, especially if the amount of data is very large. Encryption algorithms are subject to get tracked down as well. Cloud security professionals have the challenge of continuing to reinforce this technique.

Cloud computing can be used for carrying out various IT functions, and providing security to the cloud is not an easy task for cloud security professionals as there are various security concerns.

There are many benefits to cloud computing. Cloud computing provides a viable means for building cost effective solutions which are substantially flexible. By using virtual servers on internet, cloud computing provides easy delivery platforms for serving business and eases out more expensive consumer IT services.

However, there are serious risks of integrity and confidentiality for data shared on a cloud. This is because required services are often outsourced from a third party, which makes it difficult to ensure security and privacy of data.

Security professionals still need to deal with the architectural flaws of the cloud computing model so that cloud computing can be made more reliable and trustworthy.

By Chetan Soni

Mozilla’s Decision To Promote Brendan Eich To CEO Inspires Boycotts

Mozilla’s Decision To Promote Brendan Eich To CEO Inspires Boycotts

Mozilla’s Decision to Promote Brendan Eich to CEO Inspires Boycotts


Mozilla has announced Brendan Eich will take over as the long-term replacement for interim CEO Jay Sullivan, who is leaving to “pursue new opportunities”. Sullivan originally stepped in for former CEO Gary Kovacs, who left in April 2013.

While working for Netscape in 1995, Eich invented JavaScript, which became the most widely used programming language for webpages and internet applications. Eich co-founded the project in 1998 and was promoted to CTO of Mozilla in 2005.

His promotion to CEO has drawn criticism from gay and human rights activists. This is due to controversy that arose in 2012, when it was revealed that Eich donated $1,000 to a campaign supporting Prop 8 in California, a piece of legislation that denied gay and lesbian couples the right to marry.  Prop 8 passed, but was then overturned by a California state court, and this ruling was upheld by the United States Supreme Court.

Many users, now former users, of Mozilla services have begun boycotting their products, including their popular web browser Firefox. Hampton Catlin, who co-founded his company, rarebit, with his husband, wrote an open letter to Mozilla informing them he would no longer develop or test his apps on Firefox.

Box Goes Public with IPO

After over a year of anticipation, sparked by a January 2013 interview with Box’s (formerly 29-year-old CEO Aaron Levie that announced eventual plans to go public, on Monday afternoon Box filed the necessary paperwork with the United States Securities and Exchange Commission that will allow them to sell shares to the general public. Box shares will now be available on the New York Stock Exchange under the ticker symbol BOX.

According to their S-1 filing, as of January 31st, 2014, Box has more than 25 million registered users, over 34,000 businesses that pay to use their services and 972 employees. Box noted revenues of $124.2 million in 2013, but also claimed some notable losses of $50.3 million, $112.6 million and $168.6 million in 2011, 2012 and 2013, respectively.

Eight banks, including Morgan Stanley and JP Morgan Chase, worked with Box for this initial offering. This IPO is a dual class offering, meaning shareholders who bought shares after Box went public will receive one vote per share, while company insiders who attained shares before the company went public will receive ten votes per share. Box expects to raise $250 million in its first month on the stock exchange.

By Adam Ritche

CloudTweaks Comics
Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

Four Recurring Revenue Imperatives

Four Recurring Revenue Imperatives

Revenue Imperatives “Follow the money” is always a good piece of advice, but in today’s recurring revenue-driven market, “follow the customer” may be more powerful. Two recurring revenue imperatives highlight the importance of responding to, and cherishing customer interactions. Technology and competitive advantage influence the final two. If you’re part of the movement towards recurring…

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups Cloud platforms have become a necessary part of modern business with the benefits far outweighing the risks. However, the risks are real and account for billions of dollars in losses across the globe per year. If you’ve been hacked, you’re not alone. Here are some other companies in the past…

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart  Breaches Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove whether a system achieves its desired functional specifications. It is employed by organizations that build products that absolutely cannot fail. One of the reasons NASA rovers are still roaming Mars…

The Security Gap: What Is Your Core Strength?

The Security Gap: What Is Your Core Strength?

The Security Gap You’re out of your mind if you think blocking access to file sharing services is filling a security gap. You’re out of your mind if you think making people jump through hoops like Citrix and VPNs to get at content is secure. You’re out of your mind if you think putting your…

Do Not Rely On Passwords To Protect Your Online Information

Do Not Rely On Passwords To Protect Your Online Information

Password Challenges  Simple passwords are no longer safe to use online. John Barco, vice president of Global Product Marketing at ForgeRock, explains why it’s time the industry embraced more advanced identity-centric solutions that improve the customer experience while also providing stronger security. Since the beginning of logins, consumers have used a simple username and password to…

Your Biggest Data Security Threat Could Be….

Your Biggest Data Security Threat Could Be….

Paying Attention To Data Security Your biggest data security threat could be sitting next to you… Data security is a big concern for businesses. The repercussions of a data security breach ranges from embarrassment, to costly lawsuits and clean-up jobs – particularly when confidential client information is involved. But although more and more businesses are…

Maintaining Network Performance And Security In Hybrid Cloud Environments

Maintaining Network Performance And Security In Hybrid Cloud Environments

Hybrid Cloud Environments After several years of steady cloud adoption in the enterprise, an interesting trend has emerged: More companies are retaining their existing, on-premise IT infrastructures while also embracing the latest cloud technologies. In fact, IDC predicts markets for such hybrid cloud environments will grow from the over $25 billion global market we saw…


Sponsored Partners