Category Archives: Cloud Computing

Data Breaches: Incident Response Planning – Part 2

Data Breaches: Incident Response Planning – Part 2

Incident Response Planning – Part 2

Continued from Part 1… As an estimated 50 million consumers were yet to be informed more than a month after the breach discovery, a Senate health committee had to intervene. But that wasn’t the end of Anthem’s missteps — it took customer’s days after calling a dedicated phone line to receive a call back…

What Post-Breach Response ‘Should’ Have Looked Like

As Verizon so aptly observed in its soldier analogy, it’s challenging to defend your perimeter if you don’t know what to expect. There’s no doubt that some of the incident-response scenarios that played out in the public eye would have been different if the companies had been better prepared to not just address a breach but also plan for the right type of scale.

In eBay’s case, for example, knowing that there is no such thing as a foolproof security might have led them to an “assume compromise” philosophy. Which means having a clear understanding of where the data resides, and what risk each category of data is exposed to, based on what systems are compromised. Refusing to give an estimate a week after a breach is the first ingredient in the recipe of a PR disaster.

Socil Media

(Image Source: Shutterstock)

Social Media Voice

The second ingredient in that recipe is ignoring your own social media channels — eBay’s reaction should have been immediate in urging customers to change passwords, with a promise of more information to come as soon as details were available. One component of a communications plan in a crisis like a data breach is a handful of pre-approved templates, with ready-to-go messaging, that can be immediately disseminated to stakeholders. These messages need not alarm customers but should be transparent in stating that a potential breach was being investigated and that as a precaution customers should change passwords for their protection.

Another channel that eBay should have quickly used was its own website — and not by posting confusing, hard-to-see banners. That same collection of templates in the crisis communication plan should have had a succinct but transparent message about a potential breach and what the company was doing to secure the customer’s information.

The great thing about a well-thought-out plan is that it involves various internal and external teams, not just IT or PR but everyone from legal to risk. In the heat of the moment, it’s hard to know which teams should be activated — but with advance planning, this “all hands on deck” scenario will unfold much smoother.

Basic Elements of an Incident-Response Plan

Even with the increased awareness about cybersecurity risk at the BOD and C-suite level, organizations are still lagging in planning for breaches. In its an annual Global Information Security Survey, EY found that of the 1,755 executives who responded, only 43 percent had formal incident response programs for their organizations. Worse yet, only 7 percent of those that had plans integrated a comprehensive approach that included third-party vendors, law enforcement and playbooks. Much work remains to be done in this regard.


Let’s look at some basic components of a plan and rewrite the Anthem response scenario to show how things could have played out differently.

  1. Start with an inventory of data — what types of data your company collects, processes and stores; where it’s stored and how it’s transmitted; who has access both in-house and at third-party contractors, and so on. In our Anthem scenario, with a precise inventory, the insurance provider would know immediately that among the impacted stakeholders are third-party customers, and the risk would be communicated to stakeholders accordingly.
  1. Outline your procedures for monitoring access and conduct regular audits. While monitoring may be mostly an IT concern, it should be spelled out in your plan because it involves cross-company functions and it’s one of the steps that determines the extent of your breach.

Take advantage of the built-in cybersecurity capability of vendors like Salesforce, which not only offers robust security but also provides training for your employees.

  1. Secure the infrastructure. This goes hand in hand with inventorying and monitoring. It should already be part of your daily IT routine but should also be integrated into the master response plan, with additional post-breach steps such as contacting outside forensic investigators.
  1. Create your crisis-communications plan. As previously discussed, this plan should include exact messaging, pre-approved and ready to go with a few “fill in the blank” areas, for different types of incidents. This should also include the categories of recipients for the communications, the delivery schedule and dissemination vehicles (typically more than one channel).

Based on this plan, in the case of the ideal Anthem response, a process would be in place to reach not only its 80 million employees and customers but also its various associates, like Blue Cross and Blue Shield, who were also compromised. Additionally, the digital media team would go all-hands-on-deck to update website and social media information, monitor social channels and respond to common questions and concerns. Plus, an external vendor would be activated temporarily to fill a 24/7, designated customer service center fielding calls related to the breach and signing up customers for credit monitoring.

  1. Assess the legal risks. These are not just based on government regulations and other legal obligations. The possibility of lawsuits is very real, and your post-breach actions can add fuel to the fire if not properly executed. It’s a good idea to engage not just your regular counsel but an outside firm that specializes in breaches, and begin that engagement in the planning stage. This will allow you to begin your public disclosure and mitigation immediately instead of waiting to start a process.

This list is just a basic starting point. Incident-response plans are highly tailored to the individual organization, but best practices should be used when developing them. Not unlike a marketing plan or HR hiring manual, this plan is an important tool that helps address your organization’s success. When a breach happens, you’re likely not going to be less stressed with a plan in hand, but you will know exactly how to proceed without second-guessing your actions and missing critical steps.

By Sekhar Sarukkai

Cloud Computing Then & Now

Cloud Computing Then & Now

The Evolving Cloud 

From as early as the onset of modern computing, the possibility of resource distribution has been explored. Today’s cloud computing environment goes well beyond what most could even have imagined at the birth of modern computing and innovation in the field isn’t slowing.

A Brief History

Matillion’s interactive timeline of cloud begins with the first stored-program computer, the Manchester Baby, developed in 1947. Quickly, time sharing became necessary as the 250 computers available in 1955 were rented to users in efforts to ensure as little downtime as possible. When packet switching was introduced in 1960, the foundation for resource sharing and the internet was laid and shortly thereafter, in a speech at MIT, John McCarthy suggested that computer resources would one day be shared like any other service.


Through the ‘70s, ‘80s, and ‘90s, the world saw the development of the internet and mainstreaming of computers, and in 1996 the term cloud computing was first used by George Favaloro and Sean O’Sullivan, executives at Compaq Computer. During the 2000s mobile and smartphone technology took off and very quickly access to the cloud was common. Of course, the last five years have seen the greatest advances in cloud computing, as seemingly with all technology, it develops exponentially. Global giants such as Amazon, Google, and Apple rely heavily on the cloud, and in 2013 it was estimated global spending on cloud services reached $47 billion.

The Evolution

From digital assistants to smart cars to virtual reality to the internet of things, all of the latest modernizations rely on cloud technology. But so too do most of the traditional services individuals and organizations rely on. Although we’ve seen new products and services focused on managing money, the traditional banking institutions are developing their own services and the environment is nearly unrecognizable to that of ten years ago. Who can even imagine a world without internet banking?


Healthcare similarly has advanced, and not only in the laboratories and offices of pioneering doctors and scientists. Large hospital and patient management institutions are taking up the reigns and following suit, albeit more slowly, and patient care programs are being implemented to combine the benefits of modern devices such as wearables with healthcare regimens. Already two years ago, an HIMSS Analytics survey of cloud adoption in healthcare organizations found 83% of those surveyed were using cloud services. Common uses included the hosting of clinical applications and data, health information exchange, and backup and data recovery.

And the benefits cloud computing promises education are immense. Already, cloud technology is changing the way students learn and extending access to schooling into remote and impoverished areas. Though schools and universities are adopting cloud technologies themselves, many startups such as Education Modified, Kiko Labs, and HSTRY, are coming up with new methods and platforms which enhance and further learning.

Into the Future

It’s predicted that the cloud service market will be worth around $108 billion next year, and by 2020 the number of connected devices worldwide is expected to reach 25 billion. Further estimates suggest cloud computing offers green benefits too, and US organizations moving to the cloud before 2020 will save $12.3 billion in energy costs. Gartner points to a hybrid cloud infrastructure in the coming years, and says Ed Anderson, “I start to think of a multi-cloud environment as a foundation for a next wave of applications.” And according to Forrester Research, we’re on the cusp of the second wave of cloud computing, with service providers focused on next-gen applications that require omnichannel support, time-based analytics, and micro service support. The barrier to entering the cloud seems likely to shrink significantly due to adjusted compliance requirements and regulations, and although security already is a primary focus, with the expansion of cloud, its importance will be magnified. Finally, due to the high demand for cloud services, service providers will soon, if not already, be building next-generation architecture on hyper-converged platforms further reducing maintenance costs and speeding up scalability.

By Jennifer Klostermann

Data Breaches: Incident Response Planning – Part 1

Data Breaches: Incident Response Planning – Part 1

Incident Response Planning – Part 1

The topic of cybersecurity has become part of the boardroom agendas in the last couple of years, and not surprisingly — these days, it’s almost impossible to read news headlines without noticing yet another story about a data breach. As cybersecurity shifts from being a strictly IT issue to being a mission-critical component, BODs are also becoming more interested in what their organizations are doing to plan their incident response.

Cybersecurity professionals are smart to use the philosophy of “assumed compromise” — knowing that no matter how robust the defenses, they will be breached. Just like disaster preparedness helps in the aftermath of a major earthquake, hurricane or another natural calamity, incident-response planning helps organizations prepare in advance for the aftermath of a data breach.

In its recently released “2016 Data Breach Investigations Report,” Verizon compared being part of a infosec team to being a soldier who’s tasked to guard a hill at all costs, but without knowing who the enemy is, what it looks like, where it’s coming from and when. And to make matters worse, that soldier only has an old rifle with a few ammunition rounds.

Incident Response Planning

That is certainly a fitting description of today’s cybersecurity threat landscape. Using this analogy, now imagine this soldier has extensively practiced a variety of scenarios on what an attack “may” look like, and the steps he needs to do when it does happen, regardless of how the attack plays out. This soldier still doesn’t have any more specific details about the enemy or the impending attack, but he is much better equipped for whatever unknown comes his way. That is exactly what an incident-response plan does.

You don’t have to look hard for statistics to know why you need this plan: Last year, the number of discovered zero-day vulnerabilities more than doubled from 2014, according to the 2016 Internet Security Risk Report, newly released from Symantec. In other words, a new zero-day vulnerability popped up every week, on average. At the same time, McAfee Labs whitepaper report predicts a significant shift in the next five yearstoward new threats that are more difficult to detect, including file-less attacks, exploits of remote shell and remote control protocols, encrypted infiltrations and credential theft.”

The size of the organization doesn’t matter, as bad actors don’t discriminate when they look for the lowest-hanging fruit. In its 2015 Internet Security Threat Report, Symantec found a 40 percent increase in the number of large companies targeted compared to the year before — with five of six companies becoming a target. But small businesses aren’t doing any better: In its 2015 Year-End Economic Report, the National Small Business Association found that 63 percent of the businesses fell victim of cyberattacks in the past year. Since almost 90 percent of attacks are driven by financial motivation or espionage (based on the 2016 Verizon study), if you collect and store any type of information — employee records, customer data, intellectual property etc. — you’re on the cybercriminals’ radar.

What Not To Do After an Incident


(Image Source:

If you find yourself in the middle of a cyberattack without a plan, you’re going to scramble as fast as you can, and not just from a tactical IT standpoint to secure your information infrastructure as fast as you can. That’s just step one. If sensitive data was breached, you have a long road ahead — notifying multiple layers of stakeholders, being inundated by customer and media calls, responding to any government inquiries, offering mitigation such as credit monitoring and potentially bracing for lawsuits. When you are in crisis mode, it’s difficult to think strategically about all these phases — it’s unlikely you’ll even know all the ramifications if you’ve never gone through an incident like this before.

Incident Response 

That’s where incident-response planning comes in. You can give yourself ample time to consider potential scenarios and then train your employees — even taking them through actual drills and tabletop scenarios.

Look at some of the big companies’ responses to appreciate why a well-planned out response is necessary. In many of the breaches we’ve seen in the past two or three years, the post-breach actions didn’t play out as well as they should have, resulting in PR nightmares.

Target, for example, took a week to announce its data breach in 2013, in the middle of the peak shopping season, as news began to hit customers through media reports. A gridlocked customer service line and a negative social media outburst were just some of the consequences — to say nothing of the class-action suit that eventually followed, costing the company $10 million in customer settlements and another $6.75 million in legal costs. As Target struggled to contain the damage and set up an official breach-communication website, scammers acted quickly to take advantage of the chaos — sending out fake messages that claimed to be from the company.

EBay topped Target by not only taking three months to realize a breach (which is not that uncommon) but also waiting for two weeks after that to notify customers. What followed, however, was awkward for such a big player: The first announcement was posted on, a little-known corporate website and when it finally made its way to the eBay ecommerce site, it only went as far as telling users to change passwords, without any explanation. Meanwhile, PayPal customers were confused because a banner posted on that website didn’t clarify whether PayPal accounts were compromised as well.

While eBay was nonchalant in social media — simply responding to a storm of complaints with a tweet saying it would take a while for all customers to receive the password-resetting email — it worked really hard to downplay the magnitude of the breach. Even going so far as refusing to give an estimate, based on its best knowledge, on the number of records potentially affected.

Anthem was also seemingly overwhelmed by the magnitude of the impact from its data breach. It took the company five days to announce a breach (which took two months to discover) and quite some time to assess the scale and communicate with stakeholders. Its original disclosure, in February 2014, put the number of records potentially stolen by hackers at 37.5 million, but then it more than doubled that estimate, 20 days later, to 78.8 million.

As an estimated 50 million consumers were yet to be informed more than a month after the breach discovery, a Senate health committee had to intervene. But that wasn’t the end of Anthem’s missteps — it took customer’s days after calling a dedicated phone line to receive a call back…

What Post-Breach Response ‘Should’ Have Looked Like…

Read Part 2

By Sekhar Sarukkai

Fintech – Programs, Events and the Future

Fintech – Programs, Events and the Future

Fintech Programs and Events

The Financial Services Roundtable (FSR) has just launched a fintech collaborators program that aims to bring financial and tech industry leaders together. The hope is that this new program will encourage fintech innovation and collaboration while mapping out a successful future.

Tech Collaborator

Projects from FSR’s Tech Collaborator will enable technology and financial firms to jointly develop best practices and guidelines in efforts to advance security and efficiency. The first project is the study of the integration of wearables and creation of best practices for securing, moving, and accessing sensitive ‘data-in-motion’ in a mobile financial services world, and the second the development of standards or best practices for data security, integrity, and accessibility in the cloud. Once the results of these two projects are revealed this fall, two new projects will be announced with possible topics including blockchain/distributed ledger, identity proofing, and the internet of things (IoT). Says Tim Pawlenty, FSR CEO, “Technology is changing the world and what customers demand at lightning speed. Financial and technology companies competing and forming partnerships benefits consumers and we look forward to creating forums to enhance these opportunities.”

FinTech Ideas Festival


(Image Source: Shutterstock)

FSR has also introduced its FinTech Ideas Festival bringing together CEO-level leaders in the financial and technology sphere from across the globe. Partnering with TechNet, the exclusive invitation-only event for CEOs is set for January 2017 and will focus on artificial intelligence, biometrics, cybersecurity, data access and security in the cloud, financial inclusion, the future of the workforce, IoT and big data, managing regulations in the future, and payments. Ajay Banga, President and CEO of MasterCard, remarks, “Sometimes the best ideas come about when people with different perspectives and experience engage in new ways. We see the FinTech Ideas Festival as a way to accelerate our respective and new efforts to make a positive impact on people’s lives.

The Future of Fintech

The Singularity University Exponential Finance Conference takes place in New York from June 7th-8th, an opportunity to measure how disruptive technologies such as big data, artificial intelligence, and blockchain technologies are affecting the financial industry. According to co-moderator of the event, Bob Pisani, the massive growth of investment in private tech firms over the last five years has encouraged the creation of startups determined to steal market share from traditional financial institutions. However, though slower to react, banks and other conventional financial institutions “are not lying down.”

According to Pisani, two issues are driving fintech: the control of customer relationships and cutting costs in a low growth environment. He further points to three areas of growth including mobile money, consumer lending, and personal finance management. It’s unlikely though that any of the ‘easy money’ is left in the industry. Citigroup estimates that only 1% of North American banking revenue has migrated to a digital model, but believes that will increase to 10% in 2020 and 17% in 2023. Though it has previously been predicted that bank branches would decline dramatically, the drop off so far has been a modest 15% since 2007. Today it seems far more likely that the large financial and banking institutions will duke it out with the likes of PayPal, ApplePay, Betterment, and Lending Club. For the consumer, there could be nothing better.

Top Fintech Organizations

Fintech Innovators believes the financial services industry is ‘facing a wave of digital disruption that is starting to reshape the sector,’ and provides the Fintech 100 to celebrate the top companies in the space. Their list includes 50 leading established players and 50 emerging stars. Included in the leaders are ZhongAn, tailoring insurance, Oscar for health insurance, and Wealthfront for investment, while upcoming organizations such as Avoka provide frictionless digital sales and service, Bankable offers banking as a service, and BioCatch promises ‘less friction, less fraud.’ Though the fintech industry certainly isn’t providing easy pickings any longer, the innovators and long-established institutions are ensuring the space advances and improves.

By Jennifer Klostermann

HP and Paramount Pictures Partner in Star Trek Beyond

HP and Paramount Pictures Partner in Star Trek Beyond

Hewlett Packard Enterprise and Paramount Pictures 

HPE Film Integrations and New Ad Campaign Feature Futuristic Concept Technologies Inspired by The Machine

PALO ALTO, CA–(Marketwired – Jun 6, 2016) – Hewlett Packard Enterprise (NYSE: HPE) has teamed with Paramount Pictures to imagine technology 250 years into the future for the upcoming Star Trek Beyond feature film from Paramount Pictures and Skydance set for worldwide release beginning July 20. The upcoming film will include several futuristic concept technologies created exclusively for the movie based on HPE’s The Machine, a groundbreaking research project being developed by the company. Also in connection with the film, HPE will debut a new advertising campaign featuring The Machine, the first prototype of which is slated to launch later this year.

As the official enterprise technology partner for Star Trek Beyond, HPE designers and researchers worked closely alongside the filmmaking team to design futuristic technologies that could one day be powered by The Machine and that would contribute to bringing the world of Star Trek Beyond to life.

This collaboration presented a unique backdrop against which we could highlight the vast potential of The Machine,” said Henry Gomez, Chief Marketing and Communications Officer, Hewlett Packard Enterprise. “Star Trek has a long tradition of boldly enabling us to imagine mankind’s future. The Machine will reinvent computing architecture from the ground up, so the connection to the film’s exciting technologies is easy to envision.

Planets definitely aligned for this HPE partnership with Star Trek Beyond given the timing of the launch of The Machine technology and the film’s release,” said LeeAnne Stables, President of Worldwide Marketing Partnerships and Licensing for Paramount Pictures. “It was exciting to see the HPE designers working in a room with our creative teams to infuse actual future world technology into this action-packed film. We know movie audiences around the world love seeing what’s ahead.”

HPE’s The Machine project is focused on reinventing the computing architecture on which all computing is currently based — from smart phones to data centers to super computers. It aims to break through the limitations of today’s computing, enabling a massive and essential leap forward in computing performance and efficiency.

The Machine plans to use light, called photonics, for communications rather than copper wires; it aims to make the “save” button a concept of the past, instead capturing, and remembering, everything in memory as it is created and even after it’s turned off; and it plans to revolutionize how security is embedded throughout a computer.

The Machine technologies are currently being designed to address technological challenges in the not-so-distant future — from dealing with the massive quantities of data generated through connected devices to being able to translate health data from millions of people into solutions for the healthcare industry.

The new HPE advertising campaign, Accelerating Beyond, will feature Star Trek Starfleet Academy recruits in a world powered by The Machine 250 years in the future. It is designed to illustrate the progress that could one day be achieved through the lens of The Machine technology. The campaign, which will include :60 and :30 second television ads, shot in a futuristic landscape in Iceland along with digital treatments, will debut at HPE’s Discover Las Vegas conference, June 7-9. It will begin rotation widely on July 1.

Star Trek Beyond will be in theaters beginning July 22.

About Hewlett Packard Enterprise
Hewlett Packard Enterprise is an industry-leading technology company that enables customers to go further, faster. With the industry’s most comprehensive portfolio, spanning the cloud to the data center to workplace applications, our technology and services help customers around the world make IT more efficient, more productive and more secure.

About Star Trek Beyond

“STAR TREK BEYOND,” the highly anticipated next installment in the globally popular Star Trek franchise, created by Gene Roddenberry and reintroduced by J.J. Abrams in 2009, returns with director Justin Lin (“The Fast and the Furious” franchise) at the helm of this epic voyage of the U.S.S. Enterprise and her intrepid crew. In “Beyond,” the Enterprise crew explores the furthest reaches of uncharted space, where they encounter a mysterious new enemy who puts them and everything the Federation stands for to the test.

From Paramount Pictures and Skydance, “STAR TREK BEYOND” is a Bad Robot, Sneaky Shark, Perfect Storm Entertainment production. The film stars John Cho, Simon Pegg, Chris Pine, Zachary Quinto, Zoë Saldana, Karl Urban, Anton Yelchin and Idris Elba. Directed by Justin Lin, the third film in the franchise series is produced by J.J. Abrams, Roberto Orci, Lindsey Weber, and Justin Lin; and executive produced by Jeffrey Chernov, David Ellison, Dana Goldberg, and Tommy Harper. Based upon “Star Trek” created by Gene Roddenberry, the screenplay is written by Simon Pegg & Doug Jung.

STAR TREK BEYOND” opens in U.S. theaters July 22, 2016.

Intel Targets Autonomous Cars and IoT With New Acquisition

Intel Targets Autonomous Cars and IoT With New Acquisition

Intel Targets Autonomous Cars

To the casual observer, Intel may have looked like it was in trouble, after getting rid of nearly 12,000 jobs this year alone and cutting back on many lines of business that they have since deemed unnecessary, but that’s not stopping them from investing in new start ups and getting a foothold in the worlds of IoT and autonomous cars.

IoT, or the Internet of Things, is a growing movement to make as many normally mundane objects ‘smart’ or connected to the internet. Items like clothes washers and refrigerators are being designed with internet connectivity and smart functions that make it easier to use, especially for those of us who aren’t always home. This, coupled with the growing popularity of autonomous cars and/or cars with autonomous functions, provides the perfect niche for Intel and could create an era of unprecedented growth for the tech giant.



Intel’s latest acquisition is computer vision company Itseez Inc, a small tech startup that specializes in technologies that allow computers to obtain and process visual information. This is ideal for Intel’s upcoming plans to branch out into the world of autonomous and self-driving cars.

Itseez’s technology will likely be used to help Intel compete with Google, Tesla, and other similar companies that are already making names for themselves in the self-driving car arena. The idea is to allow these autonomous vehicles to collect visual information and make decision on how to act based on that information.

While this may seem like a great advance in technology, it does raise the same ethical dilemma that has been facing other autonomous car giants – what happens when the laws of robotics and ethics don’t always match up?

Ethical Dilemmas

There is a common ethical dilemma that comes up when discussing autonomous cars and the programs that drive them – when these programs are created by humans and subject to human error, how can we expect them to make the right decision when lives are on the line?


(Image Source: Shutterstock)

Consider this scenario: You’re riding as the passenger in an autonomous vehicle. Ahead of you on the road is a pedestrian. There is no way for you to safely avoid the pedestrian without crashing the vehicle. What decision does the autonomous car make – to strike and potentially kill the pedestrian, or to crash the vehicle and potentially kill it’s passenger?

Even less life-threatening scenarios can potentially cause a problem, such as the recent crash between a Google self-driving car and a bus, where both the car’s driver and the autonomous programming made an incorrect assumption leading to a collision between the two vehicles.

This and other similar scenarios are giving lawmakers and manufacturers alike serious pause, because without an acceptable answer to these issues, there is no way to make autonomous cars truly safe.

Introducing advanced computer vision technology like the innovations that Itseez can offer mean that we’re one step closer to creating a safer autonomous car. The advances in car autonomy are also one of the first major steps toward creating an environment steeped in the IoT.

IoT Advances

This isn’t the first step that Intel has taken toward becoming an IoT superpower. Earlier in 2016, the company acquired Yogitech, an Italian company that focuses on “functional safety for superconductors.” In a nutshell, by purchasing Yogitech, Intel acquired the tools that it needs to make sure all the chips in the autonomous vehicles work just the way they’re supposed to.

This is just the beginning.

Intel is planning on turning its Internet of Things niche into an empire, leading to an integration of 50 billion devices and the possibility of trillions of dollars in economic impact by 2020 – a mere 4 years away.

Can They Do It?

Can Intel pull it off? Definitely, if the market continues to move the way that it has for the past couple of years. People love to be connected to everything – how many of you have a home security system that you can access in real time from your smartphone or a wifi-enabled thermostat that allows you to adjust your home’s temperature while you’re on your commute home?

This desire for internet enabled appliances and an increased level of connectivity will only continue to grow, and if Intel has gotten its toe in the door in time, the company could achieve everything that it has promised and more.

By Kayla Matthews

Norway Looking To Eliminate Gas Powered Vehicles By 2025

Norway Looking To Eliminate Gas Powered Vehicles By 2025

Future Vehicles

One tweet by Tesla Motors CEO Elon Musk has given fans of electric vehicles the kind of news they never could have imagined just a few years ago.

Musk tweeted out the front page of a Norwegian newspaper “Dagens Naeringsliv” and wrote: “Just heard that Norway will ban sales of fuel cars by 2025. You guys Rock!!

The Norwegian headline says four political parties have agreed on an energy message: “Stop sales of diesel and gasoline vehicles in 2025.

The question is whether or not Norway has officially signed off on a ban of gas-powered cars nine years from now. While the country offers some of the biggest tax incentives in Europe to buyers of electric vehicles, the majority of cars and vans sold each year are gasoline and diesel powered…

Read Full Article Source: CNBC

The Linux Foundation’s Core Infrastructure Initiative Invests In Security Tool

The Linux Foundation’s Core Infrastructure Initiative Invests In Security Tool

Identifying Web Application Vulnerabilities

Grant Accelerates Work to Deliver OWASP ZAP as a Service, Making It Accessible to More Developers

SAN FRANCISCO, CA–(Marketwired – June 03, 2016) – The Core Infrastructure Initiative (CII), a project managed by The Linux Foundation that enables technology companies, industry stakeholders and esteemed developers to collaboratively identify, fund and improve the security of critical open source projects, today announced it is investing in the Open Web Application Security Project Zed Attack Proxy project (OWASP ZAP).

This testing tool helps developers automatically find security vulnerabilities in web applications during development and testing. Both easy to use and freely available, it appeals to a wide range of users with varying security knowledge, even first-time testers.

CII’s sponsorship adds a full-time core developer to work on accelerating ZAP as a Service, which will allow ZAP to also be deployed as a long running, highly scalable, distributed service accessed by multiple users with different roles.

Recently voted the most preferred open source testing tool for the second time in three years by users and ToolsWatch readers, OWASP ZAP is one of the world’s most popular security tools. Hundreds of volunteers around the globe help to continually improve and enhance OWASP ZAP, according to Project Lead Simon Bennetts, who works for Mozilla as part of its security team.

OWASP ZAP is a proven and powerful security tool that will gain even broader applicability with an increase in dedicated resources,” said Emily Ratliff, senior director of infrastructure security, The Linux Foundation. “CII is excited to help advance work that’s already underway to run ZAP in new, different ways, especially in partnership with like-minded organizations like OWASP and Mozilla as they work to ensure the Internet is a safe, global resource.”

OWASP ZAP joins projects like OpenSSL, OpenSSH, NTPd and other fundamental projects CII and its members invest in to encourage software development best practices and secure coding processes.

The CII grant has had an immediate impact on OWASP ZAP. We’ve added a developer, improved coding best practices, set up a predictable release schedule and roadmap and performed audits to help future-proof our code,” said Bennetts.

I’m very excited to see ZAP get the commitment of a full time developer,” said Michael Coates, former chairperson of the OWASP board, a not-for-profit that ensures ongoing availability and support for OWASP. “ZAP is a pivotal tool for use in assessing the security of a web site. As an open source project that is free for everyone to use, the commitment of development resources from CII will greatly advance its capabilities and usability for all.”

With a service-based offering, ZAP will extend itself to a whole new level of maturity and usability that will amplify its value to the community,” said Matt Konda, chair of the OWASP Board of Directors. “Even more than that, ZAP continues to be a model for what OWASP can achieve.

CII funds projects that help the open source community’s ability to deliver and maintain secure secure code. Additionally communication security is a critical need, so funding is also prioritized for projects that improve related, often at-risk services like embedded, IoT, mobile, server and web applications. To submit a grant proposal, apply online using the CII grants management solution. Funding decisions are made on a rolling basis, so grants are issued at any time.

About Core Infrastructure Initiative

CII is a multimillion-dollar project that funds and supports critical elements of the global information infrastructure. It is organized by The Linux Foundation and supported by Amazon Web Services, Adobe, Bloomberg, Cisco, Dell, Facebook, Fujitsu, Google, Hitachi, HP, Huawei, IBM, Intel, Microsoft, NetApp, NEC, Qualcomm, RackSpace,, and VMware. Moving beyond funding projects, CII is introducing pre-emptive tools and programs to help the open source ecosystem and the companies who support it deploy secure coding practices. For a full list of CII grantees, please visit:

About The Linux Foundation

The Linux Foundation is a nonprofit consortium dedicated to fostering the growth of Linux and collaborative software development. Founded in 2000, the organization sponsors the work of Linux creator Linus Torvalds and promotes, protects and advances the Linux operating system and collaborative software development by marshaling the resources of its members and the open source community. The Linux Foundation provides a neutral forum for collaboration and education by hosting Collaborative Projects, Linux conferences, including LinuxCon and generating original research and content that advances the understanding of Linux and collaborative software development. More information can be found at

CloudTweaks Comics
Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

Timeline of the Massive DDoS DYN Attacks

Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

Using Private Cloud Architecture For Multi-Tier Applications

Using Private Cloud Architecture For Multi-Tier Applications

Cloud Architecture These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything…

Security: Avoiding A Hatton Garden-Style Data Center Heist

Security: Avoiding A Hatton Garden-Style Data Center Heist

Data Center Protection In April 2015, one of the world’s biggest jewelry heists occurred at the Hatton Garden Safe Deposit Company in London. Posing as workmen, the criminals entered the building through a lift shaft and cut through a 50cm-thick concrete wall with an industrial power drill. Once inside, the criminals had free and unlimited…

The Rise Of BI Data And How To Use It Effectively

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart  Breaches Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove whether a system achieves its desired functional specifications. It is employed by organizations that build products that absolutely cannot fail. One of the reasons NASA rovers are still roaming Mars…



Big Data and DNS Analytics Big Data is revolutionizing the way admins manage their DNS traffic. New management platforms are combining historical data with advanced analytics to inform admins about possible performance degradation in their networks. Not only that, but they also have the ability to suggest ways to optimize network configurations for faster routing.…

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence All businesses need a strategy and processes for governance, risk and compliance (GRC). Many still view GRC activity as a burdensome ‘must-do,’ approaching it reactively and managing it with non-specialized tools. GRC is a necessary business endeavor but it can be elevated from a cost drain to a value-add activity. By integrating…


Sponsored Partners