Category Archives: Big Data

The Industrial Internet Arises With Big Data And The Internet of Things

The Industrial Internet Arises With Big Data And The Internet of Things

The Industrial Internet Arises

Both Big Data and the Internet of Things arm us with a near infinite source of data thanks to Internet-connected sensors and data analysis tools, and energy efficiency is a field that’s starting to reap the rewards. Says Jeff Immelt, CEO of General Electric, “The combination of software and machines, from airplane engines to power plants to wind turbines, has laid the foundation for a new wave of innovation – and the economic and environmental impact of industry and software cannot be understated.

Industrial Internet

As the global industrial sector incorporates Big Data and the Internet of Things, so arises the Industrial Internet, with a projected $15 trillion increase in global GDP in the next 20 years thanks to optimized performance, increased productivity, and considerable savings in fuel and energy. A mere 1% decrease in the combined operating expenditures of 2014’s top 40 miners could have resulted in savings of $5.3 billion, making it clear that improved efficiency offers the potential for significant profit escalation. 

industrial-internet

(Infographic Source: Visualcapitalist)

Industry leaders recognize that connecting hardware with predictive analytics through sensors leads to valuable insight and optimization, and the use of heavy-duty machinery is developing with these innovations. Airlines are saving millions by avoiding downtime and delays thanks to warnings of potential engine failure; real-time reporting of engine temperatures, fuel efficiencies, speed, and vibration patterns are available to engineers; and mill assets and process information can be consolidated in one common platform, creating an overall picture for better throughput, recoveries, and quality. Predictive analytics further benefits industrial organizations not only through energy cost savings but with increased productivity and reduced maintenance costs.

Leading Organizations Implementing Renewable Energy

tamara-tj-dicaprioGoogle has been carbon neutral since 2007, buying carbon credits for any emissions made, and the organization claims to be more than a third of the way to being 100% renewable. Microsoft, committed to being carbon neutral since 2013, has implemented an internal carbon fee, reducing emissions of carbon dioxide equivalents by 7.5 million metric tons since 2012, and investing 10.2 billion kilowatt-hours in renewable energy. Writes senior director of environmental sustainability at Microsoft, TJ DiCaprio, “Our carbon fee represents a proactive step to make our business groups accountable for their carbon emissions while creating a fund to support efficiency and innovation.” In Austria, Sony’s CD manufacturing site uses 100% renewable energy, and Sony plans to have a zero environmental footprint by 2050, and by 2020, Ikea intends to be powered entirely by renewable energy while Walmart is committed to buying or producing 7 billion KwH of renewable energy.

How Smaller Businesses Can Maximize Energy Efficiency

Going green isn’t a strategy only big business is opting for; smaller organizations can employ their flexibility to integrate renewable energy by fixing operational inefficiencies to reduce energy bills, improving building infrastructure with green tech, implementing renewable energy tech such as wind turbines and solar panels, and utilizing organic materials for power generation. As global energy needs increase, every venture into renewable energy and energy efficiency benefits both the environment as well as the organization’s potential revenues.

By Jennifer Klostermann

What The FITARA Scorecard Tells Us About Government Cyber Security Preparedness

What The FITARA Scorecard Tells Us About Government Cyber Security Preparedness

Government Cyber Security Preparedness

Last year’s massive data breach of Office of Personnel Management, as well as other recent cyber security incidents affecting federal agencies, underscored the urgency of bringing the federal government’s security infrastructure up to date. Although many agencies have made strides toward hardening their cyber security, outdated IT infrastructure and architecture is still common — making the federal government an easy cyber attack target.

In 2014, Congress enacted the Federal Information Technology Acquisition Reform Act (FITARA), giving CIOs significant powers in IT decisions, including new technology acquisitions. But based on an analysis of a scorecard created to measure the implementation of four key provisions of the legislation, the top 24 federal agencies received an average overall “grade” of D.

This raises the question: How well is the federal government prepared for cyber attacks?

FITARA’s impact on security

mitigation-security

FITARA was “the most comprehensive overhaul of government IT in 18 years,” according to a Gartner analysis. Its purpose was to reform IT procurement and management to make it more agile and efficient.

Because FITARA’s intent was to drastically cut spending on outdated, legacy technology, the expectation was that it would minimize the vulnerabilities that create the perfect storm for cyber attackers.

But a recently published audit of the Department of Homeland Security — considered to have some of the best cyber security measures among federal agencies — showed that its IT infrastructure still relies on dozens of unpatched, vulnerable databases.

The “Evaluation of DHS’s Information Security Program for Fiscal Year 2015” report, by the Office of Inspector General (OIG), found a long list of other shortcomings. They included 220 “sensitive but unclassified,” “secret” and “top secret” systems with “expired authorities to operate,” which would imply that those systems were no longer regularly patched and maintained. And even many systems that were actively maintained didn’t have current security patches.

It’s worth mentioning that the entities where cyber security is especially critical were at the top of the OIG list as having the most vulnerable systems — 26 systems inside the Coast Guard, 25 at FEMA, 11 at DHS’ own headquarters, 14 at Customs and Border Protection and 10 at Transportation Security Administration. This audit shows that the government is still a long way from coming up-to-speed with its cyber attack defenses.

FITARA compliance

While the OIG audit focused on compliance related to the Federal Information Security Modernization Act (FISMA), it reflects the same concerns exposed by the recent FITARA scorecard.

In releasing the scorecard, members of the House Oversight Committee wrote, “For decades, the federal government has operated with poorly managed and outdated IT infrastructure. Cyber attacks are a real threat to this country. Federal agencies must act now.

government-scorecard

(Image Source: Oversight.house.gov)

The scorecard looked at FITARA implementation progress in four areas: data center consolidation, IT portfolio review savings, risk assessment transparency and incremental development. Factors considered for the grades included implementation of best practices for risk assessment, increasing the powers of CIOs and trimming wasteful spending.

The Department of Corrections and the General Services Administration received the only Bs (there were no As), while five agencies including DHS scored Cs. Energy and Education failed, and the other 16 all came in with Ds.

In the data center category, 15 of the agencies received Fs (three of the 24 did not report consolidation), while 16 agencies had Fs in the review savings category.

The grades in these two areas were calculated based on how well the agencies saved money by reviewing their IT portfolios as well as consolidating the data centers. In fact, F. David Powner, director of GAO’s information technology management issues, said during a committee hearing that the number of federal data centers has actually grown, to 11,700, and only 275 of those are considered “core.”

In the incremental development area, the grades were based on how many IT projects that were part of major investments successfully achieved completion and delivery every six months. Again, a dozen agencies failed completely (and three others didn’t have any projects meeting the criteria).

The last area measured how well agencies managed the major IT projects’ risks. This is one category where many agencies fared much better, with only four receiving Fs while 10 receiving As and Bs.

While the entire scorecard poses a major concern, it’s especially troubling to see DHS, which is tasked with overseeing the country’s security, only managing to score a C.

The same goes for the State Department — which scored a D — considering the email server scandal former Secretary of State Hillary Clinton had been embroiled in. Not to mention the major cyber attack that had crippled State’s unclassified email system, which had to be completely shut down and couldn’t be mitigated for months.

It’s also worth calling out OPM, which received a D, since its breach compromised personal data of 21.5 million federal employees. Security experts pointed out that OPM essentially “left the barn doors open” because of its poor security measures.

Veterans Affairs’ “C” and Department of Education’s F add a layer of concern because of the Health Insurance Portability and Accountability Act (HIPAA), which was designed to protect and secure protected health information (PHI).

The VA, which runs the country’s “largest integrated healthcare system” through its Veterans Health Administration, is subject to HIPAA as a “business associate” of the VHA. The rash of cyber security breaches of healthcare providers last year was proof that bad actors are increasingly seeking out PHI because its value on the black market is much higher than financial information. The federal government should practice what it preaches, and make sure its own departments that are subject to HIPAA have strong cybersecurity defenses.

The Department of Education itself is not subject to HIPAA, but many of the public schools and institutions that it funds — and which report data back to the agency — are. The House Oversight Committee Chairman Rep. Jason Chaffetz has warned that OPM’s breach would pale in comparison to the damage that cybercriminals could inflict on the Department of Education.

I think ultimately that’s going to be the largest data breach that we’ve ever seen in the history of our nation,” he said, regarding a breach of Education, at a Brookings Institution event.

The FITARA scorecard, of course, wasn’t intended to point fingers at the federal government for doing a poor job. Still, considering the estimated IT federal budget at $79.8 billion for fiscal year 2016 (which ends on Sept. 30, 2016), the scorecard results pose serious concerns about the nation’s cyber resilience.

As Federal CIO Tony Scott put it during the committee’s hearing, “FITARA presents a historic opportunity to reform the management of information technology across the federal government.” He also said that the work and commitment required to fully implement this law couldn’t be underestimated.

Let’s just hope that the next FITARA scorecard shows much better progress than an average of “D.”

By Sekhar Sarukkai

Big Data And Quantum Computers

Big Data And Quantum Computers

From gene mapping to space exploration, humanity continues to generate ever-larger sets of data—far more information than people can actually process, manage, or understand.

Machine learning systems can help researchers deal with this ever-growing flood of information. Some of the most powerful of these analytical tools are based on a strange branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched every which way.

Such topological systems are especially useful for analyzing the connections in complex networks, such as the internal wiring of the brain, the U.S. power grid, or the global interconnections of the Internet. But even with the most powerful modern supercomputers, such problems remain daunting and impractical to solve. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at MIT, the University of Waterloo, and the University of Southern California…

Full Artice Source: Phys.org

Savision Discusses Hybrid Cloud, Microsoft System Center, And Live Maps Unity

Savision Discusses Hybrid Cloud, Microsoft System Center, And Live Maps Unity

Savision Discusses Hybrid Cloud

Savision, the first Dutch software company to sell to the US government via the GSA approval list, was founded in 2006 by two Dutch nationals, and today has a team of 35 people across Amsterdam, Dallas, New York, and Ottawa. An independent software provider selling enterprise software for the IT operations market, Savision currently focuses primarily on Microsoft technology but plans to extend operations across other platforms in the future. With a client base including the International Atomic Energy Agency, US Library of Congress, KPMG, and 20th Century Fox, the company has established itself firmly in both the hybrid cloud and Microsoft System Center markets. Matthew Carr, Savision’s Business Development Manager, with 15 years’ experience working in various IT&T companies around the world and having filled a range of positions in Savision from Finance to Business Intelligence, discusses the pros and cons of hybrid cloud with us, and delves into the advantages Microsoft System Center offers.

What do you believe are the chief benefits of moving to hybrid cloud?

Price, productivity, and security. Pricing because companies don’t need additional data room capacity for peak times of the day. If done correctly, the public cloud can be purchased on a per hour basis, meaning that bursting capacity will be considerably cheaper than managing more fixed assets.

Productivity is provided thanks to less time spent maintaining the IT environment which is partially handled by the public cloud providers now. Less time fixing problems affords more time for application development teams to focus on delivering value to the business. The counter argument to this, however, is that the increased complexity that the hybrid cloud introduces can mean that when issues do arise, time spent troubleshooting problems increases.

cloud ideas

Furthermore, security is a contentious issue at the moment, but people are realizing that security protocols built by the public cloud providers are actually far more secure than simple company firewalls which are regularly being breached by the BYOD phenomenon. Data is actually safer within a hybrid cloud that uses best in class security.

Are there any organizations that you believe more likely to benefit from hybrid cloud?

Pretty much any organization with over 100 employees can start to think about the hybrid cloud. We are seeing now that early stage companies will start on the pure public cloud because of the simplicity it affords in allowing companies to focus on their core business. As companies mature though it will make sense to have critical workloads on their private cloud, and use public for more ancillary requirements.

Are there any organizations that you believe won’t benefit from hybrid cloud?

Some global governmental departments such as in Germany are legally required to keep all data on their own premises. Additionally, there is an ongoing saga of US law enforcement agencies attempting to track potential criminal activity via social media and data sources in the big US public cloud providers (Microsoft, Amazon, and Google) for data located overseas. Currently, the likes of Microsoft are rightfully attempting to block access to data held in other countries. Should this situation change, it may see a flight of non-US companies from the large public cloud providers.

What can Microsoft System Center do for organizations, and how do you make that solution even better?

Microsoft System Center is a suite of products that enterprises use to manage their IT infrastructure and applications. System Center Operations Manager (SCOM) is an agent-based monitoring technology that tells an IT administrator when there is a problem or outage somewhere in the environment. Savision Live Maps Unity works natively with SCOM to provide business context to these infrastructure problems. Since 80% of incidents are non-critical and don’t affect the business in a meaningful way, Live Maps Unity enables organizations to focus only on the areas affected by outages and hence save time on problem resolution.

Savision believes that Microsoft Service Center provides insight and control over IT environments, but points out some significant challenges including integration of monitoring tools, an inability to determine impact, priority and responsible teams, and the difficulty in identifying root causes of service outages. Could you detail some of the challenges organizations using Microsoft Systems Center might face?

A common problem is alert noise caused by individual physical or virtual servers having problems. With the use of clusters and failovers, this means that most of these outages are not important to the business. In today’s IT environment, there’s a huge amount of complexity which is only increasing over time with virtualization and applications distributed over various geographic and functional locations. Agent-based monitoring of servers is still the main method of understanding problems within the IT fabric but, more and more, it doesn’t actually say anything about the business impact. The first thing that IT administrators hear about a problem is still when a business user calls the help desk to say their email or web application is down.

How would Savision address these challenges?

Firstly, Savision is able to correlate related alerts and simplify them. Instead of an IT administrator receiving ten alerts which result in confusion or time wasting as administrators process them one by one, Live Maps Unity correlates alerts to Business Services and alerts based upon that higher level context. It immediately reduces the alert noise and lets people see only the alerts they need to take action upon. This time saving is invaluable.

Also, by focusing on the business context, Live Maps Unity increases the importance of end user experience with regards to application performance. By introducing a Business Service Management framework, our technology forces IT administrators to think about and design end user synthetic transactions. These transactions sit on the business user side and simulate how an end user interacts with an application or service. If there are any problems from the user’s side, a notification will be sent to the IT administrator to act upon before a time-consuming phone call is put into the help desk.

The post is sponsored by Savision. For more, head over to Savision now. Organizations that believe they could benefit from Savision’s products can try Live Maps Unity free online.

By Jennifer Klostermann

Startups Soaring With The Cloud

Startups Soaring With The Cloud

Startup Cloud Boom

In 2015, the cloud was primarily used by large and mid-level organizations for mission-critical workloads, but this year it’s entrenching itself with startups and driving transformation. In India, currently boasting 2.75 million developers, and second only to the US’s 3.6 million developers, startups are reportedly adopting IBM Cloud to expand their businesses and develop applications with developer-friendly Bluemix and its catalog based on open standards, choice, and portability. Says Vivek Malhotra, director of IBM Cloud India/South Asia, “Based on open source, Bluemix helps startups get a single development and management experience across any combination of public, dedicated and local Bluemix instances. Even if clients have existing infrastructure setups or application program interface (API), they can securely connect those to Bluemix for a hybrid solution.”

Successful Startups & Bluemix

startup-movement

(Image Source: Shutterstock)

Yipeedo, a mobile-only platform helping urban city dwellers get the most from their environs, uses Bluemix for its application development. The application analyzes user tastes and preferences and uses this data to recommend suitable activities. Wolken Software also utilized Bluemix when building their TeamToq app, an enterprise-class app enabling secure and collaborative device agnostic discussions between individuals, enterprise apps, and enterprise cloud apps. And KlickDoc, a comprehensive cloud-based healthcare platform connecting patients with doctors, diagnostic centers, and pharmacies makes use of IBM Bluemix as the development platform for their programming language, PHP runtime. Srinath Ranga, founder of Opteamize Cloud Solutions, a corporate connect marketplace built on Bluemix, asserts, “As a startup, we need app development platforms that are fast, secured, and easy to use. IBM has a deep understanding of the startup environment and is helping us create a market differentiation through its cloud-based solutions. Bluemix, with cutting edge features like deep analytics or cognitive capabilities, makes apps a lot more innovative and interesting for our audiences.

Alternative Platforms

Of course, Bluemix isn’t the only cloud platform providing support to startups, and along with some of the house brands like Microsoft Azure and Amazon Web Services, a variety of diverse platforms are available to back up emerging startups. Fusio is an open source API management platform that helps developers build and manage RESTful APIs, and provides endpoint versioning, schema definition, secure authorization, and the ability to handle data from different sources. And Apigee, another free tools platform for testing, debugging, protecting and analytics on API, is a highly scalable hybrid that helps developers build apps faster, predict next best action, and connects developers. Along with Cloud Foundry, Cloudify, and OpenShift, Apache CloudStack is notable, designed to deploy and manage large networks of virtual machines as a highly scalable and available Infrastructure as a Service (IaaS) cloud computing platform.

Whatever platform exploited, the cloud market is thriving. Says Malhotra, “Cloud is the wave of the future and given the inter-linkages with mobility, big data, and social, we foresee the migration of all enterprise data to public, private or hybrid cloud model within the next few years.”

By Jennifer Klostermann

The Meaning Of Secure Business Agility In The Cloud

The Meaning Of Secure Business Agility In The Cloud

Secure Business Agility In The Cloud

As cloud continues to accelerate business delivery and shift away the balance of power from IT and InfoSec to business users, organizations need to find ways to ensure that security is part of a business process rather than an afterthought. Today’s organizations are transacting some of their most valuable data and services in the cloud. While the promise of instant availability, convenience and cost are very attractive the damage to brand, reputation and trust could be irrevocable to businesses if security is not built in.

Many CISOs and InfoSec teams continue to struggle with the new order in which business users have unprecedented freedom over how they work, what devices and applications they use to accomplish their work and from where they work. Most want to partner with their business users to figure out optimal ways to engage in cloud services securely but most don’t think of how IT security integrates into business processes. The result is that we often see burdensome processes within organizations where business users have to take extra steps to categorize data or to register new cloud security services. And, in doing so InfoSec and IT might be creating a bigger risk where business users will further make a run around InfoSec and IT. When business users are pressed for time extra processes become doubly burdensome.

Insider Threat Vectors

Reputation and trust could be irrevocable to businesses if security is not built in... Click to Tweet

Over the last year there has been a rise in both accidental and mis-intentioned insider threat vectors. With personal and business lines of work so blurred it’s easy for business users to accidentally drag and drop the wrong attachment into an email, or in the spur of a moment accidentally post a message that alludes or pertains to confidential company information, or post a regulatory-related file on an unsecured file share site in order to make it easier to work on.

The key to secure business agility in the cloud is through ongoing dialog and automation.

evolution-tech

Ongoing dialog:

  • Given the fast changing pace of today’s business environments IT and InfoSec and business users need to have constant check-ins to ensure a fruitful relationship. Needs are going to change rapidly as increasingly more services are migrated to the cloud.
  • Security processes need to be designed to be business intuitive. If business users are going to required to own the data classification process, categories should be few and very intuitive. And, so, too the process for the onboarding of new cloud services.

Automation:

There are now a slew of cloud security services that enable business users to remain agile while preserving security in a less intrusive way.

  • Emerging data security toolsets leverage big data analytics and machine learning to automate the data classification process. Such toolsets should be explored within the business culture, geographies and trialed before going broadscale.
  • Self service portals can be designed with a standard set of security profiles built in. This helps not only automate the cloud security provisioning process but also allows for consistent implementation company-wide and across the many different types of cloud services a company many engage.

As we enter into 2016, I encourage IT and business users to find more meaningful ways to ensure securely accelerate cloud services.

By Evelyn de Souza

Employment Trends: Cloud Technology Leads The Way

Employment Trends: Cloud Technology Leads The Way

Employment Trends: Cloud Technology

World Economic Forum’s Future of Jobs report, developed in collaboration with the Global Agenda Council, uses the dataset of a survey of CHROs and other senior talent and strategy executives examining 371 leading global employers that represent over 13 million employees. It predicts that the rapidly evolving employment landscape will force businesses, governments and individuals to anticipate and prepare for future skill requirements, and studies the drivers of change, employment trends, and emerging skills in order to propose future workforce strategies and recommendations.

Driving Change

job-growth-tech

The study finds that the majority of respondents believe urgent adaptive action is required to address drivers that they predict will occur within the next five years. The changing nature and flexibility of work tops the list of demographic and socio-economic drivers of change, while mobile internet, cloud technology, processing power, and big data are considered fundamental technological change drivers. On average, those surveyed believe the impacts of mobile internet and cloud technology, Big Data, and crowdsourcing are already being felt, and consider IoT, 3D printing, and concerns about ethical and privacy issues to impact from 2015 to 2017. Advanced robotics, artificial intelligence, and advanced materials are deemed to be just around the corner, with respondents predicting impacts to industries and business models in these areas will be felt from 2018 to 2020.

Employment Trends

movement-tech

Current trends across the surveyed countries may lead to 5.1 million jobs lost due to disruptive labor market changes between 2015 and 2020, as 7.1 million jobs generally concentrated in routine white collar office functions such as administrative and office roles are lost, and 2 million jobs are created in the architectural, engineering, computer, and mathematical fields. It’s likely new and emerging roles will become critically important, and both data analysts and specialized sales representatives were consistently pointed to across all industries and areas. Across in-demand job families, it’s likely that recruitment will become even more difficult than it already is, and organizations are going to have to find efficient ways of ensuring reliable talent sourcing.

Skill Challenges

The Future of Jobs report further notes reduced shelf-life of employees’ existing skill sets, and technological disruptions are likely to quickly enhance this challenge.

the-future

It’s estimated that by 2020 more than a third of required core skill sets of most occupations will consist of skills not considered crucial today, and strong social skills such as emotional intelligence and collaboration attitudes will have to supplement technical skills. Workers in lower skilled roles are likely to face redundancy without significant reskilling and upskilling.

Workforce Strategies and Recommended Actions

With business leaders slow to act on these looming challenges to date, most of those surveyed prioritize future workforce planning reasonably or very highly. Approximately two-thirds of respondents intend to invest in reskilling current employees while some promising possibilities such as making better use of the experience of older employees and building an ageless workforce seem to be underutilized.

The report details four areas requiring immediate focus: Reinventing the HR Functions; Making Use of Data Analytics; Talent Diversity; and Leveraging Flexible Working Arrangements and Online Talent Platforms, as well as three areas for longer term focus: Rethinking Education Systems; Incentivizing Lifelong Learning; and Cross-Industry and Public-Private Collaborations. Government policies and reforms will need to complement these strategies, and organizations and individuals need to become involved in the management and direction of these disruptive changes that are likely to entirely reinvent our economies.

By Jennifer Klostermann

Ad Infinitum – Internet For Everything

Ad Infinitum – Internet For Everything

Internet For Everything

The hypothesis that a new Internet-for-everything society will come, as it is desired by the fundamentalists, is in fact very weak, not to say improbable” —Philippe Breton

Despite what Breton wrote in 2011, small devices across the globe are increasingly capable of fully qualified networking. This technological advancement of small, autonomous devices equipped with adequate sensors builds up the foundation for the Internet of Things. What Breton was pointing out is that this development is like a Trojan horse, incurring massive social implications. His key message was that this transformation of society is largely unquestioned. Under the populistic notion of practicality, the issue is presented as inevitable, despite the challenges it poses to the core values of his society as he expresses them: the Law, Speech and the Individual.

Clearly, with its close connection to contemporary globalization, the increasing number of tiny, autonomous devices operating throughout society will also raise concerns and research questions about security, privacy and ethical matters. Consequently, there is more and more research published on the technical security of these devices, the networking between them, and their backend systems. Take for example what Hossain, Fotouhi and Hasan contributed in their recent paper for IEEE World Congress. While technical solutions essentially and comprehensively identify and classify the parts and their interconnected links, they leave out the important questions of “who governs” and “whose security”.

Furthermore, technical maneuvers rarely bring about direct financial advantages for businesses.

Backdoor in the refrigerator

future-techThe technical vulnerabilities of interconnected devices are often explained using rather abstract, if not surreal, scenarios. Yet the fact is that networked small devices often provide new injection points for various rogue actors, and also generate new business for security appliance providers.

These fictional examples are often reinforced by referring to more severe environments like healthcare, industrial or military appliances where a backdoor in one small device could compromise the whole system. Many nations are presenting these threats as real, and investing in research both to identify them and sometimes also to gain offensive capabilities. As the basis of the Westphalian State is to be in possession of the ultimate coercive force, the local law enforcement office eagerly wants to secure their ability to invade your fridge. The armed forces, on the other hand, might want to do the same thing abroad for the sake of national security.

The threat is not that far-fetched, as recent headlines have demonstrated how innocent game consoles were used for plot against the sovereign. While competent security agencies are well aware that state security involves much more than taking away or intercepting digital toys, this kind of headline incurs huge value for the securitizing process in the public mind.

Global Business Infrastructure

The fundamental aims when securing any information system are to ensure that the data stays coherent, confidentiality is not lost and the data is available when needed. While these and any derived requirements are commonly implemented today in traditional web applications and infrastructure, by definition, the complex and evolving IoT has some particular restrictive characteristics. Yet for global businesses, and indeed, as noted, increasingly for states too, it is essential that they and their customers are able to operate safely in the world of Things.

tech evolution

Many devices in the mesh-like network of Things are expected to be rather autonomous, and yet need to be in connection with other devices. As such, a backend system is usually included in the architecture, to coordinate communication across the devices. While useful from the point of view
of the application, this kind of dependence and transfer of data will introduce an expansion of the borders of the IoT security domain. While completely autonomous devices could conceptually be developed, in practice, business and legal requirements often lead to practical hybrid solutions, where parts of the application and data are stored on the device and parts are shared across the network.

Perhaps one of the most widely spread IoT-like systems is the RFID or biometric passport. Capable of storing essential details and getting power over the air, it contains essential cryptographic features to ensure that gates at the border are not easily led astray.

biometric-passports

(Image Source: Automatic Border Control Process – Wikipedia)

Active chips are equipped with an internal power source, so that they can initiate communication as well. While they are forerunners on the market of Things, these small devices have also been known to be tragic examples of failures of security. Setting up a trivial antenna on the street could initiate connection to any passport within range, and by knowing or guessing its password, gain access to personal details. While the feature is apparently designed for the border gates, it demonstrates the practical dangers of building backdoors in the Internet of Things.

By Kristo Helasvuo

CloudTweaks Comics
Most Active Internet Of Things Investors In The Last 5 Years

Most Active Internet Of Things Investors In The Last 5 Years

Most Active Internet Of Things Investors A recent BI Intelligence report claimed that the Internet of Things (IoT) is on its way to becoming the largest device market in the world. Quite naturally, such exponential growth of the IoT market has prompted a number of high-profile corporate investors and smart money VCs to bet highly…

5 Considerations You Need To Review Before Investing In Data Analytics

5 Considerations You Need To Review Before Investing In Data Analytics

Review Before Investing In Data Analytics Big data, when handled properly, can lead to big change. Companies in a wide variety of industries are partnering with data analytics companies to increase operational efficiency and make evidence-based business decisions. From Kraft Foods using business intelligence (BI) to cut customer satisfaction analysis time in half, to a…

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

Why Businesses Need Hybrid Solutions Running a cloud server is no longer the novel trend it once was. Now, the cloud is a necessary data tier that allows employees to access vital company data and maintain productivity from anywhere in the world. But it isn’t a perfect system — security and performance issues can quickly…

Infographic: IoT Programming Essential Job Skills

Infographic: IoT Programming Essential Job Skills

Learning To Code As many readers may or may not know we cover a fair number of topics surrounding new technologies such as Big data, Cloud computing , IoT and one of the most critical areas at the moment – Information Security. The trends continue to dictate that there is a huge shortage of unfilled…

New Report Finds 1 Out Of 3 Sites Are Vulnerable To Malware

New Report Finds 1 Out Of 3 Sites Are Vulnerable To Malware

1 Out Of 3 Sites Are Vulnerable To Malware A new report published this morning by Menlo Security has alarmingly suggested that at least a third of the top 1,000,000 websites in the world are at risk of being infected by malware. While it’s worth prefacing the findings with the fact Menlo used Alexa to…

Cloud Infographic – The Future Of Big Data

Cloud Infographic – The Future Of Big Data

The Future Of Big Data Big Data is BIG business and will continue to be one of the more predominant areas of focus in the coming years from small startups to large scale corporations. We’ve already covered on CloudTweaks how Big Data can be utilized in a number of interesting ways from preventing world hunger to helping teams win…

Containerization: The Bold Face Of The Cloud In 2016

Containerization: The Bold Face Of The Cloud In 2016

Containerization And The Cloud “Right now, the biggest technology shift in the cloud is a rapid evolution from simple virtual machine (VM) hosting toward containerization’’ says the CTO of Microsoft Azure, Mark Russinovitch, a man who deals with the evolving cloud infrastructure every day. In his words, containerization is “an incredibly efficient, portable, and lightweight…

The Future Of Cybersecurity

The Future Of Cybersecurity

The Future of Cybersecurity In 2013, President Obama issued an Executive Order to protect critical infrastructure by establishing baseline security standards. One year later, the government announced the cybersecurity framework, a voluntary how-to guide to strengthen cybersecurity and meanwhile, the Senate Intelligence Committee voted to approve the Cybersecurity Information Sharing Act (CISA), moving it one…

Comparing Cloud Hosting Services

Comparing Cloud Hosting Services

Cloud Hosting Services Cloud hosting service providers are abundant and varied, with typical structures affording the reliability of virtual partitions, drawing resources externally; secure data centers; scalability and flexibility not limited by physical constraints; pay-per-use costing; and responsive load balancing for changing demands. While high end (and high price) services offer an extensive range of…

Cloud Infographic – The Future (IoT)

Cloud Infographic – The Future (IoT)

The Future (IoT) By the year 2020, it is being predicted that 40 to 80 billion connected devices will be in use. The Internet of Things or IoT will transform your business and home in many truly unbelievable ways. The types of products and services that we can expect to see in the next decade…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US. The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about…

Maintaining Network Performance And Security In Hybrid Cloud Environments

Maintaining Network Performance And Security In Hybrid Cloud Environments

Hybrid Cloud Environments After several years of steady cloud adoption in the enterprise, an interesting trend has emerged: More companies are retaining their existing, on-premise IT infrastructures while also embracing the latest cloud technologies. In fact, IDC predicts markets for such hybrid cloud environments will grow from the over $25 billion global market we saw…

Protecting Devices From Data Breach: Identity of Things (IDoT)

Protecting Devices From Data Breach: Identity of Things (IDoT)

How to Identify and Authenticate in the Expanding IoT Ecosystem It is a necessity to protect IoT devices and their associated data. As the IoT ecosystem continues to expand, the need to create an identity to newly-connected things is becoming increasingly crucial. These ‘things’ can include anything from basic sensors and gateways to industrial controls…

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted.…

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart  Breaches Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove whether a system achieves its desired functional specifications. It is employed by organizations that build products that absolutely cannot fail. One of the reasons NASA rovers are still roaming Mars…

Disaster Recovery – A Thing Of The Past!

Disaster Recovery – A Thing Of The Past!

Disaster Recovery  Ok, ok – I understand most of you are saying disaster recovery (DR) is still a critical aspect of running any type of operations. After all – we need to secure our future operations in case of disaster. Sure – that is still the case but things are changing – fast. There are…

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Success for Today’s CMOs Being a CMO is an exhilarating experience – it’s a lot like running a triathlon and then following it with a base jump. Not only do you play an active role in building a company and brand, but the decisions you make have direct impact on the company’s business outcomes for…

Beacons Flopped, But They’re About to Flourish in the Future

Beacons Flopped, But They’re About to Flourish in the Future

Cloud Beacons Flying High When Apple debuted cloud beacons in 2013, analysts predicted 250 million devices capable of serving as iBeacons would be found in the wild within weeks. A few months later, estimates put the figure at just 64,000, with 15 percent confined to Apple stores. Beacons didn’t proliferate as expected, but a few…