Category Archives: Big Data

Educating Our Future Data Scientists

Educating Our Future Data Scientists

Future Data Scientists

Data scientists are one of the many sought-after IT professionals currently enjoying their pick of positions thanks to a general shortage of IT staff. Universities and colleges have created and continue to develop master’s programs attempting to address this skills shortage, and there’s additionally been a reliance on boot camps to help fill the gaps. However, undergraduate programs tailored to big data have been somewhat lacking in the past.

With the Department of Labor’s forecast of a 25% growth in data jobs by 2018, it’s no surprise that US colleges and universities are changing tactics and developing undergraduate programs that address big data skills requirements. The good news is there is growing interest in this field, which means a number of excellent publications and influencers are starting to cover this topic

Universities & Colleges with Data Science Undergrad Programs

data-science

(Image Source: Shutterstock)

College of Charleston

Providing their own undergraduate program in data science, students complete internships at tech giants before choosing from 14 degree disciplines which include accounting, biomechanics, CRM, economics, exercise physiology, finance, geoinformatics, molecular biology, organismal biology, physics and astronomy, psychology, sociology, and supply chain management.

DePaul

The Bachelor of Arts in Decision Analytics provided by DePaul teaches students practical applications of big data including the ethical collection and securing of data, analysis and communication of findings, and the development of solutions to data problems.

Drexel University

Starting in the fall of 2016, this Bachelors of Science Program in Data Science provides students with a well-rounded data education including the analysis and meaningful use of data, determining a business’s data needs, and securing data.

University of Iowa

This Bachelor’s Degree in Business Analytics and Information Systems claims a 100% placement rate and provides two degree tracks. Business analytics revolves around the improvement of business’ data strategy and the building of new processes while the information systems stream focuses on managing technologies that collect, store, and secure data.

Ohio State University

The data analytics program offered by this university provides an interdisciplinary major thanks to a bachelor of science degree from the College of Arts and Sciences through partnerships with the Fisher College of Business, the College of Engineering, and the College of Medicine. Students are able to specialize either in biomedical, business analytics, or computational analytics, covering a range of industries requiring data scientists.

University of San Francisco

Providing majors in data science focusing on computer science and mathematics, the University of San Francisco program promises students a wealth of mathematical, computational, and statistical skills. Students choose between the three streams of mathematical data science, computational data science, and economic data science and courses range from linear algebra to microeconomics to programming.

University of Wisconsin

Through their undergraduate degree in Data Science and Predictive Analytics, University of Wisconsin students receiving not only an education in data science but are also lectured in business and the application of big data concepts. Skills taught include mining and collection of data, analysis of data, and the creation of data visualizations, preparing students for work in fields across finance, marketing, economics, management, and more.

Worcester Polytechnic Institute

The program at Worcester Polytechnic Institute earns students both a Bachelor of Science Degree in Data Science as well as a master’s degree. With research across almost every industry, students will find few limitations to their big data education.

Free Big Data Programs

python-program

Of course, formal education programs are not the only providers of big data skills. A number of online courses have emerged helping job seekers beef up their big data and analytics skills covering topics such as machine learning, Hadoop, and various programming languages. Udemy offers Big Data Basics: Hadoop, MapReduce, Hive, Pig, & Spark aimed at beginners interested in the tech foundations of the big data ecosystem, and for slightly more advanced students, the Hadoop Starter Kit offers access to a multi-node Hadoop training cluster. Introduction to Spark, R Basics – R Programming Language Introduction, and Python for Beginners with Examples provide primers for some of the different skills data scientists need while MIT’s Artificial Intelligence course teaches students to develop intelligent systems. These and many other short courses provide an excellent starting point for those interested in data science while the programs being drawn up by top colleges and universities are advancing quickly to meet industry skills needs.

By Jennifer Klostermann

Leveraging IoT & Open Source Tools

Leveraging IoT & Open Source Tools

IoT and Data Growth

Though the data regarding connected devices is anything but cohesive, a broad overview of IoT stats affords a clear picture of how quickly our world is becoming a connected ecosystem: In 1984, approximately 1,000 devices were connected to the Internet; in 2015, Gartner predicted 4.9 billion connected things would be in use; and by 2020 analysts expect we’ll have somewhere between 26 and 50 billion connected devices globally. Said Padmasree Warrior, Chief Technology and Strategy Officer at Cisco, “In 1984, there were 1,000 connected devices. That number rose up to reach a million devices in 1992 and reached a billion devices in 2008. Our estimates say… that we will have roughly 50 billion connected devices by the year 2020.

What’s Connected?

Internet-of-things-infographic

(Infographic Source: industrial-ip.org)

Of course, we’re well past the days when ‘connected’ meant to your computer or mobile phone. Connected devices today include many household gadgets such as heating, lighting, and refrigerators, personal wearables including smart watches and clothing, and industrial equipment such as shipping pallets and automation gear. Innovators are already dreaming up the next big thing, and in the future, we can expect smart couches that keep you warm in winter, smart crockery that tracks what you’re eating, and smart toothbrushes helping fight gum disease. IoT is being implemented in the running of businesses and product manufacturing, as well as into new designs and concepts generated by these firms, and according to Vision Mobile, 91% of IoT developers are using open source technology in these IoT projects.

IoT & Open Source Tools

With data from 3,700 IoT developers across 150 counties, Vision Mobile found that eight out of ten IoT developers use open source whenever they can, and six out of ten contribute to open source projects. The cost (free) of these open source tools tends to be the leading driver behind their use, but developers also point to open source tools providing the best support along with the best technology thanks to constant improvements and peer-to-peer support in the open source community. Open source technology is also considered a valuable method for improving skills and learning new technologies.

oliver-pauzetOliver Pauzet, VP of Market Strategy at Sierra Wireless, additionally points out that “closed, proprietary systems can make interoperability difficult.” Inter-brand connection is thus another challenge open source technology addresses, enabling the devices of different developers to communicate. Pauzet points also to the necessity of creating and employing industry standards which will encourage interoperability for greater choice and flexibility. This would mean developers could use cross-brand devices in the development of specific solutions, promising greater innovation along with cost efficiency. Finding an open source license that is “business-friendly,” along with industrial-grade components released as an open standard is Pauzet’s tip for quickly taking IoT concepts from prototype to mass deployment. Says Pauzet, “The fact that so much of the integration, testing, and validation work is already done, they no longer have to invest big money when the time comes to expand on a global scale.”

Open Source Support

Recently announced, Farnell element14 is calling their mangOH Green Open Hardware IoT Platform the first “all-in-one Hardware, Software and Cloud-based solution for Industrial IoT applications.” Allowing developers rapid testing and prototyping of ideas, IoT solutions can purportedly be taken to market within weeks, and the platform is compatible with other open source initiatives including Linear Technology Dust Networks and Texas Instruments ZigBee, NXP thread module, Wi-Fi and Bluetooth module.

The range of developers making use of and creating open source tools and solutions is extensive; Postscapes Internet of Things Awards 2015/16 takes a look at some of the best IoT open source projects. Projects nominated include platforms for building IoT gateways as well as interaction with everyday objects, CNC farming machinery, tools for the generation of HTML and mobile apps for IoT, and more. Postscapes believes the open source movement is “championing openness, transparency, and the power of collaborative development”; the range of quality open source IoT projects is the proof.

By Jennifer Klostermann

University of Wisconsin – Bridging Together VR and Big Data in Future Courses

University of Wisconsin – Bridging Together VR and Big Data in Future Courses

Virtual Reality Tools Could Bring Big Data to Life for University of Wisconsin Data Science Students

MADISON, WI –(Marketwired – May 16, 2016) – Soon, students in University of Wisconsin data science courses might be putting on virtual reality headsets and bracing themselves for a spectacular ride through mountains of data. The courses are part of the online Master of Data Science degree program offered by UW-Extension in collaboration with six UW campuses.

Ryan Martinez, an instructional designer for UW-Extension, sees both a need and an opportunity to make Big Data come to life in a way that can dramatically change people’s behavior. His proposal to use Oculus virtual reality technology in the online UW Master of Data Science curriculum earned Martinez a highly coveted spot at the Oculus Launch Pad Boot Camp at Facebook headquarters in Menlo Park, California, on May 21.

Oculus VR will provide feedback and mentorship to Martinez and 100 other pioneers joining him at the Oculus Launch Pad Boot Camp. Following the forum, Oculus will determine which concepts will earn scholarships ranging from $5,000 – $50,000, distributed with the objective of helping leaders realize their ideas.

If data scientists could more quickly analyze data that’s already available, they could make faster decisions, act sooner, and potentially even save lives,” says Martinez. “All someone needs is a phone and a VR headset. What we can do to bring the data from virtual reality into real life — it’s incredible.”

We’re proud to continue to lead the way with new technologies and practices in higher education,” says David Schejbal, dean, UW-Extension’s Continuing Education Division. “Ryan’s concept may potentially lead to innovative tools we could offer our data science students, and is an interesting option they probably haven’t even considered.

About the University of Wisconsin-Extension Continuing Education Division

The University of Wisconsin Master of Data Science joins a growing list of online degree and certificate programs offered in collaboration with UW-Extension and UW System campus partners, including the existing bachelor’s degree in Health and Wellness Management; bachelor’s, master’s, and certificate programs in Sustainable Management; a bachelor’s degree in Health Information Management and Technology; bachelor of science in Nursing (RN to BSN) programs; and nine additional degree and certificate programs offered in the self-paced, competency-based UW Flexible Option format.

Savision Delivers New Multi-Platform Business Service Intelligence Solution

Savision Delivers New Multi-Platform Business Service Intelligence Solution

Unity iQ is the One Groundbreaking Solution That Solves The Main Problem Enterprises Face Today: Increased Data & Legacy Tools That Are Used In Silos

Las Vegas, NV, May 16, 2016 – Savision (www.savision.com) today announced the availability of its new multi-platform Business Service Intelligence Solution, Unity iQ at the Knowledge16 event in Las Vegas, NV. The world of IT is becoming increasingly complex, with IT professionals facing an assortment of challenges. Frequently, ITSM and monitoring systems are disconnected, with teams working in operational silos, not fully understanding the impact that each system has on the overall business. As a result, IT professionals spend significant time, effort and funds being reactive rather than proactive, putting out fires vs. innovating to drive business growth.

Now, Savision, the market leader in Business Service Intelligence solutions, is offering a groundbreaking solution, Unity iQ. Unity iQ connects the worlds between IT, the help desk and the business, transforming silos into unity. This new solution aggregates and analyzes dispersed data from existing ITSM and monitoring systems, delivering relevant and actionable information for IT and business stakeholders. It does not replace these existing domain tools, but rather it collects, correlates and prioritizes the data these systems produce and measures it against pre-defined business KPIs.

Business service delivery is stuck in a world of silos. Unity iQ is a smart and easy-to-deploy solution that reduces complexity and brings these worlds together,” said Diana Krieger, CEO of Savision. “Unity iQ allows you to aggregate, analyze and act upon dispersed data from different monitoring and ITSM systems. It provides a holistic view for your IT, help desk and business teams so they can solve problems faster and predict outages. With Unity iQ, you can spend less time on operations and more on innovation.”

Unity iQ turns data into Business Service Intelligence. Alerts, and incidents are displayed in real-time, giving all stakeholders a holistic view of their complete IT environment. The benefits of Unity iQ include:

  • Enables Business and IT alignment (People): Facilitates business optimization and motivates an entrepreneurial-focused culture. Helps organizations in maturity journey, and solves miscommunications between business and IT.
  • Increases innovation (Technology) and revenue– Unity iQ helps to increase maturity level of the company by providing business contribution metrics. The more mature a company becomes, the more they can invest money from their operational budget to focus more in innovation, lowering risks and costs.
  • Increases operating efficiency (Process) – The access to relevant information makes room for better decision making, as well as predicting and minimizing downtime.

With Unity iQ, IT will spend less time resolving problems and more time on planning for the future. Unity iQ:

  • Reduces downtime by up to 70%
  • Reduces the number of service outages by up to 60%
  • Results in cost savings of up to 82%
  • Is easy to install, configure, and maintain

We developed Unity iQ in line with requests from our customers, and our customers also provided valuable feedback by using a beta version of Unity iQ,” said Rob Doucette, CTO of Savision. “We’re confident that this unique, innovative solution will solve enterprises’ main problem of too much data across silos, by providing a more robust, holistic view.”

About Unity iQ

Unity iQ is a smart solution that optimizes your IT service delivery. It provides your IT, help desk and business teams the service intelligence they need to solve problems faster and predict outages. Unity iQ allows you to aggregate, analyze and act upon dispersed data from different monitoring and ITSM systems. The unified data is presented in a holistic view so everyone understands the business context. Within seconds, you can determine the business impact of an outage or perform a root- cause analysis.

About Savision

Savision is the market leader in Business Service Intelligence solutions. We provide your IT, help desk and business teams the service intelligence they need allowing them to become business partners. With our solutions you can prevent problems and reduce downtime. Since our start in 2006 in the Netherlands, we have helped over 800 customers optimize their IT service delivery. This includes clients from the public sector to Fortune 500 companies worldwide. For more information, visit www.savision.com.

The Collision of Cloud and Data Privacy

The Collision of Cloud and Data Privacy

Cloud and Data Privacy

The “cloudification” of everything from data storage to applications to security services has increased the availability of free-flowing data, allowing business to access anything from anywhere. However, it’s raised serious concerns about the security of personally identifiable information (PII) collected and shared by businesses and government agencies across international borders, and a global data privacy movement was born. Leading the charge on data privacy reform is the European Union (EU) – where consumer privacy is seen as a fundamental right. As a result, data location now matters in the cloud, and businesses must be prepared to know exactly when, where and how this data is shared across geographic borders.

dread-security

(Image Source: Shutterstock)

While data privacy is quickly gaining steam across the entire globe, steps the U.S. and EU are currently taking will likely shape the debate for years to come. The recently passed General Data Protection Regulation (GDPR), which goes into effect in 2018, establishes a framework for all 28 EU member nations, providing a comprehensive and unified way for businesses to properly handle sensitive data belonging to EU citizens. Of the restrictions the GDPR places on global, multi-national businesses, the proper handling of PII is front and center.

The other major data privacy issue, the EU-US Data Privacy Shield to replace Safe Harbor, more narrowly addresses the flow of personal data from the EU to the U.S. However, an initial draft of the new framework was deemed inadequate by the EU Parliament’s influential Article 29 Working Party and cannot be relied upon until it passes the test in the EU court, leaving thousands of businesses in limbo.

No More “Go With the Flow

Information-intensive business processes rely on SaaS, and this, coupled with a shift to mobile computing platforms, means controlling data location and complying with privacy regulations is extremely challenging. As new regulations come to pass, they may put U.S. companies at an even greater disadvantage by adding to the confusion over the consequences of non-compliance. According to the latest draft of the GDPR, for example, any U.S. business involved in the processing of EU consumer data – whether directly or via third-party entity – can be held liable for a breach, resulting in fines of anywhere from $1.7 million up to 4 percent of a business’ global revenue, depending on where the data violations occurred.

cyber-security

Whether your data lies in the public, private or hybrid cloud – it needs to be constantly evaluated in order to truly assess risk potential,” said Simon Leech, chief technologist, Security, Hybrid IT at Hewlett Packard Enterprise. “The owner of the information is ultimately responsible, which is why it is vital for companies to establish a true culture of security at all levels within the business.

Businesses should be addressing potential data privacy violations now in order to make complying with new regulations easier. There are some approved mechanisms that can be put in place while the specifics are hammered out, such as:

  • Binding corporate rules (BCR) – BCR are a set of legally enforceable rules for the processing of personal data that ensure a high level of protection is applied when personal data is transferred between members of a corporate group. Once a set of BCR has been approved by the relevant national data protection authorities, they will ensure that adequate data privacy safeguards are in place to meet compliance.
  • Hiring a Chief Privacy Officer (CPO) With data privacy regulations like GDPR and EU-US Data Privacy Shield, companies that regularly handle sensitive data on a large scale or collect information on many customers should consider designating a data protection officer that can quickly make decisions based on the evolving regulatory landscape. The CPO will be responsible for all data protection matters on a day-to-day basis, and should be involved in vendor decisions that may handle PII.
  • Investing in the IT team – Let’s be clear: complying with these new data privacy regulations will be expensive. But the cost of non-compliance will be even greater, meaning IT teams will face more pressure than ever to protect data from breaches and unauthorized access – both from internal and external threats. Fines will be levied whether the transfer of data was intentional or accidental. Unfortunately, IT teams are woefully underprepared to comply with GDPR as it is.
  • End data hoarding Technology has made it increasingly cheaper and easier to store data that many businesses simply do so as a matter of course. But big data isn’t necessarily better data, and businesses should adopt a data-minimalist approach to ensure greater control and reduce risk.

Data privacy has become a global issue affecting all companies that operate internationally, particularly those that have adopted cloud technologies. Companies can continue using the cloud as long as they’ve put procedures and systems in place to ensure that EU citizen data resides in the country of record. This includes not only validating how any personal data is collected, stored, processed and shared, but also how the business can prove continuous compliance. Setting up local datacenters will help solve the location-focused burdens of the new regulations, but it’s not enough. Companies will still need to maintain control over the entire lifecycle of EU citizen data, as well as who has access to it and from where.

By Daren Glenister

Breakthroughs in Clinical Trials Utilizing the Power of the Cloud

Breakthroughs in Clinical Trials Utilizing the Power of the Cloud

Clinical Trials and the Power of the Cloud

Clinical trials play an essential role in the drug development process by effectively demonstrating the efficacy and safety of a pharmaceutical compound. Although lead by scientific endeavor with patient safety and therapeutic benefits in mind, the process of bringing drugs to market is long, complex, bureaucratic and, above all else, expensive.

Inefficiencies in the clinical trials process continue to stymie industry stakeholders anxious to rein in the cost of product development and adhere to tighter timelines. There is an urgent need to expedite the time-to-market for new drugs and to make the approval process simpler. Discontent with the ‘status quo’ and dismal performance metrics are driving a cacophony of infrastructural changes with stakeholders embracing technologies that are finally moving the needle. Cloud-based solutions such as clinical trial management systems (CTMS), electronic data capture (EDC), electronic trial master file (eTMF), and study startup (SSU) are all quantum leaps and are collectivity referred to as the eClinical stack.

Why the cloud?

randy-biasCloud computing continues to be a disruptive force in IT with no signs of slowing down. According to the Synergy Research Group, the worldwide cloud computing market grew 28% to $110B in revenues in 2015, and forecasts from the International Data Corporation (IDC) indicate worldwide spending on public cloud services will grow at a 19.4% compound annual growth rate (CAGR) – almost six times the rate of overall IT spending growth – from nearly $70 billion in 2015 to more than $141 billion in 2019. “Cloud computing provides a dramatic opportunity across all industries,” according to Randy Bias, Director, OpenStack Foundation, and author of Grasping the Cloud Is Essential to Business Efficiency. “Old businesses are leveraging cloud to disrupt the existing incumbents. Cloud computing is profoundly disruptive in a way few can truly understand.”

By playing a critical role in enabling digital transformation, cloud computing lowers typical IT barriers of slow time to value, risky implementations, limited resources, heavy maintenance, and incompatible systems. Allowing cloud computing to free up resources to run the business enables organizations to focus their time and energy on the pursuit of innovation and growth.

Some of the key reasons driving cloud-based adoption are:

  • Ease of deployment and management
  • Greater flexibility in supporting evolving business needs from both a technical and business perspective
  • Lower cost of operations
  • Easier way to scale and ensure availability and performance
  • Overall ease of use

According to Nan Bulger, Executive Director of SCIP, the Strategic & Competitive Intelligence Professionals society, and author of The New Decision Influencer, “In profit and nonprofit based businesses alike, the future of anything rests in the ability to influence the bottom line through operational efficiency and effectiveness, customer revenue generation and social impact.

The need for more efficient clinical trials is driving greater use of cloud-based solutions in the pharmaceutical industry – historically slow in adopting new technologies – especially with the rise in outsourcing and globalization. The “cloud” gives the ability to access value-added services from anywhere at anytime with a level of simplicity, flexibility, and cost-efficiency never seen before.

Leveraging the cloud for speed

high-speed-cloud

(Image Source: Shutterstock)

The public’s growing dissatisfaction with the clinical trial process is evident in the press with the recent push for expedited programs, such as, the 21st Century Cures Act, compassionate use and the “Right to Try” laws leading the vanguard of change to an industry which has been historically mired in regulation and slow to adopt new innovative technologies, technologies which have the ability to significantly reduce cycle times and get much-needed therapies to those in need faster.

Significant financial losses bolster the insistent calls for change. Data from the Tufts Center for the Study of Drug Development indicate that mean clinical development time is 6.7 years, and daily revenue lost because a drug is not yet on the market has been estimated in the range of $1 million – $8 million. To confront these issues of cost and time, the industry has been evolving from its slow paper-based methods toward cloud-based systems. With the flurry of attention focused on the issue of speeding clinical trials, the need for collaborative, cloud-based solutions has never been greater.

In the cloud, data is available in real-time from anywhere in the world, and the rapid elasticity afforded to cloud-based hosting solutions can offer virtually infinite scalability – a proposition that is attractive for large Pharma and Contract Research Organizations (CROs). Cloud-based technologies also allow results to be analyzed more quickly and facilitate communication amongst clinical research teams across the globe. The introduction, and growing adoption, of cloud technologies for clinical trials will lower of cost of technology and thus the barrier to entry, making the cloud attractive for small-to-mid sized biotech, medical device companies and universities. For small companies, cloud computing services can provide a fast way to launch a new product, while keeping the focus on developing product features instead of fine-tuning office servers.

Improving Study Startup with Cloud-Based Services

While companies have often focused on improving study conduct in order to make gains in clinical trial efficiency, stakeholders are becoming increasingly aware that better Study Startup (SSU) – a perpetual bottleneck – processes are linked to shorter clinical timelines, and the emphasis is slowly shifting in that direction. SSU includes activities such as country selection, pre-study visits, site selection and initiation, regulatory document submission, contract and budget execution, and enrolling the first patient.

cloud-security-health

Research indicates that lengthy start-up times are problematic for many stakeholders: companies seeking to develop new treatments, insurers formulating policy, providers, and patients. Addressing this issue is a challenge because too often, information needed to launch clinical trials still resides in multiple databases, leaving SSU activities to be performed using Excel spreadsheets, e-mail, and shared file drives. Consequently, too much time is spent on non-productive activities, such as status meetings, because the desired information is housed in various locations and is not readily available.
http://www.nextivadrive.com

These inefficiencies can be minimized using a purpose-built SaaS SSU solution. With this type of solution, real-time viewing of data and smart workflows that standardize processes become possible. Some key advantages of the solution are: it functions as a single repository for study documents; information only needs to be entered once; and documents from the study database can be accessed using a single logon. Overall, the technology is designed to provide better collaboration with sites, improve business processes, identify bottlenecks, and avoid redundant processes. Using cloud-based technology, a better SSU methodology aligns with the goal of faster development by significantly impacting cycle times. This approach leads to greater cost savings and faster market entry, making valuable therapies available to patients sooner.

Conclusion

Industry analysts estimate that the data generated by the pharmaceutical industry doubles every six months and recently published research indicates that by 2020 approximately 70% of clinical trials will be outsourced to CROs. How will on-premise or custom-built applications handle these scale and business operational challenges – simply put they won’t.

Cloud computing is attractive because its inherent scalability, availability and flexibility offer the potential to streamline the clinical development process, accelerate timelines, and cut information technology costs. Additionally, the cloud can add a layer of security and control that is simply not possible with paper-based processes. Introducing these important efficiencies into routine clinical processes helps companies adhere to increasingly aggressive timelines, and comply with the changing nature of global regulations in a timely manner.

And while the pharmaceutical industry might not be the vanguard of innovative cloud technology adoption, one thing remains clear – the cloud will continue to revolutionize the healthcare industry by enabling pharmaceutical companies to bring their drugs to patients faster at a lower cost.

Craig MorganBy Craig Morgan, brand development director at goBalto Inc.

Craig is a technology and life sciences management professional with more than 15 years of experience in the application of informatics and bioinformatics to drug discovery. He currently heads up the marketing and brand development functions at goBalto, working with sponsors, CROs and sites to reduce cycle times and improve collaboration and oversight in clinical trials.

Brand Identity Is Now The Crux Of Technology And Business

Brand Identity Is Now The Crux Of Technology And Business

Identity, Technology and Business

When Tim Cook and Apple pushed back against the FBI’s iPhone hack request, the resulting conflict hit on where we are, and where we’re going, with technology and business. It’s not just about useful tools people can use for convenience and entertainment anymore. It’s about identity.

Apple pushed back because the hack represents an intrusion on privacy. Hacking Syed Farook’s phone provides a direct window on who he is. This would open a wormhole to other iPhone users, too. Now, Apple wants to know how an anonymous third party was able to hack the phone.

Should Apple’s technology, or any sort of tech, protect identities from prying eyes? Or should people’s identities be fair game for organizations that want to use them? So far we’ve seen more of the latter.

Power and Liquidity of Identity

This has been brewing for a while. Businesses and customers both have a stake. Consumer identity has become a commodity. Every touch-point in high-tech commerce hinges on who the customer is. Business and consumer alike trade in the power and liquidity of identity.

identity-theft

First, an example of how the identity protection issue is playing out for American businesses, in a very concrete way. According to Square’s guide to EMV (Europay Mastercard Visa), “Almost half of the world’s credit card fraud happens in the United States.” As a result, businesses must switch to a new “processing device” that will accept EMV cards. You’ve probably seen local businesses that have complied, some that haven’t.

Because of the new EMV requirement, if someone commits credit card fraud with a magnetic stripe card, the business is liable. Credit card fraud is identity theft. Now businesses have to protect customer identity by staying technologically relevant.

Is this appropriate, or ironic? Businesses trade in customer identity, oftentimes without giving the customer a choice; now they don’t have any choice but to protect the identities they trade in.

Analyzing customer preferences, location data, spending habits, and other factors linked to who you are is a part of tech-savvy marketing and sales. Consumer data is a hot item—the topic doesn’t just come up on marketing blogs.

Appnovation is a web development company with a blog post titled “Integrating Customer Data into Your Business Decisions“. The author (whose last name isn’t provided—apparently they wanted to protect his identity) says, “If good information promotes growth and growth allows for success, why doesn’t every organization just do it?” He’s not merely observing the trendy practice of using customer data to grow business—he’s promoting it. There’s technology for tracking “customer experience and actions”, which gives the business an advantage. According to the author, “The insights and data from your analytics product do more than predict and hone customer behavior, they can be a window into your infrastructure’s current health.”

This final statement holds the key to the commoditization of identity. To the observer, what we do determines who we are. And now, what we do, including what we post online and the sites we visit, directly influences business determinations. They want to “hone and influence our behavior”, but that behavior also plays into how a business views itself. Even further than that—consumer behavior determines a business brand’s identity.

Consumers Dictating Brand Messaging

consumer-focus

(Image Source: Shutterstock)

Base Creative is an international branding agency. Their brand strategist and senior writer, Rod Parkes, has this to say about the evolving meaning of brands:

The brand owner can no longer dictate the meaning of the brand – the customer defines this, and today’s customers cannot be so easily told what to think. From Amazon to TripAdvisor, websites and social media enable the experience of others to help shape the potential purchaser’s perceptions.”

Web technology—primarily social media and peer reviews—gives consumers the same type of power businesses want to have over them. It’s the power to determine what an entity will do.

Consumer perception and the intersection of technology create a fluid, relative identity for brands, because brands must react to consumer perception. They’re watching us, we’re watching them, and on either side we’re making decisions based on our observations. It’s a dynamic feedback loop.

Personalization 

This relationship between identity, technology, and power is reflected in the trends to watch in 2016. Dynamic personalization, in which brands analyze data to market directly to individuals, reportedly delivers ROI (Return on Investment) five to eight times greater than non-personalized efforts. Personalization is also estimated to boost sales by ten percent. At least ninety percent of the time, though, consumers conduct their own research before they buy something. Brands want to influence research efforts.

On Facebook, for one, brands seek to establish an identity alongside users. Despite that, 62 percent of consumers in a Gallup poll report social media has no influence on their buying decisions whatsoever. This doesn’t stop marketers from pursuing social marketing strategies, such as influencer marketing.

According to this influencer marketing infographic from Simplilearn, influencer marketing is the most effective channel for customer acquisition. Of all the social networks, Facebook is considered the most effective for influencer marketing, with 27 percent of the share. Why would marketers use influencer marketing on social media if 62 percent of people aren’t influenced by it? There’s clearly a disjunction between consumer perception and brand perception.

The respondents to the Gallup poll may not have realized influencers are marketing to them. Word of mouth is wrapped up in the identity of the speaker. You trust what an influencer says because, ostensibly, you know them.

Clearly, the intersection of identities and technologies has created a new playing field for business. As we’re seeing with the Apple vs. FBI case, Apple’s struggle is to maintain a brand image that people associate with consumer identity protection.

The struggle for businesses that use data to personalize marketing is also a power struggle. Does the brand influence the consumer’s purchases more than the consumer influences brand identity? The answer to this question will ultimately determine how people identify with brands, and what brands do with data.

By Daniel Matthews

Did You Know That There Is A Real SHIELD?

Did You Know That There Is A Real SHIELD?

The Real Shield

You cannot make this up. The ODNI (Office of the Director of National Intelligence), an Act of Congress and a European Commission special “working group” known as Article 29 are all involved. Blame it on Edward Snowden. The Europeans are “concerned” (meaning: terrified) about the privacy protections surrounding any of their data stored in the US.

What are we talking about? Facebook, Google, Amazon and many more B2C and B2B organizations collect customer’s data and often hold it in their cloud platforms in the US. If your firm works with anyone in the EU and you use the cloud you need to be aware of the major change that has taken place in just the last six months or so. You could be legally liable and suffer penalties for not following these new regulations.

online privacy

A little back ground – until October of 2015 the relationship between the US and EU around privacy protection of EU citizens data stored in the US was governed by something set up in 2000 called Safe Harbor. It was basically a self-policing agreement that stipulated any US company who collected data from EU citizens needed to:

  • Inform them their data was being gathered,
  • Tell them what would be done with it,
  • Obtain permission to pass on the information to a third party,
  • Allow EU citizens access to the data gathered,
  • Ensure data integrity and security and
  • Provide a way to enforce compliance.

But then came the revelations of Snowden. The Europeans were antsy about American Intelligence’s ability to view their personal data but Snowden really drove them wild. A privacy activist named Max Schrems filed suit in the European Court of Justice against the Irish data protection authority based on the concerns he had about Facebook transferring his data from Ireland to the US.

The court ruled last October that Safe Harbor agreement was invalid under the EU’s rules. As you might guess there was immediately a great deal of confusion over what this meant to the various providers and consumers. There was also a recognition that it would be in all parties’ best interest to create a replacement that would meet the EU restrictions. Hence, SHIELD was born.

The EU-US Privacy Shield, commonly called “Shield”, was forged out of an EU and US set of consultations and changes of law on both sides. There were a few hair-raising moments when it appeared that all the needed steps might not be accomplished by the deadline imposed by the court. But, in the end, they were and when you look back, it is amazing how fast governments can actually work.

The European Commission did all of the following:

  • Reformed the EU Data protection rules, which apply to all companies providing services on the EU market,
  • Passed the EU-U.S. Umbrella Agreement ensuring high data protection standards for data transfers between the EU and U.S., and
  • Established the Shield for commercial data exchange, which contains obligations on U.S. companies who handle personal data.

On its part The US Congress passed the Judicial Redress Act of 2015 and President Obama signed it. This has significant consequences for US based businesses because it means that EU citizens will have the right to obtain judicial redress in the US if American authorities mishandle their data.

So what are some of the consequences and differences from Safe Harbor?

  • Safeguards related to intelligence activities will extend to all data transferred to the U.S., regardless of the transfer mechanism used.
  • The Shield’s dispute resolution framework provides multiple avenues for individuals to lodge complaints, more than those available under the Safe Harbor and alternative transfer mechanisms such as Standard Contractual Clauses or Binding Corporate Rules.
  • An organization’s compliance with the Privacy Shield will be directly and indirectly monitored by a wider array of authorities in the U.S. and the EU, possibly increasing regulatory risks and compliance costs for participating organizations.
  • The Department of Commerce will significantly expand its role in monitoring and supervising compliance, including carrying out ex officio compliance reviews and investigations of participating organizations.
  • Participating organizations will be subjected to additional compliance and reporting obligations, some of which will continue even after they withdraw from the Privacy Shield.

For the big cloud-based providers none of this represents a real burden but for medium and smaller firms you need to ensure your compliance even if your underlying cloud provider is one of the big boys like Amazon or Microsoft. As they always say: “Consult Your Attorney”.

So, what about the spooks? The EU is still worried that representations by the ODNI are not sufficient (“we don’t do bulk spying”) to assure protections. The bet is the European Commission will probably approve the Shield but the whole thing will still land up in court. Meanwhile, commerce continues to march on and hopefully we will see a complete resolution soon.

By John Pientka

CloudTweaks Comics
Unusual Clandestine Cloud Data Centre Service Locations

Unusual Clandestine Cloud Data Centre Service Locations

Unusual Clandestine Cloud Data Centre Service Locations Everyone knows what the cloud is, but does everybody know where the cloud is? We try to answer that as we look at some of the most unusual data centre locations in the world. Under the Eyes of a Deity Deep beneath the famous Uspenski Cathedral in the…

Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks Does cloud security risks ever bother you? It would be weird if it didn’t. Cloud computing has a lot of benefits, but also a lot of risks if done in the wrong way. So what are the most important risks? The European Network Information Security Agency did extensive research on that, and…

Cloud Computing – The Good and the Bad

Cloud Computing – The Good and the Bad

The Cloud Movement Like it or not, cloud computing permeates many aspects of our lives, and it’s going to be a big part of our future in both business and personal spheres. The current and future possibilities of global access to files and data, remote working opportunities, improved storage structures, and greater solution distribution have…

Cloud Computing – The Real Story Is About Business Strategy, Not Technology

Cloud Computing – The Real Story Is About Business Strategy, Not Technology

Enabling Business Strategies The cloud is not really the final destination: It’s mid-2015, and it’s clear that the cloud paradigm is here to stay. Its services are growing exponentially and, at this time, it’s a fluid model with no steady state on the horizon. As such, adopting cloud computing has been surprisingly slow and seen more…

Cloud Infographic – Big Data Analytics Trends

Cloud Infographic – Big Data Analytics Trends

Big Data Analytics Trends As data information and cloud computing continues to work together, the need for data analytics continues to grow. Many tech firms predict that big data volume will grow steadily 40% per year and in 2020, will grow up to 50 times that. This growth will also bring a number of cost…

Is The Fintech Industry The Next Tech Bubble?

Is The Fintech Industry The Next Tech Bubble?

The Fintech Industry Banks offered a wide variety of services such as payments, money transfers, wealth management, selling insurance, etc. over the years. While banks have expanded the number of services they offer, their core still remains credit and interest. Many experts believe that since banks offered such a wide multitude of services, they have…

What You Need To Know About Choosing A Cloud Service Provider

What You Need To Know About Choosing A Cloud Service Provider

Selecting The Right Cloud Services Provider How to find the right partner for cloud adoption on an enterprise scale The cloud is capable of delivering many benefits, enabling greater collaboration, business agility, and speed to market. Cloud adoption in the enterprise has been growing fast. Worldwide spending on public cloud services will grow at a…

The Future Of Cloud Storage And Sharing…

The Future Of Cloud Storage And Sharing…

Box.net, Amazon Cloud Drive The online (or cloud) storage business has always been a really interesting industry. When we started Box in 2005, it was a somewhat untouchable category of technology, perceived to be a commodity service with low margins and little consumer willingness to pay. All three of these factors remain today, but with…

Report: Enterprise Cloud Computing Moves Into Mature Growth Phase

Report: Enterprise Cloud Computing Moves Into Mature Growth Phase

Verizon Cloud Report Enterprises using the cloud, even for mission-critical projects, is no longer new or unusual. It’s now firmly established as a reliable workhorse for an organization and one that can deliver great value and drive transformation. That’s according to a new report from Verizon entitled “State of the Market: Enterprise Cloud 2016.” which…

Moving Your Enterprise Apps To The Cloud Is A Business Decision

Moving Your Enterprise Apps To The Cloud Is A Business Decision

Moving Your Enterprise Apps Whether it be enterprise apps or any other, if there is any heavy data that is going to be transacted in and through an app, then affiliating it with the Cloud becomes a must. And then an important question arises: How do you decide when to integrate your enterprise app with…

Three Challenges of Network Deployment in Hyperconverged Infrastructure for Private Cloud

Three Challenges of Network Deployment in Hyperconverged Infrastructure for Private Cloud

Hyperconverged Infrastructure In this article, we’ll explore three challenges that are associated with network deployment in a hyperconverged private cloud environment, and then we’ll consider several methods to overcome those challenges. The Main Challenge: Bring Your Own (Physical) Network Some of the main challenges of deploying a hyperconverged infrastructure software solution in a data center are the diverse physical…

How The CFAA Ruling Affects Individuals And Password-Sharing

How The CFAA Ruling Affects Individuals And Password-Sharing

Individuals and Password-Sharing With the 1980s came the explosion of computing. In 1980, the Commodore ushered in the advent of home computing. Time magazine declared 1982 was “The Year of the Computer.” By 1983, there were an estimated 10 million personal computers in the United States alone. As soon as computers became popular, the federal government…

Beacons Flopped, But They’re About to Flourish in the Future

Beacons Flopped, But They’re About to Flourish in the Future

Cloud Beacons Flying High When Apple debuted cloud beacons in 2013, analysts predicted 250 million devices capable of serving as iBeacons would be found in the wild within weeks. A few months later, estimates put the figure at just 64,000, with 15 percent confined to Apple stores. Beacons didn’t proliferate as expected, but a few…

5% Of Companies Have Embraced The Digital Innovation Fostered By Cloud Computing

5% Of Companies Have Embraced The Digital Innovation Fostered By Cloud Computing

Embracing The Cloud We love the stories of big complacent industry leaders having their positions sledge hammered by nimble cloud-based competitors. Saleforce.com chews up Oracle’s CRM business. Airbnb has a bigger market cap than Marriott. Amazon crushes Walmart (and pretty much every other retailer). We say: “How could they have not seen this coming?” But, more…

Your Biggest Data Security Threat Could Be….

Your Biggest Data Security Threat Could Be….

Paying Attention To Data Security Your biggest data security threat could be sitting next to you… Data security is a big concern for businesses. The repercussions of a data security breach ranges from embarrassment, to costly lawsuits and clean-up jobs – particularly when confidential client information is involved. But although more and more businesses are…

Maintaining Network Performance And Security In Hybrid Cloud Environments

Maintaining Network Performance And Security In Hybrid Cloud Environments

Hybrid Cloud Environments After several years of steady cloud adoption in the enterprise, an interesting trend has emerged: More companies are retaining their existing, on-premise IT infrastructures while also embracing the latest cloud technologies. In fact, IDC predicts markets for such hybrid cloud environments will grow from the over $25 billion global market we saw…

Adopting A Cohesive GRC Mindset For Cloud Security

Adopting A Cohesive GRC Mindset For Cloud Security

Cloud Security Mindset Businesses are becoming wise to the compelling benefits of cloud computing. When adopting cloud, they need a high level of confidence in how it will be risk-managed and controlled, to preserve the security of their information and integrity of their operations. Cloud implementation is sometimes built up over time in a business,…

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart  Breaches Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove whether a system achieves its desired functional specifications. It is employed by organizations that build products that absolutely cannot fail. One of the reasons NASA rovers are still roaming Mars…