Category Archives: Technology

New Report Finds 1 Out Of 3 Sites Are Vulnerable To Malware

New Report Finds 1 Out Of 3 Sites Are Vulnerable To Malware

1 Out Of 3 Sites Are Vulnerable To Malware

A new report published this morning by Menlo Security has alarmingly suggested that at least a third of the top 1,000,000 websites in the world are at risk of being infected by malware. While it’s worth prefacing the findings with the fact Menlo used Alexa to compile the list of the top million – a ranking site which is notoriously flawed with spammers and companies paying to have their own numbers inflated – it is still a worrying trend.

Although the use of the Alexa data might be questionable, the Menlo study’s methodology was sound. They scanned 1.75 million URLs before checking each one against third party classification systems to see if it was reported as malicious, checking IP addresses against a reputation database, and issuing a web request to each URL so they could fingerprint the response and determine what software was in use. The results are astounding – the report found one in five sites are running software with known vulnerabilities, and one in twenty sites were identified by 3rd-party domain classification services as serving malware or spam, or are part of a botnet.

risks-alexa-security

The report claims that its findings prove that the concept of a ‘trusted’ site is a fallacy – with a billion websites already online and an extra 100,000 being added every day, companies’ websites are now being threatened by other sites that are out of their control. They use the example of the recent Forbes.com hacking, which saw attackers exploit a WordPress vulnerability to insert malicious code into the site that was then delivered via the ‘trusted site’ for an unspecified amount of time – possibly months.

Menlo Security’s CTO, Kowsik Guruswamy, had the following to say: “Vulnerable servers (like in Forbes, James Oliver, etc.) are being exploited by cyber criminals as a launching pad for delivering malware to unsuspecting end users. If anything, this act is trending higher. As SSL on the Internet becomes more prevalent, enterprises are going to face a much higher risk. This is partly because most enterprises, for privacy reasons, don’t inspect SSL traffic and this is an easy channel for malware to ride on without getting noticed.

The Worst Offenders

So which sites and sectors can you trust? Unsurprisingly, the worst offenders in the report were sites labelled as “Hate and Intolerance”, with sites that promoted content about violence and child abuse showing vulnerability rates of almost 35 percent. In the regular web, the report noted concern about the number of sites in typically trusted sectors that showed vulnerabilities; for example transport, health, and medicine sites had a rate of 20 percent, while tech sites and business sites both exhibited around 18 percent rates.

What is the solution? We’ve already see more than $70 billion spent on cyber security tools in 2014, yet somehow malware always manages to stay one step ahead. Menlo argue that the incident similar to the Forbes hack will become increasingly common until someone addresses the source of the problem by developing a new tool that can completely stop all web attacks before they reach their target, rather than just investing in new tools that do a better job of detecting infected systems and limiting the impacts of security breaches.

What do you think? Is Menlo’s report accurate? How do you envisage the future of cybersecurity? Let us know in the comments below.

By Daniel Price

Cloud Infographic – Big Data Predictions By 2023

Cloud Infographic – Big Data Predictions By 2023

Big Data Predictions By 2023

Everything we do online from social networking to e-commerce purchases, chatting, and even simple browsing yields tons of data that certain organizations collect and poll together with other partner organizations. The results are massive volumes of data, hence the name “Big Data”. This includes personal and behavioral profiles that are stored, managed, and analyzed on demand.

Big Data is growing at an explosive rate. Last week we discussed that data is growing exponentially. Today, our best estimates suggest that at least 2.5 quintillion bytes of data is produced every day and Google processes roughly 3.5 billion requests per day. That is a lot of data transmission which requires a lot of storage space at the end of it.

Today, we’ve come across a very insightful infographic courtesy of Open4u which focuses on the a number of Big Data trends and predictions over the next 5-10 years.

big-data-present-future

 

Big Data and Financial Services – Security Threat or Massive Opportunity?

Big Data and Financial Services – Security Threat or Massive Opportunity?

Big Data and Financial Services

Cloud Banking Insights Series focuses on big data in the financial services industry and whether it is a security threat or actually a massive opportunity. How does big data fit into an overall cloud strategy? Most FI’s have a positive mind-set towards cloud IT consumption as it not only enables saving across IT investment but frees up precious resource and surplus so  FI’s  can invest in customer centric activity. Financial institutions have long, and in many occasions life long relationships with their customers, the convergence of cloud and data analytics helps FI’s to work towards the common goal of relationship building and increasing product uptake through tailoring and appropriately segmenting their approach. There is a great deal of “noise” around this topic both in terms of its real meaning but also how it can create disruption in relation to security & privacy. Throughout this article we will share some insight about both of those to help reduce the “noise levels” from the industry.

What is Big Data?

data-migration

Big data has been around for a number of years now and everyone has an opinion on it. It is not unusual to see two individuals having a conversation about big data where neither person is talking about the same thing. One is saying that it is all about the volume and the other saying it is about variety. Some even suggest that small and fast data shouldn’t be classified under big data. This is just like two individuals discussing Ruby and Java. Both of them are programming languages but neither has anything to do with the other – only when you apply some context to  the conversation does it become productive. So let us specify some context on the topic and then see how it applies to financial services.

Big data is really all about 4 things, known as the 4 V’s of Big Data:

  • Volume – relates to the size of the data. New ways to process huge amounts of data by cutting it into small “bite sized” pieces and processing the algorithms leveraging massive parallelism.
  • Variety – relates to the structure of the data. This also relates to new ways of ingesting, handling and processing data which can be originated from multiple structure types from relational to non-relational origins including social platforms such as Twitter, Facebook and other service type data aggregators.
  • Verocity – Data Veracity relates to the accuracy of Big Data. Focus is on the the uncertainty of imprecise and inaccurate data.
  • Velocity – is related to the speed in which the data is ingested or processed. This is also important because big data brings different ways to treat data depending on the ingestion or processing speed required. So this can be for example data coming from a sensor, which ingests small status but very quickly or also be related to data required to be processed in “near real-time” due to the criticality of the data like in fraud detection and compliance mechanisms.

ibm-big-data

(Image Source: IBM)

Now that we understand the 4 V’s you might be asking why this is important and what generated these massive amounts of data? This is a great question and the answer is quite simple.

Data Explosion

explosion-cloud

There has been an increase in data of around 10x every five years and about 85% of new data types have been introduced; from clickstream, time-series and  columns to spatial data types. The potential is skyhigh when these types of data explosion are placed in conjunction with the consumerization of IT or the huge amount of people connected through social channels.

The potential is huge but the “noise” levels right now are so high that it is difficult to achieve real value. This happens because, as is the case with many other innovations, companies already have some kind of Big Data solution which in their opinion will revolutionize the way they work, but when you dig a bit deeper do they really know how to capitalise on this? Other misconceptions are around the thinking that Big Data is all about Hadoop and can achieve if you leverage it. This is definitely not the case. Big Data is an approach, a strategy to generate better Business Insights and Intelligence for the business. The technologies and products we use will always depend on what the enterprises goals really are. If we are talking about storing the data (Yes Big Data can also be about how you store this data) we might focus on MongoDB, Cassandra, RavenDB and others, but if we’re talking about the processing we might be talking about Hadoop, Spark, Kinesis, Streaming Analytics and others. Every choice depends on the context and the Hadoop ecosystem is not the only one which will solve everything. There is no such thing as “one size fits all” or “silver bullets” as this type of thinking will result in disastrous consequences.

When looking at the market, It would be fantastic to see a common and unified approach to products that enable Big Data analytics, but unfortunately we are still far from that. This is very similar to what is happening with the Cloud. Just because  you have it doesn’t make you an expert in it and by no means does this guarantee that you are optimising its potential, let’s not delude ourselves, we are all still trying to figure out the real power of this for our businesses and it will depend on how we leverage it. To succeed in this transformation we will first need to be humble enough to stop and learn without preconceptions and understand that this is really another tool in our toolbelt to help our business succeed. Basically we need to learn that, most importantly, we need to work to get the right data, at the right time to the right person. Only then we will make this successful.

Level-setting out Big Data Conversations

To better reduce the noise levels, which are associated with anything new, we need to start using some level-setting questions. These are some which should help you become very successful achieving your commercial and customer engagement objectives. The questions are:

  • What is the data frequency of the data being processed? Is it on-demand, continuous feed or real-time?
  • What is the analysis type which should be used? Is this batch or streaming/real-time?
  • What’s the processing methodology we should use? Predictive Analytics, Analytical (like Social Network Analysis or Pre-emptive Analytics?
  • What is the structure of data we are receiving? Is it unstructured, semi-structured or highly structured?
  • What type of data sources do we need to work with? Web & Social, Machine generated, human generated, biometric, transactional system or other?
  • What is the volume of data received? Are these massive chunks or small and fast chunks?
  • What will be the consumers of this data? Will it be human, business process, other enterprise applications or other repositories?
  • Are we talking about how to store this data or how to process it?

After answering these questions you should have a clear notion of how much “noise” you have been dealing with and why sometimes you have the sense that you are having conversations around big data with others but it seems that you both are speaking different languages. In addition, this context should enable a real productive discussion because everyone should be singing from the same hymn sheet.

Now that we understand the context we also need to understand how the Big Data approach works and why. This approach is based in 4 phases:

  1. Aggregate – This is where we focus on understanding the different data sources and how the data is getting ingested because the goal is to aggregate all the data and to have a unified view around all the different sources independently of where they came from.
  2. Enrich – This is the phase where we refine the data, transform it and perform the data cleansing to make sure we have all the data we need and the most accurate one.
  3. Analyse – In this phase is where we implement our algorithms being analytical, predictive or pre-emptive.
  4. Visualize/Expose – Finally this is where we focus on creating a concrete visualization over all our data and the results of our processes to both people and other solutions.

Big Data in Financial Services

financial-cloud

Financial Services has everything to benefit from Big Data when the strategy in place is accurate. If we consider the retail banking companies, this will be a huge help since with more data available it will be much easier to define what type of products will be more suitable to a specific customer because when all the social data is merged with the internal financial data of the customer, this forms an near accurate picture around the customer’s preferences, likelihood of investment, personality and most importantly any associated risk. Application in Big Data for the financial services industry are just touching the surface, a lot more will appear in the meantime i.e the ability for FI’s to create a massive database with social and professional information from customers where they can create risk analysis near real-time to understand the insurance risk you will need to pay and so on.

Fraud Detection Example

Let’s use an example to check if what we use today to perform our Fraud Detection in Credit Cards is basically a Big Data problem or not.

Let us start with the setting the context of what Big Data is in this example.

  • What is the data frequency of the data being processed? Is it on-demand, continuous feed or real-time?
    • In this case we will have two different speeds and processes running. First we have a real-time feed which is provided by the Consumer Transaction Systems and which provide us the transactions which are happening currently and which we need to answer. Secondly we will have a feed from the Anti-Money Laundry System which can be either continuous or on-demand depending on the system we are using or connecting to.
  • What is the structure of data we are receiving? Is it poly-structured or highly structured?
    • This is a poly-structured data problem which we are facing because on one side we have unstructured information provided by our card payment machines (“our connected things”) and on the other side we have massive data warehouse processes which contain historical data which enables us to rate each transaction in terms of trust level.
  • What is the analysis type which should be used? Is this batch or streaming/real-time?
    • The speed in which the decisions need to happen is near real-time because we need to make sure we increase the security of our customers by pre-empting issues before they happen, but not only this is required. To quote a transaction in terms of Fraud level we will need to have also batch processing happen, because we need the historical analysis to train our processes to avoid false positives.
    • So in reality in this example we will need to use a combination of both and only then we will be successful
  • What’s the processing methodology we should use? Predictive Analytics, Analytical (like Social Network Analysis or Pre-emptive Analytics?
    • In this example we need both Analytical processing since we need to understand what happen in the past and what the patterns we should look for to find a fraud. Also we need to use the predictive analytics processing to predict if a certain customer transaction which is happening now is really fraud or not. This is done by using the historical data to train an Artificial Intelligence process using LDA (Latent Diritchlet Allocation), Bayesian Learning Neural Networks and Peer Group Analysis.
  • What is the structure of data we are receiving? Is it unstructured, semi-structured or highly structured?
    • This data is received in multiple ways. Some times this is structured but more often this is really unstructured or semi-structured data.
  • What type of data sources do we need to work with? Web & Social, Machine generated, human generated, biometric, transactional system or other?
    • In Fraud detection we have multiple data sources which generate the data we use in the system. Some of it is generated from a device like the card machine being used for the payment; other is generated by the AML system and other from other systems and even humans like in the peer group analysis.
  • What is the volume of data received? Are these massive chunks or small and fast chunks?
    • The volume of data received is huge and normally we will have that the consumer transaction systems will have small and fast chunks of data where the AML will be more massive chunks of data.
  • What will be the consumers of this data? Will it be human, business process, other enterprise applications or other repositories?
    • Mainly the consumers of this data will be other enterprise applications and other repositories since we need to act in real-time on it and store for audit purposes and train our AI algorithms.
  • Are we talking about how to store this data or how to process it?
    • No we’re not talking only about data storing. This is mainly about data processing even though for audit and training purposes we need to store it and some of it will be time-series based, some document based and some relational.

So here is a more visual representation of what we were talking about. Now we see the real complexity and how setting the context is so important. By now you should have a fairly good idea about the context of this example, how Big Data can help and how to structure your conversation around it.

Summary

So in summary Big Data is an over hyped buzz-word which by itself doesn’t mean anything. We need to dig much deeper to understand really what it is all about and always remember to:

  1. Understand the business goals so we know what we need to achieve.
  2. Perform a level-set around what part of Big Data problem we are talking about so we can focus the discussion.
  3. Focus on understanding what to focus in each of the stages of the approach: Aggregate, Enrich, Analyse and Visualize/Expose.
  4. Provide the right data, to the right person at the right time.
  5. Focus on the visualization of the data also because without it might not be understood.

We hope this article helps you achieve your goals and better understand how Big Data can change Financial Services because with all the data we have it is only a matter of getting the right people (Data Scientists, Data Stewards, Data Engineers) working on it and we will be much better fulfilling our customer needs.

By Diaz Ayub

Listening To Music: From The Paleolithic To iPods

Listening To Music: From The Paleolithic To iPods

Listening To Music: From The Paleolithic To iPods

Music, in the human sense, has been around for as long as people could grunt. While some modern music might make audiophiles believe we haven’t come that far, the medium for which we enjoy it has evolved as fast as humanity itself.

While recorded music is obviously a modern invention, one of the earliest known musical instruments dates back nearly 40,000 years. Found in 1995, the Divje Babe Flute predates the term woodwind as it was actually carved out a cave bear’s femur. This simple flute was thought to be the work of Neanderthals who sadly didn’t stick around for the invention of recorded sound. The first device capable of recording sound was built by Parisian Édouard-Léon Scott De Martinville in 1857.

High Speed Music Revolution

From there, the evolution of sound recording and enjoyment entered a high-speed revolution, beginning with the 1877 invention of the phonograph by renowned inventor Thomas Edison. This physical sound recorder used the creation of physical deviations within discs or cylinders to record sound. Playback followed a similar process, using a stylus to track the deviations producing vibration, thus music. These recordings didn’t hit the mainstream until the invention of the directional radio antennae by Guglielmo Marconi, which broadcast for the first time on Christmas Eve in 1906. This was followed by the eventually rise of public radio in the 1920’s.

The buying and selling of musical recordings was born in 1926 with the invention of the prototype LP record. This early record had roughly five minutes of playtime per side and oddly enough actually played from the inside out. The first commercial record was released in 1948 by Columbia Records. It featured Nathan Milstein performing the Mendelssohn violin concerto. LP Records remain a popular medium for listening to music to this day.

The 8 Track

From here, music mediums evolve quickly but with less success. In 1963 the 8-track tape is invented by William Powell Lear. This format gives way to the magnetic cassette tape in the 1970s which ultimately leads to the invention of arguably one of the most important consumer inventions of the century, the TPS-L2, otherwise known as the Sony Walkman.

sony-tps

(Image Credit: Hugo Rodriguez)

For the first time music was truly portable. It was released in Japan in 1979, followed by the United States in 1980. It allowed two people to listen to tapes on the go and even initially featured a “hot line” function, allowing users to enable a microphone to talk over the music during playback. Sony would introduce a series of portable tape players over the following decades before the next big wave hit in 1982.

(Infographic Credit To Virgin Music)

music_players_infographic_one

 

Digital Music Begins

1982 marked the launch of the compact disc, a format which now in it’s dying days had LP records at their knees through the 90’s and into the 2000’s. This small disc allowed users to listen to full albums without flipping records or sides. This medium was invented in Germany and was a product of both Phillips and Sony. It was followed by the launch of portable CD players, once again, pioneered by Sony.

The next big wave was a slow burner. Introduced to the mainstream in 1988-89, the MP3 was the first isolated, digital format for music. This format allowed users to listen to music on their computer and more primitive MP3 players. Most people probably didn’t know what an MP3 was until 1999 with the launch of Napster. That story, in all its legalities and intricacies, requires it’s own chapter.

music_players_infographic_two

The iGeneration Begins…

But what exactly is a primitive MP3 player? Well, the answer is any MP3 player released before 2001. This year marked the launch of Apple’s first iPod. This product took the world by storm and launched Apple to the top of not just the technology field, but made them huge players in the music industry. That statement in validated in 2003 with the launch of the iTunes Music Store. For the first time, massive catalogues of music are available with just one-click online. The iTunes music store continues to be driving force in the music industry today.

Cloud Streaming Music Begins…

The late 2000’s mark a series of consumer music improvements which are still being felt out to this day. Popular social networks like MySpace, Soundcloud, and Bandcamp allow smaller artists to promote and expose their music to new audiences. Likewise, services like Spotify, Pandora, and Rdio launch in the late 2000s and bring streaming to a wide variety of devices including the aforementioned iPod, Apple iPhone, and Google’s Android devices. These cloud services represent the forefront of music enjoyment today, with features such as preloading Spotify streams to your smartphone for enjoyment in out-of-service areas, and cross-platform enjoyment through the cloud becoming ever prevalent.

The future is where things always get interesting. It will be very hard at this point in time to detach smartphone and player for the common consumer, meaning the launch of a revolutionary music player is likely down the road with the exception of new niche products such as Neil Young’s PonoPlayer, an extreme high-fidelity player for those not content with standard audio formats. Alternatively, the future could be held in past technologies, with formats such as the LP record making an incredible comeback post-iPod. Another route could be with the advent of Virtual Reality. Products like the Oculus Rift could merge the realities of music and gaming to create musical virtual reality interfaces, allowing music to revert from a single sensory experience to something more grand. Regardless, the future is bright for those looking to listen to music.

By Keith Holland

The History Of 3D Printing – From Kidneys To Cars

The History Of 3D Printing – From Kidneys To Cars

The History Of 3D Printing

3D printing, also called additive manufacturing, has been one of the break-out technologies of the last 18 months. People are now clamouring to get involved with the technology, with printable objects ranging from hand-guns to medical equipment. Yet, despite its recent popular emergence and contrary to popular belief, 3D printing has been existence for more than 35 years, with the inception of the concept traced back to 1976 and the first example of it coming in the early 1980s.

(Tedtalk – What is next for 3d printing?)

Here we take a brief look at its history:

1981/2

The first published account of a printed solid model was made by Hideo Kodama from the Nagoya Municipal Industrial Research Institute in either 1981 or 1982 (accounts vary). His paper theorised about the potential behind a rapid prototyping system that used photopolymers to build a solid, ‘printed’ object that was built up in layers, each of which would correspond to a cross-sectional slice in a model.

1984

Charles Hull, who at the time worked for 3D Systems Corporation, invented stereolithography – a process which lets designers create 3D models using digital data which can then be used to create a tangible object. Hull went on to publish a number of patents on the concept of 3D printing, many of which are used in today’s processes.

1992

3D Systems Corporation built the first stereolithographic apparatus (SLA) machine. It used a UV laser to solidify photopolymers, thus making 3D objects layer-by-layer. Although the machince faced difficulties, it was the first time it was proved that complex objects could be ‘printed’ overnight.

1999

The stroke of the new millennium saw a world first as the first 3D printed organ was transplanted into a human. Created by scientists at Wake Forest Institute for Regenerative Medicine, a human bladder was printed, covered in the recipient’s own cells, and then implanted. It was a scientific breakthrough; because the device used the patient’s own cells, there was no chance the implant would be rejected.

2005

3D printing collided with the open source movement for the first time in the middle of the decade. The idea was championed by Dr Adrian Bowyer at the University of Bath in England. His idea was to create a printer than could print itself – thus making units and parts cheaper, more accessible, and easier to distribute.

2008

After the success of the bladder in 1999, then of the first printed kidney in 2002, 2008 was the year that saw the first 3D printed prosthetic limb. It incorporated all parts of a biological limb, was printed ‘as is’, without the need for any latter assembly.

2011

The world’s first 3D printed car was launched by Kor Ecologic at the TEDxWinnipeg conference. Designed to be ‘green’, the prototype could reportedly do 200 mpg and would retail for between $10,000 and $50,000, depending on the model.

Today

The costs of 3D printers are falling phenomenally. While they cost around $20,000 just three years ago, there are now machines being developed that will be sold for under $500, thus making the technology increasingly available to the average consumer and laying the foundations for a potential explosion in the home inventor and home DIY markets. It’s not hard to imagine a future where a trip to the dentist will see him print you a new tooth, if you have a crash in your car you can just print a new chassis, or all furniture will simply be downloaded and printed at home.

Included is an infographic by engineering.com which illustrates the brief history of this growing industry

3d-printing history

What do you think the future holds for the 3D printing industry? Let us know in the comments below.

By Dan Price

To Be Heard And To Be Paid: The Cloud Computing Music Industry

To Be Heard And To Be Paid: The Cloud Computing Music Industry

The Cloud Computing Music Landscape

For the duration of my college life certain questions were a given when I came home to see my parents. I’d be asked about my grades. I’d be asked about my eating habits. But, first and foremost, I’d be asked why the hell I was still playing in that band I started in High School. You know, that band that never made a cent playing dive bars across South-Western Ontario and Québec. Usually the conversation was short, but from time to time we’d dig deep into the artistry of the whole “playing guitars loud game”, something I called the balance (or lack there of) between being heard and being paid.

streaming-music

Over the last decade, the gray area between these two polar opposites has blasted out to form an even greater divide. Where before the biggest concern was making enough gas money to get to the next show, the advent of streaming services like Spotify, Pandora, and RDIO has thrown a complicated and ironic wrench into the already rusted gears of being an independent or aspiring musician. Off the bat, I would say 99% of artists probably couldn’t care less about what Spotify pays them.  In a world where paying for music is largely a last resort employed only when streaming and/or downloading fails, Independent artists have bigger fish to fry than Pandora’s embarrassingly small artist payouts. Having this problem is a luxury for “Band X” on their fifth day straight of Taco Bell and old socks in a van with five smelly men or women. The birth of streaming monsters like Spotify, Pandora, and RDIO isn’t about artists at all in fact (sadly). It’s about end users and service providers, and they both have never had it better.

Streaming Leaders Battling It Out

It has never been easier to find music. Between streaming platforms serving up personalized, themed, and curated content steadily and having more and more ways to communicate with artists, music is everywhere for those who wish to find it and support it. It truly is a golden age for music fans who want something for nothing. Streaming providers at the same time are living the high life, with The Motley Fool reporting that as of last month Pandora was leading the charge with 76.5 million subscribers as of Q3 2014. Spotify, who have 60 million subscribers, has added 10 million subscribers in the last two months and is well on its way to becoming the most popular cloud-based music service, doubling Pandora’s growth and having a much higher ratio of paying users (25% for Spotify to 5% for Pandora.)

While this is great news for this emerging industry, it hasn’t all been roses for the big guns. There have been public splits (Taylor Swift pulled her music in November 2014), leaked royalty payments (showing Pharrell Williams only generated $2,700 in song writing royalties in Q1 after generating 43-million streams), and industry lawyers chumming the water as they try to hold onto their last card in the deck in the wake of the Napster explosion of the 2000’s. It’s a turbulent time. As Swift’s move proved though, you can be very successful without streaming services behind you. Swift captured 2014’s first platinum record following her public break up with Spotify, telling Time Magazine that other services like Rhapsody are more artist friendly, listener-exclusive, and “places a perception of value on what I (she) have created.” On the other hand, streaming services not only survived, but thrived after losing their big fish, continuing to grow healthily.

The Rise Of Vinyl

music-sales

It is my opinion that the music industry in general will go down as one of the most interesting to observe. I also believe we are living through its most interesting time period. Are we watching a 20-year shuffle of deck chairs after the iceberg that was Napster ripped into its hull? Have the rising sales of vinyl records thrown the industry a line as a more niche product? Could Spotify someday “be” the music industry producing and owning the content it streams? Is a legal battle brewing that will see artists and labels deny Spotify, Pandora, and RDIO access to their respective catalogues? The only thing that is certain in all this is that as the distance between listeners and their music shrinks, the gap between artists and their cut will grow begging the old question, are you in this to be heard, or are you in this to be paid?

(Image source: Shutterstock)

By Keith Holland

Keith is based out of Montreal, Canada and has a background in Journalism and has been writing for several years. Keith is a Music Technophile who not only loves listening to music, but creating his own tunes with his current band. 

The Internet of illness

The Internet of illness

The Internet of illness

The number of postings about IoT solutions has continued to rise. It is a wave that hasn’t crested yet. I’ve posted several here on CloudTweaks as have a number of other authors. IoT topics from the industrial use, to what IoT is going to change around the world. It got me thinking for some reason about the common cold. Why couldn’t the IoT revolution help with the spread of the common cold? The Internet of illness if you will.

First off let’s consider illness and work. Many times we feel like we can go into work with the minor cold we have. However, it would behoove employers to support workers staying home when they are ill. Remote technology and cloud based meetings solutions make it easier than ever to support home based workers. Yes you may lose a small amount of productivity. But if that person comes into the office, and infects 20 other people you suddenly go from a small loss (potentially) of productivity to a much bigger loss of productivity.

tech-cold

Second there are a number of health solutions that are popping up for people to consider and use. Smart thermometers that are connected to your cellular phone and run out to check and see if your symptoms are in actuality part of something worse. Devices that can easily sniff the contents of the food you are considering for pathogens and things you can’t eat.

Third there is a reality of connections that exists. The Internet supports connections of a number of types. Medical information can now go directly from the monitoring home medical device to your doctor or medical professional. It is the growing information that is the IoT generates. The value of this information is that it can help you feel better, faster.

PrescriptionMeds

Epidemics cause people to do things differently than expected. But if you know up front who is sick and who is not sick you might not have to wear that surgical mask everywhere. The next step beyond who might be sick around you is the question are you sick? A smart thermometer or smart medical device could quickly tell you – yes you are sick. Then the next question becomes, are you contagious or you are not contagious? You could then determine should I go to work or should I stay home.

Illness Alert! Stay Home

stay-home

(Image Source: Shutterstock)

Going one step further, there could also be sensors placed in the doors of offices that scan people on the way into the building. The screen in front of you flashing “you are asked to work from home today due to illness.” Again, one person’s annoying cold can take another employee out for two weeks so in the best sense of the right thing to do its extreme but something that could be done. Of course if you didn’t work there and were waiting in reception you could still as a visitor infect the entire office building but we did say it was going a step too far, not all the way to crazy.

Internet of Illness could help reduce the spread of infections quickly. Simply knowing in advance that you are sick will allow you to work from home. The technology is available where working from home is effective and for some people even more productive than going into the office. It is in the end the next step in the Internet of things revolution. Your phone can quickly tell you if what you are feeling is the cold, the flu or worse. The device can notify your doctor so that when you arrive for the verification of illness you are routed away from other people (if you are contagious).

We haven’t beaten the common cold by science yet, but we could limit it radically by technology.

The Internet of illness – it’s a thing!

By Scott Anderson

The Importance Of Having A Flexible Monitoring Tool

The Importance Of Having A Flexible Monitoring Tool

The Importance Of Having A Flexible Monitoring Tool

(Post Sponsored By Site 24×7)

The world is increasingly moving towards a web-based economy. Regardless of what industry you are in, it is almost impossible to be competitive and to maintain a brand presence without a strong online offering. It means that your company’s website is increasingly becoming the first interaction that users, clients, and customers will have with your business. As the old saying goes, first impressions are everything – so it’s vital that your IT Team (specifically DevOps and Operations teams) have all the necessary tools at their disposal to make sure everything is working as it should, looks great, and is live.

With this is mind, it is important to use the most flexible and all-encompassing tools available. Why split performance monitoring, web app monitoring, server monitoring, and app performance monitoring between different providers and different tools when there are offerings in the market that can provide an all-in-one solutions? If it’s possible to monitor both your internal network and your public/private cloud infrastructure with the same software, would that not be beneficial, cheaper, and more streamlined?

One of the better tools available at the moment is Site24x7’s SaaS product. Widely recognised as the most flexible monitoring tool for IT and DevOps, they’ve featured heavily in the media and have been subject of coverage for Gartner, 451 Research, EMA, and Virtualisation Practice, amongst others. Indeed, Gartner placed them in their ‘Magic Quadrant’ for Application Performance Monitoring and they were Network World’s ‘Product of the Week’.

But what makes Site24x7 so flexible? As one of their clients said, “Site24x7 is outstanding in the way it provides a swiss-army knife of various network monitoring tools at an affordable price” (Sridhar P, Director of Engineering, Sastra Technologies).

Let’s look at a number of their free tools:

System Admin:

    • Website monitoring: Site 24×7 offer global website availability testing, with more than 50 locations around the world.
    • Traceroute Generator: Helps you troubleshoot problems and alerts you if something breaks
  • Server and Application Monitoring: Instant notifications and track performance of physical hosts

Validation Tools:

  • Server Header: Check headers and verify HTTP status codes
  • HTML Validator: Find errors using W3C standards

Content Tools:

  • Speed Report: Find out if you webpage’s loading time is optimised
  • Link Explorer: Explore links in a certain URL

Developer Tools:

  • JSON Formatter: A formatter and validator to help create coherent JSON data
  • XML Formatter: To make use your XML data is formatted correctly

They also offer SLA tools, on-premise and mobile network pollers, and end user experience monitoring, but you can check out their website for a full list of free tools.

  • Website Performance Monitoring
  • Web Application Monitoring
  • Web Page Analyzer
  • Service Monitoring
  • Real User Monitoring (RUM)
  • Application Performance Monitoring (APM)
  • Cloud Monitoring
  • VMWare Monitoring
  • Server Monitoring (Windows & Linux)
  • Internet Network Monitoring (On-Premise Poller)
    Exchange Server Monitoring
  • DNS Server Monitoring
  • SSL Certificate Monitoring
  • FTP RTT Monitoring
  • Mail Server Monitoring
  • Mobile Application Performance Monitoring (Mobile APM)

site24-7-infographic_001

The company offers five price plans for an incredibly affordable amount. Their basic plan – perfect for bloggers and freelancers – starts at $4.50 per month. Their two full-feature plans, which offer you the full power of all the software’s features, start at $35 per month. The lesser of the two is great for small IT teams and MSPs, while the more expensive plans ($89 and $449 per month) are aimed at SMEs and large-scale businesses.

If you’ve got any questions about their products, or you want to discuss signing up for a free 30-day trial, you can contact them via a web-form or at sales@site24x7.com.

By Dan Price

CloudTweaks Comics
Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks Does cloud security risks ever bother you? It would be weird if it didn’t. Cloud computing has a lot of benefits, but also a lot of risks if done in the wrong way. So what are the most important risks? The European Network Information Security Agency did extensive research on that, and…

Mobile Connected Technologies – The Future Of The Healthcare Industry

Mobile Connected Technologies – The Future Of The Healthcare Industry

Mobile Connected Technologies Clinics, hospitals, and other healthcare facilities are embracing new mobile technologies in order to be more efficient in their daily tasks. With faster communication and better collaboration, clinicians can spend much less time handling medical devices and more time administering care to their patients. Industry experts are stating that mobile connected technologies…

Will Your Internet of Things Device Testify Against You?

Will Your Internet of Things Device Testify Against You?

Will Your Internet of Things Device Testify Imagine this:  Your wearable device is subpoenaed to testify against you.  You were driving when you were over the legal alcohol limit and data from a smart Breathalyzer device is used against you. Some might argue that such a use case could potentially safeguard society. However, it poses…

5 Reasons Why Your Startup Will Grow Faster In The Cloud

5 Reasons Why Your Startup Will Grow Faster In The Cloud

Cloud Startup Fast-tracking Start-ups face many challenges, the biggest of which is usually managing growth. A start-up that does not grow is at constant risk of failure, whereas a new business that grows faster than expected may be hindered by operational constraints, such as a lack of staff, workspace and networks. It is an unfortunate…

Low Cost Cloud Computing Gives Rise To Startups

Low Cost Cloud Computing Gives Rise To Startups

Balancing The Playing Field For Startups According to a Goldman Sachs report, cloud infrastructure and platform spending could reach $43 billion by 2018, which is up $16 billion from last year, representing a growth of around 30% from 2013 said the analyst. This phenomenal growth is laying the foundation for a new breed of startup…

Cloud Infographic – What Is The Internet of Things?

Cloud Infographic – What Is The Internet of Things?

What Is The Internet of Things? “We’re still in the first minutes of the first day of the Internet revolution.”  – Scott Cook The Internet of Things (IOT) and Smart Systems are based on the notions of Sensors, Connectivity, People and Processes. We are creating a new world to view and measure anything around us through…

Explosive Growth Of Data-Driven Marketing

Explosive Growth Of Data-Driven Marketing

Data-Driven Marketing There is an absolute endless amount of data that is being accumulated, dissected, analyzed with the important bits extracted and used for a number of purposes. With the amount of data in the world has already reached into multiple zettabytes annually. A Zettabyte is one million petabytes or one thousand exabytes. With data…

Cloud Infographic: Programming Languages To Build Your Cloud

Cloud Infographic: Programming Languages To Build Your Cloud

Programming Languages What programming languages are the building blocks to help develop and facilitate these present and future cloud platforms? Where can we learn and develop these skills in order to help us build our own careers? A couple of options would be to visit sites such as Stackoverflow which can provide you with a good source of information.…

5 Predictions For Education Technology

5 Predictions For Education Technology

Education Technology Although technology has fast influenced most sectors of our world, education is an area that’s lagged behind. Many classrooms still employ the one-to-many lecturing model wherein the average student is catered for while a few are left behind, and others bored. Recently, there’s been a drive to uncover how to use technology successfully…

Cloud Computing – The Good and the Bad

Cloud Computing – The Good and the Bad

The Cloud Movement Like it or not, cloud computing permeates many aspects of our lives, and it’s going to be a big part of our future in both business and personal spheres. The current and future possibilities of global access to files and data, remote working opportunities, improved storage structures, and greater solution distribution have…

Your Biggest Data Security Threat Could Be….

Your Biggest Data Security Threat Could Be….

Paying Attention To Data Security Your biggest data security threat could be sitting next to you… Data security is a big concern for businesses. The repercussions of a data security breach ranges from embarrassment, to costly lawsuits and clean-up jobs – particularly when confidential client information is involved. But although more and more businesses are…

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Federal Government Cloud Adoption No one has ever accused the U.S. government of being technologically savvy. Aging software, systems and processes, internal politics, restricted budgets and a cultural resistance to change have set the federal sector years behind its private sector counterparts. Data and information security concerns have also been a major contributing factor inhibiting the…

Are Cloud Solutions Secure Enough Out-of-the-box?

Are Cloud Solutions Secure Enough Out-of-the-box?

Out-of-the-box Cloud Solutions Although people may argue that data is not safe in the Cloud because using cloud infrastructure requires trusting another party to look after mission critical data, cloud services actually are more secure than legacy systems. In fact, a recent study on the state of cloud security in the enterprise market revealed that…

Achieving Network Security In The IoT

Achieving Network Security In The IoT

Security In The IoT The network security market is experiencing a pressing and transformative change, especially around access control and orchestration. Although it has been mature for decades, the network security market had to transform rapidly with the advent of the BYOD trend and emergence of the cloud, which swept enterprises a few years ago.…

Lavabit, Edward Snowden and the Legal Battle For Privacy

Lavabit, Edward Snowden and the Legal Battle For Privacy

The Legal Battle For Privacy In early June 2013, Edward Snowden made headlines around the world when he leaked information about the National Security Agency (NSA) collecting the phone records of tens of millions of Americans. It was a dramatic story. Snowden flew to Hong Kong and then Russia to avoid deportation to the US,…

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt. Even so, decision makers should not put off moving from old legacy systems to…

Disaster Recovery – A Thing Of The Past!

Disaster Recovery – A Thing Of The Past!

Disaster Recovery  Ok, ok – I understand most of you are saying disaster recovery (DR) is still a critical aspect of running any type of operations. After all – we need to secure our future operations in case of disaster. Sure – that is still the case but things are changing – fast. There are…

Protecting Devices From Data Breach: Identity of Things (IDoT)

Protecting Devices From Data Breach: Identity of Things (IDoT)

How to Identify and Authenticate in the Expanding IoT Ecosystem It is a necessity to protect IoT devices and their associated data. As the IoT ecosystem continues to expand, the need to create an identity to newly-connected things is becoming increasingly crucial. These ‘things’ can include anything from basic sensors and gateways to industrial controls…

How To Overcome Data Insecurity In The Cloud

How To Overcome Data Insecurity In The Cloud

Data Insecurity In The Cloud Today’s escalating attacks, vulnerabilities, breaches, and losses have cut deeply across organizations and captured the attention of, regulators, investors and most importantly customers. In many cases such incidents have completely eroded customer trust in a company, its services and its employees. The challenge of ensuring data security is far more…