Category Archives: Technology

Hot Emerging Trends – The Pizza Delivery Drone

Hot Emerging Trends – The Pizza Delivery Drone

The Pizza Delivery Drone

Recently, drone delivery systems have been discussed as future state plans for many vendors. I personally think the traceable IoT-based pizza delivery drone is the next big thing. If you think about it, your pizza makes it to your house much faster. Since the drone carries both a temperature and a GPS locator, you can tell how hot your pizza is and where exactly your pizza is. Traffic issues will no longer result in a soggy crust topped with lukewarm cheese disappointing your now-ravenous family after an hour-plus wait.

From the vendor perspective, there are risks, of course. The first is the pizza drone theft. Once the drone is airborne, you still have to control it. That control can be compromised and the pizza intercepted. Or worse yet, the more expensive drone would likely be stolen, as well.


Now you could change the way the drone operates. Layout simple GPS-based maps to the homes of your customers and then allow the drone to fly autonomously to the customer’s address. Once the pizza is removed from the drone, you then have the customer hit the home button and away the drone goes. In between, the drone avoids objects based on its sensors, but doesn’t have remote control options. It would also be able to have a beacon if it was unable to complete its flight home for some reason.

To reduce the overall cost, the pizza place would need to operate a number of drones (based on restaurant volume and density of population). The fleet would probably split its resources with ¼ of the drones at the pizza shop charging and waiting for deliveries. ¼ of the drones would be stopped at the customer’s location, and ¼ of the drones on the way to the customer. The final ¼ of the drones would be on the way back from the customer. Each drone would have three batteries: One charging, one already charged, and one on the drone so that they could all stay in the air. I suspect the requirement for the pizza drone system would be like the days of messenger pigeons: The “drone coop” would need to be on the roof of the building away from where people walk and eat.

If your delivery range (furthest customer from the store) is less than 10 miles, the drones would be extremely effective and faster than cars. They can fly straight to the destination like the proverbial crow, something cars cannot do. It would also allow you to expand your delivery zone. The deluxe drone could include a way to maintain the temperature of the pizza while flying greater distances.

Beyond pizza, there could also be the late night craving delivery via drone as well. They are pretty quiet in flight and could deliver those special cravings any time of the day or night without disturbing the neighbors. Back in the day, we used to have to get dressed, get in the car and head to the store for those pickle and peanut butter snacks. This way, we hop on the web and then wait on the back porch for the drone. For Apartments and Hotels, the drones would probably head toward the main entrance and text you when they arrive.

Reality says that this would have to be a credit card-driven industry. That means that since we would be using our cards more, the vendors having this service would need to do a better job of making sure those card numbers were safe. One possible solution would be integrating a secure card reader in the actual drone that would confirm receipt of the midnight snack or the pizza. Upon the door closing, the drone would route home.

This drone delivery model isn’t that far away. The drones available right now can carry upwards of 1-2 pounds. It’s a small peanut butter and a small pizza, but it isn’t that far off from drones with greater capacity. Imagine ordering pizza from your favorite place and when the kids say “where’s the pizza” you simply point at the map on your phone and say “right there.”

By Scott Andersen

The Future Of Cybersecurity

The Future Of Cybersecurity

The Future of Cybersecurity

In 2013, President Obama issued an Executive Order to protect critical infrastructure by establishing baseline security standards. One year later, the government announced the cybersecurity framework, a voluntary how-to guide to strengthen cybersecurity and meanwhile, the Senate Intelligence Committee voted to approve the Cybersecurity Information Sharing Act (CISA), moving it one step closer to a floor debate.

Most recently, President Obama unveiled his new Cybersecurity Legislative Proposal, which aims to promote better cybersecurity in information-sharing between the United States government and the private sector. As further support, The White House recently hosted a Summit on cybersecurity and consumer protection at Stanford University in Palo Alto on February 13, 2015 which convened key stakeholders from government, industry and academia to advance the discussion on how to protect consumers and companies from mounting network threats.

No doubt we have come a long way, but looking at the front-page headlines today reminds us that we’ve still got a long ways to go. If the future if going to be different and more secure than today, we have to do some things differently.

I recently participated on a panel titled “The Future of Cybersecurity” at the MetricStream GRC Summit 2015, where I was joined on stage by some of today’s leading thinkers and experts on cybersecurity; Dr. Peter Fonash, Chief Technology Officer Office of Cybersecurity and Communications, Department of Homeland Security; Alma R. Cole, Vice President of Cyber Security, Robbins Gioia; Charles Tango, SVP and CISO, Sterling National Bank; Randy Sloan, Managing Director, Citigroup; and moderator John Pescatore, Director of Emerging Security Trends, SANS Institute.

The purpose of this panel was to convene a diverse group of experts who believe in a common and shared goal – to help our customers, companies, governments and societies become more secure. This panel followed on the heels of a keynote address by Anne Neuberger, Chief Risk Officer of the NSA, who spoke about a simple challenge that we can all relate to: operations. Speaking on her experience at the NSA, Neuberger articulated that a lot of security problems can be traced back to the operations, and more precisely, this idea that ‘we know what to do, but we just weren’t doing it well’ or ‘we had the right data, but the data wasn’t in the right place.’

Moderator John Pescatore from SANS Institute did an exceptional job asking the questions that needed to be asked, and guiding a very enlightening discussion for the audience. For one hour on stage, we played our small part in advancing the discussion on cybersecurity, exploring the latest threats and challenges at hand, and sharing some of the strategies and solutions that can help us all become more secure.

Here are the five key takeaways that resonated most.


Topic 1: Threat information sharing tends to be a one-way street. There is an obvious desire from the government to get information from private industry, but a lot more needs to be done to make this a two-way street.

According to Dr. Peter Fonash, Chief Technology Officer at the Office of Cybersecurity and Communications at the Department of Homeland Security, the DHS is looking to play a more active role in threat information sharing. To that end, the DHS is actively collecting a significant amount of information, and even paying security companies for information, including the reputation information of IP addresses. However, some challenges faced when it comes to the government being able to participate in sharing that threat information is in getting that information as “unclassified as possible” and second, lots of lawyers involved in making sure that everything that is shared is done so in a legal manner. Dr. Fonash stressed that government faces another challenge; private industry thinking that government is in some way an advisory or industry competitor when it comes to threat information – this is simply not the case.

Topic 2: There are lots of new tools, the rise of automation, big data mining – but the real challenge is around talent.

Simply stated, our organizations need more skilled cybersecurity professionals than what the currently supply offers. For cybersecurity professionals, it is a great time to be working in this field – job security for life, but it is a bad time if you are charged with hiring for this role. Automation and big data mining tools can definitely help when they are optimized for your organization, with the right context and analysts who can review the results of those tools. According to Alma R. Cole, Vice President of Cyber Security at Robbins Gioia, in the absence of the skill-sets that that you aren’t able to find, look internally. Your enterprise architecture, business analysis, or process improvement leaders can directly contribute to the outcome of cybersecurity without themselves having a PHD in cybersecurity. While cybersecurity experts are needed, we can’t just rely on the experts. Cole makes the case that as part of the solution, organizations are building security operations centers outside of the larger city centers like New York and DC – where salaries aren’t as high, and there isn’t as much competition for these roles. Some organizations are also experimenting with virtual security operations centers, which provide employees with flexibility, the ability to work from anywhere, and improved quality of life, while also providing the organization with the talent they need.

Topic 3: We are living and doing business in a global economy – we sell and buy across the world and we compete and cooperate with enemies and business partners around the world. We are trying to make our supply chains more secure but we keep making more risky connections.

According to Charles Tango, SVP and CISO at Sterling National Bank, this might be a problem that gets worse before it gets better. We’ve seen a dramatic increase in outsourcing, and many organizations have come to realize that the weakest link in the chain is oftentimes their third party. At this moment in time, as an industry, banks are largely reactionary, and there’s a lot of layering of processes, people and tools to identify and manage different risks across the supply chain. The industry needs a new approach, wherein banks can start to tackle the problem together. According to Tango, we won’t be able to solve this challenge of managing our third and fourth parties on an individual bank-by-bank basis; we have to start to tackle this collaboratively as an industry.

Topic 4: No doubt, the future of applications is changing dramatically, and evolving everyday – just look at the space of mobile computing.

According to Randy Sloan, Managing Director at Citigroup, from a dev-ops automation perspective, if you are introducing well-understood components and automation such as pluggable security – you are way out in front, and you are going to be able to tighten things up to increase security. More challenging from an app-dev perspective is the rapidness – the rapid development and the agile lifecycles that you have to stay up with. The goal is always to deliver software faster and cheaper, but that does not always mean better. Sloan advocates for balance – investing the right time from an IS architecture, to putting the right security testing processes in place, and focusing on speed – slowing things down and doing things a more thoughtfully.

Topic 5: We’ve got dashboards, and threat data, and more sharing than ever before. But what we need now are more meaningful approaches to analytics that aren’t in the rear view mirror.

I believe over the next few years, organizations will be more analytics driven, leveraging artificial intelligence, automation, machine learning and heuristic-based mechanisms. Now the challenge is figuring out how to sustain it. This is the value of an ERM framework where you can bring together different technologies and tools to get information that can distilled and reported out. This is about managing and mitigating risk in real time, and intercepting threats and preventing them from happening rather than doing analysis after the fact.

We live in an increasingly hyper-connected, socially collaborative, mobile, global, cloudy world. These are exciting times, full of new opportunities and technologies that continue to push the boundaries and limits of our wildest imaginations. Our personal and professional lives are marked by very different technology interaction paradigms than just five years ago. Organizations and everyone within them need to focus on pursuing the opportunities that such disruption and change brings about, while also addressing the risk and security issues at hand. We must remember that the discussions, strategies, and actions of today are helping to define and shape the future of cybersecurity.

By Vidya Phalke, CTO, MetricStream

Are You Sure You Are Ready For The Cloud: Type of Cloud

Are You Sure You Are Ready For The Cloud: Type of Cloud

Type of Cloud

Continuing this theme on “Are you ready for the Cloud”, we are going to move forward with a new question: What type a cloud? That can be encompassed with many different connotations. It could mean it’s going to be hosted by a provider, or is it going to be an on-prem cloud? It is it going to be managed or is it un-managed? It could be a PRODUCTION cloud or is it a DR or a DEV cloud?

Today, we are going to talk about a new type, one based on Function. How is that possible? Some clouds are actually designed for a specific purposed. It’s easy to say that it is a PRODUCTION cloud or a DEV/TEST instance. But what if you need a cloud to perform a certain way for only the applications that get assigned to it?

dev testing

Take this for example: You need a cloud that can out perform your other clouds because you are going to move some high IOP dependent systems to it? What do you do? How do you build that cloud out to support what you need?

First, you need to have control over the whole cloud, not just the instances. You can create individual high-speed instances if you use a hosted automated system, but if it is going to be an on-prem cloud, then you will need to have control over it.

High Speed


The high-speed cloud is broken down several ways, but we are going to focus on five: Compute, RAM, Network, Local Disk and Remote Disk. So, lets break it down:

  • Compute is easy it’s your CPU Cores. At one time we would just say CPUs, but now a days we know the Cores are everything
  • RAM is RAM. The faster you get, the better you are
  • Networking is HUGE! If you are building your cloud in a corporate environment, you should be taking advantage of the fastest network speeds available. If you have 10GB Ethernet connections, then you should be using a few for every Compute Node.
  • Local Disk is important also. Not only do you need it for your OS, but also for direct storage capability. Local storage is handled differently, so you should be aware of that. If you 15K disks spinning in the Compute nodes; that will give you a great start for simple storage, as long as you watch the sizes of the volumes you create.
  • Remote disk can be a corporation’s saving grace. Local storage can fill up fast, and you are limited to the number of IOPS your disks can spit out. Going with a remote flash/SSD system can get you upwards of 500K IOPS per second. WAY faster than a local disk.

So you start assembling your cloud. You take several Compute nodes, and you load them up with CPU cores, RAM and high-speed local disks. Then you have your Compute nodes attach to the remote disks (or sub systems) to create and provide the big, fast chunks of storage.

Now, when you send your high-speed applications to it, you separate out the application servers and database servers within the cloud so the Compute nodes can focus on running one task, instead of hundreds.

Can you add other server instances to the Compute nodes? Yes, but make sure you do some system tests first to make sure you are not using to much horsepower for one application, and then starving another.

(Image Source: Shutterstock)

By Richard Thayer

Survey Says SaaS Help Desk Is A Must-Have

Survey Says SaaS Help Desk Is A Must-Have

SaaS Help Desk Is A Must-Have

Cloud based help desk software lets you automate the management of your IT support, customer support, and IT assets. And it’s moving beyond its nice-to-have status and fast becoming a must-have application for companies of all sizes. That’s what my company, ManageEngine, discovered in a recent user survey — and that discovery means different things to different people. We’ll look at the survey revelations and ramifications below, but here’s the key takeaway: Put the right help desk in place, and you will reap substantial rewards in productivity and end-user satisfaction.

Sure, that takeaway might not sound surprising, especially when it comes from a vendor of help desk software, which we are. But here’s the thing. Help desk rewards continue to elude a lot of people.

Some still use email and spreadsheets to manage their end users’ support requests. And others took the plunge and adopted a help desk app only to regret the decision and wish they were still using email and spreadsheets. Why? Because their help desk software creates a big pain in the butt — usually because the software is bulky and bloated, requires extensive customization and results in rigid processes that reduce productivity.

Both groups are missing out on the benefits that the right, purpose-built help desk software confers. And when they don’t see those benefits, they don’t see the necessity of help desk software. Well, if you find yourself in either camp — or just generally dissatisfied with your current help desk solution — our survey results will open your eyes. The rewards are out there.

Numbers Reveal Help Desk Necessity


For context, the survey was conducted among organizations that deployed our help desk software, ServiceDesk Plus Standard Edition after we made it completely free in March 2014. The Standard Edition includes ITIL incident and knowledge management, and it has no restrictions on the number of technicians, tickets or users — all of which makes it appealing to a lot of first-time users as well as users looking to switch to a more suitable solution.

Now let’s see what the survey respondents reported:

  • 98 percent improved help desk productivity and attained incident management maturity
  • 95 percent saw a significant increase in end-user satisfaction levels
  • 71 percent were able to measure performance by identifying and tracking key metrics
  • 47 percent implemented a help desk software solution for the first time ever
  • 39 percent implemented a knowledge base for the first time ever
  • 20 percent use help desk software beyond IT in other departments such as HR, travel, and maintenance and facilities

Those are all phenomenal numbers, especially the improvements in help desk productivity (98 percent) and end-user satisfaction (95 percent). Email and spreadsheets will only take you so far. To take productivity and satisfaction to the next level, you need a true help desk that has the self-service portal, service level agreement management, help desk reports and related features to enhance work flows and business processes.

Additional Findings

Just as promising is the finding that 71 percent of the respondents are positioned for additional performance gains by virtue of their ability to identify and track key metrics. Simply put, being able to measure their performance better enables them to improve it over time.

In one of the survey’s more striking findings, 47 percent of the respondents reported that they adopted IT help desk software for the first time. When you consider the relative maturity of the help desk market, this result reveals that there’s still a lot of unmet demand from people who have yet to invest in a help desk app.

Clearly, though, users don’t want just any solution. After all, vendors have been offering free versions of their help desks for years, yet we still have this huge demand. No, users want the right solution, and that usually means the right mix of features, usability and cost.

CloudTweaks - Helpdesk Comic

More than one in three respondents (39 percent) implemented a knowledge base for the first time ever. This is a perfect example of opening users’ eyes to help desk advantages that they previously didn’t know. A knowledge base is pivotal to productivity and satisfaction gains because it establishes a repository of solutions that can be accessed instantly by users and help desk technicians alike.

Finally, 20 percent of the survey respondents reported that they use help desk software beyond the IT department to handle the support duties in other departments such as HR, travel, and maintenance and facilities. We believe this is the future of the help desk, which will expand beyond IT to centralize the management of services provided to any department. In any given department, the help desk will automate, track, measure, monitor, alert and/or report on tasks and activities that were previously done by hand, by a patchwork of general purpose apps (think email and spreadsheets) or by purpose-built apps that are data silos.

If you’ve been waiting to roll out a dedicated help desk, now would be a good time to make your move. The help desk has become a must-have, and you’ve got more options and more opportunities than ever before. Adopt the right help desk, and your business, your help desk techs and your end users will all thank you.


Raj Sabhlok_2_140224By Raj Sabhlok

Raj is the president of ManageEngine and, both divisions of Zoho Corp. Raj has particular interest in IT management software and its power to change the fortunes of a business when implemented effectively. Prior to Zoho, Raj spent nearly 20 years working with some of the world’s most innovative technology companies including Embarcadero Technologies, BMC Software and The Santa Cruz Operation (SCO). In his career, he has held technical, marketing, sales and executive management positions within the enterprise software industry. Raj has a bachelor’s degree in mathematics from the University of California, Santa Cruz and an MBA from Duke University’s Fuqua School of Business. He lives in Silicon Valley with his wife and four boys – and his iPhone.

The Future of M2M Technology & Opportunities

The Future of M2M Technology & Opportunities

The Future Of The Emerging M2M

Here at CloudTweaks, most of our coverage is centered around the growing number of exciting and interconnected emerging markets. Wearable, IoT, M2M, Mobile and Cloud computing to name a few. Over the past couple of weeks we’ve talked about Machine to Machine (M2M) such as the differences between IoT and M2M as well as the The Job Future of M2M. And today we will look a little bit further into the opportunities available.

But first, here is a tremendous and well crafted visual produced by the group at ImperialTechForesight from Insect Burger Food Vans to Smart Dust Sensor Networks.


Click For Larger Version

The future of this technology will depend in large part on the advances that are made in the next few years. However, there are many industries which will benefit from having this technology and will expand to exploit it more readily.


This may be the industry that is most affected by M2M for a couple of reasons. Currently, this technology is being used in very creative ways, including having sensors for patients (which we will be discussing in the next day or two) who are vulnerable to having a heart attack. The sensor can detect the signs of an impending cardiac arrest and contact emergency medical services.


In addition, the aging baby boomer population is creating a wealth of new jobs in the health care industry. This massive growth combined with the advancing technology of M2M will means far more opportunities for the jobs related to M2M in the future.


Another area where M2M will have a great impact is in the consumer industry where the concepts found in many science fiction films are now becoming reality. When your refrigerator detects that you are low on certain foods and beverages, it will contact your local supermarket and order what is needed and then pay for it using a credit or debit card number.

This type of system can be used with any products that are consumed or used in the home and need to be replaced. The consumer can control the ordering process by having restrictions placed, but the convenience is a great advantage. There will be a growing market of jobs in the M2M field when it comes to consumer-related goods and services.

Indoor Environmental Monitoring

The HVAC industry may soon undergo another revolution that is on par with the introduction of air conditioning to the home. However, the use of M2M will expand beyond just HVAC and involve all the energy applications found within commercial buildings, facilities and perhaps residential homes as well. Recently, Google I/O experimented at the San Francisco Moscone Center by placing over 4,000 environmental sensors that monitored the temperature, air quality, pressure, motion, light and level of noise all through the conference that they held in the structure. The result was a very strong gain in energy efficiency that improving technology will only capitalize on in the near future.

The heavy service based industries for example have relied on field personnel to make troubleshooting visits that can now be replaced with M2M technology that not only senses what adjustments to make, but offers early indications of what machinery may break down so it can be addressed before a major breakdown occurs. Here, those in the HVAC industry along with electricians will gain considerably through the use of this technology.

By Brent Anderson

Basic Cloud Risk Assessment Tips

Basic Cloud Risk Assessment Tips

Basic Cloud Risk Assessment

You should worry about the risks of cloud computing. But don’t get too scared. With a few simple steps you can easily get a basic understanding of your risks in the cloud and even have a good start in managing these risks.

If you are a large corporation in a regulated industry, a cloud risk assessment can take weeks or months of work. But even that process starts from simple principles.

Oddly enough, I think any risk assessment of a cloud plan should start with the benefit you are expecting from the cloud service. There are two reasons for that. First, the benefit determines the risk appetite. You can accept a little risk if the benefit is large enough. But if the benefit is small, why take any chances?

The second reason is that not realizing the benefit is a risk as well.


For example, if there is a choice between running your CRM system in-house versus in the cloud, you might find that it takes too long to set up the system in-house and it won’t be accessible by sales people in the field. The cloud system will be quicker to deploy and easier to access from outside your company, so the benefit can be realized quicker.

Pretty essential in any cloud risk assessment is figuring out what the data is that you want to store in the cloud. Most of the cloud risk management is built on that pillar.

Pay particular attention to data that identifies persons, log files, credit card numbers, intellectual property, and anything that is essential to the conduct of your business. You can easily guess what this means for a CRM system: customers, proposals, contact details.

The second question then is:

What do you want to do with that data?

How is the cloud provider giving you access to that data? Is the access convenient enough, can you get the reports that you need? In this step you sometimes need to revisit the previous step. For example as you do your reports you figure out that you not only stored customer orders in the cloud, but also your product catalog. So add that to the data that you should worry about.

Once you have a clear idea of the data and the functionality, you can start looking at the value at risk.


Beginning with the data, think about what the worst thing is that can happen to the data. What about it getting lost, or falling into the hands of the wrong people? What about the chance that it is changed without you knowing (maybe by a colleague who happens to have too many access rights)? In my experience, people overestimate the risk of the cloud provider leaking your data, and underestimate the risk of internal people leaking your data.

Similarly, what happens to the business if the data or the reports are not available for some period of time? How long can your business get by without having full access to the data? In the worst case the provider goes out of business. Can you survive the time it takes to set up a new service?

With that general picture in your head, you can start looking at the threats. The top risks are that the cloud provider fails to deliver, and that the cloud provider leaks information.

A little more subtle are the cases where you think they should be doing something, but they don’t. If you use IaaS, you may think that the cloud provider is patching your operating systems. Typically, they don’t. And any backup that the cloud provider makes does not protect you from a provider going out of business. So you want to review your assumptions on who takes care of which risk.

If anything, you should think about which data you still want to use after you stop working with that cloud service. This is easier to do before the cloud provider runs into trouble. Regular data extraction can be fairly simple. If your provider does not make that easy, well, maybe they should not be your provider.

Further reading? The European Network and Information Security Agency (ENISA) has produced a very good list of cloud risks. (See my earlier blog: I also produced a brief video on that, search for “ENISA top 8 risks” and you will find it on YouTube. For risk assessment purposes I have also created a brief risk triage worksheet. You can get that by signing up to my cloud newsletter at

(Image Source: Shutterstock)

By Peter HJ van Eijk

Cloud Infographic – Internet of Things (IoT) Will Be Top Technology Investment

Cloud Infographic – Internet of Things (IoT) Will Be Top Technology Investment

Internet of Things (IoT) Top Technology Investment

Investors are jumping all over the opportunities abound when it comes to the Internet of Things and Big Data. There is simply way too much money at stake to ignore the potential that is going to truly define how we live and do business in the future.

IDC predicts: “A transformation is underway that will see the worldwide market for IoT solutions grow from $1.9 trillion in 2013 to $7.1 trillion in 2020. IDC defines the Internet of Things as a network of networks of uniquely identifiable endpoints (or “things”) that communicate without human interaction using IP connectivity – be it “locally” or globally.”

As markets for wearables, smart TVs, connected cars and the smart home begin to mature, the venture capitalists are sensing the time for them to take the plunge is ripening. “The connected car and home are as big an opportunity as the connected phone,” said Venky Ganesan, a managing director at Menlo Ventures.

Included is an infographic by IDG which explores IOT adoption.


The Job Future of Machine To Machine (M2M)

The Job Future of Machine To Machine (M2M)

The Job Future of M2M

Machine to Machine (M2M) which is associated with the Internet of Things is arguably the fastest growing area of technology over the past five years. What is still a relatively new concept in a number of fields has grown remarkably in that time and is estimated to reach upwards of $90 billion by 2017.

The concept of M2M is fairly simple, connecting millions of devices together through a network that ranges from heart monitors, vending machines, appliances, building environmental systems and so on. Pretty much everything with software or sensors that can report information to other devices is all part of the M2M technology.

M2M Jobs

To be fair, M2M and the Internet of Things (IoT) are inter-related, but not necessarily the same. Actually, M2M goes back several decades to the telecommunication industry where machines were networked with machines as the phone company expanded its services to businesses and the general public. IoT is an outgrowth of M2M.

Thanks to the development of the internet and the foresight of many who saw M2M expanding into many different industries beyond telecommunications, we now live in a world that is being primed for machine to machine technology to expand to far higher levels than what was thought possible two decades ago. However, before this revolution can fully take place the infrastructure must be fully set up and functioning which is where a great deal of M2M job opportunities can be found.

The M2M Job Markets

The relatively simple concept of M2M does require advanced technology to connect. Today, the field of M2M includes industries that monitor devices, activities or changes in status from a distance. In addition, they also include areas where inventory needs to be monitored and can even detect levels of radiation on the increase. The data is sent through an IP network to other applications or devices which can then take the information and analyze it so the appropriate action can be taken.

Currently, M2M does not dominate in a few industries, but rather it can be found in many different applications across a breadth of industries which include those that use the following;

  • Industrial Automation
  • Smart Grid & Smart Cities
  • Heathcare
  • Defense
  • Logistics and More

The technology is mostly used for monitoring and subsequently controlling from a distance depending on the use of M2M. Today, the industries that use this technology for the most part are as follows;

  • Oil & Gas
  • Communications
  • Military
  • Manufacturing
  • Precision Agriculture
  • Public Utilities and more

Currently, these are the major growth industries for M2M technology which has created a myriad of small businesses that supply these areas. For the person who is looking to jump in on machine to machine tech, then these are the places to concentrate, particularly the supporting companies that are providing this to the major industries.

In addition, the same technology is found in data networking, transmission and mobile mesh networking that uses 3G and 4G cellular features. The communications field is focused on developing the networks that allow M2M to exist and branch out so that it can be involved in many other industries where it is currently available in a more limited fashion. So, the job market for this particular technology is also focused on the infrastructure that will allow it to reach out and grow exponentially in the near future.

(Image Source: Shutterstock)

By Brent Anderson

CloudTweaks Comics
Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

Security and the Potential of 2 Billion Device Failures

Security and the Potential of 2 Billion Device Failures

IoT Device Failures I have, over the past three years, posted a number of Internet of Things (and the broader NIST-defined Cyber Physical Systems) conversations and topics. I have talked about drones, wearables and many other aspects of the Internet of Things. One of the integration problems has been the number of protocols the various…

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups Cloud platforms have become a necessary part of modern business with the benefits far outweighing the risks. However, the risks are real and account for billions of dollars in losses across the globe per year. If you’ve been hacked, you’re not alone. Here are some other companies in the past…

Do Not Rely On Passwords To Protect Your Online Information

Do Not Rely On Passwords To Protect Your Online Information

Password Challenges  Simple passwords are no longer safe to use online. John Barco, vice president of Global Product Marketing at ForgeRock, explains why it’s time the industry embraced more advanced identity-centric solutions that improve the customer experience while also providing stronger security. Since the beginning of logins, consumers have used a simple username and password to…

Adopting A Cohesive GRC Mindset For Cloud Security

Adopting A Cohesive GRC Mindset For Cloud Security

Cloud Security Mindset Businesses are becoming wise to the compelling benefits of cloud computing. When adopting cloud, they need a high level of confidence in how it will be risk-managed and controlled, to preserve the security of their information and integrity of their operations. Cloud implementation is sometimes built up over time in a business,…

Connecting With Customers In The Cloud

Connecting With Customers In The Cloud

Customers in the Cloud Global enterprises in every industry are increasingly turning to cloud-based innovators like Salesforce, ServiceNow, WorkDay and Aria, to handle critical systems like billing, IT services, HCM and CRM. One need look no further than Salesforce’s and Amazon’s most recent earnings report, to see this indeed is not a passing fad, but…

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence All businesses need a strategy and processes for governance, risk and compliance (GRC). Many still view GRC activity as a burdensome ‘must-do,’ approaching it reactively and managing it with non-specialized tools. GRC is a necessary business endeavor but it can be elevated from a cost drain to a value-add activity. By integrating…

Having Your Cybersecurity And Eating It Too

Having Your Cybersecurity And Eating It Too

The Catch 22 The very same year Marc Andreessen famously said that software was eating the world, the Chief Information Officer of the United States was announcing a major Cloud First goal. That was 2011. Five years later, as both the private and public sectors continue to adopt cloud-based software services, we’re interested in this…

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Speed, flexibility, and innovation require multiple cloud services As businesses seek new paths to innovation, racing to market with new features and products, cloud services continue to grow in popularity. According to Gartner, 88% of total compute will be cloud-based by 2020, leaving just 12% on premise. Flexibility remains a key consideration, and…


Sponsored Partners