Computing Before And Now – Side-Effects of Technology
It was during the 1970’s we evidenced the beginning of personal computing technologies being introduced by IBM with its first portable computer, Intel with its first microprocessor, and the first Apple computer. Before this period, organizations used to function no different than today, even without the sheer power of computing technologies. Fifty years later, we find ourselves becoming dependent to digital computing. Everything is either run, managed or controlled by computers and sophisticated machines. While the advancement in technology over the last five decades has been instrumental in changing our work environment, it has successfully penetrated our daily life routine.
(Image Source: Apple/Wikipedia)
Generally speaking, just like any other tool, the primary purpose of personal computing is to perform complex manual tasks in a faster and more accurate manner. However, as noted earlier, with the passage of time, inadvertently, technology has become an inseparable part of our personal lives. For example, daily use of tablet-PC, mini-computers, android devices – all with immense computing power are constantly used us, reflecting dependency in its true sense. Researchers are now beginning to reveal the consequences of adopting this technology with adverse effects on general health. For instance, according to Vibrant Life “Repetitive motion from typing at a computer keyboard and using a mouse can cause carpal tunnel syndrome. Awkward postures create neck and back problems in children, just as with adults. These problems can interfere with a child’s sleep, performance in school, and the discomfort can create negative attitudes”. Likewise, according to WebMD, eye related problems known as Computer Vision Syndrome (CVS) may develop by staring at a computer screen for hours, and research shows that 50% to 90% working at computer screens for prolonged hours develop symptoms of eye related problems. Despite innumerable benefits offered by computing technology, the important distinction that needs to be made is the ‘proper’ usage of such technology.
The newest wave in technological advancement perhaps is ‘cloud computing’. Working as a ‘pay-as-you-go’ model, cloud computing allows remotely using computers and servers hosted on the Internet to store, back-up, manage, and process data, instead of using your own hardware or software. This innovative advancement allows further dependency on usage of technology due to its ease of use function, not to mention reduction in costs for corporations and individuals. In other words, cloud computing is integrating not only how we work but has opened many new avenues. Unlike physical borders among countries, cloud computing is boundary-less. Corporations are able to reach a broader customer base without having to invest in physical IT infrastructure in parts of the world they intend to do business. One of the foremost advantage cloud computing offers is the availability of skilled human resource throughout the world.
Having said that, our dependency on technology in general continues to grow as it advances towards even greater innovations. As we become accustomed to this, we must ensure striking the right balance between using technology as a tool and allowing it to engulf our daily lives. Undeniably, the future promises new innovations and advancements with greater penetration and adoption of technology in various industries as well as spheres of society. Nevertheless, with evident side-effects of technology, we must ensure adequate and ‘proper’ use of this tool.
By Syed Raza
- InformationWeek Reveals Top 125 Vendors Taking the Technology Industry by Storm - September 27, 2016
- SWIFT Says Bank Hacks Set To Increase - September 26, 2016
- Automated Application Discovery Introduced By Savision At Microsoft Ignite 2016 - September 26, 2016
- The Rise Of Threat Intelligence Sharing - September 22, 2016
- Big Data and AI Hold Greatest Promise For Healthcare Technologies - September 21, 2016