by Rick Bretz
In the movie “Six Days, Seven Nights” Harrison Ford tells Anne Heche, “we’re going to be here for a long,long.long…..long,…..long,,,,,,,,long time.” Well, that’s where we are with information technology and the Cloud Computing environment and this technology wasn’t built in six days and seven nights. Computers have been around since mechanical computers were invented such as the Abacus and the Babbage Engine concept and have steadily infused themselves in its many forms since then.
From the Abacus to the SMART 8K Television. Computers and their handy friends, the microchips that help create the SMARTness of the “Internet of Things”, are all around us. The microprocessor jump-started the imaginations of engineers and inventors in the years afterward. This is the computing power along with size and heat requirement that got the whole Internet of Things started.
It’s in the mobile phone, the electric car, the hybrid vehicle, and even gas-powered transportation. It’s in our airplanes, trains, hotel rooms and are a big part of our security posture with the cameras on houses and businesses. It has let us get cash from ATMs and buy stuff from grocery and mall stores by sliding in our computer chip credit or bank card when we just can’t get to the ATM.
Once we get into the house, the Information technology stuff lives in our security system, our home network, our SMART refrigerators, our high tech televisions. It has even let bloggers reach thousands of people at the creation of blog site on WordPress.
The security that protects all of our data in our hard drives and CPUs has lagged behind historically but has not stalled at a red light on the information superhighway. A Black Hat hacker has to only be right once while network security defenses and protocols have to defend all attacks, much like terrorist going after soft targets. That area of Technology is trying to keep up but it is a wall chat constantly renovated as security professionals keep stacking the bricks but the territory gets longer and wider. All the while, the buying public has purchased and purchased, waiting for the next, biggest and smallest and fastest with all of the bright colors associated with it.
The law is based on Moore’s prediction that the number of components on a computer chip would double every two years. More accurately, Moore predicted that the number of transistors placed on a single square inch of an integrated circuit chip would double every two years.
With all the “Whiz Bang” stuff out there, what’s really important is not the latest and greatest from a historical standpoint. The worthy topic is a discussion about value. In another string of words: “How much did people pay for the latest hot shot device and how many forked over the cash?” The follow up question would be ”how long did the cash light set have to wait before they could get it.” Let take a look.
For several years, the power and features of computers increased while the cost dropped at a rapid cycle, some people said 18 to 24 months where costs dropped and computer specifications raised to another level. That concept is obsolete now because companies need more time to develop ideas, test concepts and generally raise the technology bar.
According to a USA Today article in June of 2018, the price of an HP 3000 in 1971 costs 95 thousand dollars. In 1977 an Apple II computer costs 1298 dollars, adjusted for inflation would be more than 5300 dollars.. In 1990 a Commodore VIC costs about 300 dollars, adjusted for inflation it would be a little over 900 dollars. These prices were for computers with considerably less computer power and speed of today’s models. Today, a consumer can get a laptop, PC or even a SMART phone for a few hundred dollars and be able to run apps, software and use the speed and power of the device that people could only wish in 1971 and 1980.
Today, you can access the world wide web from any device from just about anywhere where the infrastructure exists. That brings us to the age of Cloud Computing.
Technology seeks the most efficient form, unless otherwise constrained. Efficient form is defined qualitatively as one that is best adapted to its application or as one with the least number of problems.
This means that the technology evolution has brought us to this point where we can go get any document and work any any place we want to work, be it at the beach, the cafe, hotel room, or hotel lobby. Today you don’t have to be in the “Office” to finish a job you started in the “Office.” To bring the idea home, you don’t have to start working in an office when beginning a new job today,
Cloud Computing serves as the next step on the timeline from local area networks, storage and infrastructure and desktop applications residing on local hard drive to accessing personalized desktops, data files and applications while sitting at your favorite coffee shop while the power of the Cloud goes out does the work for you.
Since this began with a movie reference, here are two more to ponder. Two movies recently released reveal the potential for technology. The movie “Lucy” showed how integrating computer technology and information with human beings can bring dangerous, unintended consequences in the wrong hands but in the right hands can be a force for good.
The other movie aired on Netflix and was called, “Extinction.” You have to decide who is doing the dirty work and who is being selected for extinction. The movie centered around Artificial Intelligence and how far humans are prepared to advance robot technology in 100 or more years from now. Without spoiling the movie’s plot and conclusion, it surprises you when trying to decide who are the protagonists and who are the evil doers and whether there are any categories at all.
Article on the history of computer prices.
Here’s an excellent computer timeline that outlines when each type of computer was introduced to the scientific community and then the buying public.