Sonicu's Blog

Timeline to remote temperature monitoring: Where are you?

Written by Jim Mayfield | Jul 8, 2016 8:41:04 PM

 

 

 

 

 A lot has happened since the good doctor Clifford noodled-up the first analog thermometer in the 19th century. 

There's that little incident at Kitty Hawk where man broke the bonds of gravity leading to a Moon Shot or two; the splitting of an atom that changed the way we see ourselves; and a host of other inventions, discoveries and revolutions from industrial to digital.

In today's digital world, remaining the same in business is simply not an option: The more things change, the more one probably should change the way things are done.

Unless you're using automated, wireless temperature monitoring and sensors paired to a cloud-based platform, you might as well be using a version of Clifford's analog thermometer. Truth is, you essentially are; it just takes less time to register.

Where are you on the temperature monitoring time line?

1866 – English physician Sir Thomas Clifford Allbutt develops first analog-based clinical thermometer that displays body temperature in five minutes.   

1973 – Royal Medical Corp. granted U.S. patent for first digital medical thermometer that revolutionizes temperature monitoring in terms of accuracy, speed, and efficiency. However, the new technology still requires manual reading and recording.

1977- Early 1980s – personal desktop computers become available with the introduction of Commodore PET, Apple II and Tandy TRS-80.

1985 – Introduction of local alarming that emits audible signal when temperature parameters at a single unit are exceeded. Issues of manual reading and logging temperature data to avoid the alarm continue.   

1985 – 1990 – desktop computing comes to the masses and a new paradigm is established with local storage of temperature data. The computer becomes the first temperature data logger. Wired networks and hardware continue to plague application expansion.

1989 – First commercial U.S. internet service provider, The World, becomes operational.

1992 – U.S. telecom company Sprint offers first commercially available dial-up modem providing internet access via a public switched telephone network. Remote temperature alarms are now possible through a wired system via automated telephone calls and pagers. These 20-year-old radio and battery technologies are still employed by many monitoring companies.

1995 – Internet is fully commercialized with decommissioning of National Science Foundation Network enabling commercial traffic to move across the World Wide Web.

1999 – Wi-Fi wireless networking introduced by Apple on its new iBook computers. The cord is cut, and remote, wireless, remote temperature monitoring is possible.

1999 – Salesforce.com introduces delivery of enterprise applications over the internet via a single website. Cloud-based computing goes mainstream, eliminating the need for investment and maintenance of expensive server rooms.

2002 – Amazon Web Services develops a suite of cloud-based services to include data storage and computation.

2008 – Sonicu pioneers a cloud-based light and sound monitoring system for hospital neonatal intensive care units across the country utilizing Amazon Web Services’ servers, technology, and encryption features.

2016 – Sonicu continues to expand its wireless, remote temperature monitoring and measuring applications using state-of-the-art cellular radio technology to transmit data from wireless temperature sensors to its cloud-based platform, eliminating critical security issues accompanying out-dated Wi-Fi systems. Sonicu’s platform operates independently of enterprise IT structures, and end users have 24/7/365 access to data and alarming anywhere in the world.