Sonicu's Blog

Timeline to Remote Temperature Monitoring: Where Are You?

Written by Jim Mayfield | Jul 8, 2016 8:41:04 PM

A lot has happened since the good doctor Clifford Allbutt noodled up the first analog thermometer in the 19th century. 

There's that little incident at Kitty Hawk where man broke the bonds of gravity, leading to a Moon Shot or two; the splitting of an atom that changed the way we see ourselves; and a host of other inventions, discoveries, and revolutions from industrial to digital.

In today's digital world, remaining the same in business is simply not an option: The more things change, the more one probably should change the way things are done.

Unless you're using automated, wireless temperature monitoring and sensors paired to a cloud-based platform, you might as well be using a version of Clifford's analog thermometer. Truth is, you essentially are; it just takes less time to register.

Where are you on the temperature monitoring timeline?

1866 – English physician Sir Thomas Clifford Allbutt develops the first analog-based clinical thermometer that displays body temperature in five minutes.  

1973 – Royal Medical Corp. granted a U.S. patent for the first digital medical thermometer that revolutionized temperature monitoring in terms of accuracy, speed, and efficiency. However, the new technology still requires manual reading and recording.

1977- Early 1980s – personal desktop computers became available with the introduction of Commodore PET, Apple II, and Tandy TRS-80.

1985 – Introduction of a local alarm that emits an audible signal when temperature parameters at a single unit are exceeded. Issues of manual reading and logging temperature data to avoid the alarm continue.   

1985 – 1990 – desktop computing comes to the masses, and a new paradigm is established with local storage of temperature data. The computer becomes the first temperature data logger. Wired networks and hardware continue to plague application expansion.

1989 – First commercial U.S. internet service provider, The World, becomes operational.

1992 – U.S. telecom company Sprint offers the first commercially available dial-up modem providing internet access via a public switched telephone network. Remote temperature alarms are now possible through a wired system via automated telephone calls and pagers. These 20-year-old radio and battery technologies are still employed by many monitoring companies.

1995 – The Internet is fully commercialized with the decommissioning of the National Science Foundation Network, enabling commercial traffic to move across the World Wide Web.

1999 – Wi-Fi wireless networking was introduced by Apple on its new iBook computers. The cord is cut, and remote, wireless, remote temperature monitoring is possible.

1999 – Salesforce.com introduces the delivery of enterprise applications over the internet via a single website. Cloud-based computing goes mainstream, eliminating the need for investment and maintenance of expensive server rooms.

2002 – Amazon Web Services develops a suite of cloud-based services to include data storage and computation.

2008 – Sonicu pioneers a cloud-based light and sound monitoring system for hospital neonatal intensive care units across the country, utilizing Amazon Web Services’ servers, technology, and encryption features.

2016 – Sonicu continues to expand its wireless, remote temperature monitoring and measuring applications using state-of-the-art cellular radio technology to transmit data from wireless temperature sensors to its cloud-based platform, eliminating critical security issues accompanying outdated Wi-Fi systems.

Sonicu’s platform operates independently of enterprise IT structures, and end users have 24/7/365 access to data and alarms anywhere in the world.