There's that little incident at Kitty Hawk where humanity took to the skies, leading to a Moon Shot or two; the splitting of the atom changed the way we see the universe itself; and a host of other inventions, discoveries, and revolutions, from industrial to digital.
In today's digital world, staying the same in business is simply not an option: The more things change, the more one probably should change how things are done.
Unless you're using automated, wireless temperature monitoring and sensors paired to a cloud-based platform, you might as well be using a version of Clifford's analog thermometer. Truth is, you essentially are; it just takes less time to register.
1866 – English physician Sir Thomas Clifford Allbutt develops the first analog-based clinical thermometer that displays body temperature in five minutes.
1973 – Royal Medical Corp. was granted a U.S. patent for the first digital medical thermometer, which revolutionized temperature monitoring in terms of accuracy, speed, and efficiency. However, the new technology still requires manual reading and recording.
1977- Early 1980s – personal desktop computers became available with the introduction of Commodore PET, Apple II, and Tandy TRS-80.
1985 – Introduction of a local alarm that emits an audible signal when temperature parameters at a single unit are exceeded. Issues with manual reading and logging of temperature data to avoid the alarm persisting.
1985 – 1990 – desktop computing comes to the masses, and a new paradigm is established with local storage of temperature data. The computer becomes the first temperature data logger. Wired networks and hardware continue to plague application expansion.
1989 – First commercial U.S. internet service provider, The World, becomes operational.
1992 – U.S. telecom company Sprint offers the first commercially available dial-up modem providing internet access via a public switched telephone network. Remote temperature alarms are now possible through a wired system via automated telephone calls and pagers. These 20-year-old radio and battery technologies are still employed by many monitoring companies.
1995 – The Internet is fully commercialized with the decommissioning of the National Science Foundation Network, enabling commercial traffic to move across the World Wide Web.
1999 – Wi-Fi wireless networking was introduced by Apple on its new iBook computers. The cord is cut, and remote, wireless temperature monitoring is possible.
1999 – Salesforce.com introduces the delivery of enterprise applications over the internet via a single website. Cloud-based computing goes mainstream, eliminating the need for investment and maintenance of expensive server rooms.
2002 – Amazon Web Services develops a suite of cloud-based services, including data storage and computation.
2008 – Sonicu pioneers a cloud-based light-and-sound monitoring system for hospital neonatal intensive care units across the country, using Amazon Web Services’ servers, technology, and encryption features.
2016 – Sonicu continues to expand its wireless, remote temperature-monitoring and measuring applications, using state-of-the-art cellular radio technology to transmit data from wireless temperature sensors to its cloud-based platform, eliminating the critical security issues associated with outdated Wi-Fi systems.
Sonicu’s platform operates independently of enterprise IT structures, and end users have 24/7/365 access to data and alarms anywhere in the world.