In this world of IOT, iBeacons, Eddystone, Mobiles with 3+ GB RAM, light/gesture control sensors, GPS, accelerometer, compass and quad/octa core processors with cameras an upgrade for all of this would be to have sensors like:
1. Temperature,
2. Movement/motion,
3. Smell,
4. Pollution, etc. onto the mobiles and then fuse them with IoT/sensor networks. This fusion is simple and easy using Mobile APPs.
Imagine we being stuck in a natural calamity like fire, earthquake, flood, etc. and if we have these sensors fused onto the mobile along with IoT and mobile networks + WiFi there could be a huge combination of information that can be provided to each person to and fro.
They could be part of a sensor network (ad-hoc over WiFi or 2G/3G/CDMA/4G and so on) and could participate in finding better outcomes in any troubling situation. Why I mention ad-hoc/WiFi network is incase of emergency if 2G/3G/4G/CDMA network goes down we can use ad-hoc/WiFi to build a small network which can fuse with IoT sensors and maybe get a temporary VSAT or other internet uplink which are now available on mobile vehicles as well (broadcasters, news channels, etc.).
World of mobility is just opening up for all of us. Information sharing would reach a new level with aim of saving lives. All this can be enabled at the end point via mobile APPs using Android SDK or Apple based APPs.