Patent: Sensors To Predict Actions, iChat VoIP Integration

A new Apple patent application was recently published that shows Apple has been researching more advanced uses of the iPhone’s sensors in order to predict what a user wants to do. It talks about various scenarios in which information picked up by proximity sensors, ambient light detectors, temperature sensors and accelerometers to determine the state of the device, letting it automatically change it’s configuration to better suit the circumstances.

A few examples of how these sensors might be used in conjunction with a bluetooth headset in some occasions to pick up what it’s conditions are:

  • Accelerometer to determine if device is being picked up
  • Temperature, ambient light, and proximity sensors to determine if device in a pocket
  • Proximity to see if the phone is held up to your ear or Bluetooth headset is being worn
  • Audio detection to see if you start talking into Bluetooth headset vs phone

Based on the information it picks up, it could then determine what it should do. For example, if it’s in your pocket, it would set the ring to ‘vibrate’. If it can tell that it’s near your face, it would know you’re holding it to your ear, so it could automatically turn off speakerphone. It could also rout a call directly to your wireless headset or to the phone based on what it thinks you’re currently using, such as speaking into, wearing or picking up one of them.

The most interesting of these scenarios is what the iPhone would automatically do when it is placed in or removed from the dock. For example, when placed in the dock during a call, it could automatically switch over to a VoIP call on the computer, routing audio to the computer’s speakers and letting the user talk into a connected microphone. Based on this, it looks like they may be planning to add VoIP capabilities to iChat on the Mac.
[via MacRumors]

View the comments on the forum…