Fusion of Active and Passive Sensors for Fast 3D Capture
Yang, Qingxiong; Tan, Kar-Han; Culbertson, Bruce; Apostolopoulos, John
Keyword(s): 3D, Depth, Sensors, Time of flight, Stereo, Multiview, real time, remote collaboration, human interaction
Abstract: We envision a conference room of the future where depth sensing systems are able to capture the 3D position and pose of users, and enable users to interact with digital media and contents being shown on immersive displays. The key technical barrier is that current depth sensing systems are noisy, inaccurate, and unreliable. It is well understood that passive stereo fails in non-textured, featureless portions of a scene. Active sensors on the other hand are more accurate in these regions and tend to be noisy in highly textured regions. We propose a way to synergistically combine the two to create a state-of- the-art depth sensing system which runs in near real time. In contrast the only known previous method for fusion is slow and fails to take advantage of the complementary nature of the two types of sensors.
Additional Publication Information: To be presented at IEEE International Workshop on Multimedia Signal Processing 2010, Saint-Malo, France. October 4, 2010
External Posting Date: August 21, 2010 [Fulltext]. Approved for External Publication
Internal Posting Date: August 21, 2010 [Fulltext]