Recently, PrimeSense (acquired by Apple at $350
million in 2013) announced the advanced UI/UX technology (demo
video) for the control
of smart home IoT (Internet of Things) devices exploiting the 3D sensor (demo
video). Patents can
provide insights regarding technical details for the smart home control applications.
Followings illustrate the technical details for the UI/UX technology based on
the related PrimeSense patents’ disclosures.
Flexible Smart Home Controls
PrimeSense patent application US20140225824
describes technical details regarding the virtual smart home control buttons: A
control unit projects images of control devices onto a wall of the room and
remotely senses contact with and manipulation of the projected devices by a
user (or more precisely, contact with the wall on which the images are
projected and gestures of the user's hand and fingers while in this situation).
The projected devices may range from a simple on/off switch to more complex
controls, such as dials, sliders, and keypads. The user may modify, add or
remove control devices at will by interaction with the control system, such as
by holding and dragging a projected device along the wall to a new location.
Following figure illustrates the smart home
control with a projection-based virtual control system.
The key element of the projection-based
control system 20 is a room control unit 22, which controls the operation of
electrical equipment in the room, such as lights 24, 26, an air conditioner 28,
and a media system 30 (which may play audio, video or other content), for
example. These items of equipment are typically wired through the walls and
ceiling of the room to control unit 22, rather than wiring them to conventional
electrical control devices mounted on the walls of the room. Alternatively,
control unit 22 may operate items of electrical equipment via wireless links.
Room control unit 22, which may be
conveniently mounted on or in the ceiling, projects images of control devices
32, 34, 36 onto a wall of the room. A user 38 interacts with these projected
devices by gestures of his hand and fingers, as though they were actual,
electrical controls. A sensor in control unit 22 detects contact between the
user's fingers and the projected devices and controls the electrical equipment
in the room accordingly. Thus, for example, user 38 can touch an on/off switch
in device 36 to cause control unit to turn light 26 on or off, or may turn a
dial in device 32 to cause the control unit to brighten or dim lights 24. As
another example, user 38 can move a slider in device 34 to change the room
temperature (and control unit 22 can project the actual and/or target
temperature onto device 34, as well, as though the device were an actual
thermostat).
One of the advantages of the projection-based
control system 20 is that the locations and forms of control devices 32, 34, 36
can be changed by user 38 at will. For this purpose, control unit 22 can
implement a touch interface with functionality similar to that offered by
current touch screens. For example, when control unit 22 senses extended
contact between the user's finger and one of the control devices, the control
unit selects and visually highlights the device. The user can then drag and
drop the control device at a new location by sliding his finger along the wall
of the room to the desired location. Control unit 22 can simultaneously project
a moving image of the control device along the wall next to the user's finger
until the user "drops" the device in its new location. User 38 can
use appropriate gestures or other inputs to enlarge or shrink the control
devices, as well as copying a device appearing on the wall to a location on
another wall. In this manner, for example, the user will be able to add a
"switch" for light 26, so that it will be possible to turn the light
on and off from both a location next to the door of the room and a location
next to the light itself. No additional wiring or other modifications
whatsoever are needed for this purpose.
Flexible IoT Device Remote Control
3D sensing technology can be exploited for
controlling IoT devices remotely. PrimeSense patent application US 20140304647 describes technical details
regarding the remote IoT device control system. Following figure illustrates
the gesture-mediated remote information input system.
The gesture-mediated remote information input
system 10 incorporates a sensing device 12, typically a three-dimensional
camera, which detects information that includes the body (or at least parts of
the body) of a user 14.
Information detected by sensing device 12 is
processed by a remote device 18, which drives a display screen 20 accordingly.
Sensing device 12 is connected to remote device 18 via a sensing interface 22,
which may comprise a Bluetooth adapter, an Infrared Data Association (IrDA)
device, a cable connection, a universal serial bus (USB) interface, or any
communication interface for outputting sensor data that allows remote device 18
to import remote sensing data. Remote device 18 typically comprises a
general-purpose computer processor, which is programmed in software to carry
out the functions. The software may be downloaded to the processor in
electronic form, over a network, for example, or it may alternatively be
provided on tangible storage media, such as optical, magnetic, or electronic
memory media. Alternatively, some or all of the image functions may be
implemented in dedicated hardware, such as a custom or semi-custom integrated
circuit or a programmable digital signal processor (DSP).
Display screen 20 presents user interface
elements comprising a pointer 24 and a remote information input interface 26,
which comprises symbols 28, 30, 32, 34, 36. A display interface 38 connects
display screen 20 to remote device 18, and may comprise a Bluetoot adapter, an
IrDA device, a cable connection, or any communication interface for outputting
image data that allows remote device 18 to export visual display data, e.g., in
the form of a compressed image. The symbol selection layout provides a
simplified example for the purposes of illustration. The symbols 28, 30, 32
represent numerals and 34 and 36 represent actions. Each symbol can be remotely
selected or actuated to control the computer application. Remote information
input interface 26 can also comprise a zoom level indicator 40 to provide a
visual indicator of the zoom level of remote information input interface 26.
Zoom level indicator 40 may be shown as a slider, similar to sliders utilized
in web browsers and other applications. The zoom level is typically allowed to
range within certain limits, e.g., from 50% to 500%.
©2015 TechIPm, LLC All Rights Reserved http://www.techipm.com/
No comments:
Post a Comment