Patent application number | Description | Published |
20080309511 | Selectively adjustable icons for assisting users of an electronic device - Systems and methods for providing selectively adjustable icons to assist users of an electronic device are provided. Icons can be selectively adjusted to assist users in connecting and disconnecting accessories to and from the electronic device. | 12-18-2008 |
20090066647 | GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application. | 03-12-2009 |
20090066648 | GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application. | 03-12-2009 |
20090088204 | Movement-based interfaces for personal media device - Systems and methods are provided for a media device including one or more movement-based interfaces for interfacing with or controlling the media device. | 04-02-2009 |
20090153389 | Scroll bar with video region in a media system - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video. | 06-18-2009 |
20090153475 | Use of a remote controller Z-direction input mechanism in a media system - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video. | 06-18-2009 |
20090153478 | Centering a 3D remote controller in a media system - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video. | 06-18-2009 |
20090158203 | Scrolling displayed objects using a 3D remote controller in a media system - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video. | 06-18-2009 |
20090158222 | Interactive and dynamic screen saver for use in a media system - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video. | 06-18-2009 |
20090284532 | CURSOR MOTION BLURRING - An electronic device for displaying a cursor with a trail is provided. The user may control electronic device operations by navigating the cursor on a display. To assist the user in identifying the current location of the cursor, the electronic device may define and display a trail indicating the prior positions of the cursor. For example, the electronic device may identify previous cursor positions and draw a curve, for example a spline, connecting the previous cursor positions and the current cursor position. The curve may have a varying width, thus forming a trail for which the wider portion is adjacent the cursor, and for which the narrower portion is adjacent the tip of the curve. The electronic device may instead or in addition modify the opacity of the curve, for example based on the instantaneous speed of the cursor. In some embodiments, other trail characteristics (e.g., size, color, opacity, path) may be modified based on prior cursor movements or cursor speed. | 11-19-2009 |
20090313584 | SYSTEMS AND METHODS FOR ADJUSTING A DISPLAY BASED ON THE USER'S POSITION - An electronic device for providing a display that changes based on the user's perspective is provided. The electronic device may include a sensing mechanism operative to detect the user's position relative a display of the electronic device. For example, the electronic device may include a camera operative to detect the position of the user's head. Using the detected position, the electronic device may be operative to transform displayed objects such that the displayed perspective reflects the detected position of the user. The electronic device may use any suitable approach for modifying a displayed object, including for example a parallax transform or a perspective transform. In some embodiments, the electronic device may overlay the environment detected by the sensing mechanism (e.g., by a camera) to provide a more realistic experience for the user (e.g., display a reflection of the image detected by the camera on reflective surfaces of a displayed object). | 12-17-2009 |
20090322676 | GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application. | 12-31-2009 |
20110163944 | INTUITIVE, GESTURE-BASED COMMUNICATIONS WITH PHYSICS METAPHORS - A user can make an intuitive, physical gesture with a first device, which can be detected by one or more onboard motion sensors. The detected motion triggers an animation having a “physics metaphor,” where the object appears to react to forces in a real world, physical environment. The first device detects the presence of a second device and a communication link is established allowing a transfer of data represented by the object to the second device. During the transfer, the first device can animate the object to simulate the object leaving the first device and the second device can animate the object to simulate the object entering the second device. In some implementations, in response to an intuitive, gesture made on a touch sensitive surface of a first device or by physically moving the device, an object can be transferred or broadcast to other devices or a network resource based on a direction, velocity or speed of the gesture. | 07-07-2011 |
20110164029 | Working with 3D Objects - Three-dimensional objects can be generated based on two-dimensional objects. A first user input identifying a 2D object presented in a user interface can be detected, and a second user input including a 3D gesture input that includes a movement in proximity to a surface can be detected. A 3D object can be generated based on the 2D object according to the first and second user inputs, and the 3D object can be presented in the user interface. | 07-07-2011 |
20110164163 | SYNCHRONIZED, INTERACTIVE AUGMENTED REALITY DISPLAYS FOR MULTIFUNCTION DEVICES - A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link. | 07-07-2011 |
20110175821 | Virtual Drafting Tools - Techniques for using virtual tools are disclosed. In one aspect, a user interface is presented. A first touch input including touch inputs at two or more locations is received, and a virtual tool corresponding to the relative positions of the two or more locations is identified. A second touch input interacting with the virtual tool is received, and a graphical object corresponding to the identified virtual tool and the second touch input is presented. In another aspect, an input activating a drafting mode of a device is received, and a drafting user interface is presented. A second touch input including touch inputs at two or more locations is received, and a third touch input is received. A graphical object corresponding to the third touch input and a virtual drafting tool corresponding to the second touch input is generated and presented. | 07-21-2011 |
20110179368 | 3D View Of File Structure - A file structure or data hierarchy can be navigated using 3D gesture inputs. For example, objects can be arranged in a plurality of layers. A user input, including a 3D gesture input having a movement in proximity to a display surface can be detected. Different layers can be navigated in response to a movement component that is perpendicular to the display surface. | 07-21-2011 |
20110193788 | GRAPHICAL OBJECTS THAT RESPOND TO TOUCH OR MOTION INPUT - A first graphical object on a user interface of a device can be transformed to a second graphical object on the user interface. The second graphical object can be manipulated by a user on the user interface using touch input or by physically moving the device. When manipulated, the object can be animated to appear to have mass that responds to real-world, physical forces, such as gravity, friction or drag. The data represented by the second graphical object can be compressed or archived using a gesture applied to the second graphical object. Graphical objects can be visually sorted on the user interface based on their mass (size). The visual appearance of graphical objects on the user interface can be adjusted to indicate the age of data represented by the graphical objects. | 08-11-2011 |
20110197153 | Touch Inputs Interacting With User Interface Items - Techniques for managing user interactions with items on a user interface are disclosed. In one aspect, a representation of an opening is presented in response to touch input. A display object is moved over the opening, and the display object is processed in response to the moving. In another aspect, touch input pinching two opposite corners of a display object followed by touch input flicking the display object is received and the display object is deleted in response to the inputs. In another aspect, touch input centered over a display object is received and the display object is deleted in response to the input. In another aspect, touch input corresponding to swiping gestures are received and a display object is securely deleted in response to the gestures. | 08-11-2011 |
20120262379 | GESTURE VISUALIZATION AND SHARING BETWEEN ELECTRONIC DEVICES AND REMOTE DISPLAYS - The disclosed embodiments provide a system that facilitates interaction between an electronic device and a remote display. The system includes a first application and an encoding apparatus on the electronic device, and a second application and a decoding apparatus on the remote display. The encoding apparatus obtains graphical output for a display of the electronic device and a first set of touch inputs associated with the graphical output from a first touch screen. Next, the encoding apparatus encodes the graphical output, and the first application transmits the graphical output and the first set of touch inputs to the remote display. Upon receiving the graphical output and the first set of touch inputs at the remote display, the decoding apparatus decodes the graphical output. The second application then uses the graphical output and a visual representation of the first set of touch inputs to drive the remote display. | 10-18-2012 |
20120268410 | Working with 3D Objects - Three-dimensional objects can be generated based on two-dimensional objects. A first user input identifying a 2D object presented in a user interface can be detected, and a second user input including a 3D gesture input that includes a movement in proximity to a surface can be detected. A 3D object can be generated based on the 2D object according to the first and second user inputs, and the 3D object can be presented in the user interface. | 10-25-2012 |
20130073095 | Protective Mechanism for an Electronic Device - An electronic device including a processor, a sensor in communication with the processor and a protective mechanism. The protective mechanism is in communication with the processor and is configured to selectively alter a center of mass of the electronic device. Additionally, the electronic device also includes an enclosure configured to at least partially enclose the processor and the sensor. | 03-21-2013 |
20130082974 | Quick Access User Interface - Providing quick access to certain applications on a computing device is disclosed. A security wall is enforced with respect to applications on the computer, wherein enforcing the security wall includes preventing access to the applications until a security input is received. A predefined input is received through a home button on a touch-sensitive display of the computer. Access is provided to a particular application in response to receiving the predefined input, wherein providing access to the particular application includes allowing a user to access the particular application without receiving the security input from the user. | 04-04-2013 |
20130083240 | Method for Synchronizing Content Displayed on Multiple Devices in an Array - A method comprising providing multiple video units in an array, playing video content on the video units in a synchronized manner, and detecting when one of the video units is removed from the array. In response to detecting removal of the video unit, the video content played on the video units remaining in the array in a synchronized manner is adjusted. | 04-04-2013 |
20130135288 | Using a Three-Dimensional Model to Render a Cursor - In some implementations, a cursor can be rendered based on a three-dimensional model. In some implementations, the three-dimensional cursor can be manipulated to change the orientation of the three-dimensional cursor based on the context of the cursor. In some implementations, parameters associated with the three-dimensional model can be manipulated based on the context of the three-dimensional cursor to change the appearance of the cursor. | 05-30-2013 |
20130135309 | Dynamic Graphical Interface Shadows - Dynamic window and cursor shadows are described. In some implementations, graphical user interface display objects can be configured with elevation offset information to give the display objects a three-dimensional surface that can have pixels of varying height. In some implementations, shadows that are rendered upon display objects configured with pixel elevation offset information can be adjusted to reflect the three-dimensional surface of the objects thereby better approximating real-life shadows. In some implementations, shadows can be dynamically rendered in real-time and adjusted according to the elevations of display objects onto which they are cast. | 05-30-2013 |
20130151967 | SCROLL BAR WITH VIDEO REGION IN A MEDIA SYSTEM - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video. | 06-13-2013 |
20130155307 | SYNCHRONIZED, INTERACTIVE AUGMENTED REALITY DISPLAYS FOR MULTIFUNCTION DEVICES - A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link. | 06-20-2013 |
20140108998 | MULTIMEDIA CONTROL CENTER - Techniques and systems for centralized access to multimedia content stored on or available to a computing device are disclosed. The centralized access can be provided by a media control interface that receives user inputs and interacts with media programs resident on the computing device to produce graphical user interfaces that can be presented on a display device. | 04-17-2014 |
20140111547 | SYNCHRONIZED, INTERACTIVE AUGMENTED REALITY DISPLAYS FOR MULTIFUNCTION DEVICES - A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link. | 04-24-2014 |
20140176811 | Adaptive Media Content Scrubbing on a Remote Device - Systems and techniques are disclosed for controlling, from a mobile device, media content stored on the mobile device to a media client for presentation on a display device. Data can be provided from the mobile device to the media client for identifying the location of the media content and a playback time. Based on the data, the media client can obtain a portion of the media content associated with the playback time. Also, playback of the media content on the display device can be controlled by a user of the mobile device. | 06-26-2014 |
20140325458 | TOUCH INPUTS INTERACTING WITH USER INTERFACE ITEMS - Techniques for managing user interactions with items on a user interface are disclosed. In one aspect, a representation of an opening is presented in response to touch input. A display object is moved over the opening, and the display object is processed in response to the moving. In another aspect, touch input pinching two opposite corners of a display object followed by touch input flicking the display object is received and the display object is deleted in response to the inputs. In another aspect, touch input centered over a display object is received and the display object is deleted in response to the input. In another aspect, touch input corresponding to swiping gestures are received and a display object is securely deleted in response to the gestures. | 10-30-2014 |
20140351726 | GRAPHICAL OBJECTS THAT RESPOND TO TOUCH OR MOTION INPUT - A first graphical object on a user interface of a device can be transformed to a second graphical object on the user interface. The second graphical object can be manipulated by a user on the user interface using touch input or by physically moving the device. When manipulated, the object can be animated to appear to have mass that responds to real-world, physical forces, such as gravity, friction or drag. The data represented by the second graphical object can be compressed or archived using a gesture applied to the second graphical object. Graphical objects can be visually sorted on the user interface based on their mass (size). The visual appearance of graphical objects on the user interface can be adjusted to indicate the age of data represented by the graphical objects. | 11-27-2014 |
20140365909 | SELECTIVE ROTATION OF A USER INTERFACE - This is directed to rotating an entire user interface of a portable electronic device. In particular, this is directed to defining a UI orientation mode in which a user can direct the device to rotate a UI. When the UI orientation mode is enabled, the electronic device can detect particular inputs, for example based on the outputs of motion sensing components such as an accelerometer and a magnetometer, to determine how to rotate the UI. Once the UI has been rotated to a desired orientation, a user can lock the UI orientation and exit the UI orientation mode. | 12-11-2014 |