Patent application number | Description | Published |
20090058823 | Virtual Keyboards in Multi-Language Environment - The disclosed implementations include displays of accented or related characters for characters selected by a user through a virtual keyboard operating in a multi-language environment. In one aspect, when a user clicks and holds down a key, a popup displays accented characters for the character associated with the key. In another aspect, the order of accented characters can be based a frequency of occurrence of the accented character in the current language being typed by the user. In another aspect, when a character is at edge of a display, the popup is visually displayed in a different location and the ordering of the accents in the display are set with the more frequently occurring accents being more quickly accessible. In another aspect, auto correction is used to correct accented equivalents for compounds. In another aspect, a different visual keyboard layout is provided for different languages. | 03-05-2009 |
20090077464 | INPUT METHODS FOR DEVICE HAVING MULTI-LANGUAGE ENVIRONMENT - Text input is corrected on a touch-sensitive display by presenting a list of candidate words in the interface which can be selected by touch input. The candidate list can include candidate words having two or more character types (e.g., Roman, kana, kanji). In one aspect, the candidate list can be scrolled using a finger gesture. When a user's finger traverses a candidate word and the touch is released, the candidate word is inserted into a document being edited. In another aspect, characters can be erased by touching a key (e.g., a backspace or delete key) and making a sliding, swiping, or other finger gesture. A number of characters proportional to a distance (e.g., a linear distance) of the finger gesture across the display are erased. If there are characters in a text input area, those characters are erased first, followed by characters in the document being edited. | 03-19-2009 |
20090265669 | LANGUAGE INPUT INTERFACE ON A DEVICE - Methods, systems, devices, and apparatus, including computer program products, for inputting text. A user interface element is presented on a touch-sensitive display of a device. The user interface element is associated with a plurality of characters, at least a subset of which is associated with respective gestures. A user input performing a gesture with respect to the user interface element is received. The character from the subset that is associated with the gesture performed with respect to the user interface element is inputted. | 10-22-2009 |
Patent application number | Description | Published |
20130311997 | Systems and Methods for Integrating Third Party Services with a Digital Assistant - The electronic device with one or more processors and memory receives an input of a user. The electronic device, in accordance with the input, identifies a respective task type from a plurality of predefined task types associated with a plurality of third party service providers. The respective task type is associated with at least one third party service provider for which the user is authorized and at least one third party service provider for which the user is not authorized. In response to identifying the respective task type, the electronic device sends a request to perform at least a portion of a task to a third party service provider of the plurality of third party service providers that is associated with the respective task type. | 11-21-2013 |
20140028600 | Touch Screen Device, Method, and Graphical User Interface for Inserting a Character from an Alternate Keyboard - A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises displaying a first soft keyboard. While displaying the first soft keyboard, a key for selecting a second soft keyboard different from the first soft keyboard is displayed. A first contact is detected on the key for selecting the second soft keyboard. In response to detecting the first contact, the second soft keyboard is displayed. Movement of the first contact is detected to a character-insertion key in the second soft keyboard. Lift off of the first contact is detected at the character-insertion key in the second soft keyboard to which the first contact moved. In response to detecting the lift off, a character is inserted that corresponds to the character-insertion key in the second soft keyboard to which the first contact moved and the display of the second soft keyboard is ceased. | 01-30-2014 |
20140125609 | Portable Multifunction Device, Method, and Graphical User Interface for Adjusting an Insertion Point Marker - In accordance with some embodiments, a computer-implemented method is performed at a portable electronic device with a touch screen display. The method includes: displaying graphics and an insertion marker at a first location in the graphics on the touch screen display; detecting a finger contact with the touch screen display; and in response to the detected finger contact, expanding the insertion marker from a first size to a second size on the touch screen display and expanding a portion of the graphics on the touch screen display from an original size to an expanded size. The method further includes detecting movement of the finger contact on the touch screen display and moving the expanded insertion marker in accordance with the detected movement of the finger contact from the first location to a second location in the graphics. | 05-08-2014 |
20140327629 | Touch Screen Device, Method, and Graphical User Interface for Customizing Display of Content Category Icons - A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen display, applying one or more heuristics to the one or more finger contacts to determine a command for the device, and processing the command. The one or more heuristics comprise: a heuristic for determining that the one or more finger contacts correspond to a one-dimensional vertical screen scrolling command, a heuristic for determining that the one or more finger contacts correspond to a two-dimensional screen translation command, and a heuristic for determining that the one or more finger contacts correspond to a command to transition from displaying a respective item in a set of items to displaying a next item in the set of items. | 11-06-2014 |
20150149955 | PORTABLE MULTIFUNCTION DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR ADJUSTING AN INSERTION POINT MARKER - In accordance with some embodiments, a computer-implemented method is performed at a portable electronic device with a touch screen display. The method includes: displaying a window on the touch-sensitive display; detecting a single-finger gesture on the touch-sensitive display; in response to the detected single-finger gesture on the touch-sensitive display, displaying a magnifying screen that includes an expanded view of a first portion of the window that comprises at least a part of the texts and an insertion marker; detecting movement of a single finger on the touch-sensitive display while the magnifying screen is displayed; and in response to detecting the movement of the single finger on the touch-sensitive display, replacing the display of the expanded view of the first portion of the window with an expanded view of the second portion of the window in the magnifying screen. | 05-28-2015 |
20150317078 | METHOD, DEVICE, AND GRAPHICAL USER INTERFACE PROVIDING WORD RECOMMENDATIONS FOR TEXT INPUT - A portable electronic device having a touch screen display displays a soft keyboard with a plurality key icons representing a plurality of letters The device detects a finger contact on a respective key icon representing a respective letter, and in response to detecting the finger contact on the respective key icon representing the respective letter, displays an enlarged view of the respective letter. The device then detects a liftoff of the finger contact from the respective key icon, and in response to detecting the liftoff, inputs the respective letter in a text input area, the text input area being displayed adjacent the soft keyboard and ceases to display the enlarged view of the respective letter. | 11-05-2015 |
20160034133 | CONTEXT-SPECIFIC USER INTERFACES - Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein. | 02-04-2016 |
20160034148 | CONTEXT-SPECIFIC USER INTERFACES - Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein. | 02-04-2016 |
20160034152 | CONTEXT-SPECIFIC USER INTERFACES - Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein. | 02-04-2016 |
20160034166 | CONTEXT-SPECIFIC USER INTERFACES - Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein. | 02-04-2016 |
20160034167 | CONTEXT-SPECIFIC USER INTERFACES - Context-specific user interfaces for use with a portable multifunction device are disclosed. The methods described herein for context-specific user interfaces provide indications of time and, optionally, a variety of additional information. Further disclosed are non-transitory computer-readable storage media, systems, and devices configured to perform the methods described herein. | 02-04-2016 |
20160062598 | MULTI-DIMENSIONAL OBJECT REARRANGEMENT - A method includes displaying, on a touch-sensitive display, a plurality of application icons in a first configuration at locations on a hexagonal grid in relation to an origin. The application icons have corresponding ranks based on their respective locations in relation to the origin. In response to detecting a movement of a user contact from a first position to a second position: the display of a first application icon at a first location is translated to a second position; a second configuration of the application icons is determined based on the first location and the second location; and the display of the application icons is transitioned from the first configuration to the second configuration. In the second configuration, no application icon except the first application icon is displaced by more than one location relative to the first configuration. | 03-03-2016 |
Patent application number | Description | Published |
20110179380 | Event Recognition - An electronic device executes one or more software elements. Each software element is associated with a particular view, which includes one or more event recognizers. Each event recognizer has one or more event definitions based on one or more sub-events, and an event handler. The event handler is configured to send an action to a target in response to the event recognizer detecting an event corresponding to a particular event definition. The electronic device detects a sequence of sub-events, and identifies actively involved views. The electronic device delivers a respective sub-event to event recognizers for actively involved views. At least one event recognizer for actively involved views has a plurality of event definitions, one of which is selected in accordance with an internal state of the electronic device. The at least one event recognizer processes the respective sub-event in accordance with the selected event definition. | 07-21-2011 |
20110179386 | Event Recognition - A method includes displaying one or more views of a view hierarchy, and executing software elements associated with a particular view. Each particular view includes event recognizers. Each event recognizer has one or more event definitions, and an event handler that specifies an action for a target and is configured to send the action to the target in response to event recognition. The method includes detecting a sequence of sub-events, and identifying one of the views of the view hierarchy as a hit view that establishes which views are actively involved views. The method includes delivering a respective sub-event to event recognizers for each actively involved view. A respective event recognizer has event definitions, and one of the event definitions is selected based on the internal state. The respective event recognizer processes the respective sub-event prior to processing a next sub-event in the sequence of sub-events. | 07-21-2011 |
20110179387 | Event Recognition - While displaying one or more views of a first software application, an electronic device detects a sequence of touch inputs. The electronic device, in accordance with a determination that at least one gesture recognizer in the first software application recognizes a first portion of the sequence, delivers the sequence to the first software application without delivering the sequence to a second software application, and in accordance with a determination that a first gesture recognizer in the first software application recognizes the sequence, processes the sequence with the first gesture recognizer. The electronic device, in accordance with a determination that no gesture recognizer in the first software application recognizes the first portion, delivers the sequence to the second software application, and in accordance with a determination that a second gesture recognizer in the second software application recognizes the sequence, processes the sequence with the second gesture recognizer. | 07-21-2011 |
20120159380 | Device, Method, and Graphical User Interface for Navigation of Concurrently Open Software Applications - An electronic device includes a touch-sensitive display and one or more programs stored in memory for execution by one or more processors. The one or more programs include instructions for displaying a first application view that corresponds to a first application in a plurality of concurrently open applications. The one or more programs include instructions for detecting a first input, and in response, concurrently displaying a group of open application icons that correspond to at least some of the plurality of concurrently open applications with at least a portion of the first application view. The open application icons are displayed in accordance with a predetermined sequence of the open applications. The one or more programs include instructions for detecting a first gesture distinct from the first input, and in response, displaying a second application view that corresponds to a second application adjacent to the first application in the predetermined sequence. | 06-21-2012 |
20130321466 | Determining to Display Designations of Points of Interest Within a Map View - Methods and apparatus for a map tool for determining which points of interest in a map region for which to display designations or labels in a map view such that a displayed designation does not disappear and reappear as a user zooms in or out of a map view or as a user pans across a map region. Also disclosed are methods and apparatus for a ranking tool that uses a hierarchy of categories in order to classify points of interest, where for each given hierarchical category, the points of interest within the given hierarchical category are further ranked according to ranking data for each given point of interest and also ranked according to the quantity of the ranking data for the given point of interest. | 12-05-2013 |
20140033131 | Event Recognition - While displaying one or more views of a first software application, an electronic device detects a sequence of touch inputs. The electronic device, in accordance with a determination that no gesture recognizer of the first software application recognizes a portion of the sequence of touch inputs, delivers the sequence of touch inputs to the second software application, and in accordance with a determination that at least one gesture recognizer in the second software application recognizes the sequence of touch inputs, processes the sequence of touch inputs with the at least one gesture recognizer in the second software application that recognizes the sequence of touch inputs. | 01-30-2014 |
20140089839 | Apparatus and Method for Conditionally Enabling or Disabling Soft Buttons - A method of operating a multifunction device includes displaying a soft keyboard having a plurality of buttons including one or more unconditionally enabled buttons and one or more conditionally enabled buttons. The method further includes detecting a first input with a first button at a first time; detecting a second input with a second button at a second time after the first time, where the second button is a conditionally enabled button; and in response to the detecting the second input with the second button at the second time: in accordance with a determination that the period of time between the first time and the second time is above a predefined threshold, activating the second button; and in accordance with a determination that the period of time between the first time and the second time is below the predefined threshold, preventing the second button from being activated. | 03-27-2014 |
20140173494 | Input Methods for Device Having Multi-Language Environment - Text input is corrected on a touch-sensitive display by presenting a list of candidate words in the interface which can be selected by touch input. The candidate list can include candidate words having two or more character types (e.g., Roman, kana, kanji). In one aspect, the candidate list can be scrolled using a finger gesture. When a user's finger traverses a candidate word and the touch is released, the candidate word is inserted into a document being edited. In another aspect, characters can be erased by touching a key (e.g., a backspace or delete key) and making a sliding, swiping, or other finger gesture. A number of characters proportional to a distance (e.g., a linear distance) of the finger gesture across the display are erased. If there are characters in a text input area, those characters are erased first, followed by characters in the document being edited. | 06-19-2014 |
20140267362 | Device, Method, and Graphical User Interface for Adjusting the Appearance of a Control - An electronic device with a display displays a user interface on the display. The device determines a first set of content-display values for one or more content-display properties of first content that corresponds to a respective region of the display. The device determines a first set of control-appearance values for one or more control-appearance parameters based on the first set of content-display values. The device displays a control in the respective region of the display, wherein an appearance of the control is determined based on the first content and the first set of control-appearance values. | 09-18-2014 |
20140267363 | Device, Method, and Graphical User Interface for Adjusting the Appearance of a Control - An electronic device with a display displays a user interface on the display. The device determines a first set of content-display values for one or more content-display properties of first content that corresponds to a respective region of the display. The device determines a first set of control-appearance values for one or more control-appearance parameters based on the first set of content-display values. The device displays a control in the respective region of the display, where an appearance of the control is determined based on the first content and the first set of control-appearance values, and displaying the control includes applying a blur operation to the first content to generate first blurred content and overlaying a translucent colored layer over the first blurred content. | 09-18-2014 |