Hildreth
Adam Hildreth, Leeds GB
Patent application number | Description | Published |
---|---|---|
20100174813 | METHOD AND APPARATUS FOR THE MONITORING OF RELATIONSHIPS BETWEEN TWO PARTIES - A computer implemented method and data processing device for assessing electronically mediated communications is described. A plurality of messages sent by a first party are captured. The content of the messages is processed to determine a quantitative metric reflecting a first property. The behavior over time of the quantitative metric is analyzed to assess the nature of a relationship involving the first party. | 07-08-2010 |
20110087485 | NET MODERATOR - A method and an apparatus for moderating an inappropriate relationship between two parties by analyzing a dialog between the two parties. The method and apparatus creates an alert depending upon the nature of the dialog between the two parties. The alert is sent to a third party who can moderate the relationship between the two parties. The third party can ban or block the dialog between the two parties based upon the inappropriate relationship between the two parties. A banning or block of the dialog between the two parties can also be automated. | 04-14-2011 |
20120323565 | METHOD AND APPARATUS FOR ANALYZING TEXT - An apparatus, a method, an applications programming interface and a computer program product for analyzing text. The text is transmitted between users of a text based network mediated system. The text is analyzed by intended word filter rule processing elements to determine a presence of a variation word of an intended word in the text. A method for creating the intended word filter rule processing elements is also disclosed. | 12-20-2012 |
Evan Hildreth, Thornhill CA
Patent application number | Description | Published |
---|---|---|
20140040835 | ENHANCED INPUT USING RECOGNIZED GESTURES - A representation of a user can move with respect to a graphical user interface based on input of a user. The graphical user interface comprises a central region and interaction elements disposed outside of the central region. The interaction elements are not shown until the representation of the user is aligned with the central region. A gesture of the user is recognized, and, based on the recognized gesture, the display of the graphical user interface is altered and an application control is outputted. | 02-06-2014 |
20140092013 | VIDEO-BASED IMAGE CONTROL SYSTEM - A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application. | 04-03-2014 |
20140118249 | ENHANCED CAMERA-BASED INPUT - Enhanced camera-based input, in which a detection region surrounding a user is defined in an image of the user within a scene, and a position of an object (such as a hand) within the detection region is detected. Additionally, a control (such as a key of a virtual keyboard) in a user interface is interacted with based on the detected position of the object. | 05-01-2014 |
20140331181 | ITEM SELECTION USING ENHANCED CONTROL - An enhanced control, in which a guide line is defined relative to an object in a user interface, items aligned with the guide line are displayed without obscuring the object. A selected item is output based on receiving a selection of one of the displayed items. | 11-06-2014 |
Evan Hildreth, Markham CA
Patent application number | Description | Published |
---|---|---|
20130329966 | MEDIA PREFERENCES - An electronic media device may be controlled based on personalized media preferences of users experiencing content using the electronic media device. Users experiencing content using the electronic media device may be automatically identified and the electronic media device may be automatically controlled based on media preferences associated with the identified users. | 12-12-2013 |
Evan R. Hildreth, Thornhill CA
Patent application number | Description | Published |
---|---|---|
20130271360 | INTERACTING WITH A DEVICE USING GESTURES - Systems, methods, apparatuses, and computer-readable media for are provided for engaging and re-engaging a gesture mode. In one embodiment, a method performed by the computer system detects an initial presence of a user pose, indicates to a user progress toward achieving a predetermined state while continuing to detect the user pose, determines that the detection of the user pose has reached the predetermined state, and responds to the detection of the user pose based on determining that the detection has reached the predetermined state. The computer system may further prompt the user by displaying a representation of the user pose corresponding to an option for a user decision, detecting the user decision based at least in part on determining that the detection of the user pose has reached the predetermined state, and responding to the user decision. | 10-17-2013 |
20130271397 | RAPID GESTURE RE-ENGAGEMENT - Systems, methods, apparatuses, and computer-readable media are provided for use with a system configured to detect gestures. In one embodiment, a method includes detecting a first user gesture meeting a first condition to enter a mode of operation. The method may further include exiting the mode of operation. The method may further include detecting a second user gesture meeting a second condition to reenter the mode of operation based on the detecting the first user gesture, wherein the second condition is less stringent than the first condition. | 10-17-2013 |
20140149859 | MULTI DEVICE PAIRING AND SHARING VIA GESTURES - Embodiments of the present disclosure relate to multi device pairing and sharing via non-touch gestures. In an embodiment, a method comprises detecting, by a parent device, an initiating non-touch gesture performed by a user towards the parent device. The method also comprises initiating an action based on the detecting. The method further comprises triggering, by the parent device, a gesture recognition mode on one or more secondary devices based on the detecting. And the method further comprises completing the action upon a positive gesture recognition by the one or more secondary devices. | 05-29-2014 |
20140267790 | ADAPTIVE DATA PATH FOR COMPUTER-VISION APPLICATIONS - Embodiments of the present invention provide an adaptive data path for computer-vision applications. Utilizing techniques provided herein, the data path can adapt to the needs of a computer-vision application to provide the needed data. The data path can be adapted by applying one or more filters to image data from one or more sensors. Some embodiments may utilize a computer-vision processing unit comprising a specialized instruction-based, in-line processor capable of interpreting commands from a computer-vision application. | 09-18-2014 |
20140368422 | SYSTEMS AND METHODS FOR PERFORMING A DEVICE ACTION BASED ON A DETECTED GESTURE - Systems and methods for performing an action based on a detected gesture are provided. The systems and methods provided herein may detect a direction of an initial touchless gesture and process subsequent touchless gestures based on the direction of the initial touchless gesture. The systems and methods may translate a coordinate system related to a user device and a gesture library based on the detected direction such that subsequent touchless gestures may be processed based on the detected direction. The systems and methods may allow a user to make a touchless gesture over a device to interact with the device independent of the orientation of the device since the direction of the initial gesture can set the coordinate system or context for subsequent gesture detection. | 12-18-2014 |
Evan R. Hildreth, Thornhill GB
Patent application number | Description | Published |
---|---|---|
20140278441 | SYSTEMS AND METHODS FOR SWITCHING PROCESSING MODES USING GESTURES - Systems and methods for switching between voice dictation modes using a gesture are provided so that an alternate meaning to a dictated word may be applied. The provided systems and methods time stamp detected gestures and detected words from the voice dictation and compare the time stamp at which a gesture is detected to the time stamp at which a word is detected. When it is determined that a time stamp of a gesture approximately matches a time stamp of a word, the word may be processed to have an alternate meaning, such as a command, punctuation, or action. | 09-18-2014 |
Evan Robert Hildreth, Thornhill CA
Patent application number | Description | Published |
---|---|---|
20130027503 | ENHANCED INTERFACE FOR VOICE AND VIDEO COMMUNICATIONS - An enhanced interface for voice and video communications, in which a gesture of a user is recognized from a sequence of camera images, and a user interface is provided include a control and a representation of the user. The process also includes causing the representation to interact with the control based on the recognized gesture, and controlling a telecommunication session based on the interaction. | 01-31-2013 |
20150062158 | INTEGRATION OF HEAD MOUNTED DISPLAYS WITH PUBLIC DISPLAY DEVICES - Various arrangements for presenting private information are presented. Private information to be displayed via a head mounted display to a user may be identified. A marker displayed by a public display device may also be identified. This public display device may be visible in a vicinity of the user. The private information and an indication of the marker may be output to the head-mounted display of the user, such that the private information is displayed by the head-mounted display in relation to the marker displayed by the public display device. | 03-05-2015 |
20150062159 | DYNAMIC DISPLAY MARKERS - Various arrangements for defining a marker are presented. A first defined marker presented by a public display device may be determined to be insufficient for use by a head mounted display. The first defined marker may be used as a reference point for positioning information for display by the head mounted display. In response to determining that the first defined marker is insufficient, a second marker displayed by the public display device may be defined. The second marker may have a display characteristic different from the first defined marker. The second defined marker may then be used as the reference point for positioning the information for display by the head mounted display. An indication of the second marker may be transmitted to the head mounted display. | 03-05-2015 |
Nicholas Leon Hildreth, Auckland NZ
Patent application number | Description | Published |
---|---|---|
20100048113 | SHELLFISH POSITIONING AND OPENING APPARATUS - A shellfish positioning and opening apparatus having at least one processing lane whereby the apparatus includes the following stations of an in-feed and singulation station followed by a reorientation assembly station utilizing a vision system and a holding and opening assembly station. The said processing stations are operatively connected in that order together to receive shellfish having meat therein to position said shellfish according to their shape and orientation to enable shellfish to be halved so that one shellfish half has the shellfish meat thereon. A method is also included having the steps of—singulating the shellfish; applying vision system to determine and compare orientation of shellfish, reorienting each shellfish so that the shellfish is pointing in the right direction, abutting said shellfish to the vertical alignment device to cause said shellfish to be substantially vertical and loading shellfish onto lifting assembly to lift said shellfish to allow the holding and opening assembly to hold it against the hinge breaker, apply a vacuum to create a gape whereby the knife assembly slidably operates to cut the meat from one shellfish half. | 02-25-2010 |