Patent application number | Description | Published |
20090210627 | METHOD AND SYSTEM FOR HANDLING CACHE COHERENCY FOR SELF-MODIFYING CODE - A method for handling cache coherency includes allocating a tag when a cache line is not exclusive in a data cache for a store operation, and sending the tag and an exclusive fetch for the line to coherency logic. An invalidation request is sent within a minimum amount of time to an I-cache, preferably only if it has fetched to the line and has not been invalidated since, which request includes an address to be invalidated, the tag, and an indicator specifying the line is for a PSC operation. The method further includes comparing the request address against stored addresses of prefetched instructions, and in response to a match, sending a match indicator and the tag to an LSU, within a maximum amount of time. The match indicator is timed, relative to exclusive data return, such that the LSU can discard prefetched instructions following execution of the store operation that stores to a line subject to an exclusive data return, and for which the match is indicated. | 08-20-2009 |
20090210632 | MICROPROCESSOR AND METHOD FOR DEFERRED STORE DATA FORWARDING FOR STORE BACKGROUND DATA IN A SYSTEM WITH NO MEMORY MODEL RESTRICTIONS - A pipelined processor includes circuitry adapted for store forwarding, including: for each store request, and while a write to one of a cache and a memory is pending; obtaining the most recent value for at least one block of data; merging store data from the store request with the block of data thus updating the block of data and forming a new most recent value and an updated complete block of data; and buffering the updated block of data into a store data queue; for each additional store request, where the additional store request requires at least one updated block of data: determining if store forwarding is appropriate for the additional store request on a block-by-block basis; if store forwarding is appropriate, selecting an appropriate block of data from the store data queue on a block-by-block basis; and forwarding the selected block of data to the additional store request. | 08-20-2009 |
20090210679 | PROCESSOR AND METHOD FOR STORE DATA FORWARDING IN A SYSTEM WITH NO MEMORY MODEL RESTRICTIONS - A pipelined microprocessor includes circuitry for store forwarding by performing: for each store request, and while a write to one of a cache and a memory is pending; obtaining the most recent value for at least one complete block of data; merging store data from the store request with the complete block of data thus updating the block of data and forming a new most recent value and an updated complete block of data; and buffering the updated complete block of data into a store data queue; for each load request, where the load request may require at least one updated completed block of data: determining if store forwarding is appropriate for the load request on a block-by-block basis; if store forwarding is appropriate, selecting an appropriate block of data from the store data queue on a block-by-block basis; and forwarding the selected block of data to the load request. | 08-20-2009 |
20090216516 | METHOD, SYSTEM AND COMPUTER PROGRAM PRODUCT FOR GENERATING TRACE DATA - There is provided a method, system and computer program product for generating trace data related to a data processing system event. The method includes: receiving an instruction relating to the system event from a location in the system; determining a minimum number of trace segment records required to record instruction information; and creating a trace segment table including the number of trace segment records, the number of trace segment records including at least one instruction record. | 08-27-2009 |
20090216949 | METHOD AND SYSTEM FOR A MULTI-LEVEL VIRTUAL/REAL CACHE SYSTEM WITH SYNONYM RESOLUTION - Method and system for a multi-level virtual/real cache system with synonym resolution. An exemplary embodiment includes a multi-level cache hierarchy, including a set of L1 caches associated with one or more processor cores and a set of L2 caches, wherein the set of L1 caches are a subset of the set of L2 caches, wherein the set of L1 caches underneath a given L2 cache are associated with one or more of the processor cores. | 08-27-2009 |
20090240995 | METHOD AND APPARATUS FOR IMPROVING RANDOM PATTERN TESTING OF LOGIC STRUCTURES - A test method and apparatus for randomly testing logic structures. The method includes identifying and analyzing a functional behavior of a logic structure to be covered during the random testing, modifying the logic structure such that the logic structure behaves in a functional manner during random testing, and generating patterns to exercise the modified logic structure. | 09-24-2009 |
20110320774 | OPERAND FETCHING CONTROL AS A FUNCTION OF BRANCH CONFIDENCE - A system for data operand fetching control includes a computer processor that includes a control unit for determining memory access operations. The control unit is configured to perform a method. The method includes calculating a summation weight value for each instruction in a pipeline, the summation weight value calculated as a function of branch uncertainty and a pendency in which the instruction resides in the pipeline relative to other instructions in the pipeline. The method also includes mapping the summation weight value of a selected instruction that is attempting to access system memory to a memory access control, each memory access control specifying a manner of handling data fetching operations. The method further includes performing a memory access operation for the selected instruction based upon the mapping. | 12-29-2011 |
20130091343 | OPERAND FETCHING CONTROL AS A FUNCTION OF BRANCH CONFIDENCE - Data operand fetching control includes calculating a summation weight value for each instruction in a pipeline, the summation weight value calculated as a function of branch uncertainty and a pendency in which the instruction resides in the pipeline relative to other instructions in the pipeline. The data operand fetching control also includes mapping the summation weight value of a selected instruction that is attempting to access system memory to a memory access control. Each memory access control specifies a manner of handling data fetching operations. The data operand fetching control further includes performing a memory access operation for the selected instruction based upon the mapping. | 04-11-2013 |
20140082252 | Combined Two-Level Cache Directory - Responsive to receiving a logical address for a cache access, a mechanism looks up a first portion of the logical address in a local cache directory for a local cache. The local cache directory returns a set identifier for each set in the local cache directory. Each set identifier indicates a set within a higher level cache directory. The mechanism looks up a second portion of the logical address in the higher level cache directory and compares each absolute address value received from the higher level cache directory to an absolute address received from a translation look-aside buffer to generate a higher level cache hit signal. The mechanism compares the higher level cache hit signal to each set identifier to generate a local cache hit signal and responsive to the local cache hit signal indicating a local cache hit, accesses the local cache based on the local cache hit signal. | 03-20-2014 |
20140108743 | STORE DATA FORWARDING WITH NO MEMORY MODEL RESTRICTIONS - Embodiments relate to loading data in a pipelined microprocessor. An aspect includes issuing a load request that comprises a load address requiring at least one block of data the same size as a largest contiguous granularity of data returned from a cache. Another aspect includes determining that the load address matches at least one block address. Another aspect includes, based on determining that there is an address match, reading a data block from a buffer register and sending the data to satisfy the load request; comparing a unique set id of the data block to the set id of the matching address after sending the data block; based on determining that there is a set id match, continuing the load request, or, based on determining that there is not a set id match, setting a store-forwarding state of the matching address to no store-forwarding and rejecting the load request. | 04-17-2014 |
Patent application number | Description | Published |
20100305850 | Vehicle Route Representation Creation - Techniques and systems are disclosed that provide for creating an accurate representation of a roadway network, such as for planning vehicle travel routes. Positioning data is obtained, such as GPS data points from a plurality of vehicles, which mark traces of vehicular travel. A location of a trace is clarified using adjustment forces that are related to the traces, for example, to form coherent groups of traces. From these groups of clarified traces, a graph line is created by merging the traces. | 12-02-2010 |
20100332131 | ROUTING, ALERTING, AND TRANSPORTATION GUIDANCE BASED ON PREFERENCES AND LEARNED OR INFERRED RISKS AND DESIRABILITIES - Techniques and systems are disclosed that provide a risk-based assessment for a user based on user location information. Incident data is acquired for incidents that involve potential risks (e.g., to people and/or property) from a plurality of locations and contexts, considering such factors as date, time, weather, traffic, and velocity. The incident data is matched to the user's location and context directly or indirectly to provide one or more potential outcomes of interest (e.g., accidents, injuries, fatalities), and inferences regarding the likelihood of events are made available. These measures are compared to desired risk thresholds for the user. In one embodiment, routes, times, and conditions of travel may be preferred over others routes, times, and conditions. In another embodiment, users may be notified of a condition or a vehicle's maximum velocity may be reduced when the matched incident data meets/exceeds a user's risk threshold. | 12-30-2010 |
Patent application number | Description | Published |
20100010733 | ROUTE PREDICTION - Driving history of a user with regard to a particular road intersection can be collected and retained in storage. A Markov model can be used to predict likelihood of the user making a particular decision regarding the intersection. A highest likelihood decision can be identified and used to create a travel route. In addition, contextual information can be taken into account when creating the route, such as time of day, road conditions, user situation, and the like. | 01-14-2010 |
20110205125 | INFERRING BEACON POSITIONS BASED ON SPATIAL RELATIONSHIPS - Estimating positions of beacons based on spatial relationships among neighboring beacons. Beacon reference data defining positions of beacons is stored from beacon fingerprints observed by devices (e.g., enabled with global positioning system receivers). For a received beacon fingerprint having at least one beacon for which the beacon reference data is missing (e.g., from a device without a GPS receiver), beacons in the received beacon fingerprint for which beacon reference data is available are identified. Based on these identified beacons, the missing beacon reference data is calculated. In some embodiments, a set of spatially diverse beacons is selected from the identified beacons prior to calculating the beacon reference data. | 08-25-2011 |
20120010996 | RECOMMENDATIONS AND TARGETED ADVERTISING BASED UPON DIRECTIONS REQUESTS ACTIVITY AND DATA - Concepts and technologies are described herein for providing recommendations and/or advertisements based upon route query activity. A web server is configured to receive queries from an entity. The queries and contextual data associated therewith can be analyzed, and data relating to the queries can be stored by the web server as route activity logs. Adjacent routes and explicit waypoint routing can be abstracted via address directories and ontologies to higher-level goals and the route activities and goals. Such data can serve as case libraries for the construction via machine learning of models that predict interests and preferences with visits to locations and the sequencing of such visits. Training data can include correlated contextual data such as the time and day, prior route queries, and weather, to learn predictive models. Predictions about context- and destination-centric goals and interests can be harnessed to predict preferences, target advertising about waypoints and alternative destinations of potential interest, or advertisements about location-centric or location-independent products or services, all of which can drive recommendations in the present or at a future time. | 01-12-2012 |
20120158289 | MOBILE SEARCH BASED ON PREDICTED LOCATION - A method includes receiving one or more search terms at a mobile computing device while the mobile computing device is located at a particular location. A search query that includes the one or more search terms and a location history of the mobile computing device is transmitted to a server. The method also includes receiving one or more search results in response to the search query, where the one or more search results include content identified based on a predicted destination of the mobile computing device. An interface identifying the one or more search results is displayed at the mobile computing device. | 06-21-2012 |
20120310376 | OCCUPANCY PREDICTION USING HISTORICAL OCCUPANCY PATTERNS - Methods and systems for occupancy prediction using historical occupancy patterns are described. In an embodiment, an occupancy probability is computed by comparing a recent occupancy pattern to historic occupancy patterns. Sensor data for a room, or other space, is used to generate a table of past occupancy which comprises these historic occupancy patterns. The comparison which is performed identifies a number of similar historic occupancy patterns and data from these similar historic occupancy patterns is combined to generate an occupancy probability for a time in the future. In an example, time may be divided into discrete slots and binary values may be used to indicate occupancy or non-occupancy in each slot. An occupancy probability for a defined future time slot then comprises a combination of the binary values for corresponding time slots from each of the identified similar occupancy patterns. | 12-06-2012 |
20130053054 | USING PREDICTIVE TECHNOLOGY TO INTELLIGENTLY CHOOSE COMMUNICATION - Selecting communication settings. A method includes observing at least one of present, prior, or anticipated future movement of a user. Based on the observed user movement, embodiments may predict one or more future locations of the user. Based on the one or more future locations of the user, a communication setting of a device is selected to be used by the user. | 02-28-2013 |