Resumen de: US2025307306A1
Systems and methods for responding to a subscriber's text-based request for content items are presented. In response to a request from a subscriber, word pieces are generated from the text-based terms of the request. A request embedding vector of the word pieces is obtained from a trained machine learning model. Using the request embedding vector, a set of content items, from a corpus of content items, is identified. At least some content items of the set of content items are returned to the subscriber in response to the text-based request for content items.
Resumen de: US2025307673A1
Various examples are directed to systems and methods for executing a computer-automated process using trained machine learning (ML) models. A computing system may access first event data describing a first event. The computing system may execute a first ML model to determine an ML characterization of the first event using the first event data. The computing system may also apply a first rule set to the first event data to generate a rule characterization of the first event. The computing system may determine an output characterization of the first event based at least in part on the rule characterization of the first event and determine to deactivate the first rule set based at least in part on the ML characterization of the first event.
Resumen de: US2025307662A1
Various embodiments of the present invention provide methods, apparatus, systems, computing devices, computing entities, and/or the like for performing temporally dynamic location-based predictive data analysis. Certain embodiments of the present invention utilize systems, methods, and computer program products that perform temporally dynamic location-based predictive data analysis by using at least one of cohort generation machine learning models and cohort-based growth forecast machine learning models.
Resumen de: US2025307660A1
The disclosed embodiments include computer-implemented apparatuses and processes that perform causal inferencing in distributed computing environments using trained double machine learning and trained classifiers. For example, an apparatus may receive, from a device, a request that includes identifier associated with the device and exception data that includes a requested modification to a value of a parameter of a data exchange. The apparatus may also obtain labelling data based on an application of a trained classifier to a first input dataset that includes a value of an elasticity parameter associated with the request, may generate elements of decision data associated with the requested modification based on the labelling data and on the exception data, and may transmit, to the device, a response to the request that includes the elements of decision data.
Resumen de: US2025307671A1
A data correction method, a computing apparatus used for machine learning, and a computer-readable medium are provided. In the method, multiple pieces of sensing data are related, and a causal relationship is generated. The causal relationship is compared, and a comparison result is generated. The comparison result is used for modifying the sensing data. The machine learning model is trained through inputting the modified sensing data. Therefore, the correctness of data can be ensured.
Resumen de: US2025307503A1
A wellbore temperature optimization and predication method integrating numerical models and machine learning includes the following steps: establishing a wellbore-formation transient heat transfer model, obtaining an initial data set composed of relevant parameters, normalizing the initial data set, training a wellbore temperature prediction model by using a random forest algorithm, then optimizing the hyperparameters of the random forest algorithm by using a genetic algorithm, performing global optimization by using an annealing algorithm to obtain the optimized wellbore temperature and related parameters, calculating the wellbore temperature by substituting the optimized parameters into the wellbore-formation transient heat transfer model, and performing comparative verification on the optimized wellbore temperature and the calculated wellbore temperature.
Resumen de: WO2025204495A1
An information processing device according to the present invention extracts, from combinations of words included in each of a plurality of sentences to which a teacher label is assigned, a combination of words in which an index value indicating a relation between the combination of words and the teacher label satisfies a predetermined condition, determines whether or not the extracted combination of words is included in the same context in each of the plurality of sentences, and generates training data associated with the teacher label by using, as a feature, presence/absence of the combination of words determined to be included in the same context, in each of the plurality of sentences.
Resumen de: WO2025207773A1
Electrical impedance tomography (EIT) is a non-destructive and non-radioactive imaging technique used to detect anomalies in materials. Deep learning (DL)-based EIT reconstruction addresses the non-linear, ill-conditioned nature of EIT inverse problems and allows for the conductivity of materials to be reconstructed directly through neural networks (NNs), as opposed to iteratively with conventional inverse reconstruction algorithms. To advance the DL-based reconstruction for EIT, a neural network (NN) architecture is trained with a custom loss function that serves as a surrogate model for the compressed sensing-based EIT reconstruction algorithm. The NN is trained to mimic a compressed sensing algorithm that performs the EIT conductivity reconstruction. The resultant NN is able to accurately capture the electrical properties and characteristics of the sensing domain when trained with limited data of varying quality, and this improves upon other DL models trained with the traditional MSE loss function.
Resumen de: WO2025199901A1
Embodiments of the present disclosure provide a solution for switching machine-learning (ML) model-related operation. In a solution, a first device determines, a switching condition is met, the switching condition associated with at least one of the following: a first performance of a first model of the first device, a second performance of a second model of a second device, a first processing capability of the first model of the first device, or a second processing capability of the second model of the second device; in response to the switching condition being met, transmit a first message to the second device, wherein the first message used for switching a machine-learning (ML) model-related operation from the first device to the second device or switching the ML model-related operation from the second device to the first device, wherein the first and the second models are associated with a same function.
Resumen de: WO2025207584A1
Onset and/or continuation of a migraine attack is predicted in a subject using a machine learning model. Subject health data are accessed with a computer system, where the subject health data include clinical test or measurement data received from the subject and/or subject symptom data received from the subject. A trained machine learning model is accessed with the computer system, where the trained machine learning model has been trained on training data to predict a likelihood of migraine attack occurring within a specified timeframe based on features in subject health data. The subject health data are input to the trained machine learning model using the computer system, generating classified feature data as an output. The classified feature data indicate a likelihood of the subject having a migraine attack within the specified timeframe.
Resumen de: US2025307245A1
Examples detect equivalent subexpressions within a computational workload. Examples include converting a query plan tree associated with a first subexpression into a matrix. The first subexpression is a portion of a database query from the computational workload. Each node in the query plan tree is represented as a row of the matrix. The matrix is converted into a first vector. The first subexpression is determined to be equivalent to a second subexpression by comparing the first vector to a second vector associated with the second subexpression. The comparison includes computing a distance between the first and second vectors that is lower than a distance threshold. The computational workload is modified, based on the determining, to perform the first subexpression and exclude performance of the second subexpression as duplicative.
Resumen de: EP4625199A1
Examples detect equivalent subexpressions within a computational workload. Examples include converting a query plan tree associated with a first subexpression into a matrix. The first subexpression is a portion of a database query from the computational workload. Each node in the query plan tree is represented as a row of the matrix. The matrix is converted into a first vector. The first subexpression is determined to be equivalent to a second subexpression by comparing the first vector to a second vector associated with the second subexpression. The comparison includes computing a distance between the first and second vectors that is lower than a distance threshold. The computational workload is modified, based on the determining, to perform the first subexpression and exclude performance of the second subexpression as duplicative.
Resumen de: GB2639745A
According to various examples of the present disclosure, there is provided a location management function (LMF) entity configured to: subscribe to or request artificial intelligence/machine learning (AI/ML) -related services from a network data analytics function (NWDAF) entity in relation to an AI/ML model for determining positioning of a user equipment (UE); and receive, from the NWDAF entity, an indication that training has been performed for the AI/ML model. According to various examples of the present disclosure, there is provided a network data analytics function (NWDAF) entity configured to: receive, from a location management function (LMF) entity, a subscription to or request for artificial intelligence/machine learning (AI/ML) -related services in relation to an AI/ML model for determining positioning of a UE; obtain data for training the AI/ML model from at least one other entity; train the AI/ML model based on the obtained data; and transmit, to the LMF entity, an indication that training has been performed for the AI/ML model; wherein the NWDAF entity includes a model training logical function (MTLF).
Nº publicación: EP4624960A1 01/10/2025
Solicitante:
LG ENERGY SOLUTION LTD [KR]
LG Energy Solution, Ltd
Resumen de: EP4624960A1
A battery life estimation apparatus includes a charging/discharging unit configured to charge/discharge a battery and a controller configured to calculate a partial accumulative capacity corresponding to a partial voltage period defined as a period from a first voltage to a second voltage by charging/discharging the battery in the partial voltage period, and estimate a life corresponding to an entire voltage period of the battery by inputting the first voltage, the second voltage, and the partial accumulative capacity to an estimation model trained based on machine learning.
