Results for "full pass through data"
Learning structure from unlabeled data, such as discovering groups, compressing representations, or modeling data distributions.
Information that can identify an individual (directly or indirectly); requires careful handling and compliance.
Inferring sensitive features of training data.
Training one model on multiple tasks simultaneously to improve generalization through shared structure.
Configuration choices not learned directly (or not typically learned) that govern training or architecture.
A scalar measure optimized during training, typically expected loss over data, sometimes with regularization terms.
Crafting prompts to elicit desired behavior, often using role, structure, constraints, and examples.
System design where humans validate or guide model outputs, especially for high-stakes decisions.
A wide basin often correlated with better generalization.
Low-latency prediction per request.
Compromising AI systems via libraries, models, or datasets.
Models accessible only via service APIs.
Measures similarity and projection between vectors.
Model relies on irrelevant signals.
Model behaves well during training but not deployment.
AI used without governance approval.
Acting to minimize surprise or free energy.
Interpreting human gestures.
Automated assistance identifying disease indicators.
US approval process for medical AI devices.
Predicting borrower default risk.
AI discovering new compounds/materials.
Training with a small labeled dataset plus a larger unlabeled dataset, leveraging assumptions like smoothness/cluster structure.
A mismatch between training and deployment data distributions that can degrade model performance.
Learning where data arrives sequentially and the model updates continuously, often under changing distributions.
The internal space where learned representations live; operations here often correlate with semantics or generative factors.
Designing input features to expose useful structure (e.g., ratios, lags, aggregations), often crucial outside deep learning.
Recovering training data from gradients.
Diffusion model trained to remove noise step by step.
Diffusion performed in latent space for efficiency.