Search

Your search keyword '"knowledge distillation"' showing total 35 results

Search Constraints

Start Over You searched for: Descriptor "knowledge distillation" Remove constraint Descriptor: "knowledge distillation" Publication Type Magazines Remove constraint Publication Type: Magazines
35 results on '"knowledge distillation"'

Search Results

1. Federated probability memory recall for federated continual learning.

2. Rethinking attention mechanism in time series classification.

3. Real-time masked face classification and head pose estimation for RGB facial image via knowledge distillation.

4. Hierarchical knowledge amalgamation with dual discriminative feature alignment.

5. KDCTime: Knowledge distillation with calibration on InceptionTime for time-series classification.

6. Distillation enhanced time series forecasting network with momentum contrastive learning.

7. Rethinking deep learning for supercontinuum: Efficient modeling based on integrated and compressed networks.

8. Discretization and decoupled knowledge distillation for arbitrary oriented object detection.

9. Bridging knowledge distillation gap for few-sample unsupervised semantic segmentation.

10. Online deep hashing for both uni-modal and cross-modal retrieval.

12. A novel method for reducing arrhythmia classification from 12-lead ECG signals to single-lead ECG with minimal loss of accuracy through teacher-student knowledge distillation.

13. Accelerating Monte Carlo Bayesian Prediction via Approximating Predictive Uncertainty Over the Simplex.

14. UAWC: An intelligent underwater acoustic target recognition system for working conditions mismatching.

15. Improving adversarial robustness using knowledge distillation guided by attention information bottleneck.

16. Online knowledge distillation with elastic peer.

17. Item-side ranking regularized distillation for recommender system.

18. Distilling from professors: Enhancing the knowledge distillation of teachers.

19. Top-aware recommender distillation with deep reinforcement learning.

20. Classifier-adaptation knowledge distillation framework for relation extraction and event detection with imbalanced data.

21. Distilling Ordinal Relation and Dark Knowledge for Facial Age Estimation.

22. MHAT: An efficient model-heterogenous aggregation training scheme for federated learning.

23. Federated distillation and blockchain empowered secure knowledge sharing for Internet of medical Things.

24. Hybrid mix-up contrastive knowledge distillation.

25. Heterogeneous graph knowledge distillation neural network incorporating multiple relations and cross-semantic interactions.

26. Neighbor self-knowledge distillation.

27. Multi-stage knowledge distillation for sequential recommendation with interest knowledge.

28. Graph neural networks with deep mutual learning for designing multi-modal recommendation systems.

29. Block change learning for knowledge distillation.

30. A resource-efficient ECG diagnosis model for mobile health devices.

31. INFER: Distilling knowledge from human-generated rules with uncertainty for STINs.

32. Dynamic data-free knowledge distillation by easy-to-hard learning strategy.

33. Human activity recognition based on multiple inertial sensors through feature-based knowledge distillation paradigm.

34. Customizing a teacher for feature distillation.

35. WLAN interference signal recognition using an improved quadruple generative adversarial network.

Catalog

Books, media, physical & digital resources