Search

Your search keyword '"knowledge distillation"' showing total 355 results

Search Constraints

Start Over You searched for: Descriptor "knowledge distillation" Remove constraint Descriptor: "knowledge distillation" Topic distillation Remove constraint Topic: distillation
355 results on '"knowledge distillation"'

Search Results

1. TCN-attention-HAR: human activity recognition based on attention mechanism time convolutional network.

2. Hierarchical Knowledge Propagation and Distillation for Few-Shot Learning.

3. Improving Differentiable Architecture Search via self-distillation.

4. Dual Distillation Discriminator Networks for Domain Adaptive Few-Shot Learning.

5. Inferior and Coordinate Distillation for Object Detectors.

6. Ssd-kdgan: a lightweight SSD target detection method based on knowledge distillation and generative adversarial networks.

7. DPAL-BERT: A Faster and Lighter Question Answering Model.

8. Variational AdaBoost knowledge distillation for skin lesion classification in dermatology images.

10. Descriptor Distillation: A Teacher-Student-Regularized Framework for Learning Local Descriptors.

11. Optimal Knowledge Distillation through Non-Heuristic Control of Dark Knowledge †.

12. Lightweight Algorithm for Rail Fastener Status Detection Based on YOLOv8n.

13. A hybrid model compression approach via knowledge distillation for predicting energy consumption in additive manufacturing.

14. Bi-Level Orthogonal Multi-Teacher Distillation.

15. Empowering lightweight detectors: Orientation Distillation via anti-ambiguous spatial transformation for remote sensing images.

16. Cross-Architecture Knowledge Distillation.

17. Dual-student knowledge distillation for visual anomaly detection.

18. A Multi-Level Adaptive Lightweight Net for Damaged Road Marking Detection Based on Knowledge Distillation.

19. MicroBERT: Distilling MoE-Based Knowledge from BERT into a Lighter Model.

20. MKDAT: Multi-Level Knowledge Distillation with Adaptive Temperature for Distantly Supervised Relation Extraction.

21. Knowledge Distillation via Hierarchical Matching for Small Object Detection.

22. FA-VTON: A Feature Alignment-Based Model for Virtual Try-On.

23. Multistage feature fusion knowledge distillation.

24. NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation.

25. A Pruning and Distillation Based Compression Method for Sonar Image Detection Models.

26. Self-Knowledge Distillation via Progressive Associative Learning.

27. PKD‐Net: Distillation of prior knowledge for image completion by multi‐level semantic attention.

28. Shared Knowledge Distillation Network for Object Detection.

29. Efficient Speech Detection in Environmental Audio Using Acoustic Recognition and Knowledge Distillation.

30. Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation.

31. A Multi-Teacher Distillation Scheme for BERT Model.

32. Channel-level Matching Knowledge Distillation for object detectors via MSE.

33. Distillation-based fabric anomaly detection.

34. Joint data augmentation and knowledge distillation for few-shot continual relation extraction.

35. Tomato leaf disease recognition based on multi-task distillation learning.

36. Teacher–student complementary sample contrastive distillation.

37. Kernel-mask knowledge distillation for efficient and accurate arbitrary-shaped text detection.

38. DSP-KD: Dual-Stage Progressive Knowledge Distillation for Skin Disease Classification.

39. "Putting Science into Action": A Case Study of How an Educational Intermediary Organization Synthesizes and Translates Research Evidence for Practice.

40. Modeling different effects of user and product attributes on review sentiment classification.

41. Personalized Federated Learning Based on Bidirectional Knowledge Distillation for WiFi Gesture Recognition.

42. Few-Shot Image Classification via Mutual Distillation.

43. Mutual Information-Based Neural Network Distillation for Improving Photonic Neural Network Training.

44. Self-Supervised Network Distillation for Exploration.

45. Forest Fire Object Detection Analysis Based on Knowledge Distillation.

46. Improving relation classification effectiveness by alternate distillation.

47. FMDL: Federated Mutual Distillation Learning for Defending Backdoor Attacks.

48. Research on a High-Performance Rock Image Classification Method.

49. Environment-Aware Knowledge Distillation for Improved Resource-Constrained Edge Speech Recognition.

50. A Method for Image Anomaly Detection Based on Distillation and Reconstruction.

Catalog

Books, media, physical & digital resources