Search

Your search keyword '"Wang, Shuohuan"' showing total 50 results

Search Constraints

Start Over You searched for: Author "Wang, Shuohuan" Remove constraint Author: "Wang, Shuohuan"
50 results on '"Wang, Shuohuan"'

Search Results

1. MA-RLHF: Reinforcement Learning from Human Feedback with Macro Actions

2. Upcycling Instruction Tuning from Dense to Mixture-of-Experts via Parameter Merging

3. NACL: A General and Effective KV Cache Eviction Framework for LLMs at Inference Time

4. DHA: Learning Decoupled-Head Attention from Transformer Checkpoints via Adaptive Heads Fusion

5. HFT: Half Fine-Tuning for Large Language Models

6. Autoregressive Pre-Training on Pixels and Texts

7. On Training Data Influence of GPT Models

8. Tool-Augmented Reward Modeling

9. ERNIE-Music: Text-to-Waveform Music Generation with Diffusion Models

10. ERNIE-Code: Beyond English-Centric Cross-lingual Pretraining for Programming Languages

11. X-PuDu at SemEval-2022 Task 6: Multilingual Learning for English and Arabic Sarcasm Detection

12. X-PuDu at SemEval-2022 Task 7: A Replaced Token Detection Task Pre-trained Model with Pattern-aware Ensembling for Identifying Plausible Clarifications

13. ERNIE-UniX2: A Unified Cross-lingual Cross-modal Framework for Understanding and Generation

14. ERNIE-SAT: Speech and Text Joint Pretraining for Cross-Lingual Multi-Speaker Text-to-Speech

15. Clip-Tuning: Towards Derivative-free Prompt Learning with a Mixture of Rewards

16. Nebula-I: A General Framework for Collaboratively Training Deep Learning Models on Low-Bandwidth Cloud Clusters

17. ERNIE 3.0 Titan: Exploring Larger-scale Knowledge Enhanced Pre-training for Language Understanding and Generation

18. ERNIE 3.0: Large-scale Knowledge Enhanced Pre-training for Language Understanding and Generation

19. ERNIE-Doc: A Retrospective Long-Document Modeling Transformer

20. ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora

21. Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification using Pre-trained Language Models

22. ERNIE at SemEval-2020 Task 10: Learning Word Emphasis Selection by Pre-trained Language Model

23. kk2018 at SemEval-2020 Task 9: Adversarial Training for Code-Mixing Sentiment Classification

24. ERNIE 2.0: A Continual Pre-training Framework for Language Understanding

25. ERNIE: Enhanced Representation through Knowledge Integration

26. Dual Modalities of Text: Visual and Textual Generative Pre-training

27. On Training Data Influence of GPT Models

33. Correcting Chinese Spelling Errors with Phonetic Pre-training

34. ERNIE-Doc: A Retrospective Long-Document Modeling Transformer

35. abcbpc at SemEval-2021 Task 7: ERNIE-based Multi-task Model for Detecting and Rating Humor and Offense

36. ERNIE at SemEval-2020 Task 10: Learning Word Emphasis Selection by Pre-trained Language Model

37. Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification Using Pre-trained Language Models

38. Kk2018 at SemEval-2020 Task 9: Adversarial Training for Code-Mixing Sentiment Classification

47. ERNIE 2.0: A Continual Pre-training Framework for Language Understanding

48. OleNet at SemEval-2019 Task 9: BERT based Multi-Perspective Models for Suggestion Mining

50. A stacking ensemble model for predicting the occurrence of carotid atherosclerosis.

Catalog

Books, media, physical & digital resources