208 results on '"William Merrill"'
Search Results
2. Transparency Helps Reveal When Language Models Learn Meaning
3. The Parallelism Tradeoff: Limitations of Log-Precision Transformers
4. Saturated Transformers are Constant-Depth Threshold Circuits
5. Provable Limitations of Acquiring Meaning from Ungrounded Form: What Will Future Language Models Understand?
6. 2 OLMo 2 Furious.
7. Evaluating n-Gram Novelty of Language Models Using Rusty-DAWG.
8. OLMo: Accelerating the Science of Language Models.
9. Can You Learn Semantics Through Next-Word Prediction? The Case of Entailment.
10. Reseña de 'Nómadas y sedentarios en el Norte de México: homenaje a Beatriz Braniff' de Marie Areti Hers y José Luis Mirafuente (comp.)
11. The Illusion of State in State-Space Models.
12. How Language Model Hallucinations Can Snowball.
13. The Expressive Power of Transformers with Chain of Thought.
14. Formal Languages and the NLP Black Box.
15. Let's Think Dot by Dot: Hidden Computation in Transformer Language Models.
16. ReCLIP: A Strong Zero-Shot Baseline for Referring Expression Comprehension.
17. Entailment Semantics Can Be Extracted from an Ideal Language Model.
18. A Logic for Expressing Log-Precision Transformers.
19. A Tale of Two Circuits: Grokking as Competition of Sparse and Dense Subnetworks.
20. Transformers as Recognizers of Formal Languages: A Survey on Expressivity.
21. How Language Model Hallucinations Can Snowball.
22. The Expressive Power of Transformers with Chain of Thought.
23. Competency Problems: On Finding and Removing Artifacts in Language Data.
24. Effects of Parameter Norm Growth During Transformer Training: Inductive Bias from Gradient Descent.
25. Dynasty, Declension, and the Endurance of the House of Adams
26. Epilogue. Color outside the Lines
27. Index
28. Title Page, Copyright Page
29. Cover
30. Acknowledgments
31. Chapter 5. Notes from Underground: Richard Wright, Ralph Ellison, James Baldwin, Malcolm X
32. Bibliography
33. Notes
34. Chapter 6. Next Worlds: Toni Morrison and Octavia Butler
35. Chapter 4. Domestic Uplift and Escape Abroad: Booker T. Washington, Ida B. Wells-Barnett, W. E. B. Du Bois
36. Chapter 3. Slave State to Free: David Walker, Frederick Douglass, Harriet Jacobs
37. Chapter 2. Voices from the Global South: Phillis Wheatley and Olaudah Equiano
38. Prologue
39. Chapter 1. Signifying Space: Geographies of Domination and Resistance
40. Formal languages and neural models for learning on sequences.
41. A Formal Hierarchy of RNN Architectures.
42. Transformers Implement First-Order Logic with Majority Quantifiers.
43. Extracting Finite Automata from RNNs Using State Merging.
44. Log-Precision Transformers are Constant-Depth Uniform Threshold Circuits.
45. Transparency Helps Reveal When Language Models Learn Meaning.
46. Detecting Syntactic Change Using a Neural Part-of-Speech Tagger.
47. Finding Hierarchical Structure in Neural Stacks Using Unsupervised Parsing.
48. Context-Free Transductions with Neural Stacks.
49. End-to-End Graph-Based TAG Parsing with Neural Networks.
50. Formal Language Theory Meets Modern NLP.
Catalog
Books, media, physical & digital resources
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.