Background: Rigorous meta-analysis holds promise to answer the ultimate important question in education: which interventions work and for whom? Evidence-based reform in education has contributed to a proliferation of well-designed experiments in recent years, which calls for updated systematic reviews. However, this proliferation is a double-edged sword, as more new studies make meta-analysis more labor- and time-consuming. Choosing an appropriate screening tool is one of the first key steps to set up the review process. In the field of education, there is a lack of research comparing different screening tools. Similar research has been conducted in other fields, such as biomedical research (Mierden et al., 2019) and healthcare (Harrison et al., 2020). Through comparison and feature analysis, this narrative review aims to examine currently available screening tools in education to empower educational meta-analysts with necessary information for tool selection. Research Question: 1. What is the status of reporting screening tools in published educational research? 2. What screening tools are available for educational analysts? 3. How do the available screening tools differ in features and how should educational meta-analysts select the appropriate tool based on this information? Practice: In the software market, there are many meta-analytical tools developed for specific purposes in particular fields, such as SyRF developed for preclinical studies, SRDB.PRO developed for pharmaceutical industry, and PASRIFAL developed for software engineering. Not all tools are feasible for educational systematic research. This paper aims to identify available and appropriate tools for educational researchers. Therefore, in order to be included, the screening tools must have been used in educational research. To assess this criterion, we searched for "tool name + education/school/students + meta-analysis/systematic review" on Google Scholar to find systematic reviews in education that have cited the screening tools. Furthermore, we screened publications from the journal "Review of Educational Research" to conduct complementary search for additional tools. To locate available screening tools in the market, we conducted a web-based search to obtain names of different tools from publications (Harrison et al., 2020; Mierden et al., 2019; Schoot et al., 2021), blogs (Bradburn, 2018), university library resources (Roth, 2021), and research centers (Center for Evidence Synthesis in Health, 2021). Research Design: This research adopts systematic review, narrative review, and feature analysis to search and document different meta-analytical tools. Figure 1 shows the record selection process conducted in Covidence (Covidence systematic review software). The comparison tables (Table 1 and 2) present main research findings and the qualitative synthesis provides further analysis of the advantages and disadvantages of different screening tools. Data Collection and Analysis: Data on screening tools were collected from computerized databases to identify citations in educational studies. To understand the products' functions, the tools' official websites, training videos, and the publications introducing the tools were examined. For tools that were developed by academicians, their publications give details on the functions, robustness, and limitations of the tools. We are interested in finding out the current status of screening tools reporting in educational research. Therefore, we screened publications from the journal "Review of Educational Research (RER)" to conduct a systematic search and screening. Tools' features were coded as dummy variables (0/1). We separated various features into title and abstract screening features and full-text review features. Title and abstract screening features include: title/abstract screening, bulk application, machine learning classifiers, deep learning, and duplicate removal. Apart from machine learning classifiers, we also coded whether the screening tool uses deep neural network models for improved performance. Full-text review features include six functions: full-text review, team work, blind review, inter-rater reliability, research update, and reason labels. Findings: This research identifies 24 screening tools. Among them, 7 tools have been used by educational researchers, including Abstrackr, Covidence, ASReview, RevMan, Rayyan, EPPI-Reviewer, and DistillerSR. Table 1 presents a detailed documentation of functions, developer information, and accessibility of these 7 screening tools. The current adopting rate of transparent tool reporting is dismally low: by screening 178 studies published in "Review of Educational Research" since 2015, we found that only seven (3.9%) studies reported screening tools. Through citation search, we identified eight screening tools used by educational reviewers and ranked them by descending feature scores: EPPI-Reviewer (tie), DistillerSR (tie), Covidence, Rayyan, Abstrackr, ASReview, and Excel. Finally, we present a decision tree to assist educational systematic reviewers in identifying suitable tools. In terms of functional differentiation, all 7 tools enable title and abstract screening while 4 of them also enable full-text review. Bulk application is a necessary function for researchers to manage studies in an efficient way. All tools except RevMan enable bulk import and export of studies. For collaborative systematic reviews, web-based and team work are two essential features. All included tools, except ASReview, are web-based. ASReview, being terminal and python based, precludes the possibility of team work and future updates. More than five screening tools use machine learning algorithms to facilitate meta-analysis automation. Accessibility is based on the cost of the tools. Free tools include Abstrackr, ASReview, and Rayyan. RevMan is only free for purely academic use but not for commercial purposes. Figure 2 presents the feature analysis ranking of included screening tools from high to low. Conclusions: When making the decision on the right screening tools, meta-analysts may first decide on the desirable screening tool functions to exclude tools lacking these essential functions. Figure 3 shows a decision tree to assist reviewers' tool selection process. This paper identified several tools to support screening large numbers of studies for systematic reviews, each with varying features. Research teams should seek a solution that fits their needs, but features such as the ability to collaborate, document the process, and assess interrater reliability support higher quality research syntheses. One problem this paper identifies is that many researchers report only meta-analytical packages and software but seldomly report the screening tools they use. This presents some obstacles in identifying screening tools in the field of education. With the open science movement, researchers should try to report their systematic reviews in as much detail as possible.