Modern information technology and computer science curricula employ a variety of graphical tools and development environments to facilitate student learning of introductory programming concepts and techniques. While the provision of interactive features and the use of visualization can enhance students' understanding and assist them in grasping fundamental ideas, the real difficulty for many students lies in making the transition from relying on the graphical features of these tools, to actually writing programming code statements in accordance with a set of plain English instructions. This article opens with a systematic review of the literature on alternative approaches to teaching object-oriented programming (OOP) to novice programmers. It then describes the rationale behind an "objects first, class user first" approach to introducing OOP, arguing for the use of interactive GUI-based visualization tools such as BlueJ as cognitive tools to allow learners to represent and manipulate their mental models or schemas. Finally, it reports on a study involving a cohort of students undertaking an introductory OOP unit in Java. The study investigated the effectiveness of: (i) the graphical features of BlueJ as a cognitive tool while performing coding tasks as part of a test; and (ii) the use of screencasts (video screen captures) of BlueJ to provide scaffolding during learning, which involves the provision of temporary support structures to assist learners in attaining the next stage or level in their development. The screencasts were used in conjunction with a series of structured exercises by providing an intermediate stepping stone to ease the transition to the writing of program code. The study found no significant effect of screencasts during the learning phase of the study, and no significant effect of BlueJ during testing. This result runs counter to theoretical predictions and consequently is important both for researchers focusing on the pedagogy associated with learning programming as well as those interested in the broader applications of animated instructional resources and cognitive tools. In the article, the authors postulate a number of reasons for the lack of significant effects to support their hypotheses. Firstly, it is possible that some, or perhaps many, participants who had access to BlueJ during the testing phase did not actually use it to assist them in answering the test questions. Secondly, since the screencasts and BlueJ were intended to ease students' transition to code, the data collection was conducted immediately following the participants' initial exposure to code statements. This gave rise to the possibility that they may not have been ready to attempt the questions framed at a high level of abstraction, which accounted for a majority of the test marks. The authors had hypothesized that the most benefit in providing the screencast-based scaffolding and the use of BlueJ as a cognitive tool was likely to be gained in assisting students with writing code for English instructions at this high level of abstraction; however, at this point in the semester they may not have been adequately prepared to undertake these types of questions, which required them to interpret the high-level task requirements and decompose them into individual object and class operations that would achieve the desired outcome (object state). Further research will need to be carried out to determine whether these hypothesized reasons for the lack of an identifiable difference between conditions can be supported, whether other factors are responsible, or whether in fact neither BlueJ screencasts nor the use of BlueJ as a cognitive tool actually enhance learning. One possible approach to a follow-up study would involve using a smaller number of students, but carrying out intensive observation during the experiment in order to determine the degree to which, and ways in which, BlueJ is used. This may include an oral component incorporating think-aloud protocols (Ericson & Simon, 1993) and/or follow-up interviews to gain deeper insight into and understanding of the participants' thought processes as they attempt the various questions in the test, as well as to identify gaps in their understanding in relation to the test questions. In addition to informing on the value of screencasts and cognitive tools for the learning of programming, such a study would also reveal in greater depth the nature of the cognitive stages involved in learning to write object-oriented program code from English instructions. (Contains 1 figure and 8 tables.)