About the Author(s)

Erica Munnik Email symbol
Department of Psychology, Faculty of Community and Health Sciences, University of the Western Cape, Cape Town, South Africa

Emma Wagener symbol
Department of Psychology, Faculty of Community and Health Sciences, University of the Western Cape, Cape Town, South Africa

Mario Smith symbol
Department of Psychology, Faculty of Community and Health Sciences, University of the Western Cape, Cape Town, South Africa


Munnik, E., Wagener, E., & Smith, M. (2021). Validation of the emotional social screening tool for school readiness. African Journal of Psychological Assessment, 3(0), a42. https://doi.org/10.4102/ajopa.v3i0.42

Original Research

Validation of the emotional social screening tool for school readiness

Erica Munnik, Emma Wagener, Mario Smith

Received: 20 Nov. 2020; Accepted: 12 May 2021; Published: 21 June 2021

Copyright: © 2021. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


The need for contextually appropriate and accessible school readiness assessment instruments in South Africa is well documented. The Emotional Social Screening tool for School Readiness (E3SR) screens for emotional and social competencies as a component of school readiness. This competency-based screening instrument was developed as a nine-factor model consisting of 54 items. This research study reports on the psychometric properties and factor structure of the E3SR by exploratory factor analysis. Ten preschool centres registered under the Social Welfare Act in the Cape Town Metropolitan region situated in the high-, middle- and low-socio-economic status (SES) areas constituted the research setting. A pilot study using a survey design was conducted. The E3SR protocols were completed by teachers on Grade R children during the fourth term of the academic year. The data set of 330 protocols satisfied the assumptions for inferential statistics, except for normal distribution. Normality was violated statistically; however, given the time frame, learners were expected to have mastered the competencies measured. Therefore, the violation of normal distribution was supported theoretically. Exploratory factor analysis yielded a six-factor structure, including Emotional maturity, Emotional management, Sense of self, Social skills, Readiness to learn and Communication. All the extracted factors displayed an adequate internal consistency, with a good reliability (α = 0.97). The E3SR can be shortened from 56 to 36 items without losing any important content. The E3SR can supplement formative assessments and enhance communication between role players to build children’s emotional and social competencies.

Keywords: emotional social competence; E3SR; factor structure; school readiness; South Africa; validation study.


Moving from early learning experiences into formal schooling constitutes a profound change (Yunas & Dahlan, 2013). School readiness and associated skills create a platform for learning and lifelong growth (Rimm-Kaufmann, Pianta, & Cox, 2000). Early childhood development (ECD) and school readiness are exponentially compromised by contextual factors in developing countries (Munnik & Smith, 2019a; Raikes et al., 2015). In South Africa, ECD and school readiness are adversely affected by sociocultural and political factors. (Bruwer et al., 2014; Foxcroft, 2013). School readiness is regarded as a multidimensional concept. Learning starts through early stimulation where external factors impact the personal readiness of the child, including the expectations of the parents, readiness of the school, preschool experiences and the child’s environment (Bruwer et al., 2014; Munnik & Smith, 2019a). The primary domains identified in school readiness include cognition and general knowledge, language and literacy, perception, emotion regulation, social skills, approaches to learning; physical well-being, neurological and motor development (Mohamed, 2013; Rimm-Kaufman & Sandilos, 2017). The inclusion of emotional and social readiness as a domain of readiness has received more focus (DBE, 2013; Mohamed, 2013; Munnik, 2018). Emotional and social development is perceived as an important domain of school readiness (Ngwaru, 2012). Self-understanding and awareness, social confidence, empathy and emotional growth, self and emotion regulation are identified as important competencies in the emotional and social realm (Bustin, 2007). Understanding, regulating and expressing of emotions are attributes of school readiness (Ştefan, Bălaj, Porumb, Albu, & Miclea, 2009). Similarly, compliance to rules, interpersonal skills and pro-social behaviour were identified as attributes of school readiness (Mohamed, 2013).

Laher and Cockcroft (2014) reported progress in the development of assessment protocols for educators and professionals. However, school readiness assessment in South Africa remains a focus of further research. School readiness assessment needs to be seen as a multidimensional process. In South Africa, formal assessment practices are still largely child focused. School readiness assessments are mainly performed by educators and healthcare professionals. Preschool teachers use observations and assessment measures built into the National Policy and Curriculum Statement to assess children’s readiness on a physical, cognitive, affective, normative, sociocultural and linguistic level (Powell, 2010).

Instruments that are currently used by professionals to assess for school readiness include the Aptitude Test for School beginners (ASB) (Roodt et al., 2013), the Junior South African Individual Scale (JSAIS) (Madge, Van den Berg, & Robinson, 1985), the School Readiness Evaluation by trained testers (SETTs) (HSRC, 1984), the Griffiths Mental Developmental Scales (GMDS) (Jacklin & Cockcroft, 2013; Luiz, Barnard, Knoetzen, & Kotras, 2004), the School-entry Group Screening Measure – SGSM (Foxcroft, 1994) and the School Readiness Test of the University of Pretoria (Van Rooyen & Engelbrecht, 1997). The Health Professionals Council of South Africa has not included any additional tests to assess school readiness on its list of classified tests since 2007 (HPCSA, 2010). Scientific evidence for the validity and reliability of currently used tests in the multi-cultural context is lacking.

Access to and costs are barriers that limit applicability in South Africa. Assessment is often compromised by the availability of instruments that allow for variation in sociocultural status, multilingualism and access to available resources (Amod & Heafield, 2013). Professional screening and assessment remain unaffordable for the general population (Makhalemele & Nel, 2016). Access, affordability and bias towards cognitive functioning require the need for the development of contextually relevant measures of Social Emotional Competencies (SECs) as the domain of school readiness assessment (Bustin, 2007). There is a need for accessible and affordable contextually relevant instruments to assess social-emotional competencies in preschool-aged children (Amod & Heafield, 2013, Munnik & Smith, 2019b).

Munnik (2018) developed the Emotional Social Screening tool for School Readiness (E3SR) in response to the expressed need for contextually relevant screening tools. The construction followed a multi-phased procedure in which each phase used distinct methodologies. Munnik (2018) reported that multiple design approaches were used to ensure a strong theoretical foundation for the E3SR and recommended the examination of the psychometric properties of the E3SR. The theoretical foundation was derived from two systematic reviews that informed the definition of the constructs and operationalised definitions and attributes that were used to build the model. Applicability to the South African context was enhanced through consultation with stakeholders, including parents and professionals from the education and healthcare sectors. These processes ensured a strong theoretical basis for the construction of the original pilot version of the E3SR. The E3SR was constructed with two broad domains, namely, emotional competence and social competence. The emotional competence domain consisted of five subdomains: Emotional maturity, Emotional management, Independence, Positive sense of self and Mental well-being and Alertness. The social competence domain included four subdomains: Social skills, Pro-social behaviour, Compliance with rules and Communication. The screening tool included 56 items across the nine theoretical subdomains. For a detailed description of the theoretical model underpinning the E3SR, refer to Munnik (2018). The initial validation of the E3SR was assessed via a confirmatory factor analysis (CFA) that tested the theoretical underpinning of the instrument. Munnik (2018) reported that the CFA provided support for the theoretical model with clear recommendations for further refinement. The recommendation was that there was room for further investigation of the instrument despite the fact that the theoretical model was supported. This article reports on a post hoc analysis and data reduction as a further exploration of the factor structure of the E3SR.



Grade R teachers working in 10 educare centres or preschools in the Western Cape, Cape Town area were recruited as the respondent group to complete the protocols of the E3SR. The teachers were full-time employees who currently taught Grade R. They had to be familiar with the child’s behavioural patterns, abilities and general traits across environments through their day-to-day interaction with the child in the preschools. Seventeen teachers gave consent to participate in the pilot study. A total of 330 protocols were received. The preschools included one alternative (n = 36), one private (n = 71), three governmental (n = 201) and five community-based (n = 22) preschools. All protocols were obtained from children between the age of 6 and 7 years. The demographic profile of the children on whom the protocols were based is presented in Table 1.

TABLE 1: Demographic composition of the target group/ children 6–7 years (n = 330).

Table 1 shows that in this sample, 64% of children were male and 36% were female. English was the most frequently spoken first language (56%), followed by Afrikaans (37%) and Xhosa (6%). Other primary languages that were specified as mother tongues included French, Congolese and other South African languages, Sepedi or Zulu (2%).

Research design

A cross-sectional survey design was used for data collection.


The E3SR is a strength-based screening instrument designed to screen for emotional and social readiness in preschool children in South Africa, and was used for data collection (Munnik, 2018). The E3SR used a questionnaire format with a five-point Likert scale (Never, Rarely, Some of the Time, Most of the Time and Almost always). The instrument was used in a summative way. The instrument has clear instructions on how to complete the questionnaire. The E3SR consists of two sections.

Section A: The Demographic section includes questions on the demographics of the child, such as the child’s chronological age, birth order, gender, ethnicity, language of instruction in school, home language, and history of illness or disability. Information of the respondents is also recorded, such as the length of time that the child was known to the teacher, a rating on how well the child is known and if the child has been referred for special interventions.

Section B: This section included items for each of the domains (2) and sub-domains (9). The emotional competence domain comprised of 31 items across five sub-domains, namely, Emotional maturity, Emotional management, Positive Sense of self, Independence, and Mental well-being and Alertness. The social competence domain comprised of 25 items across four sub-domains, namely, Social skills, Pro-social behaviour, Compliance with rules and Communication. The theoretical and operational definitions for each scale and subscale with their personal attributes can be accessed from the unpublished doctoral thesis of Munnik (2018).

A composite score can be calculated for the full scale, each domain and sub-domain. The full-scale score reflects the level of readiness to enter mainstream education on the emotional–social level. The domain scores reflect the level of readiness on an emotional or social level.


A stratified sampling frame of preschools or educare centres registered under the Social Welfare Act in the Cape Town Metropolitan area was established. Socio-economic status (SES) was used to stratify the sample into high, middle, and low geographical areas. Schools within these areas were invited to take part in this study. The recruitment process entailed a multi-layered stakeholder consultation. Firstly, an invitation was sent to the principals of the preschools, which included the proposed purpose of the pilot and an outline of what their involvement would entail. The principals discussed the invitation with the teachers and identified teachers who expressed their interest to participate. The research team then contacted the identified teachers for recruitment purposes. Willing principals and teachers constituted willing schools. A meeting was convened with parents of preschool children at ‘willing schools’. The aim of the meeting was to explain that the school was participating in a pilot study and the value of the E3SR. Parents were also informed that the information will be provided by the teachers as part of an administrative process as described above. Thus, principals, parents and teachers had to agree in order for the school to be included.

Meetings between the research team, teachers and the principals were scheduled at identified settings. The main purpose was to give an outline of the research (pilot study) and to clarify what teachers’ involvement would entail. Teachers were invited to ask questions about the study, the test administration,the layout of the E3SR and about the dissemination strategy. The questionnaires were delivered to an identified teacher at each preschool for completion. The respondents (teachers) were able to contact a nominated researcher at any time to discuss reservations or difficulties that arose during the pilot study. These steps increased compliance with the administration guidelines and by extension, the reliability of the data. The completed questionnaires were collected as soon as the teachers indicated that they have completed the questionnaires.

Data analysis

Data analysis entailed: (1) data curing and testing thresholds for validation, (2) testing assumptions and (3) data reduction.

Data curing

The data set included 330 protocols for children aged 6–7 years old. Protocols received from two independent schools were 107 (32.4%); 201 (60.9%) protocols were received from three governmental schools and 22 (6.7%) protocols from five community-based settings. The number of protocols in the data set (n = 330) exceeded the threshold requirements on the number of cases per item and the overall threshold for robustness recommended by DeVellis (2016). The minimum cases per item ratio for validation studies should be five cases per item up to 300 cases after which the ratio can be relaxed. The pilot E3SR consisted of 56 items, which set the minimum threshold at 280 (5 × 56 = 280). The recommended threshold sample size of 300 was exceeded, which increased the robustness of the analysis in this validation study (DeVellis, 2016).

Testing assumptions

Before the multivariate statistical analysis commenced, the assumptions for multivariate statistical analysis and data reduction were tested, as recommended by Field (2013). Normal distribution was assessed using the Shapiro–Wilk test. Bartlett’s test of sphericity was used for testing homogeneity of variance. The Kaiser–Meyer–Olkin test (KMO) assessed sample adequacy.

Data reduction

Statistical analysis was conducted using SPSS (version 25). Internal consistency was assessed with Cronbach’s coefficient alpha. The dimensional structure of the E3SR was assessed by exploratory factor analysis (EFA), as recommended by Henson and Roberts (2006). The main aim of the EFA was to clarify how many items were loaded on the identified factors and to identify a reduced set of factors that would describe the structural inter-relationships amongst the domain and sub-domain scales in the E3SR (Henson & Roberts, 2006). Principal axis factoring (PAF) was used in this study, as the intention of PAF was to explain the common variance amongst variables by means of factors (Henson & Roberts, 2006). The direct oblimin method was used as the rotation method, which allows for factors to be correlated (Laher, 2010). Factor loadings were pushed towards 0 or 1.0 by decreasing the standard errors of the loadings for the variables with small communalities or increasing those of the correlations amongst oblique factors (Kline, 2013).

Decision criteria set for this study

The interpretation and reviewing of items were performed by inspecting factor loadings, communalities and factor over-determinations (Nunnally & Burnstein, 1994).

Item inspection on the correlation matrix. Items that did not correlate with at least one other item significantly above 0.3 were omitted. Items with few significant correlations with other items above 0.3 were flagged for further inspection consistent with the recommendation from Kline (2013). Measures of Sampling Adequacy (MSA) were also assessed per item, with a threshold of 0.90.

Communalities. Communalities express the shared variance accounted for by all the extracted factors (Field, 2013). High communalities indicate a reliable item. The desirable level of communality was set at 0.40, and communalities should not vary over a wide range (Gaskin, 2016; Osborne, Costello, & Kellow, 2014).

Number of factors to extract. Factor extraction was informed by cross validation of methods: (1) eigenvalues that exceeded 1 and inspection of the scree plot consistent with the recommendation of Henson and Roberts (2006), (2) Parallel analysis (PA) (Horn, 1965) and (3) and the Velicer’s minimum average partial (MAP) test (Velicer, Eaton, & Fava, 2000).

Factor loadings. According to Costello and Osborne (2005), the threshold for factor loadings was set at 0.50. Items that loaded above 0.32 on more than one item were considered as cross-loading. Items that cross-loaded on two components were retained in the component, in which they obtained the highest loading, on condition that they obtained a minimum loading of 0.50 based on the recommendation of Williams, Onsman and Brown (2010). Items that did not load on any factors were removed after examination.

Ethical considerations

The Humanities and Social Sciences Research Ethics Committee at the University of the Western Cape granted ethical approval during the PhD study and again for the post PhD project, reference number: HS19/24. Permission to conduct research at the preschools were obtained from the Department of Basic Education and the principals.

An information sheet explained what the study entailed. Permission to include protocols in the final assessments was provided by parents. Teachers at participating schools consented to complete protocols as respondents. All participation was voluntary, and the right to withdraw without fear or negative consequence was upheld. All protocols were anonymised. Participants were informed of their rights and recourse if dissatisfied. The learners were the unit of analysis; however, they did not participate directly in the study. However, parents were appropriately informed of the study and consented to the school participation. Teachers recorded their assessment based on observation of the children during the normal course of the execution of teaching responsibilities.


Testing the data set for assumptions

The assumption of normality was violated in this sample for the overall scale, as well as all nine subscales (Shapiro-Wilk = p < 0.05). The results showed that the distribution was positively skewed, and the assumption of normality was not met for this group. This violation was in line with the expected results, as 6- to 7-year-old children are assumed to already have mastered most of the attributes of emotional social readiness in the fourth term of the academic year when data were collected. In other words, the distribution accurately reflected where the cohort should be in terms of the measured competencies. Thus, the non-normal distribution was, in fact, an accurate representation of the target group.

The KMO statistic of 0.96 suggests that sampling adequacy was within the accepted range. All items reported individual MSA values above 0.90, with the exception of Item 3 on the Mental Well-being subscale (MW3) with an MSA value of 0.73, which was still considered to be acceptable according to Field (2013). Bartlett’s test of sphericity indicated that correlations between items were sufficiently large for PAF, suggestive of a correlation matrix and not an identity matrix (χ² (1540) = 18918.98 p < 0.01).

Principal Axis Factoring
First extraction

The first extraction confirmed the nine-factor solution reported by Munnik (2018). The nine factors accounted for 75.80% of the shared variance, which exceeded the threshold set by Field (2013) where the extracted factors should account for a minimum of 60% of the variance in order to be a good fit. Thus, this initial extraction confirmed the theoretical formulation of the subscales of the E3SR and was used at baseline to be refined in subsequent extractions. Thus, the findings confirmed the acceptability of the model, even though several items were found to be problematic.

For further refinement, items IN4, IN6, SOS1 and CR4 were removed because of cross-loading on more than one factor and with none of the loadings reaching the 0.50 cut-off. All of these items had communalities below 0.6, and had low item-total correlations. Item MW3 was also removed due to low MSA (0.73) and an item-total correlation of 0.190.

Second extraction

The second extraction resulted in seven factors based on eigenvalues exceeding 1 and inspection of the scree plot. This accounted for 74.97% of the shared variance. To confirm the number of factors for extraction, Horn’s (1965) PA was conducted using Glorefeld (1995) extension of sensitivity using the 99th percentile. The analysis was run in SPSS using O’Connor’s (2000) syntax. The results were not meaningful as the eigenvalues above zero for this instrument consistently exceeded the random numbers produced in a PA. This would mean that an extraordinarily high number of factors (31) had to be retained.

The over-estimation or extraction of factors can be attributed to the following: firstly, the E3SR contains a large number of variables that are not discrete (Jones, 2018). Secondly, adjusted correlation matrices (e.g. principal axis factoring) use squared multiple correlations on the diagonal that tends to suggest more factors that are justified (Buja & Eyuboglu, 1992). This over-extraction was cross-validated by looking at the average partial correlations in the Velicer’s MAP test. The Velicer’s result revealed a seven-factor solution that is consistent with the those reported in the second extraction method. Upon inspection of the pattern matrix, several items were still loading poorly and cross-loading on multiple factors. Items EMX6, EMX7, IN2, MW2, MW6 and SS6 were removed as these items had loadings less than 0.5, and low communalities. Item EMX3 was removed because of poor communality and low item-total correlation.

Third extraction

The third extraction method was conducted on the reduced number of items and was based on the eigenvalues (> 1) and inspection of the scree plot. A six-factor solution was suggested, which accounted for 75.48% of the shared variance. After the third extraction, item IN1 was removed as this item was not loading on any factors, with a communality of 0.27. Item SS1 was loading below 0.5 and was removed. Both items PB6 and CR5 were removed because of cross-loading on two factors, with loadings below the 0.5 threshold. Subsequent extractions were carried out with the specification of a six-factor solution, as this appeared to be the best fit for the data.

Fourth extraction

The fourth extraction method also yielded a six-factor solution, accounting for 77.92% of the shared variance. Upon inspection of the pattern matrix, items IN5 and SS4 were cross-loading on more than one factor, with loadings below the 0.5 threshold, and were removed. Items PB4 and CR1 reported loadings below 0.5 and were also removed.

Final extraction

The final extraction yielded a six-factor solution, accounting for 78.94% of the shared variance amongst the 36 items. The results of the final extraction are presented in Table 2. All items had loadings above 0.5 on their respective factors. Item CR6 was cross-loading on factors 4 and 6; however, the item was retained in factor 4, as this loading was the highest above 0.5.

TABLE 2: Final factors and factor loadings after the fourth extraction.
Summary of factors and revised sub-domains

Factor 1 consisted of seven items from the original Social Skills or Confidence and Pro-social Behaviour subscales, with loadings ranging from 0.504 to 0.897. The amended scale had a Cronbach’s alpha of 0.94 and reliability would not increase meaningfully by further removal of any items. This component was retained and labelled, Social skills.

Factor 2 consisted of five items from the original Positive Sense of Self subscale, with loadings between 0.601 and 0.889. The reduced scale had a Cronbach’s alpha of 0.93, and reliability would not increase meaningfully by further removal of any items. This component was retained and labelled, Sense of self.

Factor 3 consisted of the seven original items from the Communication subscale, with loadings between 0.642 and 0.967. The subscale had a Cronbach’s alpha of 0.95, and reliability would not increase meaningfully by further removal of any items. The component was retained and labelled, Communication.

Factor 4 consisted of seven items. One item was from the original Independence subscale, three items from the Mental Well-being or Alertness, and three items from the Compliance with Rules subscale. The factor had loadings between 0.549 and 0.796. As these constructs are related, the retained items were combined to form a new scale. The newly merged subscale had a Cronbach’s alpha of 0.94, and reliability would not increase meaningfully by further removal of any items. This component was retained and renamed as Readiness to learn.

Factor 5 consisted of five items from the original Emotional Management subscale, with the loadings on this factor ranging from 0.624 to 0.901. The reduced subscale had a Cronbach’s alpha of 0.92, and reliability would not increase meaningfully by further removal of any items. The component was retained and labelled, Emotional management.

Factor 6 consisted of five items from the original Emotional Maturity subscale. The loadings on this factor ranged from 0.596 to 0.808, with the subscale having a Cronbach’s alpha of 0.95. The reliability would not increase meaningfully by further removal of any items. The component was retained and labelled, Emotional maturity.

Twenty items were removed from the original set of 56 items. A six-factor solution was recommended with the following sub-domains: (1) Emotional maturity, (2) Sense of self, (3) Communication, (4) Emotional management, (5) Readiness to learn and (6) Social skills. The amended and reduced scale consisted of 36 items and obtained a high level of internal consistency, as evidenced by the Cronbach’s alpha of 0.97. Correlations between factors is presented in Table 3. All sub-domains of the revised scale were found to be significantly correlated with one another (r = 0.48–0.81, p < 0.01).

TABLE 3: Means, standard deviations and correlations between factors.

Figure 1 is a graphical depiction of the data reduction process and the revised sub-domains.

FIGURE 1: Original and revised emotional social screening tool for school readiness domains.

Internal consistency reliability

Table 4 provides an overview of the internal consistency for the original and revised scales and subscales of the E3SR. Overall, the scale and subscales of the original E3SR were found to be internally consistent, as demonstrated by alpha levels indicative of good to excellent reliability. The reported internal consistency estimates suggest that the items hang together in a reliable way. The internal consistency results of the revised scale are also found to be excellent. The final Cronbach’s alphas for the E3SR range from 0.92 to 0.95 across the six sub-domains. The revised scale had an overall Cronbach’s alpha of 0.97 (36 items). The emotional competence domain reported a Cronbach’s alpha of 0.95 (22 items), and the social competence domain had a Cronbach’s alpha of 0.94 (14 items).

TABLE 4: Internal consistency for the original and revised scales and subscales of the emotional social screening tool for school readiness before and after principal axis factoring.


This research study evaluated the psychometric properties of the newly constructed E3SR by post hoc analysis on a data set of 330 protocols. The data set supported an exploration of the factorial structure of the E3SR. The results suggested a revised factor structure with six subscales instead of the nine-factor solution in the original instrument.

The EFA yielded a six-factor solution. Four of the subscales, Emotional Maturity, Emotional Management, Sense of Self and Communication, were retained as conceptualised in the theoretical model. The EFA proposed two mergers: firstly, Social Skills and Pro-social Behavior sub-domains were two separate domains in the theoretical model, which were merged in the data reduction process. The Social Skills and Pro-social Behaviour merge is understandable, as these domains are inter-related and interdependent with attributes that tap into similar hypothetical constructs (Munnik, 2018; Stefan, et al., 2009). The merged subscale was termed, Social skills.

Secondly, three separate domains in the theoretical model were merged into one. The proposed merger between Independence, Mental Well-being and Alertness, and Compliance with Rules highlighted the interdependent nature of these constructs. The attributes of these sub-domains spoke of the child’s general readiness for learning, including their awareness of surroundings and the ability to reason within the context of social rules. It includes the ability to follow and adhere to ground rules stipulated in specific contexts, to be responsive to feedback about one’s behaviour in relation to complying with rules, and to be able to focus and attend to tasks independently. The merger of these domains of competence is not limited to one area of development or functioning but embraces the interrelationships between skills and behaviours across domains of development and learning (Mohamed, 2013; Munnik, 2018). The inter-related attributes of the merged sub-domain resonated with the research study, which stated that a child’s attitude towards learning is linked to several constructs, such as task persistence, attention, creativity, initiative, curiosity and problem solving (Amod & Heafield, 2013; Mohamed, 2013).

The results suggested good psychometric properties. The Emotional Social Competence scale had a Cronbach’s alpha of 0.97, indicative of excellent reliability and suitability for use in psychological research. The Emotional Competence and Social Competence domain had Cronbach’s alphas of 0.95 and 0.94 respectively, both indicative of excellent reliability as per the classification provided by Taber (2018). The revised subscales showed an excellent reliability as evidenced by Cronbach’s alphas ranging from 0.92 to 0.95, indicative of internal consistency between items.

The following limitations are noted: the timing of the data collection, that is, towards the end of the last term of the academic year impacted the assumption of normality as a requirement for data reduction or multivariate analysis. The violation of normality detracts from the robustness of the analysis, even though the assumption of normality was supported theoretically. The sample for the initial validation was limited to the Cape Metropole in the Western Cape. Thus, the results, however encouraging, must be interpreted cautiously until a more inclusive target group can be recruited. The psychometric properties cannot be retested on the same sample, and thus, a CFA can only be conducted on a new sample.


The E3SR is a valid and reliable screening tool for emotional–social competence as a domain of school readiness. The data reduction process supported a six-factor model, consisting of (1) Emotional maturity, (2) Sense of self, (3) Communication, (4) Emotional management, (5) Readiness to learn and (6) Social skills. The E3SR was successfully reduced to 36 items without losing important content.


Competing interests

The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article.

Authors’ contributions

E.M., E.W. and M.S. all contributed equally to this work.

Funding information

The authors thank the National Research Foundation for financial support using the following grants: NRF Sabbatical grant for completion of PhD 2018 and Thutuka post PhD track 2019–2021 awarded to Erica Munnik. The research study has not been commissioned nor does it represent the opinions of the NRF. No commissions or prohibitions have been placed on the study or dissemination protocol because of the funding.

Data availability

Data sharing is not applicable to this article as no new data were created or analysed in this study.


The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.


Amod, Z., & Heafield, D. (2013). School readiness assessment in South Africa. In K. Cockcroft & S. Laher (Eds.), Psychological assessment in South Africa: Research and applications (1st edn., pp. 74–85). Johannesburg: Wits University Press.

Bruwer, M., Hartell, C., & Steyn, M. (2014). Inclusive education and insufficient school readiness in Grade 1: Policy versus practice. South African Journal of Childhood Education, 4(2), 18–35. https://doi.org/10.4102/sajce.v4i2.202

Buja, A., & Eyuboglu, N. (1992). Remarks on parallel analysis. Multivariate Behavioral Research, 27(4), 509–540. https://doi.org/10.1207/s15327906mbr2704_2

Bustin, C. (2007). The development and validation of a social emotional school readiness scale. Doctoral dissertation. Bloemfontein, BL: University of the Free State.

Costello, A.B., & Osborne, J.W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research & Evaluation, 10(7), 1–9.

Department of Basic Education. (2013). Promotion requirements of the national curriculum statement grades R – 12). Retrieved from ww.acsi.co.za/legallegislativeadvocacy/policy-documents/the-national-policy-pertaining-to-the-programme-and-promotion-requirements-of-the-national-curriculum-statement-grade-r-12/

DeVellis, R.F. (2016). Scale development: Theory and applications (Vol. 26). Sage.

Field, A. (2013). Discovering statistics using SPSS. Sage.

Foxcroft, C.D. (1994). The development of a group screening measure for South African children. Unpublished manuscript. University of Port Elizabeth.

Foxcroft, C.D. (2013). Developing a psychological measure. In C. Foxcroft & G. Roodt (Eds.), Introduction to psychological assessment in the South African context (4th edn., pp. 69–81). Oxford University Press.

Foxcroft, C.D., & Roodt, G. (2013). Introduction to psychological assessment in the South African context. (4th edn.). Oxford University Press.

Gaskin, J. (2016). Exploratory factor analysis: Communalities. Gaskination’s Stat Wiki. Retrieved from http://statwiki.kolobkreations.com/index.php?title=Exploratory_Factor_Analysis#Communalities

Glorfeld, L.W. (1995). An improvement on Horn’s parallel analysis methodology for selecting the correct number of factors to retain. Educational and Psychological Measurement, 55(3), 377–393. https://doi.org/10.1177/0013164495055003002

Health Professionals Council of South Africa. (2010). The professional board for psychology. Health Professions Council of South Africa. List of tests classified as being psychological tests. Form 207. Retrieved from http://www.hpcsa.co.za/Uploads/editor/UserFiles/downloads/psych/psychom_form_207.pdf

Henson, R.K., & Roberts, J.K. (2006). Use of exploratory analysis in published research: Common errors and some comments on improved practice. Educational and Psychological Measurement, 66(3), 393–416. https://doi.org/10.1177/0013164405282485

Horn, J.L. (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30(2), 179–185. https://doi.org/10.1007/BF02289447

Human Sciences Research Council of South Africa. (1984). Manual for the school readiness evaluation by trained testers. Pretoria: HSRC.

Jacklin, L., & Cockcroft, K. (2013). The Griffiths Mental Developmental Scales: An overview and a consideration of their relevance for South Africa. In K. Cockcroft & S. Laher (Eds.) (1st edn., pp. 169–185). Psychological assessment in South Africa: Research and applications. Johannesburg: Wits University Press.

Jones, J. (2018). The influence of a proposed margin criterion on the accuracy of parallel analysis in conditions engendering under-extraction. Masters theses & specialist projects. Paper 2446. Retrieved from https://digitalcommons.wku.edu/theses/2446

Kline, R.B. (2013). Exploratory and confirmatory factor analysis. In Y. Petscher & C. Schatsschneider (Eds.), Applied quantitative analysis in the social sciences (pp. 171–207). New York, NY: Routledge.

Laher, S. (2010). Using exploratory factor analysis in personality research: Best-practice recommendations. SA Journal of Industrial Psychology, 36(1), 1–7. https://doi.org/10.4102/sajip.v36i1.873

Laher, S., & Cockcroft, K. (2014). Psychological assessment in post-apartheid South Africa: The way forward. South African Journal of Psychology, 44(3), 303–314. https://doi.org/10.1177/0081246314533634

Luiz, D., Barnard, A., Knoetzen, N., & Kotras, N. (2004). Griffiths mental development scales extended revised. (GMDS-ER). Technical manual. Amsterdam: Association for Research in Infant and Child Development (ARICD).

Madge, E.M., Van den Berg, A.R., & Robinson, M. (1985). Manual for the Junior South African individual scales (JSAIS). Pretoria: Human Science Research Council.

Makhalemele, T., & Nel, M. (2016). Challenges experienced by district-based support teams in the execution of their functions in a specific South African province. International Journal of Inclusive Education, 20(2), 168–184. https://doi.org/10.1080/13603116.2015.1079270

Mohamed, S.A. (2013). The development of a school readiness screening instrument for grade 00 (pre-grade r) learners. Doctoral dissertation. University of the Free state.

Munnik, E. (2018). The development of a screening tool for assessing emotional social competence in preschoolers as a domain of school readiness (Doctoral dissertation). University of the Western Cape. Retrieved from http://hdl.handle.net/11394/6099.

Munnik, E., & Smith, M.R. (2019a). Contextualising school readiness in South Africa: Stakeholders perspectives. South African Journal of Childhood Education, 9(1), a680. https://doi.org/10.4102/sajce.v9i1.680

Munnik, E., & Smith, M.R. (2019b). Methodological rigour and coherence in the construction of instruments: The emotional social screening tool for school readiness. African Journal of Psychological Assessment, 1(0), a2. https://doi.org/10.4102/ajopa.v1i0.2.

Ngwaru, J.M. (2012). Parental involvement in early childhood care and education: Promoting children’s sustainable access to early schooling through social-emotional and literacy development. Southern African Review of Education, 18(2), 25–40.

Nunnally, J.C., & Bernstein, I.H. (1994). Psychometric theory. New York, NY: McGraw-Hill.

O’Connor, B.P. (2000). SPSS and SAS programs for determining the number of components using parallel analysis and Velicer’s MAP test. Behavior Research Methods, Instrumentation, and Computers, 32, 396–402. https://doi.org/10.3758/BF03200807

Osborne, J.W., Costello, A.B., & Kellow, J.T. (2014). Best practices in exploratory factor analysis (pp. 86–99). Louisville, KY: CreateSpace Independent Publishing Platform.

Powell, P.J. (2010). The messiness of readiness. Phi Delta Kappan, 92(3), 26–28. https://doi.org/10.1177/003172171009200307

Raikes, A., Dua, T. & Britto, R. Measuring early childhood development: priorities post 2015. In Early Childhood Matters, June 2014/124, 74–78. Bernard van Leer Foundation. Netherlands.

Rimm Kaufman, S., & Sandilos, L. (2017). School transition and school readiness: An outcome of early childhood development. Updated July 2017. Encyclopedia on Early Childhood development [online]. cCEECD, SKC-ECD. Retrieved from https://www.child-encyclopedia.com/sites/default/files/dossiers-complets/en/school-readiness.pdf.

Rimm-Kaufmann, S.E., Pianta, R.C., & Cox, M.J. (2000). Teachers’ judgments of problems in the transition to kindergarten. Early Childhood Research Quarterly, 15(2), 147–166. https://doi.org/10.1016/S0885-2006(00)00049-1

Roodt, G., Stroud, L., Foxcroft, C., & Elkonin, D. (2013). The use of assessment measures in various applied context. In C Foxcroft & G Roodt (Eds), Introduction to Psychological Assessment in the South African Context (4th ed, pp 240–249). Cape Town: Oxford University Press.

Ştefan, C.A., Bălaj, A., Porumb, M., Albu, M., & Miclea, M. (2009). Preschool screening for social and emotional competencies – Development and psychometric properties. Cognition, Brain, Behavior. An Interdisciplinary Journal, 13(2), 121–146.

Taber, K.S. (2018). The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in Science Education, 48(6), 1273–1296. https://doi.org/10.1007/s11165-016-9602-2.

Van Rooyen, A.E., & Engelbrecht, P. (1997). Die effektiwiteit van enkele skoolgereedheidstoetse vir die voorspelling van skolastiese prestasie by die skool beginner. South African Journal of Education, 17(1), 7–10.

Velicer, W.F., Eaton, C.A., & Fava, J.L. (2000). Construct explication through factor or component analysis: A review and evaluation of alternative procedures for determining the number of factors or components. In R.D. Goffin & E. Helmes (Eds.), Problems and solutions in human assessment Boston (pp. 41–71). New York: Kluwer.

Williams, B., Onsman, A., & Brown, T. (2010). Exploratory factor analysis: A five-step guide for novices. Australasian Journal of Paramedicine, 8(3). https://doi.org/10.33151/ajp.8.3.93

Yunus, K. R. M., & Dahlan, N. A. (2013). Child-rearing practices and socio-economic status: Possible implications for children’s educational outcomes. Procedia - Social and Behavioral Sciences, (90), 251–259. https://doi.org/10.1016/j.sbspro.2013.07.089.


Crossref Citations

1. Advancing psychological assessment in Africa: Contributions from the African Journal of Psychological Assessment
Sumaya Laher
African Journal of Psychological Assessment  vol: 3  year: 2021  
doi: 10.4102/ajopa.v3i0.88