E-Learning Pedagogy and School Leadership Practices to Improve Hong Kong Students ...

October 30, 2017 | Author: Anonymous | Category: N/A
Share Embed


Short Description

Consultant, Mrs Grace Kwok, who worked tirelessly to help us with liaison with . 3.3.1 Self ......

Description

E-Learning Pedagogy and School Leadership Practices  to  Improve  Hong  Kong  Students’   Computer and Information Literacy: Findings from ICILS 2013 and beyond

Nancy LAW, Johnny YUEN, Yeung LEE Centre for Information Technology in Education, University of Hong Kong

Copyright © Centre for Information Technology in Education First published 2015

Published by Centre for Information Technology in Education (CITE) Faculty of Education The University of Hong Kong The book is published with open access at icils.cite.hku.hk

Open Access. This book is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited. All commercial rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, re-use of illustrations, recitation, broadcasting, reproduction on microfilms or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the Copyright Law   of   the   Publisher’s   location, in its current version, and permission for commercial use must always be obtained from the Publisher. Violations are liable to prosecution under the respective Copyright Law. The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.

The National Research Centre for the Hong Kong participation in ICILS 2013 would like to acknowledge the financial support provided by the Hong Kong Quality Education Fund (QEF). E-Learning Pedagogy and School Leadership Practices to Improve Hong Kong Students’   Computer   and   Information   Literacy:   Findings   from   ICILS 2013 and beyond/ by Nancy Law, Johnny Yuen and Yeung Lee. ISBN 978-988-18659-6-0 Cover design by Debbie Pang

Acknowledgements Hong Kong’s participation in ICILS 2013 would not have been possible without the funding support for the project. We are very grateful to the support given by all the principals, teachers, ICT coordinators and students who participated in the pilot and main studies. Without their participation, it would have been possible to complete the study so successfully and smoothly. We need to highlight the enormous help we received from our Honorary Consultant, Mrs Grace Kwok, who worked tirelessly to help us with liaison with school principals. We are also very thankful for the generous support given by the Education Bureau and her staff, especially for helping the project to ensure participation from schools. We are also indebted to steering committee for their valuable advice over the duration of the project. Finally, we would also like to extend our hearty thanks for all the colleagues involved in various stages in this project. In particular, we would like to acknowledge the contribution from a number of colleagues and students: Dr. Zhan Wang for her contribution to the analysis of the data, Dr. Ling Li for her copyediting of the English version of this book and Mr Leming Liang for copy-editing and formatting the Chinese version. We are also grateful to Mrs Liliana Farias for her careful formatting of the English version.

i

Project Team Project Leader Prof. Nancy LAW Deputy Director, CITE, Faculty of Education, The University of Hong Kong

Research Team Members Dr. Y. LEE Assistant Director, CITE, Faculty of Education, The University of Hong Kong Dr. Johnny YUEN Research Officer, CITE, Faculty of Education, The University of Hong Kong (20132015) Dr. Emily OON Research Officer, CITE, Faculty of Education, The University of Hong Kong (20112013) Ms. Ada TSE Research Assistant, CITE, Faculty of Education, The University of Hong Kong Mrs. Grace Kwok Honorary Consultant: School Liaison

Test Administrators AU Lok Yee

AU Suet Yee

CHAN Ka Lai

CHAN Ki Yan

CHENG Kwan Yee

CHEUNG Sau Ping

CHIANG Lai Hang

CHOW Yan

FUNG Hiu Tung

FUNG Lik Yan

KAN Wai Kit

LAM Wai Sum

LEE Ying Yin

LIU Sin Yi

NGAI Yik Long

WONG Chi Lai

YAU Hiu Ying

YUNG Man Yi

ii

Steering Committee Mr. Danny CHENG Ex- Chairman, The Hong Kong Association for Computer Education Mr. K. W. LAM Ex- Chairman, Hong Kong Direct Subsidy Scheme Schools Council Mr. Y.T. LAU (10/2014 - 09/2015) CCDO, Information Technology in Educational Section, Education Bureau Mr. S. T. LEUNG Chairman, Hong Kong Aided Primary School Heads Association Mr. A. C. LIU Ex- Chairman, Hong Kong Subsidized Secondary Schools Council Mrs. K. Y. LIU NG Chairperson, Union of Government Primary School Headmasters and Headmistresses Ms. M. M. NG Ex- CEO, HKEdCity Mr. S. K. NG Ex- Vice-Chairman, Hong Kong Association of Heads of Secondary Schools Mr. M. SHE (08/2011 - 10/2014) Ex- CCDO, Information Technology in Educational Section, Education Bureau Mr. George TAM Ex- Chairman, Hong Kong Grant School Council Mr. Albert WONG Chairman, Association of IT Leaders in Education (AiTLE)

iii

iv

Table of Contents Acknowledgements Project Team Steering Committee

i ii iii

Table of Contents

v

List of the Tables

ix

List of the Figures Executive Summary

xiii xvii

Chapter 1: Computer & Information Literacy and its Assessment

1

1.1 1.2

Computer and Information Literacy: a Brief History Computer and Information Literacy in the Hong Kong School Curriculum Assessing Computer and Information Literacy

2 4

1.3.1 1.3.2

8 9

1.3

1.4 1.5 1.6

European Computer Driving Licence Computer-Based Assessments in PISA

7

Definition of CIL in ICILS Test Administration and Module Design in ICILS Summary

13 15 20

Chapter  2:  Hong  Kong  Students’  Performance  in  ICILS

21

2.1 2.2 2.3 2.4

Sampling Method and Participation Rates Computation  of  CIL  Scores  and  Countries’  Performance CIL Proficiency Levels Hong   Kong   Students’   CIL   Proficiency   Levels   in   an   International   Context

21 22 25 27

2.5

Students’  Performance  across  CIL Aspects

30

2.5.1

31

2.5.2 2.5.3 2.5.4 2.5.5

Knowing about and understanding computer use (Aspect 1.1) Accessing and evaluating information Managing information Transforming information Creating information

32 32 32 35 v

2.5.6 2.5.7 2.5.8

2.6

36 37 40

Students’  CIL  proficiency  trajectories

42

2.6.1

44

2.6.2

2.7

Sharing information Using information safely and securely Students’   performance   across   CIL   aspects   and   proficiency   levels

Hong  Kong  students’  improvements  in  competency  profile as they move to higher CIL proficiency levels Australian   and   Korean   students’   improvements   in   competency profile as they move to higher CIL proficiency levels

Summary

45

47

Chapter  3:  Influence  of  Students’  Background  and  ICT Use Experience on their CIL

49

3.1 3.2

Contextual Factors Derived from the Student Survey Influence  of  Students’  Personal  and  Family  Background

50 51

3.2.1 3.2.2 3.2.3 3.2.4 3.2.5 3.2.6

52 53 54 55 56 57

3.3

3.4

3.5

3.6

Gender and CIL achievement Educational aspirations and CIL achievement Immigrant status and CIL achievement Language use at home and CIL achievement Socioeconomic background and CIL achievement Home ICT resources and CIL achievement

Influence  of  Students’  ICT  Self-efficacy and Interest

58

3.3.1 3.3.2 3.3.3

Self-efficacy in basic ICT skills and CIL achievement Self-efficacy in advanced ICT skills and CIL achievement Interest and enjoyment in using ICT and CIL achievement

58 60 61

Influence of Students’  ICT  Use  Experience  at  Home  and  in  School

64

3.4.1 3.4.2 3.4.3

Computer experience and CIL achievement Use of computers for and at school Use of computers outside school

64 67 74

Students’  Contextual  Factors  and  Their  CIL  Achievement

79

3.5.1 3.5.2

79 82

Contextual factors and overall CIL score Contextual factors and the seven CIL aspect scores

Summary

85 vi

Chapter  4:  How  Do  Schools  Influence  Students’  CIL  Achievement?

87

4.1

ICT infrastructure and resources in schools

88

4.1.1 4.1.2 4.1.3 4.1.4

88 91 92 93

4.2

School Policies and Practices Regarding ICT Use

94

4.2.1 4.2.2

94 95

4.2.3 4.2.4 4.2.5 4.2.6 4.2.7 4.2.8

4.3

96 100 101 103 105 107

108

4.3.1 4.3.2 4.3.3

110 112 114

4.3.5

4.5

Principals’  views  on  educational  purposes  of  ICT  use Principals’  expectations  of  teachers’  knowledge  and  skills  in   professional use of ICT The  extent  to  which  principals  monitored  teachers’  ICT  use   to achieve different learning outcomes The extent to which principals took main responsibility for ICT management and implementation The extent to which schools had measures regarding ICT access and use The extent to which teachers participated in different forms of professional development as reported by principals Principal’s  priorities  for  facilitating  use  of  ICT Obstacles   that   hinder   school’s   capacity   to   realize   its   eLearning goals

Teacher’s  ICT-using Pedagogy

4.3.4

4.4

Digital learning resources Computer resources for teaching and/or learning Student: Computer ratios Summary

Teacher confidence in using ICT Teachers’  reported  use  of  ICT  tools  in  teaching Teachers’  reported  student  use  of  ICT  in  different  learning   tasks Teachers’  reported  use  of  ICT  for  various  types  of teaching and learning activities Teachers’   emphasis   on   developing   students’   information   literacy

School  Level  Factors  and  Hong  Kong  Student’s  CIL  Achievement 4.4.1 School level factors and overall CIL score 4.4.2 School level factors and the seven CIL aspect scores Summary

Chapter  5:  Learning  &  Assessment  Designs  to  foster  students’  CIL

116 118

120 121 125 126

129 vii

5.1 5.2 5.3

Principles of Learning Design for CIL Development Principles of Assessment Design Learning Designs Targeting Specific CIL Aspects

129 130 132

5.3.1

132

5.3.2 5.3.3 5.3.4 5.4 5.5

Learning designs to foster skills for accessing and evaluating information Learning designs to foster skills for managing information Learning designs to foster skills for transforming information Designs to foster skills for sharing information

An Extended Learning Design to Foster Learning in Multiple CIL Aspects Summary

References

138 141 142 146 154 155

viii

List of Tables Table

1.1

Table

1.2

Table

1.3

Table

1.4

Table

1.5

Table

2.1

Table

2.2

Table

2.3

Table

2.4

Table

2.5

Table

2.6

Table

2.7

Table

2.8

Table

2.9

Table

3.1

Table

3.2

The key standards areas specified in the 1998 and 2007 NETS The Computer and Information Literacy Framework adopted in ICILS A comparison of the IL assessment framework adopted by the ILPA Study in Hong Kong (Law et al. 2008) and that used in ICILS Overview of the short tasks in the After-School Exercise module Summary of the performance expectation and associated CIL aspect for each of the score point criteria in the After-School Exercise long task module Allocation of module score points to the ICILS assessment framework Country averages for CIL, years of schooling, average age, ICT Index, student–computer ratios and percentile graph Descriptions  of  students’  typical  performance   capabilities at the different levels of the CIL scale Examples of ICILS test items mapped to the four CIL proficiency levels based on their item difficulty The percentages correct (S.D.) and relative difficulty for the seven CIL aspects for Hong Kong, Australia and Korea Sample responses from Hong Kong students to the three tasks related to the phishing email Samples  of  Hong  Kong  students’  responses  to  the   short task on problems that may arise by making one’s  email  address  public Percentage of students who have correctly answered each short task in the After-school Exercise module (mapped to CIL assessment aspect and level) Percentage of students who have achieved partial or full score for each assessment criterion (mapped to CIL level) in the poster design large-task List of key context variables* derived from the student questionnaire Gender differences in CIL

3 14 15

17 19

23 24

26 28 30

38 40

41

42

51 52

ix

Table

3.3

Table

3.4

Table

3.5

Table

3.6

Table

3.7

Table

3.8

Table

3.9

Table

3.10

Table

3.11

Table

3.12

Table

3.13

Table

3.14

Table

3.15

Table

3.16

Table

3.17

The percentages of students at each level of educational aspiration and their respective CIL scores The percentages of students at each level of parental education reached and their respective CIL scores The percentages of students at each level of home literacy and their respective CIL scores The percentages of students with different numbers of computers at home and their respective CIL scores The percentages of students confident in performing each basic ICT skills task, S_BASEFF and correlation with CIL scores The percentages of students confident in performing each basic ICT skills task, S_ADVEFF and correlation with CIL scores Percentages of students agreeing with statements about computers, S_INTRST and correlation with CIL scores Percentages of students with different years of experience with computers Percentages of students with frequent computer use (i.e. at least once a week) at home, school and other places Percentages of students using computers in most lessons or almost every lesson in different learning areas, S_UELRN and association with CIL score Percentages of students using computers for study purposes at least once a month, S_USESTD and association with CIL score Percentages of students reported having learnt CIL related tasks at school, S_TSKLRN and the association with CIL score Percentages of students using work-oriented applications outside school at least once a week, S_USEAPP and the association with CIL score Percentages of students using the Internet outside school at least once a week for exchange of information, S_USEINF and the association with CIL score Percentages of students using the Internet outside school at least once a week for social

53

56 56 57

59

62

63

65 67

68

71

73

75

77

78

x

Table

3.18

Table

3.19

Table

3.20

Table

3.21

Table

4.1

Table

4.2

Table

4.3

Table

4.4

Table

4.5

Table

4.6

Table

4.7

Table

4.8

Table

4.9

Table

4.10

communication, S_USECOM and the association with CIL score Percentages of students using the Internet outside school at least once a week for recreation, S_USEREC and the association with CIL score Multilevel  model  results  for  students’  CIL  scores   using student context variables as level 1 predictors Multilevel model results for each of the seven standardized CIL aspect scores using student context variables as level 1 predictors List of key student context scale variables and their correlation with CIL scores for Hong Kong Percentages of students studying at schools with available internet-related teaching and learning resources Percentages of students studying at schools with available software resources for teaching and/or learning Percentages of students at schools with computer resources for teaching and/or learning Percentages of students at schools with school computers at different locations Percentages of students at schools where the principals consider ICT use as very important for achieving different educational outcomes Percentages of students at schools where the principals expect and require teachers to have different technological pedagogical knowledge and communication skills Percentages of students at schools where the principals  use  various  means  to  monitor  teachers’   ICT  use  to  develop  students’  advanced  ICT  skills Percentages of students at schools where the principals use various  means  to  monitor  teachers’   ICT use Percentages of students at schools where the principals took main responsibilities for different aspects of ICT management and implementation Percentages of students at schools where the principals took main responsibilities for different aspects of ICT management

80

81 84

85 89

90

91 93 95

97

99

99

100

102

xi

Table

4.11

Table

4.12

Table

4.13

Table

4.14

Table

4.15

Table

416

Table

4.17

Table

4.18

Table

4.19a

Table

4.19b

Table

4.20

Table

4.21

Table

5.1

Table

5.2

Table

5.3

Table

5.4

Table Table

5.5 5.6

Percentages of students at schools where teachers participate in different ICT- related professional development as reported by their principals Percentages of students at schools where principals indicate medium or high priority to ways of facilitating use of ICT in teaching and learning Percentages of Hong Kong principals who indicate “somewhat”  or  “a  lot”  of  hindrance  cause  by issues listed  to  their  school’s  capacity  to  realize  e-Learning capacity Percentages of teachers expressing confidence in doing different computer tasks Percentages of teachers using ICT tools in most, almost every, and every lessons Percentages of teachers who indicate students often using ICT for teaching activities in classrooms Percentages of teachers often using ICT for teaching practices in classrooms Percentages of teachers put strong or some emphasis  to  develop  students’  ICT-based Table 4.19a List of key school-level variables* derived from the principal List of key school-level variables* derived from the teacher questionnaire Multilevel  model  res0.0ults  for  students’  CIL  scores   using school-level variables as level 2 predictors (zscores) Multilevel model results for each of the seven standardized CIL aspect scores using school level variables as level 2 predictors Assessment  rubrics  for  evaluating  “creating   information” An  assessment  checklist  for  “creating  information”   in a poster creation task An  assessment  rubric  for  students’  work  on   evaluating information A self-evaluation checklist and associated assessment rubric for the website evaluation task Assessment criteria for the poster and route map An assessment rubric for peer review type of forum postings

104

106

109

111 113 115 117 119 121 122 124

127

131 132 135 137 141 145

xii

List of Figures Figure

1.1

Figure Figure

1.2 1.3

Figure Figure

1.4 1.5

Figure

1.6

Figure

1.7

Figure

1.8

Figure

1.9

Figure

1.10

Figure

1.11

Figure

1.15

Figure

1.13

Figure

1.14

Figure

1.15

Figure

2.1

Figure

2.2

Figure

2.3

Logo for European Computer Driving Licence Foundation (ECDL) National Education Technology Standards. The seven pillars of information literacy according to SCONUL. Relationship of IL to lifelong learning Modules included in the Information Technology Learning Targets guideline The nine generic skills specified in the Reform Proposal Consultation documents published by the Education Commission in 2000 with a focus on preparing students for lifelong and lifewide learning The Information Literacy Framework released by the EMB in 2005 and the graphical representation for information literacy used in the document Graphical representations of the Information Literacy and Scientific Inquiry frameworks used by the ILTOOLS team The Australian ICT Literacy Assessment framework An item in the Primary 5 and Secondary 2 ILPA generic  test  that  assesses  the  ‘integrate’  dimension An item in the Primary 5 ILPA mathematics test that  assesses  the  ‘integrate’  dimension Screen dumps for a task in the Secondary 2 ILPA Science  test  that  assesses  the  ‘integrate’  dimension A sample screen in an ICILS test module showing the functions for different sections of the screen The poster design environment, information resources, instructions and assessment criteria provided to students for working on the large task Percentages of Hong Kong, Australia and Korea students performing at each CIL proficiency level The  three  tasks  that  test  students’  ability  to   navigate to a designated webpage using different forms of instruction A poster designed by a Hong Kong student

1 8 4 4 5 5 6

6

7

8 10 11 12 16 18

29 31

32 xiii

Figure Figure

2.4 2.5

Figure Figure Figure

2.6 2.7 2.8

Figure

2.9

Figure

2.10

Figure

2.11

Figure

2.12

Figure

2.13

Figure

3.1

Figure

5.1

Figure

5.2

Figure

5.3

Figure

5.4

Figure

5.5

Figure

5.6

Figure Figure

5.7 5.8

Figure

5.9

Sample poster designed by Hong Kong student Screenshot of the short task in After-school Exercise that tests students’  ability  to  change  the   sharing setting of a web document Sample poster designed by a Hong Kong student Sample poster designed by a Hong Kong student The  short  task  that  assessed  students’  awareness  of   risks in making personal information public The phishing email with three suspicious elements, A, B and C A radar diagram of the mean percentages correct per CIL aspect A radar diagram showing the mean percentages correct per CIL aspect for Hong Kong students at each of the five CIL proficiency levels A radar diagram showing the mean percentages correct per CIL aspect for Australian students at each of the five CIL proficiency levels A radar diagram showing the mean percentages correct per CIL aspect for Korean students at each of the five CIL proficiency levels Contextual  factors  influencing  students’  CIL   outcomes Sample*  of  students’  submitted work on ancient calculating tools Figure 5.2 Sample of the comic strips created by the students A  P.6  student’s  work  on  evaluating  information   from news reports A  Secondary  3  student’s  work  on  evaluating information from websites Sample  of  P.  2  students’  work  on  managing   information Samples  of  P.  5  students’  template  for  recording   experimental data Data collection template resubmitted by Group 3 A sample poster and field trip route on Google map produced by a group of P.5 students Google calendar used by some students to plan their work schedule

32 33

34 35 37 39 43 44

46

47

50 132 134 135 136 138 139 140 143 144

xiv

Figure

5.10

Figure

5.11

Figure

5.12

Figure

5.13

Figure

5.14

Figure

5.15

Figure

5.16

Figure

5.17

An example of a Learning Management System customized to support shared reading and peer commenting of essays A discussion forum where students commented on the lyrics written by their classmates

144

Postings of information on the discussion forum in Stage 1, demonstrating students’  ability  to  use   keywords to access information, use the forum to manage and share information The target shop, route plan and interview questions prepared by one group of students in Stage 2, demonstrating their abilities to access, manage and transform information In Stage 3, students conducted the interview during the field trip, then posted the collected information and their own reflections online after the visit Artefacts created by students in Stage Four: The group mind map and a 3D paper model created by a student who completed his essay writing early. These  artefacts  demonstrate  students’  ability  to   transform and create information An essay written by one of the students, demonstrating his ability to transform and create information The rubric for self- and peer- assessment of the essays, and the peer feedback given by students to their peers online

147

145

148

150

151

152

153

xv

xvi

Executive Summary Background As the title of this book indicates, the focus of this publication is to help teachers, principals, education policy makers, teacher educators and members of the community  concerned  about  student  learning  to  understand  Hong  Kong  students’   levels of Computer Information Literacy (CIL) achievement in comparison with their international peers, and what e-learning pedagogy and e-leadership practices in schools will help to foster students’  ability  to  make  use  of  ICT tools productively for lifelong learning in the 21st century. The core of this book is an in-depth analysis of the Hong Kong results from the International Computer and Information Literacy Study 20131 (ICILS2013), drawing on comparisons with the international, Australian, and Korean data. Findings from this analysis will help us to understand the strengths  and  weaknesses  of  Hong  Kong  students’ CIL achievement, as well as the contextual factors (personal, family, teacher and school leadership factors) that influence them. This book also provides curriculum exemplars to illustrate the key characteristics of pedagogical and assessment designs that are conducive to enhancing  students’  CIL, drawing on past and current CITE projects. The International Computer and Information Literacy Study 2013 2 (ICILS2013) is the first large-scale   international   comparative   study   on   students’   ability to make use of computer and information technology to conduct inquiry, create, communicate and use information safely at home, school and different social and workplace contexts. Including Hong Kong, a total of 21 countries and education systems participated in this study, which was conducted under the auspices of the International Association for the Evaluation of Educational Achievement (IEA). The Hong Kong component of the ICILS 2013 study was conducted by the Centre for Information Technology in Education (CITE) of The University of Hong Kong, funded by the Quality Education Fund (QEF). The actual data collection took place between March and July of 2013.

Computer and Information Literacy—concept and test design The assessment framework for ICILS 2013 (Fraillon et al., 2013) comprises two strands of abilities. The first strand, collecting and managing information, can be further differentiated into three aspects: knowing about and understanding computer use, accessing and evaluating information, and managing information.

1 2

Hong Kong ICILS 2013 Study website: http://icils.cite.hku.hk/ ICILS 2013 International Study website: http://www.iea.nl/icils_2013.html

xvii

The second strand, producing and exchanging information, encompasses four aspects: transforming information, creating information, sharing information, and using information safely and securely. The performance assessment was conducted on computers located in the computer labs of the sampled schools. The student test consisted of questions and tasks presented in four 30-minutes modules. Each participating student was randomly assigned to complete two of the test modules. Each module comprises multiple-choice items and constructed responses pertaining to authentic tasks, designed according to the CIL assessment framework. Similar to all IEA studies of student achievement, the ICILS2013 CIL scale has an  average  score  set  to  500  and  standard  deviation  to  100.  In  addition,  students’   CIL performance are categorized into five proficiency levels in descending order: level 4, level 3, level 2, level 1, and below level 1. Table 2 presents detail descriptions of the five CIL proficiency levels.

ICILS 2013 Study design and HK participation statistics ICILS 2013 requires the following from all participating systems: School sample: A random sample of at least 150 secondary schools from the target population of schools that offer grade 8 classes in the 2012-13 academic year. School questionnaire data collection: Online questionnaire to be completed by the school principal, the ICT coordinator, and 15 teachers who teach grade 8 classes in the 2012-13 academic year (~20-30 minutes). Student sample: 20-25 students randomly sampled from all grade 8 students in each participating school. Student data collection: Computerized student CIL test (60 minutes) and student questionnaire (20 minutes). In Hong Kong data, 118 secondary schools participated in the ICILS 2013 study. A total of 2089 Secondary 2 students, 1338 Secondary 2 teachers, 115 principals and 105 ICT coordinators from the sampled schools took part in the study. Overall participation rates, after weighting and replacement, are: students 68.6%, teachers 58.3%, schools 70.8%. The Hong Kong set of data is considered as belonging to category 2, and does not meet the IEA standards for statistical comparisons across countries (category 1), which require overall participation rates after weighting and replacement to be at least 75%.

Students’  CIL  achievement Hong Kong Secondary 2 students’  overall  average  CIL  score  is  509,  slightly  above   the ICILS 2013 average of 500. However, this is lower than all the economically developed participating educational systems.

xviii

HK’s   ICT   Development   Index   is   ranked   fifth   among   the   21   participating   systems, but unfortunately, this strength in our ICT infrastructure contrasts with our  students’  relatively low CIL scores. Analysis of HK  student’s   CIL performance across the items in the two CIL strands indicates a relatively weaker performance in strand 2 (producing and exchanging information). Further in-depth analysis of each of the seven aspects of CIL show that Hong Kong students performed better in CIL tasks on knowing about and understanding computer use, as well as using information safely and securely. Their performance are poorest if they had to process the information collected, that is, evaluating information, transforming information, sharing information, as well as managing information. Of the five proficiency levels of CIL competence defined by IEA on the basis of  the  students’  performance,  a  student  need  to  be  able  to  perform  at  least  at  level 3 in order to be able to cope with the CIL needs for everyday needs. However, this amounts to only 26% for Hong Kong, compared to 34% for Australia and 35% for Korea. At the same time 38% of Hong Kong students achieved level 1 or below, which is relatively high among all economically developed participating systems. A total of 15% of HK students were assessed to be below level 1 CIL proficiency, which is very poor in comparison with Australia (5%) and Korea (9%). The standard   error   of   HK   students’   mean   CIL score is the second highest among all participating systems, indicating large variations in CIL among HK students. By comparing the comparative improvements in mean percentage scores for each of the seven aspects for students from below level 1 (mean CIL score below 407) to level 4 (mean CIL score above 661). We find that Australian and Korean students generally showed balanced advancement in all seven aspects as we examine the means CIL aspect score from lower to higher proficiency levels. On the other hand, Hong Kong students were not able to show similar magnitude of advancement in all seven CIL aspects. In particular, no significant advancement could be found between level 3 and level 4 students in the two CIL aspects that Hong Kong students were poorest at: managing information and sharing information. Clearly, helping students to improve on their performance in these CIL aspects is a high priority.

HK  students’  personal  and  home  background,  and  how  these   relate to their CIL performance SES Similar to most other countries participating in the study, the CIL score of HK Secondary 2 female students is significantly higher than their male counterparts. Not surprisingly, there is a statistically significant positive correlation between HK students’  family  social  economic  status  (SES)  and their CIL scores.

xix

However the effect of SES on CIL achievement for Hong Kong students is not as high as in other education systems.

ICT access Access to at least one computer at home (including desktop, notebook, or tablet computer) by surveyed students is high, at 98%. The mean CIL score of students with no computer at home is significantly lower, at CIL proficiency level 1. Students with at least one computer at home have mean CIL score at CIL proficiency level 2. The influence of having more than one computer at home on CIL performance is relatively minor. It is also observed that only less than 1% of the participating students reported not having Internet access at home, suggesting that some students could only access the Internet at home using smartphones.

Student level contextual factors and CIL achievement We investigated whether  and  to  what  extent  Hong  Kong  students’  personal and family context affect their achievement in CIL. Correlational analyses show that students’  CIL  scores correlate with all student level contextual factors. However, many of these context factors are themselves highly correlated. Multilevel analyses using student level factors show  that  only  student’s  self-efficacy in basic ICT skills has the largest positive significant   influence   on   student’s   CIL   achievement.   In   addition,   students’   educational   aspirations   and their reports on opportunities to learn CIL-related tasks at school also significant positive coefficients. On the contrary,   students’   self-efficacy in advanced ICT skills and reported use of ICT during lessons at school had significant negative coefficients in predicting students’   CIL   achievement.   Further   multilevel   modeling of the relationship between student’s  context  variables  and their performance in each of the seven CIL aspects show similar results. These findings indicate that CIL proficiency is different from advanced ICT skills, and that the use of ICT during lessons in Hong Kong schools, even when it happened, was not conducive to the development of students’  CIL  proficiency.  Hence  it  is  important  to  explore  the  school  level  factors   that  contribute  positively  to  students’  CIL.  

School factors and their influence  students’ CIL achievement Students’  e-learning opportunities and experiences are very much determined by the ICT infrastructure and digital learning resources available, as well as the nature and intensity of ICT use in pedagogical practices in their schools. In this study, three types of school level factors were explored: the ICT infrastructure and resources available, school policies and practices regarding ICT use and  teachers’   ICT-using pedagogy.

xx

ICT infrastructure and resources available in schools HK students had relatively good access to computers and the Internet at school for instructional purposes: 100% had a computer lab and 84% had computers available in most classrooms in their schools. There was also no relative lack of digital learning resources for students in Hong Kong. However, computers that students could access for e-learning at school, e.g. through class sets of computers that can be moved between classrooms or on computers brought by the student to class, were relatively low. In terms of network infrastructures to support learning, HK was comparatively lower on internet-based applications for collaborative work, and access to a learning management system (65%) as compared with Korea (94%).

School policies and e-learning leadership factors in schools School principals play an important part in determining the priorities and strategic directions   of   the   school.   HK   principals’   top   three   priorities   related   to   e-learning were establishing or enhancing an online learning support platform (87%), increasing the bandwidth of Internet access for computers (84%), and increasing the range of digital learning resources (83%), whereas their mean priorities for pedagogical use of ICT in teaching and learning were lower than those reported in other participating systems. In contrast, Australian   and   Korean   principals’   top   priorities  reflect  their  concern  about  teacher’s  pedagogical  use  of  ICT  as  well  as  the   range of e-Learning resources at school. Providing for participation in professional development on pedagogical use of ICT was considered  by  97%  of  Australian  students’   principals   and   89%   of   Korea   students’   principals   as   of   at   least   medium   priority.   Among   Korean   students’   principals,   the   top   three   priorities   were:   increasing the professional learning resources for teachers in the use of ICT (96%), establishing or enhancing online learning support platforms (94%), and providing teachers with incentives to integrate ICT use in their teaching (90%). In terms of the educational purposes of ICT use in learning, HK principals gave the highest priority to the three skill-oriented outcomes (all >80%): (1) basics skills in using the office suite of applications and email, (2) proficiency in accessing and using information, and (3) safe and appropriate use of ICT. Goals related to improving   students’   general   learning   outcomes   and   fostering   students’   responsibility for their own learning were considered very important by only 64% and 65%, respectively. This contrasts strongly with the Australian principals (93% and 85%) and Korean principals (79% and 78%). Only 38% of Hong Kong students attended schools whose principals consider ICT use to be very important for developing   students’   collaborative   and   organizational skills, whereas the international mean was 53%, and the Australian and Korean means were even higher, at 73% and 68%, respectively.

xxi

It   is   thus   evident   that   Hong   Kong   principals’   views on the role of ICT for learning and teaching were still focused on traditional outcomes, and gave much lower priority to ICT use in fostering 21st century skills. HK   principals   have   moderate   to   low   expectations   of   teachers’   ICT-related knowledge and skills, except on their ability to use ICT to communicate with other staff, at 86%. The highest expectation  was  on  teachers’  ability  to  integrate ICT into teaching and learning, at 68%. Only 57% expected their teachers to be able to collaborate with other teachers via ICT. The lowest expectations were related to the use of ICT for assessment and monitoring of students, neither of which had a percentage higher than 30%. In particular, the percentage of students whose principals expected teachers to be able to use ICT to develop authentic (real-life) assignments for students was only 16%. This is very disappointing as bringing authentic contexts into the classroom is one of the potential strengths that ICT use could offer. Not  all  principals  would  monitor  teachers’  ICT  use  in  teaching.  In  Hong  Kong   and   internationally,   the   most   popular   means   principals   use   to   monitor   teachers’   ICT use in teachers was classroom observations. Another means of monitoring was through teacher self-reflection. Internationally, the aspect of ICT implementation that had the highest proportion of principals taking key responsibility for was implementing ICT-based approaches in administration, with a mean of 81%. Compared to the international and Australian means, the percentages of Hong Kong principals taking main responsibility for the various aspects of ICT management were relatively low, except for the implementation of ICT-based approaches in administration. In general, principals were least likely to take main responsibility for ICT maintenance issues. It is interesting to note that none of the respective percentages for Korean principals were higher than 25%, indicating that for Korean schools, most of the ICT-specific management and implementation responsibilities were devolved to other staff members. In terms of having measures regarding ICT access and use by students in school, HK principals were most concerned about ensuring that students would not access unauthorized or inappropriate sites, and that they would honour intellectual property rights—100% reported having measures to ensure these conditions. Internationally, these are also the aspects that a vast majority of students’   principals reported having implemented relevant measures. HK principals were least concerned about restricting the total number of hours students   were   allowed   to   sit   in   front   of   a   computer,   with   only   33%   of   students’   principals reported having measures in place regarding this. An even lower percentage   of   Australian   students’   principals   reported   having   this   type   of   measures in place (18%), whereas the respective percentage for Korea was much higher, at 64%.

xxii

Based   on   the   principals’   reporting,   the most popular form of ICT-related professional development activity was courses provided by the school, followed by informal discussions within groups of teachers as well as discussions on ICT use as a regular item embedded into staff meetings. In both Hong Kong and Korea, the most popular professional development activities were courses provided by the school and observing colleagues using ICT in their teaching. However, the levels of participation reported by Hong Kong principals were very much lower than even the international average. This may be one of the reasons for the low levels of ICT adoption in teaching and student learning reported by teachers in Hong Kong. The top issues that HK principals indicated as hindrances to their school were pedagogy related, for example: insufficient time for teachers to implement e-Learning (59%) is the number one hindrance as perceived by HK principals, followed by insufficient budget for the needs of ICT implementation (e.g. LMS) (46%) as well as insufficient qualified technical personnel to support the use of ICT (35%). More than one third (34%) of HK principals indicated that pressure to score highly on standardized tests was an obstacle. The data further suggest that lack of hardware or general ICT skills among teachers were not perceived as big obstacles to elearning implementation in their schools by HK principals.

Teacher’s  ICT-using Pedagogy Teachers play the biggest role in determining the learning experiences of students. One   factor   that   potentially   impact   on   teachers’   use of ICT in their pedagogical practices is their confidence in the use of ICT. A large majority of HK teachers surveyed knew how to perform general and some advanced ICT tasks, with percentages higher than the international average. This stands in stark contrasts to the  teachers’  self-reported competence in ICT use for pedagogically related tasks, namely   monitoring   students’   progress   (52%),   and   assessing   students’   learning   (58%), which were much lower than the corresponding percentages reported by Australian (respectively 86% and 83%) and Korean (respectively 62% and 82%) teachers. HK teachers reported very low usage of ICT by the Secondary 2 students for learning activities in the classroom, with most of the learning activities surveyed at only a single digit percentage. The highest percentages recorded were for working on extended projects (i.e. over several weeks) (12%), and searching for information on a topic using outside resources (11%). Hong  Kong  students’ ICT usages in all types of learning activities were either equal to, or lower than the corresponding international averages. On the other hand, the corresponding percentages reported by Australian teachers were generally higher than the international mean, with often use of ICT reported by more than 30% in three activities: working on extended projects, submitting completed work for assessment and searching for information on a topic using outside resources.

xxiii

HK teachers reported a somewhat higher-level use of ICT in their teaching activities as opposed to ICT use by students for learning. However, only one aspect of ICT use by Hong Kong teachers in the classroom is higher than the ICILS international average, namely presenting information through direct class instruction (38%). All other types of ICT usage in teaching activities by Hong Kong teachers were lower than the ICILS international averages by 1% to 8%. The greatest differences between the Hong Kong and ICILS average (8%) were found in   two   kinds   of   teacher   use   of   ICT   to   support   students’   collaborative inquiry activities: collaboration among students (8%) and supporting inquiry learning (6%), which are often considered as important pedagogical activities to foster 21st century learning outcomes. HK teachers also reported much lower emphasis on developing  their  students’   ICT-based capabilities compared to the international average. The largest gaps are found  in  teacher’s  emphasis  on  developing  student’s  ICT-based capabilities for (1) exploring a range of digital resources when searching for information (33% for Hong Kong compared to the international mean of 53%); (2) evaluating the relevance of digital information (36% compared to 52%); and (3) evaluating the credibility of digital information (also 36% compared to 52%). Only two out of the list of ICT-based capabilities were reported as emphasized by more than 50% of Hong Kong teachers, namely, accessing information efficiently (53%) and using computer software to construct digital work products (51%). In contrast, more than 50% of Australian teachers reported giving emphasis to the development of 11 out of the 12 ICT-based capabilities surveyed, and more than 50% of Korean teachers reported giving emphasis to helping their students to develop ICT-based capabilities in 9 out of the 12 surveyed. HK teachers’  reported  usage  of  ICT  for  collaboration  among  fellow  teachers   is in general lower than the international average. Specifically, only 39% of HK Secondary 2 teachers have collaborated with peers to develop lessons involving use of ICT, compared to the international average of 58%.

Status of e-learning related school level factors in Hong Kong In summary, the Study reveal that HK principals in general gave relatively lower priority to pedagogical use of ICT, particularly with respect to the use of ICT for developing   students’   21st century skills such as collaborative and organizational skills.   HK   teachers’   participation   in   ICT-related professional development activities was very much lower than the corresponding international average. Their reported use of ICT for teaching and learning were also very low. For most of the student learning activities surveyed, less than 10% of teachers reported that their Secondary 2 students often use ICT for those activities. HK teachers also reported low  emphasis  on  developing  their  students’  CIL  capabilities.  

xxiv

How  did  school  level  factors  influence  students’  CIL?   Multilevel modelling of school level variables (including   principals’   e-learning leadership   practice   and   teachers’   ICT-using pedagogy factors) found more frequent use of ICT by students in traditional learning tasks, and the extent to which curriculum and assessment related obstacles hinder  the  school’s realization of its e-learning goals as reported by principals as the only two statistically significant   positive   predictors   of   students’   CIL achievement. These indicate that students’   CIL   achievement   will   benefit   from   more   e-learning opportunities (as opposed to e-teaching), as well as heightened leadership awareness of the need to change curriculum and assessment practices in   order   to   realize   the   school’s   elearning goals. There  were  only  two  other  significant  school  level  predictors  of  students’  CIL,   both of which were negative. These were teachers’  reported  use  of  pedagogical  ICT  tools  by themselves and the extent to which insufficient ICT hardware and software hindered  the  school’s   e-learning development, as reported by the principals. These findings show that the most important influences on Hong Kong students’  CIL  outcomes  are  those  at  the school level. Among these influences, the single most important factor is the opportunities to use ICT in learning that teachers provide to students. Furthermore, the findings show that having access to computers and the Internet, as well as using them for personal and social communication  purposes  per  se  do  not  affect  students’  CIL outcomes.

e-Learning and assessment designs to foster students’ CIL Studies of ICT-enabled innovative pedagogical practices show that students are much more likely to develop CIL if they engage in solving real-world problems in collaboration with their peers. The key pedagogical design principles of e-learning activities that  fostering  students’  CIL include: Give priority for ICT use to support students’  learning,  not  teaching; Learning activities are learner-centric and inquiry- oriented; Learning tasks are extended in time, comprising multiple stages and/or parts, with interim products generated in the process; Learning activities are authentic and related to students' daily life experiences; Learning tasks are open-ended, providing opportunities for students to make judgments; Provide learners with the opportunity to use ICT to access different sources of information, organize, compare and contrast, analyse and integrate information. Assessment  practices  also  need  to  change  to  more  effectively  foster  students’  CIL.  

xxv

Assessment as learning conceptualizes assessment as an integral part of learning and teaching, with students actively involved in this process. Assessment as learning occurs when students reflect on and monitor their own progress to inform their formulation of future learning goals and take responsibility for their own past and future learning. Traditional paper and pencil tests are not suitable for assessment as learning. Assessment rubrics and self-evaluation checklists are two of the most commonly used instruments in performance assessment. Rubrics are essential instruments for implementing Assessment as Learning. These are descriptive scoring tools for rating authentic student work qualitatively. Apart from rubrics, self-evaluation checklists are often used in the context of self- and peer- evaluation. A checklist provides a list of measurable categories and indicators for project, product and performance, allowing students to judge their own or peer’s  performance  and  determine  whether  they  have  met  the  established  criteria   of a task. The size of a learning unit often limits the complexity of the learning tasks that can be presented to students. Longer learning units that span days or weeks involving tasks that need to be conducted both in school and at home often provide more opportunities for students to develop higher level CIL outcomes. These often require students to exercise skills in multiple CIL aspects in a meaningful and holistic fashion. It is desirable for students to be provided with the same technology platform for learning in different subject matter contexts and tasks. It takes time for both teachers and students to get accustomed to the interfaces and functionalities of a new technology, creating additional cognitive and organizational burdens for all involved. This additional effort would be minimized if the same technology were used throughout the course of a  student’s  learning.

Conclusion Hong   Kong’s   participation   in   the   ICILS   2013   Study   provided   us   with   a   good   overview  of  how  Hong  Kong  students’  proficiency  in  computer  and  information   literacy and how it compared with their international counterparts. The picture we get  from  the  many  different  analyses  is  consistent.  Our  students’  CIL  proficiency   compared to all economically developed countries participating in the study is low. This is despite the overall ICT development index as well as the level of ICT provisions in   schools   in   Hong   Kong   is   internationally   relatively   high.   Students’   access to computers and the Internet at home is almost 100%, which also compares extremely well with other participating countries. Hong Kong has the second highest standard deviation in CIL score (second only to the City of Buenos Aires, Argentina), indicating a very wide spread in students’   CIL   proficiency.   On   the   other   hand,   students’   SES   background   has   amongst the lowest relationship with their CIL scores amongst all the participating countries.  This  indicates  that  school  factors  have  major  influence  on  students’  CIL   achievement. xxvi

In Hong Kong, teachers are confident about their own basic ICT competence but not in pedagogical uses of ICT. The levels of ICT use for student learning is particularly low, even though this was found to the single most important positive contributing  factor  to  students’  CIL  achievement.  In  Hong  Kong,  teachers  gave  low   emphasis   to   developing   students’   CIL   skills   and   principals   are   also   not   giving   much priority   to   the   use   of   ICT   to   support   students’   development   of   lifelong   learning   skills.   Hong   Kong   teachers’   participation   in   ICT-related professional development was also relatively low. Summarizing the ICILS 2013 findings, it is clear that there is a very strong need for improvement in teaching, learning and assessment in schools to help our students develop higher levels of CIL proficiency. This requires concerted leadership efforts at the policy and school levels. This should be a prime priority for the 4th IT in Education Strategy in Hong Kong, which was launched in August 2015. Findings from ICILS 2103 also triangulates with local ICT-related research and development projects that only when students have the opportunity to use ICT for learning (i.e. e-learning as opposed to e-teaching) will they really be able to advance in CIL proficiency. Also, the more challenges CIL aspects such as evaluating, managing and sharing information can only be effectively fostered through e-learning in open-ended, inquiry based learning tasks involving authentic, real life problems and are extended over days or weeks. Assessment practices need to be changed such that the focus is on helping students understand the criteria of assessment as well as the benchmarks for different levels of achievement for each criterion. The availability of an integrated, online learning and assessment support platform that can incorporate peer- and self-assessment using  rubrics  and  checklists  would  go  a  long  way  towards  supporting  students’   development of CIL through their learning throughout the school curriculum.

xxvii

Chapter 1 Computer & Information Literacy and its Assessment Serious discussions at the policy level about computer and information literacy (CIL) as an important student learning outcome began to appear in the West in the 1990s. In Hong Kong, around the same time, two policy documents regarding this were also published by Education and Manpower Bureau as part of the Second IT in Education Strategy of the HKSAR government. These policy documents have led  to  a  series  of  curriculum  initiatives  to  foster  students’  CIL  and  have  also  raised   issues concerning how CIL can actually be assessed. The International Computer and Information Literacy Study (ICILS), which was conducted under the auspices of the International Association for the Evaluation of Educational Achievement (IEA) in 2013, was the first international comparative study that assesses  students’   performance in CIL. Hong Kong was among the 21 countries/systems, which participated in ICILS 2013. The aim of this book is to report on the findings from ICILS 2013, which are of particular relevance to Hong Kong in comparison with its international  peers.  Students’   learning outcomes can be influenced by personal and family background as well as school and Figure 1.1 Logo for European system level factors, which are also explored Computer Driving Licence within the design of the Study. In order that the Foundation (ECDL) research findings can be easily understood by teachers, principals, teacher educators, policy-makers, parents, e-Learning technology providers and anyone who is concerned about student learning in the 21st century, this publication is not structured as a formal research report. Instead, that the focus is to bring in the relevant local and international policy and contextual backgrounds to highlight the key concepts and findings relevant to the education community and the general public. For readers who are interested to learn the specific research methodology and data analyse techniques adopted in the study, they can refer to the ICILS 2013 International Report (Fraillon et al., 2014) and the associated Technical Report (Fraillon et al., 2015).

1

This book is structured around five chapters. Chapter 1 starts with a brief review of the historical development of the concepts of CIL both in an international context and in Hong Kong. This is followed by a section introducing respectively the assessment framework adopted in ICILS 2013, and the assessment design of the performance test booklets used in the study to provide coverage for the various CIL aspects of the   framework.  Chapter   2  reports  on  students’   performance in ICILS 2013, with a special focus on  Hong  Kong  students’  performance,  giving  a   detailed breakdown of their strengthens and Figure 1.2 National Education weaknesses in comparison with two selected Technology Standards high-performing, developed countries, (NETS), 1998 Australia and Korea. Chapter 3 focuses on how   students’   personal   and   family   background   as   well   as   ICT   use   experience   within and outside schools influenced their CIL learning outcomes. The findings reported in this chapter draws on analyses of the Hong Kong, International, Australian and Korean data collected through the survey questionnaire administered to students immediately after they submitted their CIL performance test. The sampling method and participation rates for students are also explained. Chapter 4 reports on analyses of the data collected from the teacher and principal  questionnaires  to  examine   how   school  level  factors  influence  students’   CIL outcomes. A brief introduction of the principles underpinning the design of the questionnaires and the sampling methods for the principal and teacher questionnaires is also reported. As  a  main  objective  of  this  Study  is  to  help  contribute  to  improving  students’   CIL, the core focus of Chapter 5 is to recommend and describe learning and assessment   designs   that   will   foster   students’   CIL.   The   designs   described in this chapter are selected from case studies conducted by CITE in past and on-going research projects on e-Learning in Hong Kong schools.

1.1 Computer and Information Literacy: a Brief History Computer and Information Literacy actually comprises two interrelated types of literacy: Computer Literacy (CL) and Information Literacy (IL). CL is the original term used in this field. The European Computer Driving Licence (ECDL) launched in 1995 is among the earliest certification programs of its kind, which focuses on individuals’  technical  skills  in  using  basic  applications,  such  as  Office  and  email,   as well as their basic knowledge about computers and information technology (IT) security.

2

Underpinning this certification programme is the idea that the possession of basic computer skills is just as necessary in a digital society as being literate in reading and writing. In  addition  to  the  policy  perspective,  concerns  about  individuals’  computer   literacy have also been raised from the education and businesses sectors. They suggest  that  students’  computer-related competence should go beyond the use of basic digital productivity and communication tools (CL), and include the use of IT as a tool for research, problem-solving and decision-making. Further, awareness of the social, ethical and human issues related to the use of digital technology are also deemed important for ordinary citizens. In 1998, the National Education Technology Standards (NETS) published by the US-based International Society for Technology in Education (ISTE) adopted this approach, even though the specific word  “literacy”  was  not  used.  As  the  required  basic  skills  are  constantly  evolving   as a result of social and technological development, it is not surprising to see that the   NETS   was   updated   in   ISTE’s   2007   report.   Table   1.1   shows   a   comparison   between the key elements addressed in these two standards. Table 1.1 The key standards areas specified in the 1998 and 2007 NETS 1998

2007

Technology Productivity Tools

Creativity and Innovation

Technology Communications Tools

Communication and Collaboration

Technology Research Tools

Research and Information Fluency

Technology Problem-solving and Decision-Making Tools

Critical Thinking, Problem Solving, and Decision Making

Social, Ethical, and Human Issues

Digital Citizenship

Basic Operations and Concepts

Technology Operations and Concepts

Despite a few similarities between the two standards, the latter clearly has a stronger emphasis on the execution of higher order abilities in the use of IT, such as creativity, innovation, critical thinking and collaboration, as well as on the importance of digital citizenship. Professional library associations have been playing an active role in promoting the concept of Information Literacy (IL). The focus of IL is not on computing knowledge or technical skills, but rather on the ability to source, organize, evaluate and make appropriate use of information in different contextual situations for learning and problem solving. For example, the information literacy framework proposed by SCONUL (the Society of College, National and University Libraries, UK) titled Information skills in higher education: a SCONUL position paper (1999), comprise  “seven  pillars”  of  competence  (see  Figure  1.3).  Performance   in  each  pillar  of  competence  ranges  from  “Novice”  to  “Expert”.  SCONUL’s  2011   revised version was largely similar to the previous one.

3

The only difference is in the naming of the seven pillars, which was shortened to one word per pillar for easier memorisation: identify scope, plan, gather, evaluate, manage and present. The global interest in information literacy education was spurred by the growing recognition that students need to be prepared for lifelong learning to participate effectively in a society in which information and knowledge are increasing at an exponential rate. In the IL Framework published by the Australian and New Zealand Institute for Information Literacy, IL is conceptualized as personal empowerment (Bundy 2004) and depicted as a subset of independent (in SCONUL (2004), p. 4) learning skills nested within lifelong Figure 1.3 The seven pillars of information learning skills (see Figure 1.4). literacy according to SCONUL.

Figure 1.4 Relationship of IL to lifelong learning (Bundy, 2004, p. 5)

1.2 Computer and Information Literacy in the Hong Kong School Curriculum In Hong Kong, Computer Studies was introduced into the upper secondary school curriculum as an elective subject in 1982. Following the launch of the first Information Technology in Education Strategy (EMB, 1998). The Information Technology Learning Targets (ITLT) were put forward in 2000 by a working group formed under the Curriculum Development Institute. The ITLT documents the knowledge, skills and attitudes that students are expected to obtain at five different key stages in using IT tools respectively.

4

Figure 1.5 lists the eight modules included in the ITLT (EMB, 2000) document for key stages I and II. In the education reform launched in 2000, IT skills, defined as skills “to  seek,  absorb,  analyse,  manage   Using E-mail and present information critically and Word Processing intelligently in an information age and a Using the Internet digitised  world”  was included as one of Writing with a Computer the nine generic skills (see Figure 1.6). Drawing with a Computer It is thus clear that there is a conscious Joy to the Computer World shift in focus from ICT technical skills Calculating and Charting with Spreadsheet to information skills, within the Learning to control a Computer through Logo educational policy arena as well as in (EMB, 2000, p.20) the wider educational community. Figure 1.5 Modules included in the Information Technology Learning Targets Figure 1.7 shows the covers of the key education reform consultation guideline documents published in 2000. In 2004, the Education and Manpower Bureau commissioned a consortium of scholars from four teacher education institutions (BU, CUHK, HKIEd, UHK) to develop an information literacy framework for Hong Kong students as part of the objectives for the Second Information Technology in Education Strategy (EMB, 2004). Figure 1.8 shows the cover of the report published in 2005 and the diagrammatic representation of the concept of IL put forward in this document. In 2007, as part of the EDB evaluation of the effectiveness of the Second Information Technology in Education Strategy (2004-2007), the Centre for Information Technology in Education (CITE) at the University of Hong (EC, 2000, p.24) Kong was commissioned by EDB to conduct an Figure 1.6 The nine generic Information literacy Performance Assessment skills specified in the Reform Study (ILPA for short) of Primary 5 and Proposal Secondary 2 students on a random sample of government- funded schools. The details of the assessment will be introduced in the next section, it is important to note here that the description of information literacy in the EDB (2005) document was at a level of conceptual that was too abstract to serve the purpose of an assessment framework that can be operationalized into the design of performance assessment tasks. Based on a thorough literature review at the time, the CITE project team adopted the seven dimensions that ETS (2003) uses to design its information literacy assessments in higher education together with ethical use of information to form the eight dimensional framework for designing the assessment tasks in the evaluation.

5

Figure 1.7 Consultation documents published by the Education Commission in 2000 with a focus on preparing students for lifelong and lifewide learning

Figure 1.8 The Information Literacy Framework released by the EMB in 2005 and the graphical representation for information literacy used in the document The results of the assessment (Law, Lee and Yuen, 2010) showed generally unsatisfactory performance of the assessed students; and large disparities in students’  achievement  across schools (Law et al. 2007). In view of these findings, the EDB further commissioned CITE to conduct a one year design-based research and development project with primary and secondary school teachers, in order to develop (1) curriculum designs that promote students’   IL   and   (2)   tools   for   assessing   students’   IL   outcomes   (http://iltools.cite.hku.hk/ referred to as ILTOOLS for short).

6

The   students’   IL   framework   (EMB,   2005)   recommends   that   “IL   assessment   should   be   formative   and   developmental   […]   [and   should   be]   designed   for   developing  the  capability  of  learners  in  learning  different  subject  disciplines  …”   (p. 18). As such, the ILTOOLS project was commissioned to focus on IL within the Science KLA. In order to support easy operationalization of the EMB (2005) IL framework for curriculum and assessment integration, the CITE project team crystalized students’   IL   into   an   eight-element framework used in ILTOOLS (define, access, manage, integrate, create, communicate, evaluate and ethical use) for working with information, and provided explicit links between this framework and the 12 inquiry skills suggested in the Science curriculum guide (EDB 2006). Figure 1.9 shows a graphical representation of the IL framework and the scientific inquiry skills framework produced by the ILTOOLS team.

(http://iltools.cite.hku.hk/).

Figure 1.9 Graphical representations of the Information Literacy and Scientific Inquiry frameworks used by the ILTOOLS team

1.3 Assessing Computer and Information Literacy Assessing computer and information literacy often involve test takers to work in simulated environments. As the Internet is becoming the major source of information,   assessing   test   takers’   operation   proficiency   on   a   simulated   Internet   environment is one of the major means of CIL assessment.

7

In the case of the European Computer Driving Licence, which focuses on technical skills in basic computer applications, testing is arranged on a module basis. Testing is conducted on a simulated Windows/Microsoft Office environment, which records mouse movements and keystrokes and reports the results of the test immediately upon test completion.

1.3.1 European Computer Driving Licence Since 2005, the Australian National Assessment Program has been conducting ICT literacy assessment on a sample of Year 6 and Year 10 students every three years (referred to as the NAP-ICT literacy assessment). In this assessment, ICT literacy is considered as a generic ability of the students (i.e. independent of the subject domain). Three strands are included in this assessment framework, namely, working with information, creating and sharing information, and using ICT responsibly. These strands are assessed through six processes as depicted in Figure 1.10. The NAP-ICT literacy assessment was delivered in schools via online modules supported by Figure 1.10 The Australian ICT purpose-built software applications. The Literacy Assessment framework assessment modules usually comprise a (ACARA, 2015, p.4) sequence of simulated tasks in a variety of response formats, such as multiple choice, drag and drop, execution of simple software commands, short constructed text responses and construction of information products such as a poster. The NAP-ICT literacy assessment instruments were designed and developed by the Australian Council for Educational Research (ACER), which is also the team that took responsibility for designing the ICILS assessment instrument. Hence, there is great similarity between the NAP-ICT literacy assessment and the ICILS assessment. The latter will be described in the next section. As aforementioned, information literacy, as a generic ability, could enhance students’   learning   outcomes   in   different   subject   areas   according   to   the   IL   framework in Hong Kong. The assessment of IL, therefore, can be subject dependent. However, there are no well-known examples of performance assessment of IL in subject-based contexts. The Program for International Student Assessment (PISA) conducted by OECD introduced computer-based assessment (CBA) as an additional option for participating countries since 2009. In PISA 2012, countries could opt to participate in CBA of problem-solving, mathematics and reading literacy.

8

However, the PISA CBA does not assess subject-specific information literacy. It   only   assessed   students’   ability   to   accomplish   subject-based tasks, using computers as the medium instead of paper. Exactly the same set of competence is being assessed in the paper-based and computer-based assessments of Reading and Mathematics. For problem solving, the assessment framework of CBA is essentially the same as that of paper-based assessment, except that for CBA the nature of the problem situation can be interactive (i.e. not all information is disclosed and the student has to uncover some of the information by exploring, such as clicking particular buttons).

1.3.2 Computer-Based Assessments in PISA As   mentioned   in   the   previous   section,   the   assessment   of   students’   IL   was   conducted by the CITE team IN 2007 as part of the evaluation of the effectiveness of the Second IT in Education Strategy in Hong Kong (Law et al., 2007). The goal and design of the study, Information Literacy Performance Assessment (ILPA), followed  ETS’s  (2003)  IL  framework  and  assessed  IL  in  both  generic  and  subjectspecific problem solving contexts. The evaluation was carried out at two grade levels. At the primary 5 level, students were assessed on their generic IL skills and their ability to use IL tools to solve subject-specific problems in Chinese Language and Mathematics that they would otherwise not be able to do based on the P. 5 curriculum for these two subjects. At the secondary 2 level, the evaluation included assessing  students’  generic  IL  skills  and  their  ability  to  use  IL  tools  to  solve  subjectspecific problems in Chinese Language and Science that are beyond the respective secondary 2 curriculum requirements. Hence, three tests were administered at each grade level. The same generic IL test was administered to Primary 5 and Secondary 2 students in order to compare the generic skills levels for these two levels of students. As a result, a total of five different tests were developed in the ILPA Study. Figures 1.11 to 1.13 present examples of the questions in ILPA, which were used  to  assess  students’  ability  to  integrate information in the generic mathematics and science tests respectively. Integrate information relates to the ability to interpret and represent information by using ICT tools to synthesize, summarize, compare and contrast information from multiple sources. As shown by the examples, although   the   three   questions   were   all   designed   to   assess   students’   ability   to   integrate information, they differ in the kinds of tools adopted and the understanding needed to complete the question. The question shown in Figure 1.11 asks students to create a PowerPoint presentation on a one-day trip in Hong Kong that includes two scenic spots suitable. Students can use the Internet to search for information about the appropriate places to visit, their opening hours, traffic routes, and descriptions about the scenic spots in those places.

9

In this item, students need to create a PowerPoint presen tation on a PLAN for a day trip for the elderly to visit in Hong Kong. They were assessed on their ability to synthesize, compare and contrast information as well as summarizing information from muliple digital sources.

Figure 1.11 An item in the Primary 5 and Secondary 2 ILPA generic test that assesses the ‘integrate’  dimension In   the   ILPA   Mathematics   test,   the   question   assessing   students’   ability   to   integrate information for mathematical problem solving was related to geometry (see Figure 1.12). Students were asked to work out the maximum area that can be encompassed by a rectangle with a fixed perimeter. Students were given a Java applet to explore the areas inside differently configured rectangles with the same perimeter. With the applet, Primary 5 students should be able to arrive at the optimal configuration (a square) if they can record the areas when the length of the rectangle is systematically varied. This problem can only be solved analytically by using calculus, which would prove challenging even for senior secondary students.

10

In this item, students can manipulate an interactive applet to observe changes in the area of a rectangle with a fixed length perimeter using the applet.  Students’   performance is assessed on the comprehensiveness of the  students’   manipulations and observations, and the correctness of the students’  interpretations.

Figure 1.12 An  item  in  the  Primary  5  ILPA  mathematics  test  that  assesses  the  ‘integrate’   dimension In the ILPA Science test, the task assess Secondary   2   students’   ability   to   integrate information in scientific problem solving was about population dynamics in pond ecosystems. In the question, shown in Figure 1.13, students were asked to use a visual dynamic simulation tool to explore and observe how adding a foreign species to a local pond will affect the population size of different species living in the pond ecosystem over a given period of time. With the simulation tool, students can observe changes in the number of species through iconic visualization of the species in the pond as well as through the line graphs showing the actual numbers of each species over time. Originally, there were four species living in the simulated pond ecosystem: water plants, shrimps, fish and ducks.

11

In this task, students were provided with a dynamic simulation tool to explore the effect of adding red shrimps (a foreign species) to the local pond ecosystem. Students were assessed on their ability to make observations of the critical changes in population through both the iconic grid and the graph (see figure below) and their ability to make appropriate interpretation and conclusion.

This is the screen of the simulation pond ecosystem for students to conduct “ecological  experiments”.   The upper part of the main screen shows the changes in the population size of the different species in the pond in iconic form, and the lower section shows the same informa- tion in graphic format. Instruction is given in the lower lefthand corner.

Figure 1.13 Screen dumps for a task in the Secondary 2 ILPA Science test that assesses the  ‘integrate’  dimension.

12

At the beginning of the task, students were asked to observe and describe how the numbers of different species change over 200 days. Then “a  visitor  brought   20 red shrimps to add to the pond. He thinks this will increase the biodiversity of this pond ecology.” At this point, 20 red shrimps also appear in the simulated ponds. The students were then asked to observe the changes over another 600 days. They could also click on the species icons on the upper left hand panel to learn about their behavior. They were then asked to answer four questions, three of which pertain to integrating information, as follows: Why did most (or all) of the shrimps die? Why did most (or all) of the fish die? What are the possible impacts of adding a foreign species to an ecosystem? The topic of this task is the concept of competition among species that share the same niche. Hence, the introduction of foreign species to an indigenous ecology may lead to the extinction of not only the species they compete with, but also those that feed on the local species that are driven to extinction. This is a very challenging task even for high schools and university students. Using the simulation tool, students would be able to collate their observation of the relations in trend data across the species and to point out that the indigenous shrimps slowly die off as a result of the increase of the red shrimps. They found that the fish, which feed on the indigenous shrimps (but not the red shrimps), actually went extinct even before the indigenous shrimps. Hence, just as in the Mathematics task above, in this example, by using the simulation tool, Secondary 2 students with the requisite IL skills were able to learn about some important concepts in science, which would otherwise be inaccessible. The above three example assessment tasks show that different kinds of items need  to  be  designed  to  assess  students’  ability  to  integrate  information  as  a  generic   IL skill or as IL skills in specific subject learning contexts. Similarly, each of the ILPA generic and subject specific IL tests (five different tests were designed and administered in the Study as described earlier) contain items that assessed each of the seven IL dimensions, which include define, access, manage, create, communicate and evaluate, in addition to integrate, which is already described. Findings  from  ILPA  show  that  when  students’  IL  performance  in  the  different  tests   is   correlated,   students’   prior   learning   experience   with   subject-specific tools for learning purposes matters (Law et al., 2007).

1.4 Definition of CIL in ICILS ICILS 2013 only assesses CIL as a generic skill, and does not make any explicit assumption about whether performance in IL is subject matter dependent. The assessment framework used in ICILS measures CIL according to two broad conceptual categories named strands: collecting and managing information and

13

producing and exchanging information (Fraillon, et. al., 2014). Each strand is further categorized into skills categories labelled as aspects. Table 1.2 provides a brief description of the CIL strands and aspects. Table 1.2 The Computer and Information Literacy Framework adopted in ICILS

CIL strand & aspect

1.1 1.2 1.3

2.1 2.2 2.3 2.4

Assessment focus

Strand 1: collecting and managing information Knowing about and Basic technical knowledge and skills needed to work under- standing with information using computers. computer use Ability to find, retrieve, and make judgments about Accessing and the relevance, integrity, and usefulness of computerevaluating information based information. Ability to adopt and adapt schemes of information Managing information classification and organization to arrange and store information for efficient use and/or re-use. Strand 2: Producing and exchanging information Ability to use computers to vary the presentation of Transforming information to achieve clarity suitable for specific information audiences and purposes. Ability to use computers to design and generate new Creating information or derived information products for specific audiences and purposes. Ability to use computers to communicate and Sharing information exchange information with others. Using information safely and securely

Understanding of the legal and ethical issues of digital communication from the perspectives of both information producer and consumer.

If we compare the above CIL assessment framework with the assessment framework adopted in ILPA, which was based on the IL framework for students in Hong Kong published by the EDB (EMB, 2005), we can see that they are in fact very similar. Table 1.3 provides a comparison across the two assessment frameworks.

14

Table 1.3 A comparison of the IL assessment framework adopted by the ILPA Study in Hong Kong (Law et al. 2007) and that used in ICILS ILPA Assessment Dimensions

ICILS assessment strands & aspects

Defining information needs

1.2

Accessing and evaluating information

Accessing information

1.2

Accessing and evaluating information

1.1

Knowing about and understanding computer use

1.3

Managing information

Integrating information

2.1

Transforming information

Creating information

2.2

Creating information

Communicating information

2.3

Sharing information

Evaluating information

1.2

Accessing and evaluating information

Ethical use of information*

2.4

Using information safely and securely

Managing information

1.5 Test Administration and Module Design in ICILS Data collection for ICILS in schools took place between February and June 2013. Twenty students were randomly selected from each participating school to take part in the CIL performance assessment. Assessment took place in the school computer rooms, and the test instruments were delivered using software purposedesigned by the International Study Centre and administered on USB drives connected to the school computers. The test administration was delivered through USB drives instead of via the Internet to ensure a uniform assessment environment regardless of the quality of the Internet connection of the school computers used. Further, by setting all Internet related operations to take place in a simulated web environment, the assessment can avoid the potential problem of differential exposure to resources and information outside of the test environment. Four 30-minute test modules were constructed for the purpose of CIL assessment in ICILS. A module comprised a set of questions and tasks based on a theme familiar to students, following a linear narrative structure. Each module was designed to have the same structure: starting with a number of short independent tasks (usually requiring less than one minute), followed by a large task that was expected to take 15-20 minutes to complete. The themes, and hence the titles of the four modules are: After-School Exercise, Band Competition, Breathing, and School Trip. Each participating student was randomly assigned two of the four modules to complete. A sample screen layout for the test administration is shown in Figure 1.14. The main space is occupied by the sample computer screen that gives the context for the computer-based task.

15

The bottom part of the screen gives instructions on what the student is required to do. The narrow upper right panel gives a visual indication of how many tasks the student has completed (red boxes) relative to the entire set of tasks (green boxes).

Figure 1.14 A sample screen in an ICILS test module showing the functions for different sections of the screen There are two item formats in the short tasks found in the ICILS test modules: close-ended and open-ended. Responses to close-ended short tasks include multiple choice and test environment operations (i.e. mouse clicks to carry out certain tasks in the simulated test environment), which were recorded and scored automatically. Open-ended short tasks required students to enter their responses in their own words, and were scored by trained local expert scorers according to the scoring rubric. Slightly more than half of the score points were allocated to the satisfactory completion of the large task for each of the modules. The work submitted for the large tasks were scored according to the scoring criteria, which were presented to the students at the start of the large task, and remain accessible throughout the duration of the task. After-School Exercise, one of the four assessment modules of ICILS, is made public and a computer-based demonstration can be accessed at http://www.iea.nl/icils_2013_example_module.html.

16

In order to illustrate the structure of the test modules, a summary of the short tasks and large task involved in the After-School Exercise module are described here. Table 1.4 gives a brief description of the short tasks, responses required and the assessed aspect according to the CIL framework. Table 1.4 Overview of the short tasks in the After-School Exercise module

Task no.

Description

Response type

Assessed aspect

1

Given a sample email, identify the recipients.

MC*, tick all that apply

2.3

2

Follow the URL given in plain text in the email message to access a site.

Copy & paste URL to task bar of simulated web

1.1

3

In creating a user account on a public website, identify which personal information is most risky to include in a public profile.

MC*, select one.

2.4

4

Given an email alert on the simulated web, open the email via the hyperlink.

Click the hyperlink on the screen.

1.1

5

You received an email that tries to trick you into giving your password to the sender (phishing email). Identify the suspicious feature in the sender address.

Type explanation that identifies the suspicious domain name.

2.4

6

Based on the email message in Task 5, identify the suspicious feature in the greeting.

Explanation: there is no personal identifier.

2.4

7

Based on the email message in Task 5, identify the suspicious feature in the password reset URL link.

Explanation: the URL is not the correct one.

2.4

8

Explain one problem that may result from  making  one’s  email  address   publicly available.

Explanation: unwanted email, lose privacy

2.4

9

Given a document on a social networking site, make use of the sharing settings to give  “can  edit”  access  to  a  specified   person

Use the sharing function to add new user and set access  right  to  “can  edit”.

1.1

* MC refers to multiple choice objective type questions.

In this test module, students were asked to design a poster to advertise the After-School Exercise program and attract fellow students to participate. The exercise program should take about 30 minutes and should be suitable for students over the age of 12. The students had to select a suitable exercise by visiting the simulated websites provided.

17

They should then design their poster in the simulated poster design environment for which they had supposedly applied for an account in one of the short tasks earlier. Figure 1.15 show screen captures of the poster design environment, the simulated website containing exercise information and graphics, as well as the instructions and assessment criteria for the poster to be constructed.

a. Poster design environment.

b. Website with graphics and text about three exercises.

c. Instructions and assessment criteria about the large task. Figure 1.15 The poster design environment, information resources, instructions and assessment criteria provided to students for working on the large task Each poster was scored according to nine criteria: appropriateness of the poster title; layout of images; formatting design of text; colour contrast of text consistency of colour scheme used for text, graphics and background; editing of information text from website; completeness of information; persuasiveness of poster; and full use of the entire poster space. Five of the criteria had a maximum score of two, reflecting two levels of performance. Table 1.5 lists the performance expectation for each of the score points and the related CIL aspect that is being assessed.

18

Table 1.5 Summary of the performance expectation and associated CIL aspect for each of the score point criteria in the After-School Exercise long task module Score/ Assesse Criterion max. Performance expectation d aspect score 1/2

A relevant title has been added and placed in a prominent position.

2.2

2/2

A relevant title has been added and formatted to make its role clear.

2.1

1/1

One or more images are well aligned with the other elements on the page and appropriately sized.

2.2

1/2

Formatting tools have been used to some degree to show the role of the different text elements.

2.2

2/2

Formatting tools have been used consistently throughout the poster to show the role of the different text elements.

2.2

1/2

The text mostly contrasts sufficiently with the background to support reading.

2.2

2/2

There is sufficient contrast to enable all text to be seen and read easily.

2.1

1/1

The poster shows evidence of planning regarding the use of color to denote the role of the text, background, and images in the poster.

2.3

1/2

Some useful information has been copied from the resources and edited to improve ease of comprehension and relevance.

2.3

2/2

The relevant key points from the resources have been rephrased using student's own words.

2.3

1/2

Two of the three required pieces of information about the program (when, where, and what equipment is required) have been included in the poster.

1.2

2/2

All required information about the program (when, where, and what equipment is required) has been included in the poster.

1.2

1 Title design

2 Image layout

Text layout & 3 formatting

4 Colour contrast

5

6

Color consistency

Information adaptation

Information 7 completeness

8

Persuasiveness

1/1

Uses some emotive or pesuasive language to make the program appealing to readers.

2.1

9

Use of full page

1/1

Uses full page when creating poster

2.1

Except for the two criteria connected with colour use, which were machine scored, scoring of the posters created by students was carried out by locally trained expert scorers according to the scoring rubrics, similar to the case with the openended short tasks.

19

1.6 Summary This chapter has provided a historical overview of the development of the key concepts of CIL both in an international context and in Hong Kong. They are computer literacy, information literacy, and computer and information literacy. The CIL framework used in the assessment design in ICILS is similar to the one used in the Hong Kong ILPA Study, which was developed based on the IL framework published by the US Educational Testing Service (ETS. 2003). While there are several survey instruments designed to gather information that  may  reflect  an  individual’s  CIL,  this  chapter  focused  only  on  those  that  involve   computer-based performance assessment of CIL. We expressly differentiated general computer-based performance assessments (e.g. CBA in the PISA studies) from those with particular focus on CIL (such as ICILS or ILPA). Further, based on the IL frameworks published by well-known  library  associations  and  the  students’   IL framework published by the EDB in 2005, we pointed out that IL should not be viewed only as a generic skill, but also as a skill that supports learning in different disciplinary areas. ICILS assesses CIL only as a generic skill, although it is conceivable that assessment of IL can be conducted both as a generic skill in general problem solving and in subject-specific learning contexts. The ICILS test design gave a stronger emphasis on Strand 2 aspects, which were assessed through a large task set within a real-life context familiar to students. Machine scoring was limited to close-ended short tasks that require multiplechoice answers or test environment operations, and scoring of constructed artefacts with reference to colour contrasts. The majority of the tasks were scored by trained human experts based on set criteria. The quality of scoring was ensured through inter-coder reliability check on doubled scored samples.

20

Chapter 2 Hong Kong Students’ Performance in ICILS A total of 21 countries and educational systems participated in ICILS 2013. These include, besides Hong Kong, Australia, the City of Buenos Aires (Argentina), Chile, Croatia, the Czech Republic, Denmark, Germany, Korea, Lithuania, the Netherlands, Norway, Newfoundland and Labrador (Canada), Ontario (Canada), Poland, the Russian Federation, the Slovak Republic, Slovenia, Switzerland, Thailand, and Turkey. The student population for ICILS was defined as students in Grade 8, provided that the average age of the students was at least 13.5 at the time when the assessment was carried out. If the average age of Grade 8 students was below 13.5, Grade 9 students would then become the targeted population. Norway was the only country that had Grade 9 students tested in ICILS 2013. In Hong Kong, Secondary Two was the targeted student population for the Study. In order to provide the necessary background for understanding and interpretation of Hong Kong students’  performance in ICILS, this chapter begins with a brief introduction of the sampling method and criteria of ICILS. This is followed by an illustration of how the CIL scores were computed and linked to different levels of CIL proficiency. Next, the CIL scores and proficiency profiles of all participating systems will be reported, together with an in-depth analysis of the strengths and weaknesses in  Hong  Kong  students’  CIL  performance. For reference and comparison purposes, we also analyse and report on the results of Australia and Korea, which have a similar level of economic and social development as Hong Kong.

2.1 Sampling Method and Participation Rates As mentioned in Chapter 1, the targeted student population in ICILS is Grade 8, which is equivalent to Secondary 2 in Hong Kong. Student sampling was conducted in two stages. During the first stage of sampling, 150 schools 1 were randomly sampled using PPS procedures (probability proportional to size as measured by the number of students enrolled in a school). 1

The number of schools required in the sample to achieve the necessary statistical precision were estimated on the basis of national characteristics, and ranged from 138 to 318 across countries. Countries were asked to plan for a minimum sample of 150 schools, which was adopted as the sample size in Hong Kong.

21

Twenty students were then randomly sampled from all students enrolled in the targeted grade in each sampled school. In schools with fewer than 20 students in Grade 8, all students were invited to participate. During the sampling process, replacement schools with sizes closest to the sampled school were also sampled to serve as replacements when the sampled school refused to participate. The participation rates required for each country were 85% of the selected schools and 85% of the selected students within the participating schools, or a weighted overall participation of 75%. In Hong Kong, 118 schools participated in the student performance assessment and survey, giving a weighted school participation rate of 77%. Within the participating schools, a total of 2089 students participated in the online performance assessment, giving a weighted student participation rate of 89.1%. The weighted overall student participation rate was computed to be 68.6%, which was below the 75% required by IEA for the results to be considered as meeting the statistical   precision   required   for   international   comparison   of   students’   achievement. Hence, the Hong Kong results are listed in the international report under the category  “countries  not  meeting  sample  requirements”, alongside with Denmark, Netherlands and Switzerland.

2.2 Computation of CIL Scores and Countries’  Performance The four test modules comprise a total of 62 discrete questions and tasks, with a total of 81 score points. As introduced in Section 1.5, only a limited set of tasks was machine scored, and the remainder were scored by trained expert scorers according to set criteria. Just under half of the score points were derived from the short tasks. The test modules did not give the same emphasis to each of the seven aspects in the assessment framework (described in section 1.4). Instead, about twice as many score points were assigned to Strand 2 (producing and exchanging information) as compared to Strand 1 (collecting and managing information). The first three aspects in Strand 2 were primarily assessed through the large tasks. The allocation of score points to the aspects and strands in the framework is presented in Table 2.1. The established practice in all IEA studies of student achievement is to use Rasch IRT (item response theory, Rasch, 1960) to construct the achievement scale with a mean of 500 and a standard deviation of 100. The same process was applied to compute the cognitive scale of CIL scores with   a   mean   of   500   and   a   standard   deviation   of   100,   based   on   students’   performance data on the 62 questions and tasks in the four modules and equally weighted national samples. As each student was randomly assigned one of the twelve possible combinations of two out of the four modules (the same two modules presented with a different order of appearance was considered as a different combination in order to control for the influence of module sequence on difficulty), plausible value

22

methodology was used to derive summary student achievement statistics, and to estimate the uncertainty of the measurement (von Davier, Gonzalez, & Mislevy, 2009; Fraillon et al. 2015). Table 2.2 presents the CIL average scores for the 21 participating countries and systems. It also provides, for reference purposes, the respective ICT development index and student-computer ratios. Table 2.1 Allocation of module score points to the ICILS assessment framework % toward overall CIL strand & aspect CIL score Strand 1: Collecting and managing information Knowing about and understanding computer 1.1 use

13%

1.2

Accessing and evaluating information

15%

1.3

Managing information

5% Strand 1 subtotal

33%

Strand 2: Producing and exchanging information 2.1

Transforming information

17%

2.2

Creating information

37%

2.3

Sharing information

1%

2.4

Using information safely and securely

12%

Strand 2 subtotal

67%

As shown in Table 2.2, the mean CIL score of Hong Kong students is 509, which  is  slightly  above  the  international  mean  of  500.  Since  Hong  Kong  students’   overall participation rate is below 75%, comparing their performance with those in other participating countries is statistically inappropriate. In addition, it can also be seen that the CIL score of Hong Kong students is relatively low among all of the economically  developed  participating  educational  systems.  As  Hong  Kong’s  ICT   Development Index is ranked fifth among the 21 participating systems, this should not be a contributing factor to the relatively low CIL performance. In Table 2.2, another factor worth noting is the standard error in CIL scores. The standard error in the mean CIL score of Hong Kong students is the largest among those in all participating systems. The large standard error reflects large variations in the CIL achievement among Hong Kong students. What were the factors  that  contributed  to  such  a  large  diversity  in  the  students’  performance? To what   extend   did   the   students’   family   background   and   their   school   and   teacher   level factors contribute to their CIL achievement? Answers to these questions are explored in the next two chapters.

23

24

8 8

13.8 13.8

15.1 14.1 14.3 14.7

8

14.2 5th

100

100

100

300

300

300

Below L1

300

400

400

400

400

L1

75th

Mean and Confidence Interval

25th

Percentiles of Performance

200

200

200

200

500

500

500

L2

500

95th

600

600

600

L3

600

700

700

700

L4

700

Computer and Information Literacy Score 100

(3.5) (7.4) (4.7) (4.6)

Achievement significantly higher than ICILS average Achievement significantly lower than ICILS average

5.36 (53)¹

7.38 (20)² 7.38 (20)²

8.35 (4) 7.92 (10) 8.00 (7) 7.78 (13)

() Standard errors appear in parentheses. Because results are rounded to the nearest whole number, some totals may appear inconsistent.

(0.4) (0.8) (0.8) (0.6)

33 (9.4)

6 (0.0) 6 (0.3)

4 8 5 7

Student - computer ratios 10 (0.3) 3 (0.3) 10 (0.5) 2 (0.1) 20 (2.3) 11 (0.8) 9 (0.5) 17 (1.0) 26 (0.8) 15 (0.5) 13 (0.7) 22 (4.7) 14 (0.9) 80 (16.0)



▼ ▼ ▼

▲ ▲ ▲ ▲ ▲ ▲ ▲ ▲ ▲ ▲

ICT Development Index score (and country rank) 6.40 (34) 7.90 (11) 6.31 (37) 8.13 (6) 8.57 (1) 7.46 (19) 6.05 (43) 6.19 (40) 6.31 (38) 6.76 (28) 5.88 (44) 5.46 (51) 3.54 (95) 4.64 (69)



450 (8.6)

528 (2.8) 547 (3.2)

542 509 535 526

Average CIL score 553 (2.1) 542 (2.3) 537 (2.4) 537 (2.4) 536 (2.7) 523 (2.4) 517 (4.6) 516 (2.8) 512 (2.9) 511 (2.2) 494 (3.6) 487 (3.1) 373 (4.7) 361 (5.0)

* ICT Development Index Score and country rank data 2012 (Source: http://www.itu.int/ITU-D/ict/publications/idi/index.html [27/02/14])

City of Buenos Aires, Argentina

Benchmarking participants not meeting sampling requirements

Newfoundland and Labrador, Canada Ontario, Canada

Benchmarking participants

Educational systems not meeting sampling requirements Denmark 8 Hong Kong (SAR) 8 Netherlands 8 Switzerland 8

Educational systems Czech Republic Australia Poland Norway (Grade 9) Republic of Korea Germany Slovak Republic Russian Federation Croatia Slovenia Lithuania Chile Thailand Turkey

Years of Average schooling age 8 14.3 8 14.0 8 14.8 9 14.8 8 14.2 8 14.5 8 14.3 8 15.2 8 14.6 8 13.8 8 14.7 8 14.2 8 13.9 8 14.1

Averages for CIL Educational systems, years of schooling, average age, ICT Index and percentile graph

Table 2.2 Country averages for CIL, years of schooling, average age, ICT Index, student–computer ratios and percentile graph

2.3 CIL Proficiency Levels Fraillon et al. (2014) report on their analysis of the achievement data from ICILS 2013 to find out if the measurement data for different strands and aspects revealed that these were in fact belonging to different dimensional scales. The analysis resulted in an extremely high latent correlation (96%) between the students’ achievement scores on strands 1 and 2, indicating that it is appropriate to report CIL as a single achievement scale. A more descriptive CIL achievement scale was developed in order to provide a more qualitatively meaningful understanding of the performance levels of students with different achievement scores. To establish such a scale, an  “item  map”   was produced to rank the items (a unit of analysis that derives a score associated with a question or task) from the least to the most difficult. The item map and student achievement data were further analyzed to establish proficiency levels with a width of 85 scale points and level boundaries at 407, 492, 576, and 661 scale points. The scale is hierarchical in that a student located at a higher position on the scale will be able to successfully complete tasks up to that level of achievement. The scale is constructed so that students achieving a score at the lower boundary of a level should have answered about 50% of the items in that level correctly. For anyone with a score higher than the level boundary, he/she should have answered more than 50% of the items at their respective level. Table 2.3 describes what a typical student at each of the levels would be able to do in terms of Strand 1 and Strand 2 capabilities. At Level 1, students are able to use a basic range of software commands to access files and complete routine layout tasks and basic editing of text. They are also able to recognize potential risks associated with misuse by unauthorized use of computers. Students working at a level below Level 1 are likely to require support and guidance to be able to create digital information products. At Level 2, students are able to locate explicit information from given digital sources, have basic abilities to select and add content to information products with some rudimentary ability to follow layout conventions. They are also aware of the mechanisms to protect personal information and the consequential risks of public access to personal information. The key difference between Level 2 and higher levels   of   proficiency   is   in   students’   ability   to   work   autonomously   and   critically   when accessing and using information to create information products. At Level 3, students are able to search for, select and work with information appropriately to edit and create information products. They are aware of potential biases, inaccuracy and unreliability of information sources, and are able to control the layout and design of their digital products.

25

Table 2.3 Descriptions  of  students’  typical  performance  capabilities  at  the  different  levels   of the CIL scale Proficiency Performance related to Strand 2: Performance related to Strand 1: level & CIL producing & exchanging collecting & managing information score information

Level 4 (>661)

Able to: Select the most relevant information to use for communicative purposes. Evaluate the usefulness of information based on criteria associated with need. Evaluate the reliability of information based on its content and probable origin.

Able to: Create and adapt information products for specific audience and communicative purpose. Use appropriate software features to restructure and present information using appropriate conventions. Show awareness of intellectual property issues in using information on the Internet.

Level 3 (576-660)

Able to: Independently use computers as information gathering and management tools. Select the most appropriate information source for a specified purpose. Retrieve information from given electronic sources to answer concrete questions. Use conventionally recognized software commands to edit/add content and reformat information products.

Able to: Recognize that the credibility of web-based information can be influenced by the identity, expertise and motives of the creators of the information.

Able to: Use computers to complete basic and explicit information gathering and management tasks. Locate explicit information from within given electronic sources.

Able to: Make basic edits and add content to existing information products per specific instructions. Create simple information products that show consistency of design and layout conventions. Demonstrate awareness of mechanisms to protect personal information and of consequences of public access to personal information.

Able to: Demonstrate a working knowledge of computers as tools Understand the consequences of computers being accessed by multiple users.

Able to: Perform basic communication tasks and add simple content to information products using conventional software commands. Demonstrate familiarity with basic layout conventions of electronic documents.

Level 2 (492-575)

Level 1 (407 – 491)

26

Compared to those at lower levels of proficiency, students operating at Level 4 have a higher level of precision in information search and selection, and more sophisticated control in layout and formatting to meet the specific communicative purpose of the digital products. They also demonstrate awareness of the commercial potential of information products and issues related to intellectual property rights of electronically sourced information. Based on the above description, it can be argued that those students operating below Level 3 do not have adequate CIL proficiency to cope with the challenges they face in working with information in their everyday life and work. Just as a student's level of proficiency in CIL is reflected by his/her scaled CIL score, the item difficulty of questions and tasks indicates the probability of the item being successfully answered by students—the higher the item difficulty, the lower the proportion of submitted answers being successful. Table 2.4 provides a description of the questions and tasks that students operating at each proficiency level were typically able to answer correctly at least 50% of the time. These questions  and  tasks  hence  provide  a  more  vivid  illustration  of  students’  capabilities   at each level. It also provides a helpful reference for students, teachers and other educators alike on the more challenging tasks that students need to be able to tackle in order to move to the next level of CIL proficiency.

2.4  Hong  Kong  Students’  CIL  Proficiency  Levels  in  an   International Context The CIL proficiency level of each ICILS student participant can be computed based on his/her CIL score. In order to take advantage of the possibility of interpreting the findings from the Hong Kong results in the international context, and without burdening our readers with too much information, we have selected two other participating countries that are economically developed and familiar to people in Hong Kong for the purpose of comparison. Figure 2.1 is a pie chart showing the percentages of students performing at each of the four proficiency levels, and those that performed at below Level 1 in Hong Kong, Australia and Korea. It can be seen that Level 2 is the proficiency level with the highest percentage of students in all three systems. In fact, even for Czech Republic which is the country with the highest average CIL score and the highest percentage of students operating above Level 2, that percentage was only 37%. If we take the perspective that Level 3 is the minimum CIL for adequate functioning in the information world around us, then even for the highest performing countries, there is a lot that needs to be done in terms  of  promoting  students’  CIL  proficiency.   In Hong Kong, the situation is even less promising, with only 3% achieving Level 4 and 23% achieving Level 3.

27

Table 2.4 Examples of ICILS test items mapped to the four CIL proficiency levels based on their item difficulty CIL level*

Level 4

Level 3

Matched Strand 1 items: Matched Strand 2 items: producing & exchanging collecting & managing information information Evaluate the reliability of information Select from sources and adapt text for a intended to promote a presentation to suit a specified audience and product on a purpose. commercial website. Appropriate use of colour in a presentation suited Select, from a large set to the communicative purpose. of results returned by a Suitable use of text layout and formatting in search engine, a result different elements in an information poster. that meets the Balanced layout of text and images for an specified search information sheet. criteria Recognize the difference between legal, technical, Select relevant images and social requirements when using images on a website. from electronic sources to represent a three-stage process. Use generic online mapping software to represent text information as a route map. Select relevant information according to given Evaluate the reliability criteria to include in a website. of information Select and adapt some relevant information from presented on a crowdgiven sources when creating a poster. sourced website Demonstrate control of image layout when Select an appropriate creating a poster. website navigation Demonstrate control of color and contrast to structure for a given support readability of a poster content; Demonstrate control of text layout when creating a presentation Aware that a generic email greeting indicates sender  does  not  know  the  recipient’s  identity

Level 2

Navigate to a URL presented as plain text. Locate simple information within a multi-page website.

Level 1

Open a link in a new browser tab. Insert an image into a document

Add contacts to a collaborative workspace. Insert information to a specified cell in a spreadsheet. Differentiate between paid and organic search results returned by a search engine. Use formatting and location to denote text with the role of title in an information sheet. Use the full page when laying out a poster. Demonstrate basic control of text layout and color use when creating a presentation. Explain a potential problem if a personal email address is publicly available. Use software to crop an image. Place a title prominently on a webpage. Create a suitable title for a presentation. Identify email recipients on carbon copy list. Name risk(s) in not logging out after using a publicly accessible computer.

28

* There were a few items that had scaled difficulties below Level 1. These only involve execution of the most basic skills, such as clicking on a hyperlink.

Figure 2.1 Percentages of Hong Kong, Australia and Korea students performing at each CIL proficiency level An even more worrisome picture emerges if we look at the percentages of students that were assessed to perform at below Level 1. There were 15% of students in Hong Kong who were assessed to perform at below Level 1. Only four systems had a higher percentage of students operating at below Level 1: Turkey, Thailand, City of Buenos Aires (Argentina) and Chile. Clearly, there is a serious mismatch   between  our   students’  CIL  proficiency  level  and  Hong  Kong’s  overall   ICT development index as 10th in the world and 5th among the 21 participating systems. Clearly, improving the CIL proficiency level is one of the critical educational priorities for Hong Kong if we are to have a workforce and a citizenry that are CIL proficient for life and work in a highly digitized world. In the next section,  we  are  going  to  explore  the  students’  strengths  and  weakness  across  the   seven Aspects within the two CIL Strands at each of the seven CIL levels in order to seek a better understanding of the possible trajectory of development in CIL as students’ progress up the various levels.

29

2.5 Students’  Performance across CIL Aspects As explained in section 1.4, the CIL assessment framework in this Study has two strands, which can be further differentiated into a total of seven aspects. Each of the items in the assessment was designed to assess one of the seven aspects. Hence, it is possible to compute the mean percentage correct for each student for each set of items in a particular aspect and from these percentage scores to compute the mean performance of the entire population of students for each CIL aspect. Table 2.5 presents the percentage of correct answers and the relative difficulty for each aspect for students in Hong Kong, Australia and Korea. The results clearly show that while these three systems differ greatly in the students’  overall performance, they all find accessing and evaluating information, and transforming information to be the most difficult; whereas items related to knowing about and understanding computer use and using information safely and securely were the easiest. Table 2.5 The percentages correct (S.D.) and relative difficulty for the seven CIL aspects for Hong Kong, Australia and Korea HK AUS KOR CIL strand & aspect % (S.D.) D* % (S.D.) D* % (S.D.) D* Strand 1: Collecting and managing information 1.1 Knowing about and understanding 71 (23) computer use 1.2 Accessing and evaluating 38 (31) information 1.3 Managing information

43 (43)

7

76 (23) 7

74 (22) 7

2

42 (31) 1

40 (31) 1

4

45 (43) 3

49 (41) 5

Strand 2: Producing and exchanging information 2.1 Transforming information

36 (25)

1

48 (26) 2

46 (27) 2

2.2 Creating information

53 (28)

5

59 (24) 5

56 (27) 4

2.3 Sharing information

39 (37)

3

49 (35) 4

44 (38) 3

2.4 Using information safely and securely

58 (25)

6

61 (25) 6

62 (22) 6

D* is the relative difficulty of the aspect in comparison with the others. 1 is most difficult.

In the remainder of this section, we will report on the common errors and difficulties that Hong Kong students had in completing the tasks in each of the CIL aspects through the analysis of  the  students’  answers to the tasks and questions in the released module, After-school Exercise.

30

2.5.1 Knowing about and understanding computer use (Aspect 1.1) Items related to this aspect were found to be the least difficult by Hong Kong and the two reference countries, Australia and Korea. Test items for this aspect are mainly multiple choice or action items in the short tasks section of the student test modules. In the After-school Exercise module, tasks under this aspect required students to know how to navigate to another webpage when given different forms of instructions. Nearly all students (93%) could open a web document when given the hyperlink next to the instruction (Figure 2.2a), and 80% could do the same when given a hyperlink placed inside a pop-up bubble (Figure 2.2b). However, only 65% of the students were able to copy and paste a non-hyperlinked URL to the browser task bar to navigate to the designated page (Figure 2.2c).

a.

Hyperlink as circled.

c.

b.

Hyperlink placed inside circled bubble.

Student has to copy and paste circled URL to browser task bar.

Figure 2.2 The  three  tasks  that  test  students’  ability  to  navigate  to  a  designated  webpage   using different forms of instruction

2.5.2 Accessing and evaluating information (Aspect 1.2) This was the second most difficult aspect for Hong Kong students and the most difficult for both Australian and Korean students. In the After-school Exercise module, this aspect was assessed within the large task, under the criterion information completeness, and was applied to assess the text on the poster created by the students.

31

In order to get the full credit for two score points, students have to put down all three pieces of information related to the After-School Exercise program: when the event will take place (both days and time), what people will do, and what equipment and/or clothing will be needed, as explicitly specified in the task criteria presented at the beginning of the large task (see Figure 1.15). Only 16% of the Hong Kong students succeeded in putting all three pieces of information on their poster, which is very low compared to Australian (40%) and Korean (31%) students. A further 35% of the Hong Kong students were able to provide two out of the three pieces of information to score a partial credit of one point. The poster in Figure 2.3 is an example of a poster that has received a full credit of 2 score points on information completeness, while the one in figure 2.4 did not receive any credit at all as there was no relevant information that matched what was required.

Figure 2.3 A poster designed by a Hong Kong student. (Full credit on information completeness)

Figure 2.4 Sample poster designed by Hong Kong student (0 score information completeness)

2.5.3 Managing information (Aspect 1.3) Managing information was found to be of medium difficulty to students in Hong Kong as well as in Australia and Korea. On the other hand, this is the aspect that has the largest standard deviation for the mean for Hong Kong, indicating a large diversity in students’  ability  to  tackle  this  task.  In After-school Exercise, this aspect was evaluated through a short task in which students were asked to grant a colearner the right to edit an online document by changing its sharing setting to “to   edit”. Figure 2.5 shows the screen layout for this task. Only 50% of Hong Kong students were able to change the settings correctly, compared to 72% of Australian and 66% of Korean students.

32

Figure 2.5 Screenshot of the short task in After-school  Exercise  that  tests  students’  ability   to change the sharing setting of a web document

2.5.4 Transforming information (Aspect 2.1) As evident from Table 2.5, transforming information was found to be the most difficult CIL aspect by Hong Kong students, and the second most difficult by Australian and Korean students. In the After-school Exercise module, this was assessed via four separate criteria in the poster design task: title design, colour contrast, persuasiveness of the poster and use of the full page. Title Design: The title design was evaluated against two performance expectations (see Table 1.5), one of which relates to this aspect: whether a relevant title has been added and formatted to make its role clear. This requires the title to be placed in a prominent position and the text formatting to make it distinctive by capitalizing, using larger font size, bolding, etc. The posters in Figures 2.3, 2.4 and 2.6 have met this criterion. In Figure 2.7, even though   the   title   “FENCING”   was   capitalized,   bolded and underlined, it was not placed in the middle in a sufficiently prominent position to signal that it was a title, and hence failed to meet this criterion in the assessment. In summary, 49% of Hong Kong students submitted posters that met this criterion, compared to 64% of Australian and 50% of Korean students respectively.

33

Figure 2.6 Sample poster designed by a Hong Kong student

Figure 2.7 Sample poster designed by a Hong Kong student

Colour Contrast: Colour contrast was the second criteria associated with the CIL aspect transforming information, which essentially assesses whether there is sufficient contrast between the text and background of the poster for easy reading. To meet this criterion, students need to have some idea about colour schemes and how  these  can  be  used  to  direct  readers’  attention  to  important  details  on  the  poster.   This criterion was machine scored. The colour contrast of the posters in Figures 2.3, 2.4 and 2.6 were all considered to be adequate to achieve this score point, even though the contrasts may differ across these three posters. In comparison, the colour contrast for the poster in Figure 2.7 was poor. Only 11% of Hong Kong students submitted posters that met the criteria for good colour contrast while the respective percentages for Australia and Korea were 25% and 16%. Persuasiveness of Poster: This criterion considers whether the poster includes persuasive or emotive text (which can be placed anywhere on the poster) to attract people to participate in the programme. The posters in Figures 2.3, 2.6 and 2.7 have met this criterion. The relevant texts for this criterion are: Figure 2.3: (within the main text) Finding yourself performed not to well in the PE examination? Want to increase your performance in the PE examination? Want to keep fit? Then join this Marvellous Programme! Figure 2.6: (in the title) 30 分鐘運動好 , 健康生活齊做到 (translation: 30 minutes exercise is good, we can all live a healthy life) Figure 2.7: (at the bottom of the poster) Isn't it interesting? Come to join us!

34

Only 10% of Hong Kong students received the score point for the persuasiveness criterion, as compared to 38% of Australian and 60% of Korean students respectively. Use of full page: This is a relative simple criteria, assessed by checking whether the students have made use of the full area of the poster to display the information. Half of the students in Hong Kong (50%) met this criterion, while in Australia and Korea the corresponding figures were 61% and 57% respectively. Figure 2.3 and Figure 2.4 did not received any credit in this area. Overall, it can be seen that even though transforming information is also a relatively difficult CIL aspect for Australia and Korea, Hong Kong students are achieving a much lower level compared to their Australian and Korean counterparts.

2.5.5 Creating information (Aspect 2.2) Creating information was a comparatively well-answered CIL aspect for Hong Kong students. It was assessed through four of the poster design criteria: title design, image layout, text layout and formatting, and colour contrast. As can be seen from Table 1.5, two of these criteria, title design and colour contrast, also applied to the transforming information aspect. In both cases, the level of performance for achieving the creating information criterion is lower than that for transforming information, as will be explained below. Title Design: A poster will be assessed to have achieved the creating information criterion if a relevant title has been added and placed in a prominent position. Hence, satisfying this criterion for the title design is a necessary prerequisite for it to be further considered against the criterion for transforming information. Thus, the titles in the posters in Figures 2.3, 2.4 and 2.6 all satisfy the creating information criterion, and the title in Figure 2.7 does not. The percentages of students meeting this criterion for Hong Kong, Australia and Korea are respectively 69%, 80% and 71%. Image layout: This criterion requires that one or more images are well aligned with the other elements on the poster and appropriately sized. The posters in Figure 2.4 and 2.7 do not contain images and do not satisfy this criterion. The images in Figure 2.3 are well laid out and aligned with the meaning of the text in the poster, which thus gained the full score for this criterion. In comparison, the images in Figure 2.6 overlap with the main text on the poster, and did not satisfy this criterion. Only 32% of Hong Kong students submitted posters that satisfied this criterion, while the corresponding performance of Australian and Korean students were 50% and 49% respectively.

35

Text layout and formatting: This criterion concerns whether the students have used formatting tools to show the role of different text elements on the poster. The assessment differentiates between two levels of performance. At the lower level, a student needs to demonstrate basic control of text layout on the poster. A higher level of performance requires that a consistent layout format has been used to show the roles of different text elements. Hence, the poster in Figure 2.4 only received one score point (assessed to be at a lower level of performance) as the formatting was relatively basic. On the other hand, the poster in Figure 2.3 was able to demonstrate the use of different formatting features for different text elements, and received a full score of two points. Only 11% of Hong Kong students received the full score on this criterion, compared to the corresponding figures of 19% and 27% by Australian and Korean students respectively. Another 41% of the students in Hong Kong were assessed to perform at the lower level for this criterion, while the corresponding figures for Australia and Korea were both 46%.

2.5.6 Sharing information (Aspect 2.3) In the After-school Exercise module, sharing information was assessed through one short task and two poster design assessment criteria, colour consistency and information adaptation. For the short task, the students were simply assessed on whether they could identify who received an email via carbon copy. This item was relatively well answered by Hong Kong students with 69% correct, while the respective figures for Australia and Korea were 80% and 57% respectively. Colour consistency: This criterion evaluates whether a poster shows evidence of planning regarding the use of colour to denote the role of text, background and images on the poster. The default setting of the poster creation software was such that if the students did not at least change either the background colour or the text colour, the text would be very difficult to read because of poor contrast. The scoring follows a very simple principle: colour should be used intentionally to suit the purpose of the text within the poster, and not just randomly or decoratively. It is not necessary to use multiple colours to receive credit for this criterion as readability is the primary concern. All four posters shown in Figures 2.3, 2.4, 2.6 and 2.7 received full score for this criterion. A total of 76% of Hong Kong students met this criterion, compared to 67% of Australian and 76% of Korean students. Information adaptation: This criterion assesses the extent to which useful information has  been  identified  and  suitably  presented  using  the  student’s  own  words  on  the   poster. No credit would be awarded if chunks of text were directly copied from the given resources and pasted with no or very minimal editing. One score point would be awarded if some editing were done to improve ease of comprehension and relevance. Full score (two points) would be awarded if the student rephrased

36

all the key points on the poster using his/her own words. Information adaptation is the weakest area for Hong Kong students, with only 8% gaining at least one score point (i.e. having done some level of editing), and a dismal 1% scoring two score points by doing substantial rephrasing. This stands in stark contrast with Australia, which has corresponding figures of 52% and 13%, and Korea with 63% and 33% respectively. None of the four posters shown earlier scored even one point on this criterion. Figure 2.3 does not provide any detail about either of the sports listed on the poster. On the other three posters (Figures 2.4, 2.6, and 2.7), all the text about the sports were copied directly from the given resources and pasted with no editing.

2.5.7 Using information safely and securely (Aspect 2.4) This CIL aspect was assessed through five short tasks in the After-school Exercise module. The tasks and students’   responses are described below.

Figure 2.8 The short task that assessed students’  awareness  of  risks  in  making personal information public

Risks in revealing personal information: In short task 3, students were asked which of four kinds of personal information would be the most risky to make   public   in   one’s   profile information when opening an account on a public website: name, gender, home address and nationality (see Figure 2.8). For this item, Hong Kong students had excellent performance, with   96%   being   able   to   choose   “home   address”   as   the   correct   answer.   This   compares very well with the corresponding performance of Australian and Korean students, at 95% and 90% respectively.

Detecting fraudulent (phishing) emails: The short tasks 5, 6 and 7 all referred to the email in Figure 2.9, and ask the student to point out why each of the three elements A, B and C (denoting two URLs, C1 and C2) suggest that this is a phishing email. Samples  of  Hong  Kong  students’  answers  are  presented  in  Table  2.6.

37

Table 2.6 Sample responses from Hong Kong students to the three tasks related to the phishing email Responses  scoring  “0”

Task 5: suspicious sender email

Task 6: suspicious greeting

Task 7: suspicious URL

Can only say that the email is suspicious, e.g.: I  don’t  know  the  sender Fake email Not an official website Unclear answers, such as: Should  not  use  “Dear” There should not be a yellow highlight My name is not WEBDOCS Unclear answers, such as: No www No  “.com”  and  no  “.hk” Has  “reset”

Responses  scoring  “1” Can explain that the email did not originate from Webdocs, e.g.: Email is not sent by Webdocs This is not commercial/ professional email There  is  the  word  “freemail”.   Security would not use this word Can clearly state that the greeting does not have the name of the client, e.g.: If real, should use name: XXX Should  use  customer’s  name Can identify the problem, e.g.: Not sure that the reset password website belong to the Webdocs company The URL links to freeweb They are different websites

A  is  the  email  sender’s  address.  The  student  should  point  out  that  the  domain   name   of   the   purported   sender’s   address   does   not   correspond   to   that   of   the   purported  sending  organization’s  domain.  24%  of  the  Hong  Kong  students  were   able to answer this item correctly, compared to 19% and 21% of Australian and Korean students respectively. B is the greeting to the person receiving this email. If this email actually came from the purported organization, it should have the name of the receiver and so the greeting should be personal. In Hong Kong, only 24% of the students were able to point out that the greeting in B was not a personal one. The percentage of Korean students who answered this question correctly was similar to that of Hong Kong, at 27%, while Australian students performed much better, with 60% correct.

38

Figure 2.9 The phishing email with three suspicious elements, A, B and C The third short task related to this phishing email asks students to identify why the two URLs identified by C may be suspicious, and they should point out that the explicit URL address C1 is not the same as the URL of the actual hyperlink as  indicated  by  C2.  Hong  Kong  students’  performance  in  this  item  is  similarly  poor   as in the previous two, with only 24% correct. On the other hand, this item proves to be the most difficult out of the three on this phishing email for Australian students, with only 15% getting the item correct, while the performance of Korean students was similar to that of Hong Kong, with 23% correct. Problem  of  making  one’s  email  address  public: The final short task in this module that assesses  the  CIL  aspect  “using  information  safely  and  securely”  asks  the  students   to  name   one   problem  that   may  arise  if  one’s  email  address  is  made   public.   Any response indicating either that this may result in getting unwanted email such as spam and advertising email, or that this may lead to loss of privacy such as being contacted by strangers or stalkers would be acceptable answers. However, some students   also   mistakenly   thought   that   this   might   lead   to   one’s   account   being   hacked,   or   one’s   information   being   stolen. Samples   of   students’   incorrect   and   correct responses are presented in Table 2.7.

39

Table 2.7 Samples  of  Hong  Kong  students’  responses  to  the  short  task  on  problems  that   may  arise  by  making  one’s  email  address  public Responses scoring  “0”

Responses  scoring  “1”

This item was of medium difficulty for Hong Kong students, with 45% of the responses scored as correct, while the Australian and Korean students found it to be relatively easy, with correct percentages at 62% and 78% respectively.

2.5.8  Students’  performance  across  CIL  aspects  and  proficiency  levels As mentioned in section 2.4, all of the items in the CIL assessment instruments were scaled and mapped onto four CIL proficiency levels. Tables 2.8 and 2.9 summarize the performance of students from Hong Kong, Australia and Korea for each of the items mapped onto the CIL aspects and levels. In general, fewer students will be able to answer satisfactorily items that are mapped at a higher CIL level. Overall, it  can  be  seen  that  while  the  performance  tasks  assess  students’  CIL,   language competence also matter, particularly for items that require open-ended written response. Hong Kong students tend to be weak at tasks that require the use of language to express their ideas clearly in their own words, such as information adaptation and questions that require explanation.

40

Table 2.8 Percentage of students who have correctly answered each short task in the After-school Exercise module (mapped to CIL assessment aspect and level) Percentages of Task students correct CIL aspect CIL level no. HK AUS KOR Navigate to a URL given as 2 2 65 66 63 plain text 1.1 Knowing about and understanding computer use

4

Open webmail from hyperlink in pop up bubble

1

80

85

91

9

Access a document from hyperlink in email message

1

93

96

97

1.3 Managing information

10

Modify the sharing settings of a collaborative document

2

50

72

66

2.3 Sharing information

1

Identify who received an email by carbon copy

1

69

80

57

3

Identify personal information risky to set as public in profile.

2

96

95

90

5

Identify sender email & organization domain mismatch as suspicious

4

24

19

21

6

Identify a generic email greeting to be suspicious

3

24

60

27

7

Identify difference in URL displayed & the hyperlink address as suspicious

4

24

15

23

8

Explain potential problem in making personal email address public

2

45

62

78

2.4 Using information safely and securely

Another observation is that while the difficulty in completing a task satisfactorily depends on the specific CIL aspect it assesses, the format and context of the task as well as the performance level required also matter. Hence the students’   performance   in   the   various   tasks   and   CIL   aspects   in   the   After-school Exercise   may   not   necessarily   provide   a   good   reflection   on   the   students’   CIL   achievement.   We   could   get   a   better   estimate   of   the   students’   performance if we make use the performance data collected from all four assessment modules. This will be reported in the next section.

41

Table 2.9 Percentage of students who have achieved partial or full score for each assessment criterion (mapped to CIL level) in the poster design large-task Criterion

Score/ max. score

CIL aspect

CIL level

Percentages of students correct HK AUS KOR

1/2

2.2 Creating information

2

69

80

71

2/2

2.1 Transforming information

2

49

64

50

2. Image layout

1/1

2.2 Creating information

3

32

50

49

3. Text layout & formatting

1/2

2.2 Creating information

2

52

65

73

2/2

2.2 Creating information

4

11

19

27

1/2

2.2 Creating information

1

66

76

73

2/2

2.1 Transforming information

3

11

25

16

1/1

2.3 Sharing information

1

76

67

76

1/2

2.3. Sharing information

3

8

52

63

2/2

2.3. Sharing information

4

1

13

33

1/2

1.2. Accessing & eval. information 1.2. Accessing & eval. information 2.1. Transforming information 2.1. Transforming information

2

51

65

63

3

16

40

31

3

10

38

60

2

50

61

57

1. Title design

4. Color contrast 5. Color consistency 6. Information adaptation 7. Information completeness

2/2

8. Persuasiveness

1/1

9. Use of full page

1/1

2.6 Students’  CIL  proficiency  trajectories From a pedagogical point of view, it would be useful to know the strengths and weaknesses  of  a  student’s  CIL  beyond  his/her  performance at an item level. In this section,   we   will   explore   the   relative   strengths   and   weaknesses   of   students’   performance in each of the seven aspects, and to understand how students at various proficiency levels differ in terms of their achievements in each of the seven aspects. Such knowledge would help to provide some initial information about students’  trajectory  of  improvement across the CIL aspects, which may be used to guide teachers in designing programs of CIL learning for students. We will also compare   such   “trajectories”   for   Hong   Kong,   Australia   and   Korea   to   examine   whether such trajectories are generally applicable or whether these are actually dependent on the educational and cultural context of individual countries. Figure 2.10 is a graphic representation of the percentages correct for each of the seven CIL aspects for Hong Kong, Australian and Korean students presented in Table 2.5.

42

Figure 2.10 A radar diagram of the mean percentages correct per CIL aspect The key areas of concern in terms of the low percentages correct (below 40%) are aspects 1.2 (Accessing and evaluating information), 2.1 (transforming information), and 2.3 (sharing information). It can also be seen that while the two comparison countries are   both   high   performing   systems   in   ICILS,   Hong   Kong’s   achievement in some of the aspects are not too far behind. The key areas that Hong Kong students lag behind are aspects 2.1 (transforming information), 2.3 (sharing information) and 1.3 (managing information). In the remainder of this section, we will explore how the profile of competence differs across the five groups of students from below Level 1 to Level 4. These profiles will help us understand how students’  CIL  competence  develops from the lowest to the highest level.

43

2.6.1  Hong  Kong  students’  improvements  in  competency  profile  as  they   move to higher CIL proficiency levels Figure 2.11 is the graphic representation of the profiles for Hong Kong students at each of the five CIL proficiency levels. The group of students at Below Level 1 (comprising 15% of the Hong Kong student population) can be considered as the least  CIL  “educated”  group  of  students.  It  can  be  seen  that  even  for  this  group,  CIL   aspects 1.1 and 2.4 are relatively well developed.

Figure 2.11 A radar diagram showing the mean percentages correct per CIL aspect for Hong Kong students at each of the five CIL proficiency levels Since Hong Kong students have extremely high computer (98%) and Internet (99%) access at home, and there is pervasive use of Internet technology in the society at large for social communications and entertainment, the relatively high achievement in these two aspects of CIL competence are probably unrelated to students’   educational experiences in schools. Comparing the CIL competency profile between this group and the Level 1 group, we can see that there is a marked increase in competency for the latter in all seven CIL aspects.

44

As Fraillon et al. (2014) mentioned, students performing below Level 1 were in fact outside of the competence level that the ICILS assessment was designed for. Clearly, the fact that we still have such a large proportion of students in Hong Kong operating at such a low level is very worrying. Further, there is a need to find ways to help this group of students comprehensively in all 7 CIL aspects in order to reach at least Level 1 proficiency. The developmental picture becomes rather different when we compare the competency profiles beyond Level 1. From Level 1 to Level 4, the improvements in aspects 1.1 (knowing about and understanding computer use) and 2.4 (using information safely and securely) were much smaller in between any two adjacent levels. In two of the other five aspects, 1.2 (accessing and evaluating information) and 2.1 (transforming information), there continued to be marked improvements in  students’  competence  going  up  the  proficiency  levels. However, for the remaining three CIL aspects 1.3 (managing information), 2.2 (creating information), and 2.3 (sharing information), there were close to no improvement between Level 3 to Level 4. This means that our highest performing students were not able to advance further in competence in these three CIL aspects. This is particularly worrying for sharing information (aspect 2.3) and managing information (aspect 1.3) for which the highest mean for Level 4 students were below 60% and below 70% respectively.

2.6.2 Australian and Korean  students’  improvements  in  competency profile as they move to higher CIL proficiency levels It is useful to find out if other countries show similar developmental patterns in CIL competency as students advance in their CIL proficiency levels. Figures 2.12 and 2.13 show the corresponding developmental profiles for Australian and Korean students respectively. Comparing the three Figures 2.11 to 2.13, there is clear resemblance in profile across the three systems among students at Below Level 1 proficiency, and that the improvement from Below Level 1 to Level 1 are similarly marked in all seven aspects. On the other hand, Australia appear to have the most balanced improvement in all seven CIL aspects such that their students at Level 4 achieve a minimum mean percentage correct of 70% for the lowest performing aspect, sharing information (aspect 2.3); and except for managing information (aspect 1.3), this group of highest performing students achieve a mean of at least 80% correct in all the other five aspects.

45

Figure 2.12 A radar diagram showing the mean percentages correct per CIL aspect for Australian students at each of the five CIL proficiency levels Korean students also show a more balanced improvement profile compared to Hong Kong, though less so than Australia. Korean students at proficiency level 4 also achieved a minimum mean percentage correct above 70% in all seven aspects.

46

Figure 2.13 A radar diagram showing the mean percentages correct per CIL aspect for Korean students at each of the five CIL proficiency levels

2.7 Summary Results from the ICILS 2013 study show that Hong Kong students had lower achievement in Computer and Information Literacy compared to all the economically developed countries participating in the study as measured by the overall CIL score. Based on the CIL proficiency levels as defined by the Study, 15% of Hong Kong students did not even reach Level 1 proficiency, which was the lowest level that the Study was designed to assess. In fact, based on the expected competencies at the different levels of proficiency (see Table 2.3), a person with CIL proficiency below level 3 is seriously handicapped in his/her CIL ability to cope with the everyday demands in the information world around us. Only 26% of the students in Hong Kong achieved Level 3 or above while the corresponding percentages were 34% and 35% respectively for Australia and Korea. There is clearly a serious mismatch between the  students’  CIL  proficiency  and   the role played by computer and information technology in all aspects of social, economic and political developments in Hong Kong.

47

A   more   refined   analysis   reveals   important   differences   in   students’   achievement across the seven CIL aspects in the assessment framework. Hong Kong students had the poorest performance in accessing and evaluating information, transforming information and sharing information, with a mean of lower than 40% correct in each of these three aspects, and in managing information (43% correct). Through comparing   students’   competence   profiles   across   the   five   proficiency levels from Below Level 1 to Level 4, it was found that Hong Kong students were not able to show similar advancement in all seven CIL aspects, whereas their Australian and Korean counterparts showed much more balanced improvement in all seven aspects. In particular, there was no significant advancement between Levels 3 and 4 in two of the lowest performing aspects, managing information and sharing information. This indicates that even for the highest performing students in Hong Kong, their learning experiences within and outside of the school did not help them to improve in critical CIL aspects. Clearly, these are issues of serious concern. In chapters 3 and 4, we will further explore the school level conditions and personal background characteristics that influence students’  CIL  achievement.

48

Chapter 3 Influence of Students’ Background and ICT Use Experience on their CIL Students’   CIL   achievement   is   influenced   by   hierarchically   embedded   contextual   factors at multiple levels, namely, the personal level (e.g. ability, interest), family level (e.g. socio- economic status (SES) factors such as parental occupation and education),   school   level   (e.g.   teachers’   qualifications)   and   system   level   (e.g.   curriculum). The factors at each level can be further categorized into antecedents or  processes.  Process  factors  are  those  that  directly  influence  students’  learning in CIL, such as their opportunities to use ICT at home and in school, and their teachers’  pedagogical  ICT  competences.  Antecedent  factors  are  those  that  influence   the   process   factors.   Examples   of   antecedents   include   students’   family   SES,   the   school’s   student   intake   and   provisions   for   teachers’   professional   development.   Data pertaining to antecedent and process factors were collected through two approaches. The first is through the surveys of students, teachers, principals and ICT coordinators. The second is through a national context survey completed by each national research centre. These survey data allow us to investigate how factors at  different  levels  may  affect  students’  CIL  achievement.

3.1 Contextual Factors Derived from the Student Survey Figure 3.1 depicts the conceptual framework adopted in ICILS 2013, based on which the context questionnaires were designed. This chapter reports on the findings related to the contextual data gathered through the student survey. All participating students, after completing the one-hour CIL performance assessment, were invited to take part in a 20-minutes online survey. This   survey   was   designed   to   collect   information   on   the   students’   personal   and family background, as well as their use of and engagement with ICT at home and in school.

49

(Fraillon et al 2014, p.37) Figure 3.1 Contextual  factors  influencing  students’  CIL  outcomes

Table 3.1 lists the set of 13 variables derived from the student survey. Six of these are antecedents (S_SEX, S_ISCED, S_NISB, S_BASEFF, S_ADVEFF, S_INTRST), while the remaining seven can be considered as process indicators (S_TSKLRN, S_USEAPP, S_USELRN, S_USESTD, S_USEREC, S_USECOM and S_USEINF). Conceptually speaking, one may differentiate between antecedent and process factors, but actually they mutually influence each other. For example, more opportunities  to  learn  to  use  ICT  may  help  to  develop  students’  stronger  interest   and self-efficacy. In this chapter, we first report on the basic descriptive statistics of  these  context  variables  and  how  they  relate  to  the  students’  CIL  scores.  This  will   be followed by multilevel analysis to explore how these factors together influence the overall CIL score and the standardized scores of each of the seven CIL aspects of a student. Data  collected  through  the  student  survey  reflects  the  students’  experiences,   and the classroom and school level conditions as perceived by the students. It is important to note that variables derived from data collected through the student survey may also reflect  conditions  at  other  levels.  For  example,  students’  reported   ICT  usage  in  learning  at  school  may  be  considered  as  a  reflection  of  their  teachers’   pedagogical use of ICT. However, for simplicity, we will initially treat all these variables/factors as student level factors. Also, we have limited our cross-national comparisons to only two countries, Australia and Korea, so as to provide a sharper focus on systems that are of stronger interest and familiarity to readers in Hong Kong. Reports on the entire set of results from all of the 21 participating countries can be found in Preparing for life in a digital age: The IEA International Computer and Information Literacy Study international report (Fraillon et al., 2014).

50

Table 3.1 List of key context variables* derived from the student questionnaire Context Description variable S_SEX Gender of the student S_ISCED

Students’  own  expected  highest  level  of  education  reached

S_NISB

National  index  of  students’  socioeconomic  background

S_BASEFF

Self-efficacy in basic ICT skills (e.g. edit documents/photos, filing)

S_ADVEFF

Self-efficacy in advanced ICT skills (e.g. handle viruses, create macro)

S_INTRST

Interest and enjoyment in using ICT Learning of CIL-related tasks at school (e.g. accessing, organizing and presenting information) Use of work-oriented ICT applications (incl. Office suite, programming, graphics and academic software)

S_TSKLRN S_USEAPP S_USELRN

Use of ICT during lessons at school in major, non-ICT school subjects

S_USESTD

Use of ICT for study purposes (e.g. prepare assignments, collaborate)

S_USEREC

Use of ICT for recreation (e.g. music, news, video, reading reviews)

S_USECOM

Use of ICT for social communication (incl. texting & social media)

S_USEINF

Use of ICT for exchanging information (e.g. posting on forums, blogs)

* Note: All except the first two variables are scaled indices derived from multiple item responses in the survey. For details of the construction of these scales, see Fraillon et al (2015) pp. 187-197.

3.2  Influence  of  Students’  Personal  and  Family  Background There were two questions in the student survey that collected personal background variables: gender of the student, and the highest level of education that the student expected himself/herself to reach. There were four kinds of family background variables elicited by the survey: whether the student has recent immigrant status, language spoken at home with respect to the language used in the CIL assessment, socioeconomic status (SES) and the availability of ICT resources at home. Findings related to these variables are reported in this section.

51

3.2.1 Gender and CIL achievement In past IEA studies, gender differences in reading literacy were mostly in favour of female students while achievements in science and math were more commonly in favour of male students, though such trends are also changing and vary across countries. From the results of ICILS 2013, we find that girls perform better than boys in all participating countries, and such differences were significant in all but four countries. Table 3.2 presents the gender differences in the mean CIL score for Hong Kong, Australia and Korea against the international average. It shows that female students achieved an average of 25 score points higher than their male counterparts. This difference is higher than the international average, which is only an 18 point differences, but similar to that found in Australia. Table 3.2 Gender differences in CIL Gender Educational systems

Males Females Mean Scale Mean Scale Score Score

Difference (Females Males)

Gender difference 0

Hong Kong (SAR)

498 (9.2)

523 (7.5)

25 (8.3)

ICILS 2013 average

491 (1.0)

509 (1.0)

18 (1.0)

Australia

529 (3.3)

554 (2.8)

24 (4.0)

Republic of Korea

517 (3.7)

556 (3.1)

38 (4.1)

* Statistically significant (p
View more...

Comments

Copyright © 2017 PDFSECRET Inc.