Understanding Task Features

Performance tasks in ATLAS include two types of features to help you select the task that best meets your needs: attributes and highlights.

Every science performance task makes different trade-offs in terms of what features to highlight or foreground. These features may make some tasks more useful to you than others, depending on when, how, and for what purpose you are using them. The attributes and highlights shown in ATLAS were chosen based on teachers’ insights into what features were most important to them when deciding which tasks to use.

Task Features

Accessibility

Agency

Communication

Metacognition

Highlights: Includes scoring guidance Authentic phenomena Designed with teachers

How are attributes and tags evaluated and assigned?

All tasks were independently evaluated for attributes and highlights by a combination of science performance assessment experts and practicing science teachers. Each task attribute and highlight includes a set of look-fors, or indicators, drawn from research and practical implementation. Reviewers considered if each task met each indicator. For attributes, indicators are combined to produce a “low-medium-high” rating for every task. For highlights, reviewers tagged the tasks with the individual highlights that best reflect the particular strengths and features of a given task. Over time, ATLAS will continue to refine the attribute and highlight tags to ensure that they match teachers’ experiences using these tasks.

Is a task with more highlights or higher levels of attributes better than other tasks?

No. Highlights do not indicate task quality–they simply provide at-a-glance insights into characteristics of tasks that can help teachers decide if a given task is going to work well in their classroom.

How should I think about tags and attributes when I’m selecting a task?

Tags and attributes can help you figure out

  1. which tasks are going to match your current instruction and needs,
  2. which tasks might complement what current instruction looks like, and
  3. plan for how much time and modification might be needed to make a task work for you. 

Over time, you can reflect on whether certain tags seem to consistently work better in your classroom, and what that might mean for current and future instructional goals you have.


 

Attributes

Every task includes indicators for four attributes:

Accessibility

To what degree do tasks...

  • have student-facing materials that meet standards for digital and document accessibility?
  • require reading loads within tasks appropriate for a wide range of students, given the purpose of the task?
  • exhibit key features of Universal Design for Learning and Assessment?

Agency

To what degree do tasks...

  • provide students with opportunities to make developmentally appropriate, active, and consequential choices and decisions within the performance task?
  • engage students in cycles of goal-setting, purposeful action, and reflection on meeting those goals?
  • invite students to participate in directing their own learning and choices?

Communication

To what degree do tasks...

  • invite students to carefully interpret and communicate information through a range of modalities and formats?
  • require students to consider audience and perspective when conveying ideas?
  • provide opportunities for students to update their thinking and respond to ideas shared by others?

Metacognition

To what degree do tasks...

  • explicitly invite students to reflect on and connect current experiences with prior and future learning opportunities?
  • give students clear opportunities to consider how the performance task is building their understanding of science?
  • invite students to reflect on ideas and practices they still need to develop to be successful?

 

Highlights

All ATLAS tasks were also independently evaluated for a set of tags that highlight specific features that are present in the task. There are 23 possible highlights. Tasks can include multiple highlights from each category.

What the task is about

Authentic phenomena
the task engages students in a real-world observation or problem, rather than a contrived example.
Civic engagement
the task positions students, via task activities, to consider and act upon issues related to issues, policies, or concerns of a community.
Environmental literacy
the task builds student understanding of and engagement with human interactions with natural systems.
Unsettled Science
The task engages students in making sense of phenomena and problems that are active areas of research within the scientific community (i.e., there is not currently a commonly accepted explanation or mechanism for the phenomenon or idea under study)
Digital Literacy
the task asks students to engage with, reflect on, and use digital tools and platforms effectively and ethically.

How the task was developed

Designed with teachers
science teachers were directly involved in writing the task.
Designed with students
K-12 science students were involved in the development of tasks (e.g., through focus groups, student interest surveys, direct feedback on tasks)
Piloted with students
tasks were piloted with students and revised based on robust evidence

How the task supports and surfaces science learning

Collaborative
Students work with others (peers, community) to accomplish central goals of the task
Complex reasoning
The task asks students to go beyond simple/single-step application and sensemaking, engaging them in increasingly independent multi-step integrative or evaluative thinking.
Culturally relevant
The task connects science learning to lived experiences, cultural practices, and community contexts. They offer opportunities for students to draw on funds of knowledge and develop cultural competence in both cultures they themselves identify with as well as others’ (windows and mirrors).
Emphasizes SEPs, DCIs, or CCCs
This task provides particular opportunities for students to engage with and demonstrate their understanding of disciplinary core ideas, science and engineering practices, or crosscutting concepts (the dimension is foregrounded relative to other dimensions).
Peer feedback
The task invites students to provide, receive, and respond to feedback and ideas from peers.
Perspective taking
The task invites students to consider elements of the task (phenomena, problems, claims, evidence) through the lens of someone other than themselves, and make sense of how an alternative perspective may shape sensemaking, priorities, recommendations, or outcomes.
Problem solving
The task invites students to use their understanding of science ideas and practices to address a situation people want to change by defining problems, negotiating criteria and constraints effectively, and iteratively developing, testing, and improving solutions.

What resources are included for task implementation

Guides differentiation
Teacher materials include explicit support for modifying task content, implementation, or interpretation for specific student and classroom needs and contexts.
Includes translations
Student-facing tasks are available in at least one other language.
Includes student work samples
Teacher materials include examples of student responses and relevant student work.
Includes rubrics
Teacher materials include guidance for interpreting student responses and providing feedback to students.
Supports for California’s English Language Development Standards (ELD)
Teacher materials include specific supports for addressing ELD standards through the task.