Boy at a table writing with a pencil

Writing assessments: 10 evidence-informed practices to do a good job for your students (with resource links and practical suggestions)

Print Friendly, PDF & Email

How do we judge whether a student is a “good writer” for their age and years at school? 

How can we describe a particular piece of written schoolwork as of “good”, “average”, or “poor” quality?

How can we identify students who need additional help with their writing?

How can we measure and report students’ progress with their writing skills during and after intervention or teaching?

The answer to all these questions is ‘regular and reliable writing assessments’. But, not surprisingly, some early-career (and experienced) teachers and speech-language pathologists lack confidence when it comes to assessing students’ writing skills.  

If we want to support students to improve their writing, we need to assess their written language skills in a way that allows us to:

  • identify existing strengths and challenges at word, sentence, and text levels;
  • set specific, curriculum-relevant, and functional goals for different kinds of writing; and 
  • monitor progress in real time to check that our teaching or writing intervention is working.

In this article, we share 10 evidence-based practices we apply in our practice when assessing a student’s writing. Where possible, we’ve included links, references and resources you can use to explore the topic in more detail and to improve existing practices.

1. The big picture: writing skills are important and must be taught 

Good writing skills are essential for:

  • academic success; and
  • later career success (Price & Jackson, 2015).

Increasingly, writing is also important for social participation (e.g. texting friends, and using social media apps).

Unlike talking, writing is not ‘biologically natural’. Everyone needs to learn how to do it, although some students need a lot more support than others, including many students with developmental language disorder, autism, learning disorders (including dyslexia, other reading difficulties, and dysgraphia), and other neurodevelopmental differences, disorders and disabilities.

2. To assess a student’s writing properly, you need to understand language 

When you assess a student’s writing quality, you are really assessing their use of written language*. If you are assessing language in any modality, you need to know:

For these reasons, ideally, you should assess a student’s oral language skills as part of any writing assessment including their:

3. Norm-referenced, standardised writing tests have their place, but…

Many such tests exist, such as the Test of Written Language (Hammill & Larsen, 2009), and the Oral and Written Language Scales II (Carrow-Woolfolk). They are sometimes necessary, e.g., for funding eligibility and access to public services. But standardised, norm-referenced tests:

  • have many well-known limitations, meaning results often need to be interpreted with caution;
  • are really expensive; 
  • often require the tester to have special training and qualifications;
  • can be time-consuming to administer, score, and report on;
  • don’t provide enough information to formulate specific goals that are relevant to the student’s learning and curriculum; and
  • can’t be used dynamically to monitor progress, e.g. weekly or monthly.

4. …writing samples often provide more useful information 

As with oral language samples, getting written language samples from a student gives you more authentic information about the students’ writing skills and needs in the real world than you can get from a decontextualised standardised test (e.g. Price & Jackson, 2015). You can take samples from a student’s recent school work (provided it was written without help, e.g. adult scaffolding or editing), or elicit your own from the student using some of the suggestions below.

5. Don’t assume your students can write in complete sentences!

I’ve learned this the hard way. You can’t assume a student knows how to write in complete and grammatically correct sentences – even if they are in high school:

For all these reasons, we almost always assess compound and complex sentence-writing skills.

6. Get samples of different text types

At school and in life, students need to write for different purposes, so we need to assess different kinds of writing. To do a thorough writing assessment, you should ideally obtain at least three different kinds of samples:

Text typeIncludes
NarrativesStories (e.g. Nelson, 2010).
Personal recounts or narratives of real events (e.g. Lombardino, 2012).
Biographies of other people and histories. 
Expositions (informational and explanatory writing)Comparing and contrasting, e.g., pairs of objects and events, e.g. dolphins v sharks.
Enumerating facts about a specific topic, e.g. wilderness survival skills.
Solving-problems by identifying a problem and then proposing solutions.
Outlining procedures and giving directions step-by-step.
Describing objects, people, and events.
Explaining cause-and-effect relationships by giving reasons for why something happens (e.g. rain) (e.g. Singer, 2007; Ward-Lonergan, 2010).
Retelling information passages in own words. 
Persuasive writingOpinion pieces (for younger students). 
Arguments and essays for older students (e.g., Nippold, 2005; Asaro-Saddler & Bak, 2012).

In part 10 below, we outline some research-based protocols to elicit samples of narrative, expository and persuasive writing samples from students of different ages. You can also save yourself a lot of time by using or adapting some of the fantastic free language-sampling resources created, developed and tested by speech pathologist Associate Professor Marleen Westerveld, available here.

7. To analyse a writing sample properly – and to gauge progress over time – you need evidence-based criteria and measures. 

Specific practices vary, but we provide some general suggestions below based on measures commonly used in research studies about the writing samples of children and teenagers.

8. When reporting on a student’s writing, include word-level measures: 

AssessExamples of measures
VocabularyNumber of different words.
Number of abstract and concrete nouns.
Number of specific and general all-purpose verbs.
Number of metacognitive verbs (e.g. “think”, “know”, “believe”)  and “internal state” words (e.g. names of emotions).
Number and variety of coordinating and subordinating conjunctions (e.g. “and”, “but”, “so”, “because”, “if”, “while”).
Number of adverbial constructs (e.g. “finally”, “however”).
Number and variety of adjectives.
SpellingProportion of words spelled correctly/incorrectly.
Proportions of phonological, morphological, orthographic and other spelling errors.
See here for a more detailed framework for classifying spelling errors that we use in our clinic.

Note: in writing samples, students may avoid words they can’t spell. As such, targeted dictation probes and formal standardised spelling tests may provide better data. Having said that, some students do better in word-level spelling tests than in discourse, possibly due to the greater cognitive load of writing at the text level.

9. When reporting writing assessment results, include sentence-level measures:

AssessExamples of measures
Syntactic (sentence) length Average length of:
* the sentences in the sample;
* Minimal Terminal Units (T-Units). A T-Unit is one independent clause plus any subordinate clauses attached to it or embedded within it (Nippold, 2014); and
* Communication Units (C-Units). A C-Unit is similar to a T-Unit and is usually used when looking at oral language samples. Unlike T-Units, it includes elliptical responses – utterances that do not include an independent clause, but are grammatically acceptable in context, e.g. short responses to questions.
Syntactic complexity Subordination Index (SI), also known as “clausal density” or “clausal complexity”. This is calculated by adding the total number of independent and dependent clauses in the writing sample and then dividing it by the number of T-Units in the sample.
Syntactic correctnessPercentage of grammatically correct T-Units.
Number of grammatical errors.
Number of grammatical errors per T-Unit.
Total number of correct complex sentences.

10. When analysing text-level samples, report on discourse-level ‘macrostructure’ features:

(A) General text-level criteria

AssessExamples of Measures
Punctuation and writing conventions
Beginning each sentence with a capital letter.
Ending each sentence with a punctuation mark.
Capitalisations of proper nouns and words in titles.
Using apostrophes in contractions (e.g. “don’t”, “isn’t”).
Using apostrophes to show possessive relationships (e.g. David’s pen).
Using commas appropriately to mark clauses, and in lists.
Productivity (text length)Total number of words written.
Number of sentences written.
Number of T-Units written. 
Number of C-Units written.
(Take care here: some advanced writers can express themselves more succinctly than developing writers. More doesn’t always mean better.) 

(B) Text-type specific measures, e g.:

Text typeExamples of measuresExample assessment protocol
NarrativesStory grammar elements included (Stein and Glenn, 1979):
* Setting
* Complicating action
* Reaction
* Strategy
* Action
* Consequence
* Moral/ending

Story grammar elements included (Kotsoftas & Gray, 2012):
* Setting
* Initiating event or problem
* Internal response of main character
* Internal plan of main character
* Main character’s attempt to accomplish plan
* Consequence
* Resolution
* Ending

Structure elements (Hall-Mills and Apel, 2013):
* Characters
* Plot
* Sensory detail
* Logical sequence
* Context
“One day you are on your way to school and your backpack turns into a pair of wings! Tell the story of what happens. Be creative, provide good detail, and be sure your story has a beginning, middle and end.” (Adapted from Koutsoftas & Gray, 2013) 

“Remember something funny or special that happened in your family. Write the story of what happened…Why is this a funny or special memory for you?” (Adapted from Danzak, 2011) 
ExpositionsStructure/key elements (Hall-Mills and Apel, 2013):

* Structure (identifiable, appropriate, well-formed)
* Logical sequence (ideas logical and coherent, well-ordered, clear)
* Introduction (main idea clear, creative and purposeful)
* Body (supporting ideas and evidence supporting the thesis)
* Conclusion (original, creative, thought-provoking in a way that extends the topic)
Younger: “We all admire people for different reasons. Whom do you admire? It can be someone in your family, a friend, professional or celebrity. Describe this person. If you could spend a day with this person, what would you do?” (Danzak, 2011)

Older: “Think about the topic of friendship. Write a paper explaining what friendship is. Explain how people become friends and why friendship is important.” (Nippold & Sun, 2010)

“Consider which career you would like to pursue after high school. Why would you like to do that? Write about your career-choice and reasons for selecting it.” (Hall-Mills & Apel, 2013) 
Persuasive writingNumber of different reasons to support the student’s opinion (e.g. Nippold et al., 2005).

Structure (Asaro-Saddler & Bak (2012):
* Topic sentence
* 3+ reasons to support
* Explanation for each reason
* Ending sentence
Younger: “Some people think children should be required to attend school in the summertime. Do you agree or disagree? Write what you believe. Explain why you think this.” (Asaro-Saddler & Bak, 2012)

Older: “People have different views on animals in circuses…Tell me exactly what you think about the controversy. Give me lots of good reasons for your opinion.” (Nippold et al., 2005).

As noted above, additional language sampling resources developed by Associate Professor Marleen Westerveld are available on her personal website here.

Bottom line

In this article, we’ve:

  • highlighted the importance of good writing assessment practices to identify students who need additional support; 
  • encouraged teachers and speech pathologists to obtain narrative, expository and persuasive writing samples of students’ work using evidence-informed language sampling protocols; and 
  • suggested evidence-informed measures you can use to report on students’ writing strengths and challenges, and to plan additional support when needed. 

If you are interested in finding out, exactly, what we do in our practice, check out our writing assessment screener. In any event, we hope you find this article helpful and practical as you work to support your students’ writing!

Key source: Price, J. R. & Jackson, S.C. (2015). Procedures for Obtaining and Analyzing Writing Samples of School-Age Children and Adolescents. Language Speech, and Hearing Services in Schools, 46, 277-293. (As always, all errors of interpretation, and editorial comments on our practices are our own.)

Related Resources: 

Related articles:

* Important Note: This article does not address conditions that can make writing physically difficult for some students from a motor planning or execution perspective for any reason. Nor does it cover writing sampling for students who use alternative and augmentative communication methods – an important topic worthy of a standalone article. Teachers, speech pathologists and others who support students with physical and/or intellectual disabilities need to know about the many tools and accommodations that can assist students who find it physically difficult or impossible to write by hand or keyboard.

This article also appears in a recent issue of Banter Booster, our weekly round up of the best speech pathology ideas and practice tips for busy speech pathologists, providers, speech pathology students, teachers and other interested readers.

Sign up to receive Banter Booster in your inbox each week:

Man wearing glasses and a suit, standing in front of a bay

Hi there, I’m David Kinnane.

Principal Speech Pathologist, Banter Speech & Language

Our talented team of certified practising speech pathologists provide unhurried, personalised and evidence-based speech pathology care to children and adults in the Inner West of Sydney and beyond, both in our clinic and via telehealth.

David Kinnane
Speech-Language Pathologist. Lawyer. Father. Reader. Writer. Speaker.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share This

Copy Link to Clipboard

Copy