acquisition, acquisition predicts extinction, behavior analysis, behavior cusp, extinction, previous learning affects new learning, trauma, trauma-informed behavior analysis, variability, variability during acquisition predicts variability in extinction
Trigger warning: This topic is disturbing and sensitive, yet I wish more behavior analysts applied their science to this ugly real world problem. Let’s face the hard thing together, by discussing some effects of initial learning on later behavior and learning. Several references are below for this topic: How acquisition predicts extinction; variability during acquisition and extinction. This article is Part 11 in a series on how behavior analysts can grow towards supporting children and adults affected by trauma, by Dr. Camille Kolu, Ph.D., BCBA-D.
Severely aversive experiences affect us for a long time. And acquisition can predict what someone’s behavior will look like during extinction (or how behavior will depend on original learning even long after those variables are “gone”). A BCBA recently asked me for references on this topic during a training I provided to an autism agency on how to provide safer and more appropriate supports for individuals affected by events we characterize as “traumatic”. Thank you to the BCBA for the excellent question!
At first try, we might have a hard time finding references and resources showing how a young child’s traumatic history leads to bizarre and challenging behavior much later in life. If this seems strange, consider how absurd it would be to suggest that caregivers are carefully documenting and reporting how they deprived a child of the food, comfort, diaper changes and other kinds of care the child needed as an infant or growing young person. These tragic events are usually documented after, not while, they occur (if ever). But at least scientists can get familiar with how early learning affects later learning, and behavior later in life. This helps us to make sense of otherwise bizarre behaviors, provide important contextual information to caregivers and decision makers, and even to inform our preventative treatment of behaviors that don’t seem related to the ongoing situation.
Behavior analysts or psychologists might relate this to how early learning conditions affect subsequent learning, or how the variables present during early learning exerts effects on behavior, after that situation is no longer present. This discussion is to provide some examples of literature that might be useful for behavior analysts interesting in exploring this topic.
In my work with children and adults after traumatic experiences before and during foster care (or other traumatic events including long duration life threatening illnesses or aversive experiences), I have been collecting data on the types of behaviors that “show up in the behavior stream and repertoire” of children who were exposed earlier – and in some cases much earlier- to situations of neglect and abuse. It has been devastating (but also enlightening) to see some similarities between the behavior streams of otherwise diverse children exposed to these kind of behavioral excesses or deficits from their caregivers. I found that all but the most “trauma-informed” behavior analysts I worked with were less informed about the cluster of behaviors that reliably suggested “traumatic past experiences” (and thus, different protocols sensitive to their histories and different needs) after my experience with this population, while social workers were quite familiar with these. For example, some common behaviors that cluster (at least on my caseload) in this population of children exposed to severe neglect and abuse include smearing feces, eating items in the trash or related to personal hygiene, and sexual and/or aggressive behaviors toward children and/or animals. If a child showed two or more of these behaviors it seemed highly correlated with a history of neglect and/or abuse. Of course, there were many more variables involved (such as how long the child was exposed to the neglect and abuse, when it occurred in terms of how young the child was, and who the perpetrators were, and when the child was removed from care). But in almost all the cases I documented, these behaviors occurred not necessarily in the presence of the offending caregivers, but later when the child was out of the situation of abuse and neglect. However, they were related to the stimuli going on during the previous abuse (see later reference about discriminative stimuli and variability).
For example, when a very young child learns that food and comfort don’t come regularly, they try other things before giving up. First, please consider with me that the behaviors of a child who is not giving up were once adaptive and important and need to be carefully considered – so we must think about this before using “extinction” of these behaviors… this is another plea to behavior analysts and educators to look beyond “local function” of the behavior before treating it simply as a “behavior for reduction”. A child left sitting in his diaper for hours on end without food and a diaper change might later try to steal food and smear his feces. These are not typical for otherwise “neurotypical” children exposed to typical caregiving that meets basic needs (e.g., basic appropriate caregiving schedules), but these ARE typical in terms of being appropriate variations in the behaviors available to a little one when the old or “normal” behaviors no longer worked to produce stimuli meeting basic needs, or to get an otherwise drugged out and unavailable caregiver to notice a child.
It turns out that there is ample literature suggesting variability present during conditioning or learning returns during “extinction”, or that even though a child doesn’t “need to use these behaviors anymore”, that they will, when the original situation is no longer present. Put simply, initial learning has sustained effects on later performance (Stokes and Balsam, 1991). This behavioral finding has been a part of the animal learning literature since the 1950’s (e.g., see Antonitis 1951; Epstein and Skinner, 1980), but has also been explored in recent decades in people, from children to adults.
For example, in children, variability in acquisition leads to variability during new learning, even when the need to use those original varied responses was no longer a part of the learning situation. When inconsistency and variability were required at first on one kind of task for children who eventually learned two different types of tasks, the children’s behavior was more variable when learning the second (and different) type of task later. In other words, the original requirement to be variable carried over to the way the children learned later (Eisenberger and Armeli, 1997). (To behavior analysts, thinking about trauma related behaviors as “variability” might seem strange here, but it just means doing different things. If I am a child whose parents feed me regularly, feed me when I cry as an infant, and feed me reliably in toddlerhood when I request food, I have a fairly limited but effective repertoire of responses that produce food. If I learn that these “typical” behaviors of childhood (crying and making sounds) do not produce caregiver interaction, I try other things and my behavior becomes more “variable”.)
Behavior analysts interested in this topic need to have a firm grasp of how reinforcement relates to subsequent development and maintenance of behavior, to make sense of behavior that seems to be unrelated to the current “function”. I often hear parents mention that the behavior of children with trauma histories “seemed to come out of the blue”, but function aside, the common factor in most of their observations is the fact that the caregiver was present when the behavior happened or that it functioned to produce or terminate caregiver interaction. This does NOT mean that the behavior functions only as producing “attention” and that current operant reinforcement alone is maintaining the behavior, however. The presence of the caregiver (or of an adult, period) may be functioning instead as a discriminative stimulus that is paired in long past with other important conditioning variables. For more on behavior analytic theory related to previously reinforced behavior on subsequent behavior, please see Denney and Neuringer (1998) on behavioral variability as controlled by discriminative stimuli—and consider Staddon and Simmelhag, 1971:
“…one effect of a relaxation of [reinforcement] is a more or less transient increase in the relative influence of the distant past at the expense of the immediate past. In behavioral extinction, this should involve the reappearance of old (in the sense of previously extinguished) behavior patterns” (p.25).
Ultimately, even if it doesn’t seem to “make sense anymore”, children who needed to respond one way in the past have a difficult time learning to do something different, which can contribute to a paradox of doing poorly in an otherwise supportive environment. Research on how acquisition affects later learning shows that children who HAD to respond a certain way to obtain reinforcement continued to do so (respond in the original required way) later, when the previous contingency was no longer in place (Saldana and Neuringer, 1999). This pattern in the literature occurs for adults too: When people had to respond to a great increase in response requirements soon after learning began, they responded with sustained high variability; this pattern was contrasted with the learning patterns of other people who were also exposed to a large change in response requirements but that occurred later in training (e.g., Stokes, Mechner, and Balsam 1999). And if people developed repetitive (stereotyped) behavior during original learning, it was very difficult for them to develop novel sequences later (see Schwartz 1982a, 1982b).
There also there seems to be a critical period during the original learning process: if variability was reinforced during this critical period, it resulted in sustained variability and persistence of the once-reinforced variable behavior, even when that behavior was no longer “needed” (e.g., Stokes and Balsam 2001).
Overall, behavior analytic theory and literature informs what we observe practically to be true—that early learning and behavior shows up later even after the original learning situation is over. Some of this is related to cues present in the new environment, such as those discussed in contextual renewal (e.g., Bouton 2011; Todd, Winterbauer, and Bouton 2012), but variability can persist even when the situation seems like pure extinction, and there is not necessarily something about the present situation that resembles the original one. For behavior analysts treating behavior after clients experienced aversive events and disrupted caregiving, our clients benefit from our learning about the basis for these effects. Our clients need our acknowledgement of how the past informs the present.
Antonitis, J. J. (1951). Response variability in the white rat during conditioning, extinction, and reconditioning. Journal of Experimental Psychology, 42(4), 273-281.
Bouton, M. E. (2004). Context and Behavioral Processes in Extinction. Learning and Memory, 11, 485-494.
Eisenberger, R., and Armeli, S. (1997). Can salient reward increase creative performance without reducing intrinsic creative interest? Journal of Personality& Social Psychology, 72, 704-714.
Epstein, R. (1983). Resurgence of previously reinforced behavior during extinction. Behavior Analysis Letters, 3, 391-397.
Epstein, R. (1985). Extinction-induced resurgence: Preliminary investigations and possible applications. The Psychological Record, 35, 143-153.
Epstein, R., and Skinner, B. F. (1980). Resurgence of responding after the cessation of response independent reinforcement. Proceedings of the National Academy of Sciences, U.S.A., 77, 6251-6253.
Denney, J., and Neuringer, A. (1998). Behavioral variability is controlled by discriminative stimuli. Animal Learning and Behavior, 26 (2), 154-162.
Saldana,L., and Neuringer, A. (1999). Is instrumental variability abnormally high in children exhibiting ADHD and aggressive behavior? Behavioural Brain Research, 94, 51-59.
Stokes,P. D., and Balsam, P.(1991).Effects of reinforcing preselected approximations on the topography of the rat’s barpress. Journal of the Experimental Analysis of Behavior, 12, 349-373.
Stokes, P. D., Mechner, F., and Balsam, P.O. (1999). Effects of different acquisition procedures on response variability. Animal Learning and Behavior, 27, 28-41.
Schwartz, B. (1982a). Failure to produce response variability with reinforcement. Journal of the Experimental Analysis of Behavior, 37, 171-181.
Schwartz, B. (1982b). Reinforcement-induced behavioral stereotypy: How not to teach people to discover rules. Journal of Experimental Psychology: General, 111, 23-59.
Staddon, J.E.R., and Simmelhag, V. L. (1971). The ‘superstition’ experiment: A reexamination of its implications for the principles of adaptive behavior. Psychological Review, 78, 3-43.
Stokes, P. D. and Balsam, P. (2001). An optimal period for setting sustained variability levels. Psychonomic Bulletin and Review, 8 (1), 177-184.
Todd, T. P., Winterbauer, N. E., and Bouton, M. E. (2012). Effects of the amount of acquisition and contextual generalization on the renewal of instrumental behavior after extinction. Learning and Behavior, 40, 145-157.
P.S. Now for something (not completely) different: Why the horse painting? People are not the only animals experiencing trauma. Thankfully, there are better ways to treat your animals. See “Loading the Problem Loader” by Ferguson and Rosales-Ruiz for a behavior analytic look at using shaping instead of aversive control. Behavior analysis students, you read that right… the same author of our field’s beloved article on “the behavioral cusp” is the same professor who works with animals and the people who love them. Keep shaping, friends!
Reference for post-script: Ferguson, D. L., and Rosales-Ruiz, J. (2001). Loading the problem loader: The effects of target training and shaping on trailer-loading behavior of horses. Journal of Applied Behavior Analysis, 34 (4), 409-424.