(Bjuder på en läsvärd artikel ang Skinner och hans träning ;)av Gail B. peterson)





The World's First Look at Shaping:
B.F. Skinner's Gutsy Gamble

Gail B. Peterson

Last summer I told the story of B.F. Skinner's discovery of shaping (Peterson, 2000). This summer I have a follow-up story about the first photographic demonstration of shaping to the American public and, indeed, to the English-speaking people of the world at large. Included here are some pictures which, to my knowledge, have appeared only once before, almost a half-century ago, in LOOK magazine. I suspect that there are few people on the planet today under the age of 50 who have seen these pictures, and probably very few over 50 who have seen them (or remember having seen them). But what I hope will be found of greater interest is the story behind these pictures and the glimpse it gives us of B.F. Skinner's bold ingenuity and downright good luck.

Let me begin by giving a brief recap of last summer's story. It appears that none of the operant behavior Skinner had so carefully studied and written about prior to 1943 had been hand-shaped. The lever pressing by rats described in The Behavior of Organisms (Skinner, 1938) involved no hand-shaping, nor had hand-shaping been used in training the behavior chain performed by the celebrated Pliny (Life magazine, 1937). Although Skinner had built, programmed, and adjusted the equipment by hand, as the animal behaved in the experimental space, the consequences of its behavior were delivered with no hands, being totally controlled by the apparatus instead.

As noted in the earlier article, Skinner clearly had a pretty good hunch, very early on, that new and elaborate forms of behavior could be effectively brought out or "differentiated" from even the most inert original behavioral material by an astute human observer who hand-actuated reinforcement according to the method of successive approximations.

Nevertheless, he evidently never actually tried it until one day in 1943 on the top floor of a flour mill near downtown Minneapolis. The behavior shaped that day was even more unlikely than the geographic location: Skinner, with the help of grad students Keller Breland and Norman Guttman, shaped a pigeon to bowl. The pigeon was trained to swat a little wooden bowling ball with its beak, propelling it down a miniature bowling alley into some tiny bowling pins. As strange as it may sound, the successful hand-shaping of this behavior was a genuine eureka experience for Skinner and his students. Skinner refers to it repeatedly in his memoirs and autobiographies as a very illuminating moment in his career (cf., Skinner, 1958, 1972, 1979).

Why was the discovery of hand-shaping so important? It was important to Skinner at the time, in my opinion, because of the impact it had on his thinking about social behavior, human social behavior in particular, and especially human verbal behavior. But putting theoretical issues aside, history has shown the discovery of hand-shaping to be of monumental practical significance because of the impact it has had on the actual practices of people who need or want to change behavior. It was Skinner's shaping that pigeon to bowl that got the ball rolling, if you will, in our modern day fields of Applied Behavior Analysis (cf., Cooper, Heron, & Heward, 1990; Chance, 1999), behavior modification (cf., Kazdin, 2000; Martin & Pear, 1996; Sarafino, 2000), biofeedback (cf., Olton & Noonberg, 1980; Schwartz, 1995), Precision Teaching (cf., Lindsley, 1971, 1972, 1990, 1992), performance management (cf., Daniels, 2000, 2001), and all the other practical applications of operant conditioning principles. And though it took 40 or 50 years to catch hold, the current revolution we are seeing in the field of practical animal training can also be traced directly to that fateful day in 1943 (cf., Breland & Breland, 1951, 1966; Donaldson, 1996; Pryor, 1994, 1995, 1999; Ramirez, 1999; Reid, 1996).

Throughout his life and career, B.F. Skinner was a man of great self-confidence, with strong personal convictions about the essential correctness of his view on the determinants of behavior. He was so sure of himself and his views, in fact, that he would occasionally give rather glib accounts that, frankly, may have gone a bit beyond the established facts of the matter. I alluded to an example of this above: he described, quite convincingly, the process of hand-shaping several years before he or anyone else had actually ever done it. In his important 1937 "Reply to Konorski and Miller" paper, Skinner gave the distinct impression that he had hand-shaped rats to lever press, an observation which was of central theoretical significance to the point at issue in that historic academic exchange. However, in discussing the matter some 40 years later, Skinner, to his everlasting credit, came clean and fessed up that he had not actually ever shaped lever pressing that way, but "I was sure it could be done" (Skinner, 1979, p. 185). Thus, paragon of empirical science virtue though he most definitely was, this "Reply to K & M" episode shows that he was not above venturing beyond the strict empirical facts from time to time, sticking his theoretical neck out a little and engaging in flat-out speculation - - but doing it in a way that didn't sound the least bit like speculation.1 He appears to have done a similar thing again in his 1951 Scientific American article on "How to Teach Animals". This time, however, he soon got called on it.

That 1951 paper is a landmark article in the history of practical animal training. This was the article in which the term shaping was used, at least in print, for the very first time. It was also the article in which the use of a clicker as a conditioned reinforcer was first described (again, in print).2 Skinner also described in that paper something very much like our modern practice of target training, although he didn't call it that and didn't expand upon how a target can be used to prompt other behavior which can then be strengthened by reinforcement. That short four-page article is probably the most concise tutorial one could ever find on the basic principles of operant conditioning. It should be required reading for anyone interested in the field.

Also contained within that neat little 1951 article is a confident speculation (once again, not sounding at all like speculation) which could have proved quite embarrassing to Skinner, and on a fairly grand scale at that. Without having had any experience actually doing it himself beforehand and probably without having seen anyone else do it either3, Skinner confidently described how easy it is, via shaping with positive reinforcement, to train a dog to do tricks. In particular, he gave the example of how easy it is to train a dog to lift its head, turn a little pirouette, and dance.

As a second test, let us say, you want to teach the dog to lift its head in the air and turn around to the right. The general procedure is the same, but you may need some help in sharpening your observation of the behavior to be reinforced. As a guide to the height to which the dog's head is to be raised, sight some horizontal line on the wall across the room. Whenever the dog, in its random movements, lifts its head above this line, reinforce immediately. You will soon see the head rising above the line more and more frequently. Now raise your sights slightly and reinforce only when the dog's head rises above the new level. By a series of gradual steps you can get the dog to hold its head much higher than usual. After this you can begin to emphasize any turning movement in a clockwise direction while the head is high. Eventually the dog should execute a kind of dance step. If you use available food carefully, a single session should suffice for setting up this behavior. (Skinner, 1951, p. 27)

When Skinner's article came out in Scientific American, it was read by a writer for LOOK magazine4 who must also have been something of a skeptic. Joseph Roddy promptly paid Skinner a visit at Harvard, and essentially told him that if this method of training dogs was as slick as he claimed, then LOOK wanted pictures of it for their magazine. Skinner later wrote, "As a poker player might have put it, LOOK was calling me." (Skinner, 1983, p. 42) Skinner unhesitatingly took up the challenge; I think it was a rather gutsy gamble.

Accordingly, Mr. Roddy acquired a dog, a young Dalmatian (registered name "Roadcoach Cheerful", call name "Agnes"), and brought her to Skinner. In the resulting magazine article, Roddy wrote:

At Skinner's workshop, the Harvard Psychological Laboratory in Cambridge, the feeling is that a man can have a dog doing anything reasonable he could want a dog to do within twenty minutes of their first encounter. We doubted that, and visited Skinner with a camera on our hip and Agnes on our leash. Dog and psychologist were introduced to each other. Skinner asked us what we would have the dog do, and we said, "Run up the wall." Twenty minutes later, we were convinced. (LOOK magazine, May 20, 1952, p. 17)

How did Skinner do it? He was darn clever: First of all, he realized that the taking of photographs would, obviously, be a very high priority feature of this whole LOOK magazine enterprise, and it was therefore important that Agnes be at ease with flash bulbs. Also, for the pictures to be fair, accurate, and meaningful, it would be best if each photo corresponded to a progressive step in the shaping process. The answer was to use the flash of the photographer's strobe light itself as the conditioned reinforcer (instead of a clicker). Accordingly, Agnes was given pretraining in which the flash of the strobe was immediately followed by a small cube of beef. This training served both to countercondition any unwanted adverse reactions Agnes might have had to the flash, and to establish the flash as a marker, a bridge, a conditioned reinforcer. So, when the time came to put up or shut up, both Skinner and Agnes were ready. Yes – darn clever5.

The actual training and photo shoot took place in the apartment of Skinner's graduate student, Charlie Ferster. Skinner set things up (for real this time) in much the way he had described one might do in the Scientific American piece (see pictures above): he attached some horizontal stripes to the wall which he then used to gauge the dog's responses of lifting its head higher and higher. Then, he simply set about shaping a jumping response by flashing the strobe (and simultaneously taking a picture), followed by giving a meat treat, each time the dog satisfied the criterion for reinforcement. The result of this process is shown above, as it was in LOOK magazine, in terms of the pictures taken at different points in the shaping process. Within 20 minutes, Skinner had Agnes "running up the wall", just as Roddy had requested.

That first demonstration went so well that Skinner and company evidently decided to go for a "twofer". The Fersters had one of those covered kitchen wastebaskets that can be opened by stepping on a pedal at its base, and so for the second shaping demonstration, Charlie trained Agnes to press the pedal and pop the top on the wastebasket. Again, the photographer's flash served as the conditioned reinforcer, and each step in the process was photographed. The results are shown below.

Skinner's gutsy gamble paid off. Things worked out exactly as he had bet they would.

The word "luck" often comes up in discussions of betting and gambling. Was luck involved here? Dare we use that word in connection with the work of a scientist? Yes, I think luck was involved here, and yes, I think we can use that word in the context of science, as long as we don't start getting all mystical and magical about it. Indeed, Skinner himself once acknowledged luck as one of the factors important in scientific discovery (Skinner, 1959, p. 366), so I don't think he would be offended if we explored the role luck might have played in his winning this gutsy gamble. How might it have been involved?

I think it was a sheer stroke of good luck that, of all the possible breeds of dogs he might have chosen, Roddy brought Skinner a Dalmatian and asked him to train it, in essence, to jump. Dalmatians jump. In fact, Gewirtz (2000) has recently described "jumping up" as an almost incorrigible breed characteristic in Dalmatians. It is empirical question, but I doubt if all dogs of all breeds could be trained, in a mere 20 minutes, to "run up the wall" as high as Agnes did. Skinner was lucky that Agnes was a Dalmatian.

I think it was also a stroke of good fortune that Skinner put those stripes on the wall. A modern Skinnerian dog trainer would, ironically, do things a little differently than Skinner himself did back then in 1952. Today, we would most likely begin by training the dog to approach and touch a target, and then we'd simply move the target higher and higher up the wall. If Skinner had done this explicitly, he could probably have had old Agnes running up that wall in 5 or10 minutes instead of 20. But he didn't do it explicitly; he did it accidentally. By lucky accident, Skinner did something that was pretty close to this modern target training approach anyway: he taped black stripes on the wall and shaped jumps to successively higher stripes. Although Skinner had intended the stripes to serve purely as stimulus aids to his behavior, i.e., stimulus support for his behavior of sighting higher and higher movements of the dog's head, the photographs clearly suggest that the stripes may also have turned out to provide very important stimulus support for the dog's behavior. Yes, Agnes can be described as "running up the wall", but she can also be described as "running up the stripes on the wall", or jumping at the highest stripes. I think those stripes on the wall were exerting every bit as much stimulus control over Agnes' jumping behavior as they were over Skinner's sighting behavior, even though their intended purpose was strictly for the latter. In this connection, it is interesting to note that, following this 1952 episode with Agnes, the very next report that Skinner published was an experimental study of the powerful control that stimuli which are accidentally present when reinforcement occurs can acquire over behavior (Morse and Skinner, 1957). In this paper Skinner noted that "Accidental, but nevertheless effective, relationships may arise in the sensory control of operant behavior" (p. 308, original italics). "Pending an investigation of these parameters, it may at least be said that incidental stimuli adventitiously related to reinforcement may acquire marked discriminative functions." (ibid, p. 311) One can't help but wonder if this insight wasn't prompted by Skinner's appreciating his good luck when those stripes on the wall, there deliberately with respect to his behavior but accidentally with respect to Agnes', acquired strong control over Agnes' behavior nevertheless.

In any event, it was good fortune for us all that Skinner's gutsy gamble paid off. All the elements came together to make the demonstration work, whether by deliberate ingenuity or accidental good luck. Skinner's faith in shaping was vindicated and reinforced, and the field of operant conditioning continued to evolve.


(1)Or, as Gary Wilkes once put it: "I would say your research leaves little doubt that there may have been a touch of heifer dust in some of Skinner's earlier pronouncements." (Wilkes, pers. comm, September 2000). To be fair to BFS, however, it is totally appropriate, in an inductive science, to predict what will occur in the general case based on observations of a specific case. It's just that most scientists would note that the inductive process has led them to predict or expect some result, rather than to imply with apparent 100% confidence that this already has been or undoubtedly would be the result.

(2)In the context of the original "no hands" approach to "differentiation of a response", it is interesting to note that human hands loom large in David Stone's 1951 illustrations for the article "How to Teach Animals". On one page we see a pair of hands, with the right hand holding a clicker and the left hand holding a treat. On the next page we see a hand poised to pull the chain on an electric light, the flashing of which is being used to reinforce a baby's response of lifting her arm. These illustrations help convey the message that this is a practical way of changing behavior that can be done "by hand".

(3)There is a chance that Skinner may have seen it done, although he doesn't mention having seen it in his autobiographies. But by the time Skinner's 1951 article was written, Keller and Marian Breland had already done much of their pioneering work in practical animal training, including work with dogs. Thus, Skinner may have actually seen the Brelands do this, or perhaps they told him how well it worked.

(4)I have been amazed at the number of people today who say they have never heard of LOOK magazine. This was one of the premier popular magazines of the mid-century 1900's. It came out weekly and was widely read in the USA and around the world as well. It consisted primarily of photo essays.

(5)We will never know to what extent, if at all, Skinner's clever use of the photographer's flash and camera operation as a response consequence may have been influenced by the classic work of Guthrie and Horton, which had been published just a few years earlier (1946). For entirely different reasons, they had made the operation of a camera (click!) contingent upon the performance of an instrumental pole-pushing response by cats. Although this fact is seldom included in the descriptions of their work given in secondary sources, they also used this method to train a dog (cf., Guthrie and Horton, 1946, p. 67). Certainly, Skinner must have read the Guthrie and Horton monograph, and his analysis of their interesting results would no doubt have rested on attributing conditioned reinforcement properties to the click of the camera.

References

Breland, K., & Breland, M. (1951). A field of applied animal psychology. American Psychologist, 6, 202-204.

Breland, K., & Breland, M. (1966). Animal behavior. The Macmillan Company.

Chance, P. (1998). First Course in Applied Behavior Analysis. Brooks/Cole.

Cooper, J.C., Heron, T.E., and Heward, W.L. (1990). Applied Behavior Analysis. Prentice Hall.

Daniels, A.C. (2000). Bringing out the best in people: How to apply the astonishing power of positive reinforcement. McGraw-Hill, Inc.

Daniels, A.C. (2001). Other people's habits: How to use positive reinforcement to bring out the best in people around you. McGraw-Hill, Inc.

Donaldson, J. (1996). The Culture Clash. James & Kenneth Publishers.

Gewirtz, E.W. (2000) By leaps and bounds. AKC Gazette, September 2000, 83-84.

Guthrie, E.R., and Horton, G.P. (1946). Cats in a puzzle box. New York: Rinehart & Company, Inc.

Harvard-trained dog. (1952, May 20). LOOK, pp. 17-20.

Kazdin, A. E. (2000). Behavior Modification in Applied Settings (6 th Ed). Wadsworth.

Lindsley, O.R. (1971) Precision teaching in perspective: An interview. Teaching Exceptional Children, 2, 114-119.

Lindsley, O.R. (1972) From Skinner to precision teaching: The child knows best. In J.B. Jordan & L. S. Robbins (Eds.), Let's try doing something else kind of thing (pp. 1-11), Arlington, VA: Council on Exceptional Children.

Lindsley, O.R. (1990) Precision Teaching: By Teachers for Children. Teaching Exceptional Children, Spring, 10-15.

Lindsley, O.R. (1992) Precision teaching: Discoveries and effects. Journal of Applied Behavior Analysis, 24, 51-57.

Martin, G. and Pear, J. (1996) Behavior Modification: What it is and how to do it (5th Ed). Prentice Hall.

Morse, W.H., and Skinner, B.F. (1957). A second type of superstition in the pigeon. American Journal of Psychology, 70, 308-311.

Olton, D.S. and Noonberg, A.R. (1980). Biofeedback: Clinical applications in behavioral medicine. Prentice-Hall.

Peterson, G.B. (2000). The discovery of shaping, or B.F. Skinner's big surprise. The Clicker Journal: The Magazine for Animal Trainers, No.43 (July/August), 6-13.

Pryor, K. 1994). Lads Before The Wind: Diary of a dolphin trainer. (3 rd Ed). Sunshine Books.

Pryor, K. (1995). On Behavior: Essays and Research. Sunshine Books.

Pryor, K. (1999). Don't Shoot the Dog! The new art of teaching and training. (revised edition). Bantam.

Ramirez, K. (1999). Animal Training: Successful Animal Management Through Positive Reinforcement. Shedd Aquarium Press.

Reid, P.J. (1996). Excel-erated Learning. James & Kenneth Publishers.

Sarafino, E.P. (2000). Behavior Modification: Principles of Behavior Change (Second Edition). Mayfield Publishing Company.

Schwartz, M.S. (Ed.). (1995). Biofeedback: A practitioner's guide (2 nd ed.). Guilford.

Skinner, B. F. (1937). Two types of conditioned reflex: A reply to Konorski and Miller. The Journal of General Psychology, 16, 272-279.

Skinner, B.F. (1938). The behavior of organisms: An experimental analysis. New York: Appleton-Century-Crofts.

Skinner, B.F. (1951). How to teach animals. Scientific American, 185, 26-29.

Skinner, B.F. (1958). Reinforcement today. American Psychologist, 13, 94-99.

Skinner, B.F. (1959). A case history in the scientific method. In S. Koch (Ed.), Psychology: A study of a science. Study I. Conceptual and Systematic. Volume 2. General systematic formulations, learning, and special processes. (Pp. 359-379). New York: McGraw-Hill Book Co., Inc.

Skinner, B. F. (1972). Some relations between behavior modification and basic research. In S.W. Bijou & E. Ribes-Inesta (Eds.), Behavior modification: Issues and extensions (Pp. 1-6). New York: Academic Press.

Skinner, B. F. (1979). The Shaping of a Behaviorist: Part Two of an Autobiography. New York: New York University Press.

Skinner, B. F. (1983). A Matter of Consequences: Part Three of an Autobiography. New York: New York University Press.

This smart University of Minnesota rat works slot machine for a living. (1937, May 31). Life, Pp. 80-81.


 

Kommentarer

Dela gärna din åsikt eller lämna ett tassavtryck, alla kommentarer kommer besvaras här på bloggen!:

Namn:
Kom ihåg mig?

E-postadress: (publiceras ej)

URL/Bloggadress:

Kommentar:

Trackback
RSS 2.0