DBR Weekly Readings Part 5

Summaries.

Cobb, P., Confrey, J., DiSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13.

This article discusses specific design experiments with the intent of illustrating commonalities, including themes the authors designate as “crosscutting features” (pg. 9). The first crosscutting feature describes the purpose of a design experiment – “to develop a class of theories about both the process of learning and the means that are designed to support learning” (pg. 10). Broad interpretation is encouraged for both learning processes and means of support. The second through fifth crosscutting features include:

  • Interventionist methodology which encourages innovation through a basis in design
  • Theories that are vigorously tested in the field (i.e. “put in harm’s way” (pg. 10))
  • Iterative cycles of invention and revision
  • Humble theories that question whether and how the theory informs the design

The authors also detail what to expect when preparing for and conducting a design experiment and analysis.

Sandoval, W. (2014). Conjecture Mapping: An Approach to Systematic Educational Design Research. Journal of the Learning Sciences, 23(1), 18–36. https://doi.org/10.1080/10508406.2013.778204

This article details a tool for implementing design-based research – conjecture mapping. This tool is meant to provide logic and “argumentative grammar” (pg. 19) to the design as well as assessing and evaluating the design. The author then details the components of the map while providing support for why those components are essential to the tool.

Bell, P. (2004). On the theoretical breadth of design-based research in education. Educational Psychologist, 39(4), 243–253.

This article details the mechanics and modes of generating theory within design-based research. Bell tries to add loose boundaries to a family of theories inherent within design-based research, with the added perspective that those boundaries may not truly exist at all. It fits into this trifecta of articles seeking to increase the legitimacy of design-based research while arguing specific points.

 

Reaction.

I thought these articles reiterated previous thoughts on design-based research, and didn’t really add too much to our growing knowledge base. And as thorough as the Sandoval article was, it still wasn’t as detailed as Vanessa’s instructions for conjecture mapping. Bell was interesting, particularly in the idea of generating theory at the intersection between nomothetic and ideographic accounts and sustaining innovation (as if that were truly a possibility – upheaval can only be sustained for so long).

 

Discussion foci.

  1. What are the limitations for conjecture mapping and when would you decide not to use it as part of your design?
  2. I keep not “getting” humble theory. It’s the theoretical and conceptual framework undergirding your design, correct?

 

Advertisements

DBR Weekly Readings Part 4

Summary.

Cohen-Vogel, L., Tichnor-Wagner, A., Allen, D., Harrison, C., Kainz, K., Socol, A. R., & Wang, Q. (2015). Implementing educational innovations at scale: Transforming researchers into continuous improvement scientists. Educational Policy, 29(1), 257-277.

This article describes the science of improvement as well as its roots and features. The authors also describe how to conduct research using the science of improvement and how this new research methodology differs from traditional research. I particularly appreciated the graphics within this article, which I thought aptly described the methodology underlying the science of improvement.

Screen Shot 2017-09-18 at 11.01.37 AM(Cohen-Vogel et. al, 2015, pg. 264)

The PDSA cycle, with its iterative nature, creates a convenient method for testing hypotheses generated by the three questions. But more than its convenience, it is yet another application of the scientific method. I also appreciated the picture (seen below) that actually showed iterations of the PDSA cycle and as those iterations fit into the implementation of the research methodology.

Screen Shot 2017-09-18 at 11.06.35 AM(Cohen-Vogel et. al, 2015, Figure 2, pg. 265)

 

Donovan, M. S., Snow, C., & Daro, P. (2013). The SERP approach to problem-solving research, development, and implementation. National Society for the Study of Education Yearbook, 112(2), 400-425.

This article discusses and frames the SERP’s contributions to DBIR and particularly its attempt to build infrastructure within DBIR projects. Again, the graphics were the most powerful parts of the articles for me. Figure 1 describes what SERP’s preferred set up is when they interact with a field site (where a DBIR project is being implemented). Figure 2 describes SERP’s process as they move from design through implementation. What I particularly appreciate about figure 2 is the description of each group’s (practitioner, researcher, designer) role within the process.

Screen Shot 2017-09-19 at 9.55.54 AM(Donovan et. al, 2013, pg. 407)

Screen Shot 2017-09-19 at 9.57.25 AM (Donovan et. al, 2013, pg. 410)

 

Sabelli, N., & Dede, C. (2013). Empowering design based implementation research: The need for infrastructure. National Society for the Study of Education Yearbook, 112(2), 464-480.

This chapter describes several infrastructure frameworks, including ones that assess the frameworks developed, that will help DBIR research become sustainable and successful. The discussion on the considerations needed for scaling-up a DBIR project – depth, sustainability, spread, shift, and evolution (pg. 468) – were specifically interesting as I think the scaling-up process is the real trick of DBIR. The authors emphasized building human capacity (building relationships and enabling the humans involved) and technological infrastructure needed to sustain the project. The last paragraphs were reserved for a discussion on funding and a summary.

 

Reaction.

I really enjoyed the gifs in the first two articles. They reminded me of the idea that “a picture is worth a thousand words.” And I started to wonder what information from my own work I should be summarizing in gifs. The flow gifs were of particular interest as they included a huge amount of information but were still easy to follow.

 

Discussion foci.

  1. Personal reflection: what kinds of information within your DBR project can be summarized in gifs?
  2. What infrastructure frameworks have been found to be most successful overall in DBIR?

DBR Weekly Readings Part 3

Summaries. [Instructions: Provide a brief (1 paragraph) summary of EACH reading assigned. This approach will support you to make progress on your final project for this class. Your summary may contain a quote, properly cited in APA format, as well as your interpretation or perspective on the quote. As a rule of thumb, you should spend twice as many words explaining/expanding on/critiquing any quote you use. Never use a quote as your own sentence, even when properly cited. Provide a summary for EVERY article assigned that you read.]

Penuel, W. R., Fishman, B. J., Haugan Cheng, B., & Sabelli, N. (2011). Organizing Research and Development at the Intersection of Learning, Implementation, and Design. Educational Researcher, 40(7), 331–337. https://doi.org/10.3102/0013189X11421826

This article describes DBIR (design-based implementation research) more thoroughly, calling on four key elements that differentiate traditional DBR and DBIR. The four elements consist of:

  • “a focus on persistent problems of practice from multiple stakeholders’ perspectives;
  • a commitment to iterative, collaborative design;
  • a concern with developing theory related to both classroom learning and implementation through systematic inquiry;
  • a concern with developing capacity for sustaining change in systems.”

While the first two elements really focus on similarities between DBIR and DBR, the last two elements really reflect the larger scope of DBIR, and the system-changing nature of this kind of research. The authors then used model examples from the literature to illustrate each of the elements. Challenges were detailed, including the precarious role researchers play, the lack of funding for such projects, and the sometimes only partial willingness of partners to implement and participate in such research. In the future directions section, we again hear a call (as we did in previous readings for DBR) for the establishment of standards of practice and evidence.

 

Fishman, B. J., Penuel, W. R., Allen, A.-R., Cheng, B. H., & Sabelli, N. (2013). Design-based implementation research: An emerging model for transforming the relationship of research and practice. National Society for the Study of Education, 112(2), 136–156.

While the first article was a summary of DBIR, this chapter is the detailed articulation of DBIR’s place within research as well as the authors’ attempt at answering the questions posed by the future directions section of the previous reading. Standards of practice as well as descriptions for potential evidence are both touched upon, but as this is obviously an introduction to a book, they are almost always offset onto the chapters that are to follow. A discussion of what would constitute supporting infrastructure in terms of both policy and funding is introduced in the last section before conclusions.

 

Reaction. [Instructions: This part is a choose-your-own adventure freestyle place to react to one or more of the readings. You could describe how you plan to apply something you read, reflect on your own experiences, interpretations and beliefs. You can also synthesize across the readings. You do not need to do this for each reading—just one overall reaction.]

From the first article, the following principles adopted by SERP (Strategic Education Research Partnership) struck me:

“(a) research and development should be a collaborative endeavor between researchers and practitioners,

(b) partnerships should be based on addressing important problems of practice,

(c) practitioners should have a say in defining those problems.”

I think the reason why these principles struck me is due to my recent realization that I am a researching practitioner, not a practicing researcher. Much of what is described in both DBR and DBIR seems a bit daft at times, mostly because the vast majority of current DBR and DBIR already matches the way I think. And how both redeeming and scary that is!

 

Discussion foci. Include at least 2 questions, wonderings, or topics in total about the articles to encourage in-class discussion.

  1. We’ve seen a call for potential standards and acceptable evidence from the readings both this week and last for both DBR and DBIR. But both methodologies seem to prize flexibility and context as main tenets. How does one delineate the former (standards and evidence) without compromising the latter (flexibility and context)?
  2. Readings often detail contexts in which the discussed research methodology has been successful. Have there been “epic fails” in DBIR? How were those situations handled and what was done to iterate the “fail” into something more successful?

DBR Weekly Readings Part 2

Summaries. (Instructions: Provide a brief (1 paragraph) summary of EACH reading assigned. This approach will support you to make progress on your final project for this class. Your summary may contain a quote, properly cited in APA format, as well as your interpretation or perspective on the quote. As a rule of thumb, you should spend twice as many words explaining/expanding on/critiquing any quote you use. Never use a quote as your own sentence, even when properly cited. Provide a summary for EVERY article assigned that you read.)
These summaries are given in the order I read the articles.

Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. (2003). On the science of education design studies. Educational Researcher, 32(1), 25–28.

The article details the authors’ main contributions and knowledge of a National Resource Council (NRC) report (2002) [full reference below], which describes educational research as a scientific endeavor. The article included an in-depth description of the scientific method via the “guiding principles of scientific research” (Shavelson et al., 2003, pg. 26), as well as an emphasis on innovation and iteration within design methodologies. The authors finished the article by giving the proper methodology and theory for generalized questions one could ask when conducting educational research (although I certainly agree with Vanessa’s note on the use of the word systemic as opposed to relationship). By providing these details, the authors helped the reader fit educational research into a science-based research framework.

Scientific Research in Education. (2002). Washington, D.C.: National Academies Press. https://doi.org/10.17226/10236

 

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. The Journal of the Learning Sciences, 13(1), 15–42.

This article compares design research to other educational research methodologies, including studies based in controlled environments (such as the laboratory or training environments), ethnographic research, and large-scale intervention studies. Collins’ comparison of controlled setting research vs. design research (pg. 20) from his 1999 article was particularly enlightening:

“ 1. Laboratory settings vs. messy situations…

  1. A single dependent variable vs. multiple dependent variables…
  2. Controlling variables vs. characterizing the situation…
  3. Fixed procedures vs. flexible design revision…
  4. Social isolation vs. social interaction…
  5. Testing hypotheses vs. developing a profile…
  6. Experimenter vs. co-participant design and analysis…”

In listing the comparative differences between the two research methodologies, Collins et al. also summarizes what design research really is – research on learning performed in the real world. This research is innovative, iterative, evolutionary, and messy. Two predominant examples of design research – Brown and Campione’s “Fostering a Community of Learners” and Joseph’s Passion Curriculum – are then detailed to expand on this definition. The details that follow the examples describe how to implement and report a design research study.

 

Barab, S., & Squire, K. (2004). Design-Based Research: Putting a Stake in the Ground. Journal of the Learning Sciences, 13(1), 1–14. https://doi.org/10.1207/s15327809jls1301_1

Again, the reader is confronted with an in-depth analysis of what design-based research is and isn’t. While Collins’ (1999) comparison of DBR to controlled settings research was elucidated clearly in the previous article, it doesn’t hold a candle to the ease of Table 1 within this article. The authors then go on to argue the merits of deciding what constitutes credible evidence as well as what tools and methodologies are most useful. I found the Collins et al. and Barab et al. articles to both focus on a call to the learning sciences community: 1. to develop an accepted definition of DBR; 2. to ask the fundamental question that guides all excellent research – why do we care; and 3. to develop and validate methodological practices and tools that will guide the design-based research in the future.

 

Reaction. (Instructions: This part is a choose-your-own adventure freestyle place to react to one or more of the readings. You could describe how you plan to apply something you read, reflect on your own experiences, interpretations and beliefs. You can also synthesize across the readings. You do not need to do this for each reading—just one overall reaction.)

Shavelson et al. was the article that helped me understand how much DBR (design-based research) and DBER (discipline-based educational research) overlap. It also reiterated the scientific method, which I saw in the DBR readings last week. How affirming this article was! Thank you, Vanessa, for assigning it.

I, of course, IMMEDIATELY downloaded the pdf of the NRC e-book, which you can find here: https://www.nap.edu/catalog/10236/scientific-research-in-education.

This article fits educational research into a framework I already know and love (i.e. scientific research). And reading the NRC e-book is definitely the next thing on my to-do list.

I also intend to send the Shavelson et al. article to CNM’s IRB (Institutional Research Board) (full disclosure – I sit on this Board) to have a discussion on how to approach design-based research in future proposals. Vanessa’s ideas from last week, including housing the research in current instructional practices while detailing the procedures that protect participants and secure data, are essential to the acceptance of such proposals. But to allow research evolution in the midst of a proposal, thus making the research either integral to some kind of trust of the PI or a conversation between the PI and the Board, is a new thing for us, and as such, it warrants an internal (and probably prolonged) discussion.

 

Discussion foci. (Instructions: Include at least 2 questions, wonderings, or topics in total about the articles to encourage in-class discussion.)

  1. This question is based on a comment I assume Vanessa made in the Shavelson et al. article. What are the fundamental differences between scientific research and design? And why would emphasizing one detract from the other?
  2. This quote from Barab and Squire (pg. 12) – “More generally, as a field we have over-theorized the role of context, and at the same time we have done little to characterize the role of context in ways that can usefully inform our design work”- haunts me, perhaps because I am constantly looking for the broader impacts of a study. Thus my question – how can DBR expand (or how has it already expanded) to increase its transferability?

DBR Weekly Readings Part 1

Summaries. (Instructions: Provide a brief (1 paragraph) summary of EACH reading assigned. This approach will support you to make progress on your final project for this class. Your summary may contain a quote, properly cited in APA format, as well as your interpretation or perspective on the quote. As a rule of thumb, you should spend twice as many words explaining/expanding on/critiquing any quote you use. Never use a quote as your own sentence, even when properly cited. Provide a summary for EVERY article assigned that you read. )

Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141–178.

Wow. This article details the evolution of guiding methodologies and practical considerations for their implementation through 30 ish years of educational psychology research, told from the perspective of Brown’s individual research program. It’s a rather fascinating journey – Brown details her cognitive reasoning for why certain choices were made in her research journey and supports them with illustration from her projects. Her evolution (and that of her research program) are profound – from lab based rote memorization techniques to embracing the chaos of the classroom (and designing ways to assess that chaos) – Brown explains the shift in research methodology elegantly with relative ease.

 

The Design-Based Research Collective (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 5–8.

This article provides a summary on what design-based research entails and gives brief examples of “good” (according to the authors) implemented design-based research. Design-based research forms a bridge between theory and practice, specifically research (ever-guided by theory) and teaching (ever-guided by experience). The overlap of a basic scientific method scheme (with hypotheses, iteration, and communication as broad themes) and an emphasis on relational undercurrents within educational research seems to form the backbone of design-based research. Appreciation of documentation, particularly to form large databases of information, seems inherent within the design-based research modality as well.

 

Svihla, V. (2014). Advances in design-based research. Frontline Learning Research, 2(4), 35–45.

Through the use of instructional design and experimental results, this article argues that: a. DBR is a fundamental research methodology for the Learning Sciences, employing both iteration and reflection to inform the refinement of theory and practice, and b. DBIR (design-based implementation research) is a way to “scale-up” DBR. Standards for the conduct of DBR as well as what kinds of data are most helpful are also discussed.

 

Reaction. (Instructions: This part is a choose-your-own adventure freestyle place to react to one or more of the readings. You could describe how you plan to apply something you read, reflect on your own experiences, interpretations and beliefs. You can also synthesize across the readings. You do not need to do this for each reading—just one overall reaction.)

I continue to embrace the discussion of innovation and failure within research and practice. DBR, in particular, seems not only to support these ideas but also to find them fundamental to any research methodology. In my own teaching practice, providing space for innovation, evolution, and failure is essential, and I try to both provide the space and model these teaching ideals.

 

Discussion foci. (Instructions: Include at least 2 questions, wonderings, or topics in total about the articles to encourage in-class discussion.)

DBR (design-based research) and DBER (discipline-based educational research) were two of the first topics I ever heard Vanessa discuss (in Fall 2013) and both topics completely blew my mind. (I did not even realize that DBIR (design-based implementation research) existed until I read Vanessa’s article.) Since that early immersion, I have continually focused on how to relate DBER into my everyday teaching practice. I’m not completely sure I’ve truly understood what the difference between DBR and DBER was, and I’m still a bit unclear on their distinct differences. Other than my perception that DBR and DBER seem to be different frameworks for how to conduct educational research, how are the two different? And other than a difference in scale, how exactly are DBR and DBIR different?

Intro to a series where I share the cool DBR articles I’m reading

Hi everyone!

Currently I’m reading some super interesting articles in the design based research class (OILS 604) I’m currently taking with my advisor – Dr. Vanessa Svihla – at the University of New Mexico. It is a doctoral level class and is part of my coursework for the Learning Sciences Ph.D.

As this is a particularly interesting class, especially in the ways scientific methodologies intersect with educational research, I’ve decided to share my weekly reading summaries via this blog to encourage others to read for themselves and thus discover the gloriousness of DBR (design-based research) and DBIR (design-based implementation research), as well as all of the in-betweens.

Enjoy y’all!