top of page

AECT STANDARD 5

Instructional Strategy Plan

AECT Standard 5

​

Assessing/Evaluating - Candidates apply formal inquiry strategies in assessing and evaluating processes and resources for learning and performance. 

​​

EDET 722

FALL 2023

Context & conditions

This artifact was an additional component to artifact 6. The instructional strategy plan was built for the objectives of this front-end analysis to dept World-Related Sequencing as described in MRK’s text.

​

Scope 

When I was laying the foundation for this learning module on delivering feedback, it became clear that the content naturally aligns with real-world scenarios. Managers don't operate in a vacuum; they engage in a series of actions and considerations when preparing to give feedback. There's a logical flow to it: gathering information, choosing a setting, scheduling, the conversation itself, and then follow-up. To teach this effectively, it made sense to mirror that natural progression. This approach, known as World-Related Sequencing, allows learners to connect new knowledge to their existing understanding of workplace dynamics, making the training more intuitive and impactful.

Adopting this strategy also provides a robust framework for skill transfer. By presenting the information in a sequence that reflects how managers deliver feedback, learners are better equipped to apply the training directly to their jobs. It's not just about theoretical knowledge; it’s about enabling them to walk away and immediately implement what they've learned. This aligns with AECT Standard 5 (2012) in that I had to apply formal inquiry strategies in assessing and evaluating processes and resources for learning and performance. For instance, instead of discussing setting expectations in isolation, the module integrates it into the natural flow of a feedback conversation, showing its relevance in the moment.

​

Instructional Design

Ultimately, World-Related Sequencing grounds the learning. It respects the learner's experience and leverages it, creating a more engaging and effective learning experience. This strategy isn't just an organizational tool; it's a way to enhance understanding, boost confidence, and foster the development of practical skills that managers can use to build stronger relationships with their teams. It makes the learning flow more naturally. By setting up the content to sort of mirror how giving feedback works in the real world, the training doesn't feel like a bunch of separate lessons. Instead, it's more like a guided walk-through of something you already know, which helps me understand it better. It feels like it uses what the learners already know to make the training make more sense

Creating the Instructional Strategy Plan was important for turning analysis findings into a clear learning design. This process involved more than just identifying what to teach; it required figuring out the best ways to teach it. I chose World-Related Sequencing because it matched the practical steps managers follow when giving feedback to their direct reports. This experience helped me see how instructional strategies work like blueprints that guide learning, making it more relevant and applicable.

​

Reflection 

The project also helped me value the link between instructional design choices and learning outcomes, especially in skill transfer and learner engagement, as noted in AECT Standard 5. World-Related Sequencing showed me how a structured approach can improve training effectiveness. Exploring other strategies, like conceptual sequencing for more abstract ideas such as vulnerability, can also support careful decision-making needed in instructional design. It showed that different learning goals often need different approaches, and it’s the designer’s job to choose the correct strategy or combination of strategies to support meaningful learning and performance improvement in each context.

This is a screenshot of the instructional strategy plan. A link is embedded within the screenshot to direct you to the document

Formative Evaluation and Report

AECT Standard 5

 

Collaborative Practice - Candidates collaborate with their peers and subject matter experts to analyze learners, develop and design instruction, and evaluate its impact on learners.  

​

EDET 793

SPRING 2025

Context & conditions

This artifact is an example of a formative evaluation and report conducted by a group on a novice instructional design students eLearning module teaching the audience about the ASSURE model of instructional design.

Scope 

I worked with Loren Breland and Wadrian Miller to complete the evaluation and report. My part in the work was around helping to write the initial methodology, finding the right SMEs to respond to the SME survey, contributing to assigning individuals to conduct the pre-test and post-test, and combining all the data, along with the report, into a single document for review.

I had never conducted a formative assessment such as this and I think it was a great practice for myself and the group. Most especially since our final eLearning product would be required to have the same SME review and assessment conducted as well. What I found a bit challenging about the formative assessment was understanding what we needed to do versus how I’ve understood the formative assessment to be defined. For my own eLearning module, I decided to create a module on Dick and Carey’s Model for instructional design, which provided more clarity.

Role 

We initiated a pre-test and post-test as tools to conduct the assessment, but I wasn’t aware that these same approaches that I used in previous designs, both professional and within the program, qualified as a total formative assessment. MRK emphasizes the importance of aligning the goals and objectives to the questions, which myself, Wadrian, and Loren worked through. We also made sure to provide attitude-based questions in the post-test to gather more objective results.

Instructional Design 

Capturing and analyzing data was essential for getting useful insights from the evaluation. For the small group trial, we collected data from pre-test and post-test surveys using Google Forms. This method helped us gather information about participants' initial knowledge and learning gains after finishing the module, as well as how long it took to complete it.

Google forms help to analyze the data. We compared answers to individual questions to identify specific areas for improvement or strong learning outcomes. For the attitude survey, we counted responses to find common views on the module's relevance, clarity, engagement, and organization. We looked at completion times to evaluate the module's efficiency.

Both the survey data and feedback from the Subject Matter Expert review were combined to give a complete picture of the module’s strengths and weaknesses. This helped inform recommendations for revisions.

Reflection 

This artifact aligns with AECT Standard 5 (2012) because my group explored, evaluate, synthesize, and apply methods of inquiry to enhance learning and improve performance. This project helped me understand a formative evaluation through assessments and an SME review full circle. I have conducted similar tasks in other courses and in my professional work, but they were limited. In my organization, we have identified a gap in our efforts to evaluate content, and if we have conducted an evaluation, it is typically a summative evaluation. For example, we would conduct a survey but not ask the right questions to honestly evaluate effectiveness. Collaboration within the group and guidance from Dr. Grant were instrumental in building my confidence and ability to articulate the significance of thorough evaluation in instructional design. I want to begin implementing extensive formative evaluations within my organization's team.

A screenshot of the formative evaluation and report with an embedded link to the direct report.
bottom of page