AECT STANDARD 2
3 C's Planning Documentation
AECT Standard 2
Assessing/Evaluating - Candidates demonstrate an inquiry process that assesses the adequacy of learning and evaluates the instruction and implementation of educational technologies and processes grounded in reflective practice.
EDET 703
SPRING 2024
Context & conditions
For this artifact, I worked with Loren Breland to create a treatment report using Kuhlmann’s Challenge, Choices, and Consequences (3C’s) model. This artifact aligns with the AECT Standard 2 (2012) through an inquiry process that assesses the adequacy of learning and evaluates the instruction and implementation of educational technologies and processes grounded in reflective practice. This was the first treatment report I had ever created, but I was able to use previous skills I have in MS Word to create tables and the access to Adobe Stock to choose the appropriate images to go along with the document.
​
Scope
The treatment report we built was for an eLearning project on beach safety we designed in Articulate Rise, which, unfortunately, we no longer have access to. We partnered on this project as the major project in EDET 703. Loren and I decided to build a course on beach safety because we both enjoy the beach and thought it was a great opportunity to discuss beach safety using the 3C model. The topics we chose to cover was the use of sunscreen, avoiding rip tides, understanding weather conditions, and types of marine wildlife you could encounter. These topics each contained a challenge, choices you could make, and the consequences to those choices, either positive or negative, depending on the choice.
​
Role
My role in this project was to lay the framework, gather appropriate materials for reference, engagement, and creativity, as well as act as the SME for the eLearning components. I did enjoy this project as it was the first of many for us to build eLearning material. Loren and I both worked hand in hand creating the treatment report, incorporating the design, style guide, storyboard, and site map for a high-level overview of what the target audience could expect from the eLearning module. For this project we used some elements of both the MRK and ADDIE module to compose the learning content. We did not conduct a full analysis, rather we looked more at the instructional problems and objectives to build a learning opportunity. No formative or summative evaluation was performed either. This was a much simpler project with the focus merely being on executing the 3C’s within the content.
​
Instructional Design
What I learned from this project was how to complete a treatment report and how the use of 3C’s adds an emphasis on learner accountability. It’s essentially a choose your own path approach. The learner gets to choose their own path based on what they know and then the consequences of their choose, whether good or bad, are unveiled. I have done similar steps to create framework and simple storyboarding with a team before, but this approach was new to me.
​
Reflection
The treatment report allowed me to take what I would consider framework and bring it into a more organized format. I also learned that the 3C module helps to build more interactive scenarios, which are more engaging, ultimately improving learner retention if done correctly – especially if the content can evoke emotion. Also, in this project, I learned the importance of providing a site map, as an element to a treatment report to show SMEs or stakeholders how the course will be navigated prior to its development so they can have a sense of how the material will flow.
Technology Evaluation Plan
AECT Standard 2
Assessing/Evaluating - Candidates demonstrate an inquiry process that assesses the adequacy of learning and evaluates the instruction and implementation of educational technologies and processes grounded in reflective practice.
EDET 746
Fall 2024
Context & conditions
This artifact is a Technology Evaluation Plan (TEP) I developed to analyze how universities evaluate and integrate new technologies, focusing on comparing the University of South Carolina and Clemson University. Creating this TEP was an important learning experience for me as an instructional design student. It provided a structured way to think about technology adoption in university settings. Before this project, I had a general idea of technology evaluation, but I did not fully appreciate how detailed and strategic it needs to be in higher education.
​
Scope
In this TEP, I dug into how USC and Clemson approached their solutions. This aligns with AECT Standard 2 (2012) because I was able to demonstrate an inquiry process that assesses the adequacy of learning and evaluates the instruction and implementation of educational technologies and processes grounded in reflective practice.
The comparison between the two universities helped me see that while both universities recognize the core components of a TEP – like aligning technology with strategic goals, considering costs, and focusing on learning outcomes – they also have some differences in what is most important for them. For example, Clemson highlights the importance of project management tools for large-scale technology initiatives and data analysis, showing their focus on managing complex tech projects. USC, however, strongly emphasizes IT security and accessibility to support the entire university, indicating its priority in creating a secure and accessible IT infrastructure.
Reflection
This project helped me to better understand that a TEP isn't just some generic checklist that all universities use in the same way. It is something that needs to be carefully crafted to fit a school's own specific situation and what they are trying to achieve. When I looked at how USC and Clemson University do things, I started to see how a school's strategic goals – like what they want to focus on – and their resources, like how much money they have, and even the technology they already have in place can all influence how they decide to evaluate new tech.
This artifact is important for me because it shows how I'm learning more about comparing different ways of evaluating technology and understanding that there are a lot of little things - such as strategic alignment, cost-benefit analysis, implementation logistics, user support, and data privacy—that collectively influence technology decisions that go into making smart decisions about using technology in higher education. It's not as simple as I initially thought.
If I could reconsider this artifact, I would choose to think critically about why the universities prioritize some technologies over others. The one aspect I may consider re-evaluating is to take a deeper dive into analyzing the differences between the technology choices of both universities. I'd like to understand why Clemson has a stronger focus on technology around project management. Does this have anything to do with research grants or programs related to STEM? Understanding the why would add some sustenance to the TEP.

