We didn’t use the “card templates” because our information didn’t fit well and looked messy, but all the information is provided here. Ashley couldn’t complete a test due to a weekend conference but will contribute more to future project milestones.
Test Card #1:
Test Name: Validating approach to the college admissions process
Assigned To: Yasser Jamal
Deadline: 11/07/2024
Duration: 2 weeks
- Hypothesis
- We think that students don’t know how to approach the college admissions process and what schools to apply to.
- Test
- To test this assumption, we will put out a pre-survey on admission knowledge. The goal of this survey is to gauge their understanding of different factors in the admissions process such as school selection, application requirements and financial aid options.
- Metric
- We’ll measure admissions knowledge through a knowledge score based on survey responses, with each question weighted by its importance in the admissions process. Students will fall into categories: high knowledge (80-100%), moderate knowledge (50-79%), and low knowledge (below 50%). This initial score will serve as a baseline to identify gaps in understanding and track improvements after targeted guidance.
- Criteria
- We’ll know we’re right if the survey results reveal significant knowledge gaps across key areas of the college admissions process. Specifically, if a majority of students fall into the low knowledge category (below 50%) or struggle with foundational questions—like identifying key admissions deadlines or understanding financial aid basics—this would indicate they lack essential resources.
Learning Card Associated With Test #1:
Insight Name: Understanding lack of students’ knowledge of college admissions process
Date of Learning: 11/05/2024
Person Responsible: Yasser Jamal
- Hypothesis
- Students don’t know how to approach the college admissions process or which schools to apply to.
- Observation
- After conducting a pre-survey, a majority of 10 students scored in the low to moderate knowledge range (below 80%) on key aspects of admissions, including school selection, application requirements, and financial aid options. Open-ended responses also revealed common misconceptions and uncertainty around deadlines and scholarship opportunities.
- Learning and insights
- The survey results confirm that students lack foundational knowledge in critical areas of the admissions process. Gaps in understanding indicate a need for targeted guidance, especially around financial aid and school selection criteria.
- Decision and actions
- EduSphere will develop targeted content, including video lessons and interactive resources, focused on college admissions fundamentals like financial aid options, application requirements, and school selection strategies. We’ll also explore offering live Q&A sessions or webinars with admissions experts. A follow-up survey will measure students’ knowledge growth and guide any further content adjustments needed to effectively address their admissions knowledge gaps.
Notes from interviews/learnings for Test #1: Students expressed uncertainty and confusion across key areas of the college admissions process, particularly around school selection, financial aid, and application requirements. Many felt overwhelmed by the number of schools and lacked criteria for narrowing options. Financial aid was a major pain point, with students unclear on eligibility and types of aid, often asking where to find scholarships. Misconceptions about application requirements were common, such as overemphasizing extracurriculars or standardized tests, and students frequently appeared stressed about deadlines, with some believing they were flexible. Overall, students showed a strong interest in personalized guidance to navigate the process and understand what makes a strong application, reflecting a need for clear, structured resources on admissions fundamentals.
——————————————————–
Test Card #2:
Test Name: Analyzing the lack of feedback given to students
Assigned To: Abbie Maemoto
Deadline: 11/06/2024
Duration: 3 weeks
- Hypothesis
- We believe that students want to get help on their assignments in a timely and personalized manner, but are not currently able to due to the limited nature of office hours and other feedback platforms.
- Test
- To verify that, we will run an A/B test where group A receives personalized, real-time feedback on their projects, and group B receives generic, next-day feedback through email. Group A and B will both receive the prompt of “Develop a business model for a hypothetical startup pitching a sustainable tech accessory”, and will submit a response to us in 48 hours.
- Metric
- And measure the difference in amount of time spent on the assignment, student satisfaction via a post-assignment survey, and performance as measured by an experienced PM.
- Criteria
- We are right if the metrics from step 3 are greater for group A than group B.
Learning Card Associated With Test #2:
Insight Name: Understanding how feedback impacts the quality of student’s work and learning
Date of Learning: 11/06/2024
Person Responsible: Abbie Maemoto
- Hypothesis
- We believed that personalized real-time feedback would have a direct correlation to improved performance as indicated by increased time spent on the task and higher student satisfaction.
- Observation
- We observed that students’ first submissions across both groups were extremely brief, including only 2-4 sentences. Upon receiving 3-5 bullet points of targeted feedback, group A’s responses incorporated all the bullet points outlined. Upon receiving generic feedback (ex. “Perhaps explore the revenue streams more”), group B’s responses incorporated 1-2 additional sentences to their original response.
- Learning and Insights
- From that we learned that targeted feedback motivates students to think critically about their responses and change their approach to the problem in a significant way. On the other hand, generic feedback resulted in very minor changes to the original submissions, indicating that generic feedback does not motivate students to incorporate the feedback.
- Decisions and Actions
- Therefore, we will include real-time, personalized feedback on our platform which selects the top 3-4 targeted areas for improvement, and provides clear, constructive steps to address these points.
Notes from interviews/learnings for Test #2: The difference between personalized and generic feedback, as determined by ChatGPT, significantly affects student motivation and assignment quality. When beginning an assignment, students often feel little incentive to invest effort. However, personalized feedback that acknowledges a student’s specific arguments creates a stronger motivation for students to engage with and incorporate the feedback. In contrast, generic feedback tends to make students feel less obligated to reciprocate effort, resulting in lower-quality revisions. Data shows that assignments incorporating personalized feedback received an average of 1.2 points higher than those with generic feedback on a 5-point scale, as evaluated by an experienced PM. These submissions also took about five more minutes to complete, suggesting that personalized feedback encourages more thoughtful revision. Notably, personalized feedback did not appear to impact student satisfaction scores.
——————————————————–
Test Card #3:
Test Name: Analyzing the need for more personalization in the content of learning platforms
Assigned To: Siya Goel
Deadline: 11/04/2024
Duration: 2 weeks
- Hypothesis
- We believe students lack personalized support on educational platforms and in school, limiting their learning and engagement. Platforms like Khan Academy and Udemy offer extensive resources but often don’t tailor content to each student’s unique needs, pace, or goals. This makes it harder for students to connect with material in a way that feels relevant to their individual progress and learning style.
- Test
- To verify this, we will conduct in-depth interviews that compare a personalized learning experience with a generic one, gathering insights on student preferences and engagement. In each interview, we’ll introduce personalized features—such as customized learning paths and projects, and relevant content recommendations—using wireframes to show both generic and personalized interfaces. We’ll ask questions focused on relevance, motivation, and the perceived value of tailored support. By analyzing responses for common themes, we aim to confirm the importance of personalization and prioritize impactful features before MVP development.
- Metric
- We will measure engagement, motivation, perceived relevance, and satisfaction with each experience. We will track responses to questions about the platform’s relevance and support, the likelihood of continued use, and how many people preferred the personalized wireframe over the genetic one. Additionally, qualitative feedback for recurring themes should be analyzed to identify which personalized features most enhance user experience.
- Criteria
- We are right if, in the interviews, students consistently express a stronger interest in the personalized features, and indicate that customized content would improve their learning experience. Positive reactions to the simulated personalized interface—compared to the generic one—will suggest that personalization aligns better with their needs and preferences, confirming its importance before MVP development.
Learning Card Associated With Test #3:
Insight Name: Understanding how much students value personalization in educational platforms
Date of Learning: 11/03/2024
Person Responsible: Siya Goel
- Hypothesis
- We believe students want more personalized content on learning platforms to better meet their unique needs, learning pace, and interests. Current platforms often provide one-size-fits-all content, which doesn’t fully address each student’s unique needs, pace, or style of learning. Tailored content can increase engagement, motivation, and understanding, making learning more effective and enjoyable.
- Observation
- We observed that out of 50 participants, 85% preferred the personalized interface, finding it more relevant to their learning goals. Students in the personalized group had a 20% higher engagement rate, spending an average of 10 extra minutes per session. Additionally, 70% were more likely to continue using a platform with customized learning paths and projects. These results suggest that personalization boosts engagement and user retention.
- Learning and Insights
- From that, we learned that personalization greatly boosts student engagement and retention. Students value customized learning paths and projects, which make them more likely to use the platform consistently. This suggests that prioritizing personalization can enhance satisfaction and promote ongoing platform use.
- Decisions and Actions
- Therefore, we will focus on the top-rated (like 2-3) personalized features: customized learning paths, targeted projects, and relevant content recommendations. We’ll also track engagement, retention, and satisfaction metrics to refine our approach based on user feedback and impact.
Notes from interviews/learnings for Test #3: From the interviews, we learned that students strongly preferred personalized learning features, especially customized paths and relevant content recommendations. Many felt more engaged and motivated when content aligned with their individual goals, whereas generic experiences often felt “disconnected” and less motivating. Targeted projects were especially appealing, as students valued projects that matched their strengths and growth areas. Overall, feedback indicated a clear preference for adaptive learning tools that respond to personal needs, with suggestions for adding personalized feedback and goal-tracking to further enhance the experience.